Sample records for ii model validation

  1. Inhibitor-based validation of a homology model of the active-site of tripeptidyl peptidase II.

    PubMed

    De Winter, Hans; Breslin, Henry; Miskowski, Tamara; Kavash, Robert; Somers, Marijke

    2005-04-01

    A homology model of the active site region of tripeptidyl peptidase II (TPP II) was constructed based on the crystal structures of four subtilisin-like templates. The resulting model was subsequently validated by judging expectations of the model versus observed activities for a broad set of prepared TPP II inhibitors. The structure-activity relationships observed for the prepared TPP II inhibitors correlated nicely with the structural details of the TPP II active site model, supporting the validity of this model and its usefulness for structure-based drug design and pharmacophore searching experiments.

  2. Testing the Predictive Validity of the Hendrich II Fall Risk Model.

    PubMed

    Jung, Hyesil; Park, Hyeoun-Ae

    2018-03-01

    Cumulative data on patient fall risk have been compiled in electronic medical records systems, and it is possible to test the validity of fall-risk assessment tools using these data between the times of admission and occurrence of a fall. The Hendrich II Fall Risk Model scores assessed during three time points of hospital stays were extracted and used for testing the predictive validity: (a) upon admission, (b) when the maximum fall-risk score from admission to falling or discharge, and (c) immediately before falling or discharge. Predictive validity was examined using seven predictive indicators. In addition, logistic regression analysis was used to identify factors that significantly affect the occurrence of a fall. Among the different time points, the maximum fall-risk score assessed between admission and falling or discharge showed the best predictive performance. Confusion or disorientation and having a poor ability to rise from a sitting position were significant risk factors for a fall.

  3. Person Heterogeneity of the BDI-II-C and Its Effects on Dimensionality and Construct Validity: Using Mixture Item Response Models

    ERIC Educational Resources Information Center

    Wu, Pei-Chen; Huang, Tsai-Wei

    2010-01-01

    This study was to apply the mixed Rasch model to investigate person heterogeneity of Beck Depression Inventory-II-Chinese version (BDI-II-C) and its effects on dimensionality and construct validity. Person heterogeneity was reflected by two latent classes that differ qualitatively. Additionally, person heterogeneity adversely affected the…

  4. Model Validation Against The Modelers’ Data Archive

    DTIC Science & Technology

    2014-08-01

    completion of the planned Jack Rabbit 2 field trials. The relevant task for the effort addressed here is Task 4 of the current Interagency Agreement, as...readily simulates the Prairie Grass sulfur dioxide plumes. Also, Jack Rabbit II field trials are set to be completed during FY16. Once these data are...available, they will also be used to validate the combined models. This validation may prove to be more useful, as the Jack Rabbit II will release

  5. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  6. Predictive validity of the Hendrich fall risk model II in an acute geriatric unit.

    PubMed

    Ivziku, Dhurata; Matarese, Maria; Pedone, Claudio

    2011-04-01

    Falls are the most common adverse events reported in acute care hospitals, and older patients are the most likely to fall. The risk of falling cannot be completely eliminated, but it can be reduced through the implementation of a fall prevention program. A major evidence-based intervention to prevent falls has been the use of fall-risk assessment tools. Many tools have been increasingly developed in recent years, but most instruments have not been investigated regarding reliability, validity and clinical usefulness. This study intends to evaluate the predictive validity and inter-rater reliability of Hendrich fall risk model II (HFRM II) in order to identify older patients at risk of falling in geriatric units and recommend its use in clinical practice. A prospective descriptive design was used. The study was carried out in a geriatric acute care unit of an Italian University hospital. All over 65 years old patients consecutively admitted to a geriatric acute care unit of an Italian University hospital over 8-month period were enrolled. The patients enrolled were screened for the falls risk by nurses with the HFRM II within 24h of admission. The falls occurring during the patient's hospital stay were registered. Inter-rater reliability, area under the ROC curve, sensitivity, specificity, positive and negative predictive values and time for the administration were evaluated. 179 elderly patients were included. The inter-rater reliability was 0.87 (95% CI 0.71-1.00). The administration time was about 1min. The most frequently reported risk factors were depression, incontinence, vertigo. Sensitivity and specificity were respectively 86% and 43%. The optimal cut-off score for screening at risk patients was 5 with an area under the ROC curve of 0.72. The risk factors more strongly associated with falls were confusion and depression. As falls of older patients are a common problem in acute care settings it is necessary that the nurses use specific validate and reliable

  7. The MicroArray Quality Control (MAQC)-II study of common practices for the development and validation of microarray-based predictive models

    EPA Science Inventory

    The second phase of the MicroArray Quality Control (MAQC-II) project evaluated common practices for developing and validating microarray-based models aimed at predicting toxicological and clinical endpoints. Thirty-six teams developed classifiers for 13 endpoints - some easy, som...

  8. Validation of PEP-II Resonantly Excited Turn-by-Turn BPM Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Yiton T.; Cai, Yunhai; Colocho, William.

    2007-06-28

    For optics measurement and modeling of the PEP-II electron (HER) and position (LER) storage rings, we have been doing well with MIA [1] which requires analyzing turn-by-turn Beam Position Monitor (BPM) data that are resonantly excited at the horizontal, vertical, and longitudinal tunes. However, in anticipation that certain BPM buttons and even pins in the PEP-II IR region would be missing for the run starting in January 2007, we had been developing a data validation process to reduce the effect due to the reduced BPM data accuracy on PEP-II optics measurement and modeling. Besides the routine process for ranking BPMmore » noise level through data correlation among BPMs with a singular-value decomposition (SVD), we could also check BPM data symplecticity by comparing the invariant ratios. Results from PEP-II measurement will be presented.« less

  9. Building and validating a prediction model for paediatric type 1 diabetes risk using next generation targeted sequencing of class II HLA genes.

    PubMed

    Zhao, Lue Ping; Carlsson, Annelie; Larsson, Helena Elding; Forsander, Gun; Ivarsson, Sten A; Kockum, Ingrid; Ludvigsson, Johnny; Marcus, Claude; Persson, Martina; Samuelsson, Ulf; Örtqvist, Eva; Pyo, Chul-Woo; Bolouri, Hamid; Zhao, Michael; Nelson, Wyatt C; Geraghty, Daniel E; Lernmark, Åke

    2017-11-01

    It is of interest to predict possible lifetime risk of type 1 diabetes (T1D) in young children for recruiting high-risk subjects into longitudinal studies of effective prevention strategies. Utilizing a case-control study in Sweden, we applied a recently developed next generation targeted sequencing technology to genotype class II genes and applied an object-oriented regression to build and validate a prediction model for T1D. In the training set, estimated risk scores were significantly different between patients and controls (P = 8.12 × 10 -92 ), and the area under the curve (AUC) from the receiver operating characteristic (ROC) analysis was 0.917. Using the validation data set, we validated the result with AUC of 0.886. Combining both training and validation data resulted in a predictive model with AUC of 0.903. Further, we performed a "biological validation" by correlating risk scores with 6 islet autoantibodies, and found that the risk score was significantly correlated with IA-2A (Z-score = 3.628, P < 0.001). When applying this prediction model to the Swedish population, where the lifetime T1D risk ranges from 0.5% to 2%, we anticipate identifying approximately 20 000 high-risk subjects after testing all newborns, and this calculation would identify approximately 80% of all patients expected to develop T1D in their lifetime. Through both empirical and biological validation, we have established a prediction model for estimating lifetime T1D risk, using class II HLA. This prediction model should prove useful for future investigations to identify high-risk subjects for prevention research in high-risk populations. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Physical properties of solar chromospheric plages. III - Models based on Ca II and Mg II observations

    NASA Technical Reports Server (NTRS)

    Kelch, W. L.; Linsky, J. L.

    1978-01-01

    Solar plages are modeled using observations of both the Ca II K and the Mg II h and k lines. A partial-redistribution approach is employed for calculating the line profiles on the basis of a grid of five model chromospheres. The computed integrated emission intensities for the five atmospheric models are compared with observations of six regions on the sun as well as with models of active-chromosphere stars. It is concluded that the basic plage model grid proposed by Shine and Linsky (1974) is still valid when the Mg II lines are included in the analysis and the Ca II and Mg II lines are analyzed using partial-redistribution diagnostics.

  11. Validated Competing Event Model for the Stage I-II Endometrial Cancer Population

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmona, Ruben; Gulaya, Sachin; Murphy, James D.

    2014-07-15

    Purpose/Objectives(s): Early-stage endometrial cancer patients are at higher risk of noncancer mortality than of cancer mortality. Competing event models incorporating comorbidity could help identify women most likely to benefit from treatment intensification. Methods and Materials: 67,397 women with stage I-II endometrioid adenocarcinoma after total hysterectomy diagnosed from 1988 to 2009 were identified in Surveillance, Epidemiology, and End Results (SEER) and linked SEER-Medicare databases. Using demographic and clinical information, including comorbidity, we sought to develop and validate a risk score to predict the incidence of competing mortality. Results: In the validation cohort, increasing competing mortality risk score was associated with increasedmore » risk of noncancer mortality (subdistribution hazard ratio [SDHR], 1.92; 95% confidence interval [CI], 1.60-2.30) and decreased risk of endometrial cancer mortality (SDHR, 0.61; 95% CI, 0.55-0.78). Controlling for other variables, Charlson Comorbidity Index (CCI) = 1 (SDHR, 1.62; 95% CI, 1.45-1.82) and CCI >1 (SDHR, 3.31; 95% CI, 2.74-4.01) were associated with increased risk of noncancer mortality. The 10-year cumulative incidences of competing mortality within low-, medium-, and high-risk strata were 27.3% (95% CI, 25.2%-29.4%), 34.6% (95% CI, 32.5%-36.7%), and 50.3% (95% CI, 48.2%-52.6%), respectively. With increasing competing mortality risk score, we observed a significant decline in omega (ω), indicating a diminishing likelihood of benefit from treatment intensification. Conclusion: Comorbidity and other factors influence the risk of competing mortality among patients with early-stage endometrial cancer. Competing event models could improve our ability to identify patients likely to benefit from treatment intensification.« less

  12. Validation of the Parental Facilitation of Mastery Scale-II.

    PubMed

    Zalta, Alyson K; Allred, Kelly M; Jayawickreme, Eranda; Blackie, Laura E R; Chambless, Dianne L

    2017-10-01

    To develop a more reliable and comprehensive version of the Parental Facilitation of Mastery Scale (PFMS) METHOD: In Study 1, 387 undergraduates completed an expanded PFMS (PFMS-II) and measures of parenting, perceived control, responses to early life challenges, and psychopathology. In Study 2, 182 trauma-exposed community participants completed the PFMS-II and measures of perceived control, psychopathology, and well-being RESULTS: In Study 1, exploratory factor analysis of the PFMS-II revealed two factors. These factors replicated in Study 2; one item was removed to achieve measurement invariance across race. The final PFMS-II comprised a 10-item overprotection scale and a 7-item challenge scale. In both samples, this measure demonstrated good convergent and discriminant validity and was more reliable than the original PFMS. Parental challenge was a unique predictor of perceived control in both samples CONCLUSION: The PFMS-II is a valid measure of important parenting behaviors not fully captured in other measures. © 2016 Wiley Periodicals, Inc.

  13. Filament winding cylinders. II - Validation of the process model

    NASA Technical Reports Server (NTRS)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  14. Establishment and Validation of GV-SAPS II Scoring System for Non-Diabetic Critically Ill Patients

    PubMed Central

    Liu, Wen-Yue; Lin, Shi-Gang; Zhu, Gui-Qi; Poucke, Sven Van; Braddock, Martin; Zhang, Zhongheng; Mao, Zhi; Shen, Fei-Xia

    2016-01-01

    Background and Aims Recently, glucose variability (GV) has been reported as an independent risk factor for mortality in non-diabetic critically ill patients. However, GV is not incorporated in any severity scoring system for critically ill patients currently. The aim of this study was to establish and validate a modified Simplified Acute Physiology Score II scoring system (SAPS II), integrated with GV parameters and named GV-SAPS II, specifically for non-diabetic critically ill patients to predict short-term and long-term mortality. Methods Training and validation cohorts were exacted from the Multiparameter Intelligent Monitoring in Intensive Care database III version 1.3 (MIMIC-III v1.3). The GV-SAPS II score was constructed by Cox proportional hazard regression analysis and compared with the original SAPS II, Sepsis-related Organ Failure Assessment Score (SOFA) and Elixhauser scoring systems using area under the curve of the receiver operator characteristic (auROC) curve. Results 4,895 and 5,048 eligible individuals were included in the training and validation cohorts, respectively. The GV-SAPS II score was established with four independent risk factors, including hyperglycemia, hypoglycemia, standard deviation of blood glucose levels (GluSD), and SAPS II score. In the validation cohort, the auROC values of the new scoring system were 0.824 (95% CI: 0.813–0.834, P< 0.001) and 0.738 (95% CI: 0.725–0.750, P< 0.001), respectively for 30 days and 9 months, which were significantly higher than other models used in our study (all P < 0.001). Moreover, Kaplan-Meier plots demonstrated significantly worse outcomes in higher GV-SAPS II score groups both for 30-day and 9-month mortality endpoints (all P< 0.001). Conclusions We established and validated a modified prognostic scoring system that integrated glucose variability for non-diabetic critically ill patients, named GV-SAPS II. It demonstrated a superior prognostic capability and may be an optimal scoring system

  15. Establishment and Validation of GV-SAPS II Scoring System for Non-Diabetic Critically Ill Patients.

    PubMed

    Liu, Wen-Yue; Lin, Shi-Gang; Zhu, Gui-Qi; Poucke, Sven Van; Braddock, Martin; Zhang, Zhongheng; Mao, Zhi; Shen, Fei-Xia; Zheng, Ming-Hua

    2016-01-01

    Recently, glucose variability (GV) has been reported as an independent risk factor for mortality in non-diabetic critically ill patients. However, GV is not incorporated in any severity scoring system for critically ill patients currently. The aim of this study was to establish and validate a modified Simplified Acute Physiology Score II scoring system (SAPS II), integrated with GV parameters and named GV-SAPS II, specifically for non-diabetic critically ill patients to predict short-term and long-term mortality. Training and validation cohorts were exacted from the Multiparameter Intelligent Monitoring in Intensive Care database III version 1.3 (MIMIC-III v1.3). The GV-SAPS II score was constructed by Cox proportional hazard regression analysis and compared with the original SAPS II, Sepsis-related Organ Failure Assessment Score (SOFA) and Elixhauser scoring systems using area under the curve of the receiver operator characteristic (auROC) curve. 4,895 and 5,048 eligible individuals were included in the training and validation cohorts, respectively. The GV-SAPS II score was established with four independent risk factors, including hyperglycemia, hypoglycemia, standard deviation of blood glucose levels (GluSD), and SAPS II score. In the validation cohort, the auROC values of the new scoring system were 0.824 (95% CI: 0.813-0.834, P< 0.001) and 0.738 (95% CI: 0.725-0.750, P< 0.001), respectively for 30 days and 9 months, which were significantly higher than other models used in our study (all P < 0.001). Moreover, Kaplan-Meier plots demonstrated significantly worse outcomes in higher GV-SAPS II score groups both for 30-day and 9-month mortality endpoints (all P< 0.001). We established and validated a modified prognostic scoring system that integrated glucose variability for non-diabetic critically ill patients, named GV-SAPS II. It demonstrated a superior prognostic capability and may be an optimal scoring system for prognostic evaluation in this patient group.

  16. TAMDAR Sensor Validation in 2003 AIRS II

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.; Murray, John J.; Anderson, Mark V.; Mulally, Daniel J.; Jensen, Kristopher R.; Grainger, Cedric A.; Delene, David J.

    2005-01-01

    This study entails an assessment of TAMDAR in situ temperature, relative humidity and winds sensor data from seven flights of the UND Citation II. These data are undergoing rigorous assessment to determine their viability to significantly augment domestic Meteorological Data Communications Reporting System (MDCRS) and the international Aircraft Meteorological Data Reporting (AMDAR) system observational databases to improve the performance of regional and global numerical weather prediction models. NASA Langley Research Center participated in the Second Alliance Icing Research Study from November 17 to December 17, 2003. TAMDAR data taken during this period is compared with validation data from the UND Citation. The data indicate acceptable performance of the TAMDAR sensor when compared to measurements from the UND Citation research instruments.

  17. Social anxiety and fear of negative evaluation: construct validity of the BFNE-II.

    PubMed

    Carleton, R Nicholas; Collimore, Kelsey C; Asmundson, Gordon J G

    2007-01-01

    disorder. Psychological Assessment, 17, 179-190]; however [Carleton, R. N., McCreary, D., Norton, P. J., & Asmundson, G. J. G. (in press-a). The Brief Fear of Negative Evaluation Scale, Revised. Depression & Anxiety; Collins, K. A., Westra, H. A., Dozois, D. J. A., & Stewart, S. H. (2005). The validity of the brief version of the fear of negative evaluation scale. Journal of Anxiety Disorders, 19, 345-359] recommend that these items be reworded to maintain scale sensitivity. The present study examined the reliability and validity of the BFNE-II, a version of the BFNE evaluating revisions of the reverse-worded items in a community sample. A unitary model of the BFNE-II resulted in excellent confirmatory factor analysis fit indices. Moderate convergent and discriminant validity were found when BFNE-II items were correlated with additional independent measures of social anxiety [i.e., Social Interaction Anxiety & Social Phobia Scales; Mattick, R. P., & Clarke, J. C. (1998). Development and validation of measures of social phobia scrutiny fear and social interaction anxiety. Behaviour Research and Therapy, 36, 455-470], and fear [i.e., Anxiety Sensitivity Index; Reiss, S., & McNally, R. J. (1985). The expectancy model of fear. In S. Reiss, R. R. Bootzin (Eds.), Theoretical issues in behaviour therapy (pp. 107--121). New York: Academic Press. and the Illness/Injury Sensitivity Index; Carleton, R. N., Park, I., & Asmundson, G. J. G. (in press-b). The Illness/Injury Sensitivity Index: an examination of construct validity. Depression & Anxiety). These findings support the utility of the revised items and the validity of the BFNE-II as a measure of the fear of negative evaluation. Implications and future research directions are discussed.

  18. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DOE PAGES

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...

    2017-10-01

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive

  19. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive

  20. Validity of the Sleep Subscale of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II)

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Malone, Carrie J.

    2006-01-01

    Currently there are no available sleep disorder measures for individuals with severe and profound intellectual disability. We, therefore, attempted to establish the external validity of the "Diagnostic Assessment for the Severely Handicapped-II" (DASH-II) sleep subscale by comparing daily observational sleep data with the responses of…

  1. Derivation of a 3D pharmacophore model for the angiotensin-II site one receptor

    NASA Astrophysics Data System (ADS)

    Prendergast, Kristine; Adams, Kym; Greenlee, William J.; Nachbar, Robert B.; Patchett, Arthur A.; Underwood, Dennis J.

    1994-10-01

    A systematic search has been used to derive a hypothesis for the receptor-bound conformation of A-II antagonists at the AT1 receptor. The validity of the pharmacophore hypothesis has been tested using CoMFA, which included 50 diverse A-II antagonists, spanning four orders of magnitude in activity. The resulting cross-validated R2 of 0.64 (conventional R2 of 0.76) is indicative of a good predictive model of activity, and has been used to estimate potency for a variety of non-peptidyl antagonists. The structural model for the non-peptide has been compared with respect to the natural substrate, A-II, by generating peptide to non-peptide overlays.

  2. Monte Carol-based validation of neutronic methodology for EBR-II analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, J.R.; Finck, P.J.

    1993-01-01

    The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less

  3. Homology modeling, binding site identification and docking study of human angiotensin II type I (Ang II-AT1) receptor.

    PubMed

    Vyas, Vivek K; Ghate, Manjunath; Patel, Kinjal; Qureshi, Gulamnizami; Shah, Surmil

    2015-08-01

    Ang II-AT1 receptors play an important role in mediating virtually all of the physiological actions of Ang II. Several drugs (SARTANs) are available, which can block the AT1 receptor effectively and lower the blood pressure in the patients with hypertension. Currently, there is no experimental Ang II-AT1 structure available; therefore, in this study we modeled Ang II-AT1 receptor structure using homology modeling followed by identification and characterization of binding sites and thereby assessing druggability of the receptor. Homology models were constructed using MODELLER and I-TASSER server, refined and validated using PROCHECK in which 96.9% of 318 residues were present in the favoured regions of the Ramachandran plots. Various Ang II-AT1 receptor antagonist drugs are available in the market as antihypertensive drug, so we have performed docking study with the binding site prediction algorithms to predict different binding pockets on the modeled proteins. The identification of 3D structures and binding sites for various known drugs will guide us for the structure-based drug design of novel compounds as Ang II-AT1 receptor antagonists for the treatment of hypertension. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  4. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equationsmore » for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.« less

  5. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  6. The Danish Barriers Questionnaire-II: preliminary validation in cancer pain patients.

    PubMed

    Jacobsen, Ramune; Møldrup, Claus; Christrup, Lona; Sjøgren, Per; Hansen, Ole Bo

    2009-01-01

    The objective of this study was to examine the psychometric properties of the Danish version of the Barriers Questionnaire-II (DBQ-II). The validated Norwegian version of the DBQ-II was translated into Danish. Cancer patients for the study were recruited from specialized pain management facilities. Thirty-three patients responded to the DBQ-II, Hospital Anxiety and Depression Scale, and Brief Pain Inventory pain severity scale. A factor analysis of the DBQ-II resulted in six scales. Scale one, Fatalism, consisted of three items addressing fatalistic beliefs regarding cancer pain management. Scale two, Immune System, consisted of three items addressing the belief that pain medications harm the immune system. Scale three, Monitor, consisted of three items addressing the fear that pain medicine masks changes in one's body. Scale four, Communication, consisted of five items addressing the concern that reports of pain distract the physician from treating the cancer, and the belief that "good" patients do not complain. Scale five, Addiction, consisted of two items addressing the fear of becoming addicted to pain medication. Finally, scale six, Tolerance, consisted of three items addressing the fear of getting tolerant to analgesic effect of pain medicine. Items related to medication side effects were analyzed as separate units. The DBQ-II total had an internal consistency of 0.87. The DBQ-II total score was related to measures of pain relief and anxiety. The DBQ-II seems to be a reliable and valid measure of the barriers to pain management among Danish cancer patients.

  7. Reliability and validity of the test of gross motor development-II in Korean preschool children: applying AHP.

    PubMed

    Kim, Chung-Il; Han, Dong-Wook; Park, Il-Hyeok

    2014-04-01

    The Test of Gross Motor Development-II (TGMD-II) is a frequently used assessment tool for measuring motor ability. The purpose of this study is to investigate the reliability and validity of TGMD-II's weighting scores (by comparing pre-weighted TGMD-II scores with post ones) as well as examine applicability of the TGMD-II on Korean preschool children. A total of 121 Korean children (three kindergartens) participated in this study. There were 65 preschoolers who were 5-years-old (37 boys and 28 girls) and 56 preschoolers who were 6-years-old (34 boys and 22 girls). For internal consistency, reliability, and construct validity, only one researcher evaluated all of the children using the TGMD-II in the following areas: running; galloping; sliding; hopping; leaping; horizontal jumping; overhand throwing; underhand rolling; striking a stationary ball; stationary dribbling; kicking; and catching. For concurrent validity, the evaluator measured physical fitness (strength, flexibility, power, agility, endurance, and balance). The key findings were as follows: first, the reliability coefficient and the validity coefficient between pre-weighted and post-weighted TGMD-II scores were quite similar. Second, the research showed adequate reliability and validity of the TGMD-II for Korean preschool children. The TGMD-II is a proper instrument to test Korean children's motor development. Yet, applying relative weighting on the TGMD-II should be a point of consideration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Groundwater Model Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process ofmore » stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of

  9. Validation and refinement of mixture volumetric material properties identified in superpave monitoring project II : phase II.

    DOT National Transportation Integrated Search

    2015-02-01

    This study was initiated to validate and refine mixture volumetric material properties identified in the : Superpave Monitoring Project II. It has been found that differences in performance are primarily controlled : by differences in gradation and r...

  10. Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.

    PubMed

    Berthelsen, Hanne; Hakanen, Jari J; Westerlund, Hugo

    2018-01-01

    This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.

  11. Simulation model calibration and validation : phase II : development of implementation handbook and short course.

    DOT National Transportation Integrated Search

    2006-01-01

    A previous study developed a procedure for microscopic simulation model calibration and validation and evaluated the procedure via two relatively simple case studies using three microscopic simulation models. Results showed that default parameters we...

  12. Model Validation Status Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.L. Hardin

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified,more » and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural

  13. Reliability and Validity of the Beck Depression Inventory--II with Adolescent Psychiatric Inpatients

    ERIC Educational Resources Information Center

    Osman, Augustine; Kopper, Beverly A; Barrios, Frank; Gutierrez, Peter M.; Bagge, Courtney L.

    2004-01-01

    This investigation was conducted to validate the Beck Depression Inventory--II (BDI-II; A. T. Beck, R. A. Steer, & G. K. Brown, 1996) in samples of adolescent psychiatric inpatients. The sample in each substudy was primarily Caucasian. In Study 1, expert raters (N=7) and adolescent psychiatric inpatients (N=13) evaluated the BDI-II items to assess…

  14. Comparison of mortality prediction models and validation of SAPS II in critically ill burns patients.

    PubMed

    Pantet, O; Faouzi, M; Brusselaers, N; Vernay, A; Berger, M M

    2016-06-30

    Specific burn outcome prediction scores such as the Abbreviated Burn Severity Index (ABSI), Ryan, Belgian Outcome of Burn Injury (BOBI) and revised Baux scores have been extensively studied. Validation studies of the critical care score SAPS II (Simplified Acute Physiology Score) have included burns patients but not addressed them as a cohort. The study aimed at comparing their performance in a Swiss burns intensive care unit (ICU) and to observe whether they were affected by a standardized definition of inhalation injury. We conducted a retrospective cohort study, including all consecutive ICU burn admissions (n=492) between 1996 and 2013: 5 epochs were defined by protocol changes. As required for SAPS II calculation, stays <24h were excluded. Data were collected on age, gender, total body surface area burned (TBSA) and inhalation injury (systematic standardized diagnosis since 2006). Study epochs were compared (χ2 test, ANOVA). Score performance was assessed by receiver operating characteristic curve analysis. SAPS II performed well (AUC 0.89), particularly in burns <40% TBSA (AUC 0.93). Revised Baux and ABSI scores were not affected by the standardized diagnosis of inhalation injury and showed the best performance (AUC 0.92 and 0.91 respectively). In contrast, the accuracy of the BOBI and Ryan scores was lower (AUC 0.84 and 0.81) and reduced after 2006. The excellent predictive performance of the classic scores (revised Baux score and ABSI) was confirmed. SAPS II was nearly as accurate, particularly in burns <40% TBSA. Ryan and BOBI scores were least accurate, as they heavily weight inhalation injury.

  15. Comparison of mortality prediction models and validation of SAPS II in critically ill burns patients

    PubMed Central

    Pantet, O.; Faouzi, M.; Brusselaers, N.; Vernay, A.; Berger, M.M.

    2016-01-01

    Summary Specific burn outcome prediction scores such as the Abbreviated Burn Severity Index (ABSI), Ryan, Belgian Outcome of Burn Injury (BOBI) and revised Baux scores have been extensively studied. Validation studies of the critical care score SAPS II (Simplified Acute Physiology Score) have included burns patients but not addressed them as a cohort. The study aimed at comparing their performance in a Swiss burns intensive care unit (ICU) and to observe whether they were affected by a standardized definition of inhalation injury. We conducted a retrospective cohort study, including all consecutive ICU burn admissions (n=492) between 1996 and 2013: 5 epochs were defined by protocol changes. As required for SAPS II calculation, stays <24h were excluded. Data were collected on age, gender, total body surface area burned (TBSA) and inhalation injury (systematic standardized diagnosis since 2006). Study epochs were compared (χ2 test, ANOVA). Score performance was assessed by receiver operating characteristic curve analysis. SAPS II performed well (AUC 0.89), particularly in burns <40% TBSA (AUC 0.93). Revised Baux and ABSI scores were not affected by the standardized diagnosis of inhalation injury and showed the best performance (AUC 0.92 and 0.91 respectively). In contrast, the accuracy of the BOBI and Ryan scores was lower (AUC 0.84 and 0.81) and reduced after 2006. The excellent predictive performance of the classic scores (revised Baux score and ABSI) was confirmed. SAPS II was nearly as accurate, particularly in burns <40% TBSA. Ryan and BOBI scores were least accurate, as they heavily weight inhalation injury. PMID:28149234

  16. Validation Studies for Diet History Questionnaire II | EGRP/DCCPS/NCI/NIH

    Cancer.gov

    Links to validation findings from the original Diet History Questionnaire (DHQ). These findings are unlikely to be greatly modified by minimal modifications to DHQ II food list and the updated nutrient database.

  17. Translation, adaptation and validation of a Portuguese version of the Moorehead-Ardelt Quality of Life Questionnaire II.

    PubMed

    Maciel, João; Infante, Paulo; Ribeiro, Susana; Ferreira, André; Silva, Artur C; Caravana, Jorge; Carvalho, Manuel G

    2014-11-01

    The prevalence of obesity has increased worldwide. An assessment of the impact of obesity on health-related quality of life (HRQoL) requires specific instruments. The Moorehead-Ardelt Quality of Life Questionnaire II (MA-II) is a widely used instrument to assess HRQoL in morbidly obese patients. The objective of this study was to translate and validate a Portuguese version of the MA-II.The study included forward and backward translations of the original MA-II. The reliability of the Portuguese MA-II was estimated using the internal consistency and test-retest methods. For validation purposes, the Spearman's rank correlation coefficient was used to evaluate the correlation between the Portuguese MA-II and the Portuguese versions of two other questionnaires, the 36-item Short Form Health Survey (SF-36) and the Impact of Weight on Quality of Life-Lite (IWQOL-Lite).One hundred and fifty morbidly obese patients were randomly assigned to test the reliability and validity of the Portuguese MA-II. Good internal consistency was demonstrated by a Cronbach's alpha coefficient of 0.80, and a very good agreement in terms of test-retest reliability was recorded, with an overall intraclass correlation coefficient (ICC) of 0.88. The total sums of MA-II scores and each item of MA-II were significantly correlated with all domains of SF-36 and IWQOL-Lite. A statistically significant negative correlation was found between the MA-II total score and BMI. Moreover, age, gender and surgical status were independent predictors of MA-II total score.A reliable and valid Portuguese version of the MA-II was produced, thus enabling the routine use of MA-II in the morbidly obese Portuguese population.

  18. External Validation of European System for Cardiac Operative Risk Evaluation II (EuroSCORE II) for Risk Prioritization in an Iranian Population

    PubMed Central

    Atashi, Alireza; Amini, Shahram; Tashnizi, Mohammad Abbasi; Moeinipour, Ali Asghar; Aazami, Mathias Hossain; Tohidnezhad, Fariba; Ghasemi, Erfan; Eslami, Saeid

    2018-01-01

    Introduction The European System for Cardiac Operative Risk Evaluation II (EuroSCORE II) is a prediction model which maps 18 predictors to a 30-day post-operative risk of death concentrating on accurate stratification of candidate patients for cardiac surgery. Objective The objective of this study was to determine the performance of the EuroSCORE II risk-analysis predictions among patients who underwent heart surgeries in one area of Iran. Methods A retrospective cohort study was conducted to collect the required variables for all consecutive patients who underwent heart surgeries at Emam Reza hospital, Northeast Iran between 2014 and 2015. Univariate and multivariate analysis were performed to identify covariates which significantly contribute to higher EuroSCORE II in our population. External validation was performed by comparing the real and expected mortality using area under the receiver operating characteristic curve (AUC) for discrimination assessment. Also, Brier Score and Hosmer-Lemeshow goodness-of-fit test were used to show the overall performance and calibration level, respectively. Results Two thousand five hundred eight one (59.6% males) were included. The observed mortality rate was 3.3%, but EuroSCORE II had a prediction of 4.7%. Although the overall performance was acceptable (Brier score=0.047), the model showed poor discriminatory power by AUC=0.667 (sensitivity=61.90, and specificity=66.24) and calibration (Hosmer-Lemeshow test, P<0.01). Conclusion Our study showed that the EuroSCORE II discrimination power is less than optimal for outcome prediction and less accurate for resource allocation programs. It highlights the need for recalibration of this risk stratification tool aiming to improve post cardiac surgery outcome predictions in Iran. PMID:29617500

  19. Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model

    PubMed Central

    Hakanen, Jari J.; Westerlund, Hugo

    2018-01-01

    Aim This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. Material and methods The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. Results This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. Conclusion In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field. PMID:29708998

  20. A new 3D finite element model of the IEC 60318-1 artificial ear: II. Experimental and numerical validation

    NASA Astrophysics Data System (ADS)

    Bravo, Agustín; Barham, Richard; Ruiz, Mariano; López, Juan Manuel; De Arcas, Guillermo; Alonso, Jesus

    2012-12-01

    In part I, the feasibility of using three-dimensional (3D) finite elements (FEs) to model the acoustic behaviour of the IEC 60318-1 artificial ear was studied and the numerical approach compared with classical lumped elements modelling. It was shown that by using a more complex acoustic model that took account of thermo-viscous effects, geometric shapes and dimensions, it was possible to develop a realistic model. This model then had clear advantages in comparison with the models based on equivalent circuits using lumped parameters. In fact results from FE modelling produce a better understanding about the physical phenomena produced inside ear simulator couplers, facilitating spatial and temporal visualization of the sound fields produced. The objective of this study (part II) is to extend the investigation by validating the numerical calculations against measurements on an ear simulator conforming to IEC 60318-1. For this purpose, an appropriate commercially available device is taken and a complete 3D FE model developed for it. The numerical model is based on key dimensional data obtained with a non-destructive x-ray inspection technique. Measurements of the acoustic transfer impedance have been carried out on the same device at a national measurement institute using the method embodied in IEC 60318-1. Having accounted for the actual device dimensions, the thermo-viscous effects inside narrow slots and holes and environmental conditions, the results of the numerical modelling were found to be in good agreement with the measured values.

  1. Modeling and validating the cost and clinical pathway of colorectal cancer.

    PubMed

    Joranger, Paal; Nesbakken, Arild; Hoff, Geir; Sorbye, Halfdan; Oshaug, Arne; Aas, Eline

    2015-02-01

    Cancer is a major cause of morbidity and mortality, and colorectal cancer (CRC) is the third most common cancer in the world. The estimated costs of CRC treatment vary considerably, and if CRC costs in a model are based on empirically estimated total costs of stage I, II, III, or IV treatments, then they lack some flexibility to capture future changes in CRC treatment. The purpose was 1) to describe how to model CRC costs and survival and 2) to validate the model in a transparent and reproducible way. We applied a semi-Markov model with 70 health states and tracked age and time since specific health states (using tunnels and 3-dimensional data matrix). The model parameters are based on an observational study at Oslo University Hospital (2049 CRC patients), the National Patient Register, literature, and expert opinion. The target population was patients diagnosed with CRC. The model followed the patients diagnosed with CRC from the age of 70 until death or 100 years. The study focused on the perspective of health care payers. The model was validated for face validity, internal and external validity, and cross-validity. The validation showed a satisfactory match with other models and empirical estimates for both cost and survival time, without any preceding calibration of the model. The model can be used to 1) address a range of CRC-related themes (general model) like survival and evaluation of the cost of treatment and prevention measures; 2) make predictions from intermediate to final outcomes; 3) estimate changes in resource use and costs due to changing guidelines; and 4) adjust for future changes in treatment and trends over time. The model is adaptable to other populations. © The Author(s) 2014.

  2. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  3. PACIC Instrument: disentangling dimensions using published validation models.

    PubMed

    Iglesias, K; Burnand, B; Peytremann-Bridevaux, I

    2014-06-01

    To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. Validation study using data from cross-sectional survey. A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  4. Comparative validity of MMPI-2 and MCMI-II personality disorder classifications.

    PubMed

    Wise, E A

    1996-06-01

    Minnesota Multiphasic Personality Inventory-2 (MMPI-2) overlapping and nonoverlapping scales were demonstrated to perform comparably to their original MMPI forms. They were then evaluated for convergent and discriminant validity with the Million Clinical Multiaxial Inventory-II (MCMI-II) personality disorder scales. The MMPI-2 and MCMI-II personality disorder scales demonstrated convergent and discriminant coefficients similar to their original forms. However, the MMPI-2 personality scales classified significantly more of the sample as Dramatic, whereas the MCMI-II diagnosed more of the sample as Anxious. Furthermore, single-scale and 2-point code type classification rates were quite low, indicating that at the level of the individual, the personality disorder scales are not measuring comparable constructs. Hence, each instrument is providing similar and unique information, justifying their continued use together for the purpose of diagnosing personality disorders.

  5. Predicting Backdrafting and Spillage for Natural-Draft Gas Combustion Appliances: Validating VENT-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rapp, Vi H.; Pastor-Perez, Albert; Singer, Brett C.

    2013-04-01

    VENT-II is a computer program designed to provide detailed analysis of natural draft and induced draft combustion appliance vent-systems (i.e., furnace or water heater). This program is capable of predicting house depressurization thresholds that lead to backdrafting and spillage of combustion appliances; however, validation reports of the program being applied for this purpose are not readily available. The purpose of this report is to assess VENT-II’s ability to predict combustion gas spillage events due to house depressurization by comparing VENT-II simulated results with experimental data for four appliance configurations. The results show that VENT-II correctly predicts depressurizations resulting in spillagemore » for natural draft appliances operating in cold and mild outdoor conditions, but not for hot conditions. In the latter case, the predicted depressurizations depend on whether the vent section is defined as part of the vent connector or the common vent when setting up the model. Overall, the VENTII solver requires further investigation before it can be used reliably to predict spillage caused by depressurization over a full year of weather conditions, especially where hot conditions occur.« less

  6. Update: Validation, Edits, and Application Processing. Phase II and Error-Prone Model Report.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    An update to the Validation, Edits, and Application Processing and Error-Prone Model Report (Section 1, July 3, 1980) is presented. The objective is to present the most current data obtained from the June 1980 Basic Educational Opportunity Grant applicant and recipient files and to determine whether the findings reported in Section 1 of the July…

  7. Concurrent and Predictive Validity of the Phelps Kindergarten Readiness Scale-II

    ERIC Educational Resources Information Center

    Duncan, Jennifer; Rafter, Erin M.

    2005-01-01

    The purpose of this research was to establish the concurrent and predictive validity of the Phelps Kindergarten Readiness Scale, Second Edition (PKRS-II; L. Phelps, 2003). Seventy-four kindergarten students of diverse ethnic backgrounds enrolled in a northeastern suburban school participated in the study. The concurrent administration of the…

  8. Validation of Arabic and English versions of the ARSMA-II Acculturation Rating Scale.

    PubMed

    Jadalla, Ahlam; Lee, Jerry

    2015-02-01

    To translate and adapt the Acculturation Rating Scale of Mexican-Americans II (ARSMA-II) for Arab Americans. A multistage translation process followed by a pilot and a large study. The translated and adapted versions, Acculturation Rating Scale for Arabic Americans-II Arabic and English (ARSAA-IIA, ARSAA-IIE), were validated in a sample of 297 Arab Americans. Factor analyses with principal axis factoring extractions and direct oblimin rotations were used to identify the underlying structure of ARSAA-II. Factor analysis confirmed the underlying structure of ARSAA-II and produced two interpretable factors labeled as 'Attraction to American Culture' (AAmC) and 'Attraction to Arabic Culture' (AArC). The Cronbach's alphas of AAmC and AArC were .89 and .85 respectively. Findings support ARSAA-II A & E to assess acculturation among Arab Americans. The emergent factors of ARSAA-II support the theoretical structure of the original ARSMA-II tool and show high internal consistency.

  9. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  10. Validation of the Beck Depression Inventory-II in a Low-Income African American Sample of Medical Outpatients

    ERIC Educational Resources Information Center

    Grothe, Karen B.; Dutton, Gareth R.; Jones, Glenn N.; Bodenlos, Jamie; Ancona, Martin; Brantley, Phillip J.

    2005-01-01

    The psychometric properties of the Beck Depression Inventory-II (BDI-II) are well established with primarily Caucasian samples. However, little is known about its reliability and validity with minority groups. This study evaluated the psychometric properties of the BDI-II in a sample of low-income African American medical outpatients (N = 220).…

  11. Using airborne laser scanning profiles to validate marine geoid models

    NASA Astrophysics Data System (ADS)

    Julge, Kalev; Gruno, Anti; Ellmann, Artu; Liibusk, Aive; Oja, Tõnis

    2014-05-01

    Airborne laser scanning (ALS) is a remote sensing method which utilizes LiDAR (Light Detection And Ranging) technology. The datasets collected are important sources for large range of scientific and engineering applications. Mostly the ALS is used to measure terrain surfaces for compilation of Digital Elevation Models but it can also be used in other applications. This contribution focuses on usage of ALS system for measuring sea surface heights and validating gravimetric geoid models over marine areas. This is based on the ALS ability to register echoes of LiDAR pulse from the water surface. A case study was carried out to analyse the possibilities for validating marine geoid models by using ALS profiles. A test area at the southern shores of the Gulf of Finland was selected for regional geoid validation. ALS measurements were carried out by the Estonian Land Board in spring 2013 at different altitudes and using different scan rates. The one wavelength Leica ALS50-II laser scanner on board of a small aircraft was used to determine the sea level (with respect to the GRS80 reference ellipsoid), which follows roughly the equipotential surface of the Earth's gravity field. For the validation a high-resolution (1'x2') regional gravimetric GRAV-GEOID2011 model was used. This geoid model covers the entire area of Estonia and surrounding waters of the Baltic Sea. The fit between the geoid model and GNSS/levelling data within the Estonian dry land revealed RMS of residuals ±1… ±2 cm. Note that such fitting validation cannot proceed over marine areas. Therefore, an ALS observation-based methodology was developed to evaluate the GRAV-GEOID2011 quality over marine areas. The accuracy of acquired ALS dataset were analyzed, also an optimal width of nadir-corridor containing good quality ALS data was determined. Impact of ALS scan angle range and flight altitude to obtainable vertical accuracy were investigated as well. The quality of point cloud is analysed by cross

  12. Concurrent Validity of the Online Version of the Keirsey Temperament Sorter II.

    ERIC Educational Resources Information Center

    Kelly, Kevin R.; Jugovic, Heidi

    2001-01-01

    Data from the Keirsey Temperament Sorter II online instrument and Myers Briggs Type Indicator (MBTI) for 203 college freshmen were analyzed. Positive correlations appeared between the concurrent MBTI and Keirsey measures of psychological type, giving preliminary support to the validity of the online version of Keirsey. (Contains 28 references.)…

  13. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  14. Are the binary typology models of alcoholism valid in polydrug abusers?

    PubMed

    Pombo, Samuel; da Costa, Nuno F; Figueira, Maria L

    2015-01-01

    To evaluate the dichotomy of type I/II and type A/B alcoholism typologies in opiate-dependent patients with a comorbid alcohol dependence problem (ODP-AP). The validity assessment process comprised the information regarding the history of alcohol use (internal validity), cognitive-behavioral variables regarding substance use (external validity), and indicators of treatment during 6-month follow-up (predictive validity). ODP-AP subjects classified as type II/B presented an early and much more severe drinking problem and a worse clinical prognosis when considering opiate treatment variables as compared with ODP-AP subjects defined as type I/A. Furthermore, type II/B patients endorse more general positive beliefs and expectancies related to the effect of alcohol and tend to drink heavily across several intra- and interpersonal situations as compared with type I/A patients. These findings confirm two different forms of alcohol dependence, recognized as a low-severity/vulnerability subgroup and a high-severity/vulnerability subgroup, in an opiate-dependent population with a lifetime diagnosis of alcohol dependence.

  15. Decompression models: review, relevance and validation capabilities.

    PubMed

    Hugon, J

    2014-01-01

    For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.

  16. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  17. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  18. Effects of Mg II and Ca II ionization on ab-initio solar chromosphere models

    NASA Technical Reports Server (NTRS)

    Rammacher, W.; Cuntz, M.

    1991-01-01

    Acoustically heated solar chromosphere models are computed considering radiation damping by (non-LTE) emission from H(-) and by Mg II and Ca II emission lines. The radiative transfer equations for the Mg II k and Ca II K emission lines are solved using the core-saturation method with complete redistribution. The Mg II k and Ca II K cooling rates are compared with the VAL model C. Several substantial improvements over the work of Ulmschneider et al. (1987) are included. It is found that the rapid temperature rises caused by the ionization of Mg II are not formed in the middle chromosphere, but occur at larger atmospheric heights. These models represent the temperature structure of the 'real' solar chromosphere much better. This result is a major precondition for the study of ab-initio models for solar flux tubes based on MHD wave propagation and also for ab-initio models for the solar transition layer.

  19. Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part I: Benchmark comparisons of WIMS-D5 and DRAGON cell and control rod parameters with MCNP5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mollerach, R.; Leszczynski, F.; Fink, J.

    2006-07-01

    In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out coveringmore » cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)« less

  20. Transport of fluid and solutes in the body II. Model validation and implications.

    PubMed

    Gyenge, C C; Bowen, B D; Reed, R K; Bert, J L

    1999-09-01

    A mathematical model of short-term whole body fluid, protein, and ion distribution and transport developed earlier [see companion paper: C. C. Gyenge, B. D. Bowen, R. K. Reed, and J. L. Bert. Am. J. Physiol. 277 (Heart Circ. Physiol. 46): H1215-H1227, 1999] is validated using experimental data available in the literature. The model was tested against data measured for the following three types of experimental infusions: 1) hyperosmolar saline solutions with an osmolarity in the range of 2,000-2,400 mosmol/l, 2) saline solutions with an osmolarity of approximately 270 mosmol/l and composition comparable with Ringer solution, and 3) an isosmotic NaCl solution with an osmolarity of approximately 300 mosmol/l. Good agreement between the model predictions and the experimental data was obtained with respect to the trends and magnitudes of fluid shifts between the intra- and extracellular compartments, extracellular ion and protein contents, and hematocrit values. The model is also able to yield information about inaccessible or difficult-to-measure system variables such as intracellular ion contents, cellular volumes, and fluid fluxes across the vascular capillary membrane, data that can be used to help interpret the behavior of the system.

  1. Validation of the Offending-Related Attitudes Questionnaire of CRIME-PICS II Scale (Chinese)

    ERIC Educational Resources Information Center

    Chui, Wing Hong; Wu, Joseph; Kwok, Yan Yuen; Liu, Liu

    2017-01-01

    This study examined the factor structure, reliability, and validity of the first part of the Chinese version of the CRIME-PICS II Scale, a self-administrated instrument assessing offending-related attitudes. Data were collected from three samples: male Hong Kong young offenders, female Mainland Chinese prisoners, and Hong Kong college students.…

  2. Validation of CRIB II for prediction of mortality in premature babies.

    PubMed

    Rastogi, Pallav Kumar; Sreenivas, V; Kumar, Nirmal

    2010-02-01

    Validation of Clinical Risk Index for Babies (CRIB II) score in predicting the neonatal mortality in preterm neonates < or = 32 weeks gestational age. Prospective cohort study. Tertiary care neonatal unit. 86 consecutively born preterm neonates with gestational age < or = 32 weeks. The five variables related to CRIB II were recorded within the first hour of admission for data analysis. The receiver operating characteristics (ROC) curve was used to check the accuracy of the mortality prediction. HL Goodness of fit test was used to see the discrepancy between observed and expected outcomes. A total of 86 neonates (males 59.6% mean birthweight: 1228 +/- 398 grams; mean gestational age: 28.3 +/- 2.4 weeks) were enrolled in the study, of which 17 (19.8%) left hospital against medical advice (LAMA) before reaching the study end point. Among 69 neonates completing the study, 24 (34.8%) had adverse outcome during hospital stay and 45 (65.2%) had favorable outcome. CRIB II correctly predicted adverse outcome in 90.3% (Hosmer Lemeshow goodness of fit test P=0.6). Area under curve (AUC) for CRIB II was 0.9032. In intention to treat analysis with LAMA cases included as survivors, the mortality prediction was 87%. If these were included as having died then mortality prediction was 83.1%. The CRIB II score was found to be a good predictive instrument for mortality in preterm infants < or = 32 weeks gestation.

  3. Ground-water models: Validate or invalidate

    USGS Publications Warehouse

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  4. Empirical agreement in model validation.

    PubMed

    Jebeile, Julie; Barberousse, Anouk

    2016-04-01

    Empirical agreement is often used as an important criterion when assessing the validity of scientific models. However, it is by no means a sufficient criterion as a model can be so adjusted as to fit available data even though it is based on hypotheses whose plausibility is known to be questionable. Our aim in this paper is to investigate into the uses of empirical agreement within the process of model validation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Preliminary validation of the Spanish version of the Multiple Stimulus Types Ambiguity Tolerance Scale (MSTAT-II).

    PubMed

    Arquero, José L; McLain, David L

    2010-05-01

    Despite widespread interest in ambiguity tolerance and other information-related individual differences, existing measures are conceptually dispersed and psychometrically weak. This paper presents the Spanish version of MSTAT-II, a short, stimulus-oriented, and psychometrically improved measure of an individual's orientation toward ambiguous stimuli. Results obtained reveal adequate reliability, validity, and temporal stability. These results support the use of MSTAT-II as an adequate measure of ambiguity tolerance.

  6. An eleven-year validation of a physically-based distributed dynamic ecohydorological model tRIBS+VEGGIE: Walnut Gulch Experimental Watershed

    NASA Astrophysics Data System (ADS)

    Sivandran, G.; Bisht, G.; Ivanov, V. Y.; Bras, R. L.

    2008-12-01

    A coupled, dynamic vegetation and hydrologic model, tRIBS+VEGGIE, was applied to the semiarid Walnut Gulch Experimental Watershed in Arizona. The physically-based, distributed nature of the coupled model allows for parameterization and simulation of watershed vegetation-water-energy dynamics on timescales varying from hourly to interannual. The model also allows for explicit spatial representation of processes that vary due to complex topography, such as lateral redistribution of moisture and partitioning of radiation with respect to aspect and slope. Model parameterization and forcing was conducted using readily available databases for topography, soil types, and land use cover as well as the data from network of meteorological stations located within the Walnut Gulch watershed. In order to test the performance of the model, three sets of simulations were conducted over an 11 year period from 1997 to 2007. Two simulations focus on heavily instrumented nested watersheds within the Walnut Gulch basin; (i) Kendall watershed, which is dominated by annual grasses; and (ii) Lucky Hills watershed, which is dominated by a mixture of deciduous and evergreen shrubs. The third set of simulations cover the entire Walnut Gulch Watershed. Model validation and performance were evaluated in relation to three broad categories; (i) energy balance components: the network of meteorological stations were used to validate the key energy fluxes; (ii) water balance components: the network of flumes, rain gauges and soil moisture stations installed within the watershed were utilized to validate the manner in which the model partitions moisture; and (iii) vegetation dynamics: remote sensing products from MODIS were used to validate spatial and temporal vegetation dynamics. Model results demonstrate satisfactory spatial and temporal agreement with observed data, giving confidence that key ecohydrological processes can be adequately represented for future applications of tRIBS+VEGGIE in

  7. Long-term ELBARA-II Assistance to SMOS Land Product and Algorithm Validation at the Valencia Anchor Station (MELBEX Experiment 2010-2013)

    NASA Astrophysics Data System (ADS)

    Lopez-Baeza, Ernesto; Wigneron, Jean-Pierre; Schwank, Mike; Miernecki, Maciej; Kerr, Yann; Casal, Tania; Delwart, Steven; Fernandez-Moran, Roberto; Mecklenburg, Susanne; Coll Pajaron, M. Amparo; Salgado Hernanz, Paula

    The main activity of the Valencia Anchor Station (VAS) is currently now to support the validation of SMOS (Soil Moisture and Ocean Salinity) Level 2 and 3 land products (soil moisture, SM, and vegetation optical depth, TAU). With this aim, the European Space Agency (ESA) has provided the Climatology from Satellites Group of the University of Valencia with an ELBARA-II microwave radiometer under a loan agreement since September 2009. During this time, brightness temperatures (TB) have continuously been acquired, except during normal maintenance or minor repair interruptions. ELBARA-II is an L-band dual-polarization radiometer with two channels (1400-1418 MHz, 1409-1427 MHz). It is continuously measuring over a vineyard field (El Renegado, Caudete de las Fuentes, Valencia) from a 15 m platform with a constant protocol for calibration and angular scanning measurements with the aim to assisting the validation of SMOS land products and the calibration of the L-MEB (L-Band Emission of the Biosphere) -basis for the SMOS Level 2 Land Processor- over the VAS validation site. One of the advantages of using the VAS site is the possibility of studying two different environmental conditions along the year. While the vine cycle extends mainly between April and October, during the rest of the year the area remains under bare soil conditions, adequate for the calibration of the soil model. The measurement protocol currently running has shown to be robust during the whole operation time and will be extended in time as much as possible to continue providing a long-term data set of ELBARA-II TB measurements and retrieved SM and TAU. This data set is also showing to be useful in support of SMOS scientific activities: the VAS area and, specifically the ELBARA-II site, offer good conditions to control the long-term evolution of SMOS Level 2 and Level 3 land products and interpret eventual anomalies that may obscure sensor hidden biases. In addition, SM and TAU that are currently

  8. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    USGS Publications Warehouse

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  9. Hydrodynamical models of cometary H II regions

    NASA Astrophysics Data System (ADS)

    Steggles, H. G.; Hoare, M. G.; Pittard, J. M.

    2017-04-01

    We have modelled the evolution of cometary H II regions produced by zero-age main-sequence stars of O and B spectral types, which are driving strong winds and are born off-centre from spherically symmetric cores with power-law (α = 2) density slopes. A model parameter grid was produced that spans stellar mass, age and core density. Exploring this parameter space, we investigated limb-brightening, a feature commonly seen in cometary H II regions. We found that stars with mass M⋆ ≥ 12 M⊙ produce this feature. Our models have a cavity bounded by a contact discontinuity separating hot shocked wind and ionized ambient gas that is similar in size to the surrounding H II region. Because of early pressure confinement, we did not see shocks outside of the contact discontinuity for stars with M⋆ ≤ 40 M⊙, but the cavities were found to continue to grow. The cavity size in each model plateaus as the H II region stagnates. The spectral energy distributions of our models are similar to those from identical stars evolving in uniform density fields. The turn-over frequency is slightly lower in our power-law models as a result of a higher proportion of low-density gas covered by the H II regions.

  10. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  11. INACTIVATION OF CRYPTOSPORIDIUM OOCYSTS IN A PILOT-SCALE OZONE BUBBLE-DIFFUSER CONTACTOR - II: MODEL VALIDATION AND APPLICATION

    EPA Science Inventory

    The ADR model developed in Part I of this study was successfully validated with experimenta data obtained for the inactivation of C. parvum and C. muris oocysts with a pilot-scale ozone-bubble diffuser contactor operated with treated Ohio River water. Kinetic parameters, required...

  12. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Reliability and validity: Part II.

    PubMed

    Davis, Debora Winders

    2004-01-01

    Determining measurement reliability and validity involves complex processes. There is usually room for argument about most instruments. It is important that the researcher clearly describes the processes upon which she made the decision to use a particular instrument, and presents the evidence available showing that the instrument is reliable and valid for the current purposes. In some cases, the researcher may need to conduct pilot studies to obtain evidence upon which to decide whether the instrument is valid for a new population or a different setting. In all cases, the researcher must present a clear and complete explanation for the choices, she has made regarding reliability and validity. The consumer must then judge the degree to which the researcher has provided adequate and theoretically sound rationale. Although I have tried to touch on most of the important concepts related to measurement reliability and validity, it is beyond the scope of this column to be exhaustive. There are textbooks devoted entirely to specific measurement issues if readers require more in-depth knowledge.

  14. Turbine Engine Mathematical Model Validation

    DTIC Science & Technology

    1976-12-01

    AEDC-TR-76-90 ~Ec i ? Z985 TURBINE ENGINE MATHEMATICAL MODEL VALIDATION ENGINE TEST FACILITY ARNOLD ENGINEERING DEVELOPMENT CENTER AIR FORCE...i f n e c e s e a ~ ~ d i den t i f y by b l ock number) YJI01-GE-100 engine turbine engines mathematical models computations mathematical...report presents and discusses the results of an investigation to develop a rationale and technique for the validation of turbine engine steady-state

  15. Validation of the LOD score compared with APACHE II score in prediction of the hospital outcome in critically ill patients.

    PubMed

    Khwannimit, Bodin

    2008-01-01

    The Logistic Organ Dysfunction score (LOD) is an organ dysfunction score that can predict hospital mortality. The aim of this study was to validate the performance of the LOD score compared with the Acute Physiology and Chronic Health Evaluation II (APACHE II) score in a mixed intensive care unit (ICU) at a tertiary referral university hospital in Thailand. The data were collected prospectively on consecutive ICU admissions over a 24 month period from July1, 2004 until June 30, 2006. Discrimination was evaluated by the area under the receiver operating characteristic curve (AUROC). The calibration was assessed by the Hosmer-Lemeshow goodness-of-fit H statistic. The overall fit of the model was evaluated by the Brier's score. Overall, 1,429 patients were enrolled during the study period. The mortality in the ICU was 20.9% and in the hospital was 27.9%. The median ICU and hospital lengths of stay were 3 and 18 days, respectively, for all patients. Both models showed excellent discrimination. The AUROC for the LOD and APACHE II were 0.860 [95% confidence interval (CI) = 0.838-0.882] and 0.898 (95% Cl = 0.879-0.917), respectively. The LOD score had perfect calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 10 (p = 0.44). However, the APACHE II had poor calibration with the Hosmer-Lemeshow goodness-of-fit H chi-2 = 75.69 (p < 0.001). Brier's score showed the overall fit for both models were 0.123 (95%Cl = 0.107-0.141) and 0.114 (0.098-0.132) for the LOD and APACHE II, respectively. Thus, the LOD score was found to be accurate for predicting hospital mortality for general critically ill patients in Thailand.

  16. External validation of the Intensive Care National Audit & Research Centre (ICNARC) risk prediction model in critical care units in Scotland.

    PubMed

    Harrison, David A; Lone, Nazir I; Haddow, Catriona; MacGillivray, Moranne; Khan, Angela; Cook, Brian; Rowan, Kathryn M

    2014-01-01

    Risk prediction models are used in critical care for risk stratification, summarising and communicating risk, supporting clinical decision-making and benchmarking performance. However, they require validation before they can be used with confidence, ideally using independently collected data from a different source to that used to develop the model. The aim of this study was to validate the Intensive Care National Audit & Research Centre (ICNARC) model using independently collected data from critical care units in Scotland. Data were extracted from the Scottish Intensive Care Society Audit Group (SICSAG) database for the years 2007 to 2009. Recoding and mapping of variables was performed, as required, to apply the ICNARC model (2009 recalibration) to the SICSAG data using standard computer algorithms. The performance of the ICNARC model was assessed for discrimination, calibration and overall fit and compared with that of the Acute Physiology And Chronic Health Evaluation (APACHE) II model. There were 29,626 admissions to 24 adult, general critical care units in Scotland between 1 January 2007 and 31 December 2009. After exclusions, 23,269 admissions were included in the analysis. The ICNARC model outperformed APACHE II on measures of discrimination (c index 0.848 versus 0.806), calibration (Hosmer-Lemeshow chi-squared statistic 18.8 versus 214) and overall fit (Brier's score 0.140 versus 0.157; Shapiro's R 0.652 versus 0.621). Model performance was consistent across the three years studied. The ICNARC model performed well when validated in an external population to that in which it was developed, using independently collected data.

  17. On the validity of travel-time based nonlinear bioreactive transport models in steady-state flow.

    PubMed

    Sanz-Prat, Alicia; Lu, Chuanhe; Finkel, Michael; Cirpka, Olaf A

    2015-01-01

    Travel-time based models simplify the description of reactive transport by replacing the spatial coordinates with the groundwater travel time, posing a quasi one-dimensional (1-D) problem and potentially rendering the determination of multidimensional parameter fields unnecessary. While the approach is exact for strictly advective transport in steady-state flow if the reactive properties of the porous medium are uniform, its validity is unclear when local-scale mixing affects the reactive behavior. We compare a two-dimensional (2-D), spatially explicit, bioreactive, advective-dispersive transport model, considered as "virtual truth", with three 1-D travel-time based models which differ in the conceptualization of longitudinal dispersion: (i) neglecting dispersive mixing altogether, (ii) introducing a local-scale longitudinal dispersivity constant in time and space, and (iii) using an effective longitudinal dispersivity that increases linearly with distance. The reactive system considers biodegradation of dissolved organic carbon, which is introduced into a hydraulically heterogeneous domain together with oxygen and nitrate. Aerobic and denitrifying bacteria use the energy of the microbial transformations for growth. We analyze six scenarios differing in the variance of log-hydraulic conductivity and in the inflow boundary conditions (constant versus time-varying concentration). The concentrations of the 1-D models are mapped to the 2-D domain by means of the kinematic (for case i), and mean groundwater age (for cases ii & iii), respectively. The comparison between concentrations of the "virtual truth" and the 1-D approaches indicates extremely good agreement when using an effective, linearly increasing longitudinal dispersivity in the majority of the scenarios, while the other two 1-D approaches reproduce at least the concentration tendencies well. At late times, all 1-D models give valid approximations of two-dimensional transport. We conclude that the

  18. Global precipitation measurements for validating climate models

    NASA Astrophysics Data System (ADS)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  19. Beware of external validation! - A Comparative Study of Several Validation Techniques used in QSAR Modelling.

    PubMed

    Majumdar, Subhabrata; Basak, Subhash C

    2018-04-26

    Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Validation of biomarkers to predict response to immunotherapy in cancer: Volume II - clinical validation and regulatory considerations.

    PubMed

    Dobbin, Kevin K; Cesano, Alessandra; Alvarez, John; Hawtin, Rachael; Janetzki, Sylvia; Kirsch, Ilan; Masucci, Giuseppe V; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Zhang, Jenny; Butterfield, Lisa H; Thurin, Magdalena

    2016-01-01

    There is growing recognition that immunotherapy is likely to significantly improve health outcomes for cancer patients in the coming years. Currently, while a subset of patients experience substantial clinical benefit in response to different immunotherapeutic approaches, the majority of patients do not but are still exposed to the significant drug toxicities. Therefore, a growing need for the development and clinical use of predictive biomarkers exists in the field of cancer immunotherapy. Predictive cancer biomarkers can be used to identify the patients who are or who are not likely to derive benefit from specific therapeutic approaches. In order to be applicable in a clinical setting, predictive biomarkers must be carefully shepherded through a step-wise, highly regulated developmental process. Volume I of this two-volume document focused on the pre-analytical and analytical phases of the biomarker development process, by providing background, examples and "good practice" recommendations. In the current Volume II, the focus is on the clinical validation, validation of clinical utility and regulatory considerations for biomarker development. Together, this two volume series is meant to provide guidance on the entire biomarker development process, with a particular focus on the unique aspects of developing immune-based biomarkers. Specifically, knowledge about the challenges to clinical validation of predictive biomarkers, which has been gained from numerous successes and failures in other contexts, will be reviewed together with statistical methodological issues related to bias and overfitting. The different trial designs used for the clinical validation of biomarkers will also be discussed, as the selection of clinical metrics and endpoints becomes critical to establish the clinical utility of the biomarker during the clinical validation phase of the biomarker development. Finally, the regulatory aspects of submission of biomarker assays to the U.S. Food and

  1. Transient PVT measurements and model predictions for vessel heat transfer. Part II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.

    2010-07-01

    Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models inmore » which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.« less

  2. Better prognostic marker in ICU - APACHE II, SOFA or SAP II!

    PubMed

    Naqvi, Iftikhar Haider; Mahmood, Khalid; Ziaullaha, Syed; Kashif, Syed Mohammad; Sharif, Asim

    2016-01-01

    This study was designed to determine the comparative efficacy of different scoring system in assessing the prognosis of critically ill patients. This was a retrospective study conducted in medical intensive care unit (MICU) and high dependency unit (HDU) Medical Unit III, Civil Hospital, from April 2012 to August 2012. All patients over age 16 years old who have fulfilled the criteria for MICU admission were included. Predictive mortality of APACHE II, SAP II and SOFA were calculated. Calibration and discrimination were used for validity of each scoring model. A total of 96 patients with equal gender distribution were enrolled. The average APACHE II score in non-survivors (27.97+8.53) was higher than survivors (15.82+8.79) with statistically significant p value (<0.001). The average SOFA score in non-survivors (9.68+4.88) was higher than survivors (5.63+3.63) with statistically significant p value (<0.001). SAP II average score in non-survivors (53.71+19.05) was higher than survivors (30.18+16.24) with statistically significant p value (<0.001). All three tested scoring models (APACHE II, SAP II and SOFA) would be accurate enough for a general description of our ICU patients. APACHE II has showed better calibration and discrimination power than SAP II and SOFA.

  3. Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes▿

    PubMed Central

    Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

    2011-01-01

    Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-Å Cα root mean square deviation [RMSD]) the high-resolution (1.8-Å) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur. PMID:21965559

  4. Reliability and Validity of Kurdish Language Version of Health Promoting Lifestyle Profile II among Kurdish Healthcare Providers Kurdish Version of HPLP-II.

    PubMed

    Kamali, Aram Salih Mohammed Amin; Sadeghi, Roya; Tol, Azar; Yaseri, Mahdi

    2016-12-01

    Unhealthy lifestyles pose significant threat to public health. This study aimed to assess the validity and reliability of a Kurdish version of the HPLP-II instrument among Kurdish healthcare providers, whose society and culture differ from that of North America and Spain, where the instrument was developed. The instrument was translated into Kurdish, back translated, and pilot tested to ascertain cultural sensitivity. It was then evaluated using a convenience sample of 460 healthcare providers in the Kurdistan region, northern Iraq using a principal components factor analysis. The order of factors was entirely identical to those isolated previously during the psychometric assessment of the English language version. The majority of our study participants were male (55%). In addition, 39.2% of participants were nurses. In addition, 42% of participants had less than five years of working experience. Furthermore, 82.1% of our study population held a high school diploma. The mean (SE) of Physical Activities dimension was low (15.3 ± 4.8) compared to Spiritual Growth dimension (24.5 ± 4.4). Moreover, the Cronbach's alpha coefficient for the overall HPLP-II questionnaire was 0.870; however, the nutrition dimension was low (0.622) compared to Physical Activities dimension (0.792). Furthermore, the correlation between items ranged from 0.099 to 0.611. The Kurdish version of the HPLP-II demonstrated initial reliability and validity. It is a precious tool to evaluate and assess lifestyle and lifestyle interventions built to improve the health of Kurds.

  5. Validation of the Health-Promoting Lifestyle Profile II for Hispanic male truck drivers in the Southwest.

    PubMed

    Mullins, Iris L; O'Day, Trish; Kan, Tsz Yin

    2013-08-01

    The aims of the study were to validate the English and Spanish Versions of the Health-Promoting Lifestyle Profile II (HPLP II) with Hispanic male truck drivers and to determine if there were any differences in drivers' responses based on driving responsibility. The methods included a descriptive correlation design, the HPLP II (English and Spanish versions), and a demographic questionnaire. Fifty-two Hispanic drivers participated in the study. There were no significant differences in long haul and short haul drivers' responses to the HPLP II. Cronbach's alpha for the Spanish version was .97 and the subscales alphas ranged from .74 to .94. The English version alpha was .92 and the subscales ranged from .68 to .84. Findings suggest the subscales of Health Responsibility, Physical Activities, Nutrition, and Spirituality Growth on the HPLP II Spanish and English versions may not adequately assess health-promoting behaviors and cultural influences for the Hispanic male population in the southwestern border region.

  6. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  7. Concurrent Validity of the WISC-IV and DAS-II in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Kuriakose, Sarah

    2014-01-01

    Cognitive assessments are used for a variety of research and clinical purposes in children with autism spectrum disorder (ASD). This study establishes concurrent validity of the Wechsler Intelligence Scales for Children-fourth edition (WISC-IV) and Differential Ability Scales-second edition (DAS-II) in a sample of children with ASD with a broad…

  8. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  9. Validating EHR clinical models using ontology patterns.

    PubMed

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Validation of urban freeway models.

    DOT National Transportation Integrated Search

    2015-01-01

    This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...

  11. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  12. Measuring health-promoting behaviors: cross-cultural validation of the Health-Promoting Lifestyle Profile-II.

    PubMed

    Sousa, Pedro; Gaspar, Pedro; Vaz, Daniela C; Gonzaga, Sílvia; Dixe, M Anjos

    2015-04-01

    Individual lifestyles have emerged as valuable health constructs. This study aims to psychometrically test the Portuguese (European) version of the Health-Promoting Lifestyle Profile-II. After an adequate linguistic and cultural adaptation of the Health-Promoting Lifestyle Profile-II scale, their psychometric properties were assessed (N = 889) by Cronbach's alpha and confirmatory factor analysis. Results showed an adequate fit to a 52-item, six-factor structure. A global alpha of .925 was obtained. The Portuguese version demonstrated good validity and reliability in a wide adult sample, and can thus be applied to the Portuguese population. This instrument is useful as an evaluation tool for health-promoting lifestyles and as an instrument for testing the effectiveness of health-promoting programs. © 2014 NANDA International, Inc.

  13. Ab initio structural modeling of and experimental validation for Chlamydia trachomatis protein CT296 reveal structural similarity to Fe(II) 2-oxoglutarate-dependent enzymes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kemege, Kyle E.; Hickey, John M.; Lovell, Scott

    2012-02-13

    Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF)more » CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-{angstrom} C{alpha} root mean square deviation [RMSD]) the high-resolution (1.8-{angstrom}) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur.« less

  14. Conformational Analysis of the DFG-Out Kinase Motif and Biochemical Profiling of Structurally Validated Type II Inhibitors

    PubMed Central

    2015-01-01

    Structural coverage of the human kinome has been steadily increasing over time. The structures provide valuable insights into the molecular basis of kinase function and also provide a foundation for understanding the mechanisms of kinase inhibitors. There are a large number of kinase structures in the PDB for which the Asp and Phe of the DFG motif on the activation loop swap positions, resulting in the formation of a new allosteric pocket. We refer to these structures as “classical DFG-out” conformations in order to distinguish them from conformations that have also been referred to as DFG-out in the literature but that do not have a fully formed allosteric pocket. We have completed a structural analysis of almost 200 small molecule inhibitors bound to classical DFG-out conformations; we find that they are recognized by both type I and type II inhibitors. In contrast, we find that nonclassical DFG-out conformations strongly select against type II inhibitors because these structures have not formed a large enough allosteric pocket to accommodate this type of binding mode. In the course of this study we discovered that the number of structurally validated type II inhibitors that can be found in the PDB and that are also represented in publicly available biochemical profiling studies of kinase inhibitors is very small. We have obtained new profiling results for several additional structurally validated type II inhibitors identified through our conformational analysis. Although the available profiling data for type II inhibitors is still much smaller than for type I inhibitors, a comparison of the two data sets supports the conclusion that type II inhibitors are more selective than type I. We comment on the possible contribution of the DFG-in to DFG-out conformational reorganization to the selectivity. PMID:25478866

  15. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  16. Modelling human skull growth: a validated computational model

    PubMed Central

    Marghoub, Arsalan; Johnson, David; Khonsari, Roman H.; Fagan, Michael J.; Moazen, Mehran

    2017-01-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions (n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. PMID:28566514

  17. Modelling human skull growth: a validated computational model.

    PubMed

    Libby, Joseph; Marghoub, Arsalan; Johnson, David; Khonsari, Roman H; Fagan, Michael J; Moazen, Mehran

    2017-05-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions ( n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. © 2017 The Author(s).

  18. External validation of preexisting first trimester preeclampsia prediction models.

    PubMed

    Allen, Rebecca E; Zamora, Javier; Arroyo-Manzano, David; Velauthar, Luxmilar; Allotey, John; Thangaratinam, Shakila; Aquilina, Joseph

    2017-10-01

    To validate the increasing number of prognostic models being developed for preeclampsia using our own prospective study. A systematic review of literature that assessed biomarkers, uterine artery Doppler and maternal characteristics in the first trimester for the prediction of preeclampsia was performed and models selected based on predefined criteria. Validation was performed by applying the regression coefficients that were published in the different derivation studies to our cohort. We assessed the models discrimination ability and calibration. Twenty models were identified for validation. The discrimination ability observed in derivation studies (Area Under the Curves) ranged from 0.70 to 0.96 when these models were validated against the validation cohort, these AUC varied importantly, ranging from 0.504 to 0.833. Comparing Area Under the Curves obtained in the derivation study to those in the validation cohort we found statistically significant differences in several studies. There currently isn't a definitive prediction model with adequate ability to discriminate for preeclampsia, which performs as well when applied to a different population and can differentiate well between the highest and lowest risk groups within the tested population. The pre-existing large number of models limits the value of further model development and future research should be focussed on further attempts to validate existing models and assessing whether implementation of these improves patient care. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  19. Calibration and validation of rockfall models

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  20. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  1. Comparison with CLPX II airborne data using DMRT model

    USGS Publications Warehouse

    Xu, X.; Liang, D.; Andreadis, K.M.; Tsang, L.; Josberger, E.G.

    2009-01-01

    In this paper, we considered a physical-based model which use numerical solution of Maxwell Equations in three-dimensional simulations and apply into Dense Media Radiative Theory (DMRT). The model is validated in two specific dataset from the second Cold Land Processes Experiment (CLPX II) at Alaska and Colorado. The data were all obtain by the Ku-band (13.95GHz) observations using airborne imaging polarimetric scatterometer (POLSCAT). Snow is a densely packed media. To take into account the collective scattering and incoherent scattering, analytical Quasi-Crystalline Approximation (QCA) and Numerical Maxwell Equation Method of 3-D simulation (NMM3D) are used to calculate the extinction coefficient and phase matrix. DMRT equations were solved by iterative solution up to 2nd order for the case of small optical thickness and full multiple scattering solution by decomposing the diffuse intensities into Fourier series was used when optical thickness exceed unity. It was shown that the model predictions agree with the field experiment not only co-polarization but also cross-polarization. For Alaska region, the input snow structure data was obtain by the in situ ground observations, while for Colorado region, we combined the VIC model to get the snow profile. ??2009 IEEE.

  2. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  3. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  4. Cross-validation to select Bayesian hierarchical models in phylogenetics.

    PubMed

    Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C

    2016-05-26

    Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.

  5. Validity of the KABC-II Culture-Language Interpretive Matrix: A Comparison of Native English Speakers and Spanish-Speaking English Language Learners

    ERIC Educational Resources Information Center

    Van Deth, Leah M.

    2013-01-01

    The purpose of the present study was to investigate the validity of the Culture-Language Interpretive Matrix (C-LIM; Flanagan, Ortiz, & Alfonso, 2013) when applied to scores from the Kaufman Assessment Battery for Children, 2nd Edition (KABC-II; Kaufman & Kaufman, 2004). Data were analyzed from the KABC-II standardization sample as well as…

  6. In silico prediction of ROCK II inhibitors by different classification approaches.

    PubMed

    Cai, Chuipu; Wu, Qihui; Luo, Yunxia; Ma, Huili; Shen, Jiangang; Zhang, Yongbin; Yang, Lei; Chen, Yunbo; Wen, Zehuai; Wang, Qi

    2017-11-01

    ROCK II is an important pharmacological target linked to central nervous system disorders such as Alzheimer's disease. The purpose of this research is to generate ROCK II inhibitor prediction models by machine learning approaches. Firstly, four sets of descriptors were calculated with MOE 2010 and PaDEL-Descriptor, and optimized by F-score and linear forward selection methods. In addition, four classification algorithms were used to initially build 16 classifiers with k-nearest neighbors [Formula: see text], naïve Bayes, Random forest, and support vector machine. Furthermore, three sets of structural fingerprint descriptors were introduced to enhance the predictive capacity of classifiers, which were assessed with fivefold cross-validation, test set validation and external test set validation. The best two models, MFK + MACCS and MLR + SubFP, have both MCC values of 0.925 for external test set. After that, a privileged substructure analysis was performed to reveal common chemical features of ROCK II inhibitors. Finally, binding modes were analyzed to identify relationships between molecular descriptors and activity, while main interactions were revealed by comparing the docking interaction of the most potent and the weakest ROCK II inhibitors. To the best of our knowledge, this is the first report on ROCK II inhibitors utilizing machine learning approaches that provides a new method for discovering novel ROCK II inhibitors.

  7. Validation of mechanical models for reinforced concrete structures: Presentation of the French project ``Benchmark des Poutres de la Rance''

    NASA Astrophysics Data System (ADS)

    L'Hostis, V.; Brunet, C.; Poupard, O.; Petre-Lazar, I.

    2006-11-01

    Several ageing models are available for the prediction of the mechanical consequences of rebar corrosion. They are used for service life prediction of reinforced concrete structures. Concerning corrosion diagnosis of reinforced concrete, some Non Destructive Testing (NDT) tools have been developed, and have been in use for some years. However, these developments require validation on existing concrete structures. The French project “Benchmark des Poutres de la Rance” contributes to this aspect. It has two main objectives: (i) validation of mechanical models to estimate the influence of rebar corrosion on the load bearing capacity of a structure, (ii) qualification of the use of the NDT results to collect information on steel corrosion within reinforced-concrete structures. Ten French and European institutions from both academic research laboratories and industrial companies contributed during the years 2004 and 2005. This paper presents the project that was divided into several work packages: (i) the reinforced concrete beams were characterized from non-destructive testing tools, (ii) the mechanical behaviour of the beams was experimentally tested, (iii) complementary laboratory analysis were performed and (iv) finally numerical simulations results were compared to the experimental results obtained with the mechanical tests.

  8. Validating the Thinking Styles Inventory-Revised II among Chinese University Students with Hearing Impairment through Test Accommodations

    ERIC Educational Resources Information Center

    Cheng, Sanyin; Zhang, Li-Fang

    2014-01-01

    The present study pioneered in adopting test accommodations to validate the Thinking Styles Inventory-Revised II (TSI-R2; Sternberg, Wagner, & Zhang, 2007) among Chinese university students with hearing impairment. A series of three studies were conducted that drew their samples from the same two universities, in which accommodating test…

  9. A Unified Constitutive Model for Subglacial Till, Part II: Laboratory Tests, Disturbed State Modeling, and Validation for Two Subglacial Tills

    NASA Astrophysics Data System (ADS)

    Desai, C. S.; Sane, S. M.; Jenson, J. W.; Contractor, D. N.; Carlson, A. E.; Clark, P. U.

    2006-12-01

    This presentation, which is complementary to Part I (Jenson et al.), describes the application of the Disturbed State Concept (DSC) constitutive model to define the behavior of the deforming sediment (till) underlying glaciers and ice sheets. The DSC includes elastic, plastic, and creep strains, and microstructural changes leading to degradation, failure, and sometimes strengthening or healing. Here, we describe comprehensive laboratory experiments conducted on samples of two regionally significant tills deposited by the Laurentide Ice Sheet: the Tiskilwa Till and Sky Pilot Till. The tests are used to determine the parameters to calibrate the DSC model, which is validated with respect to the laboratory tests by comparing the predictions with test data used to find the parameters, and also comparing them with independent tests not used to find the parameters. Discussion of the results also includes comparison of the DSC model with the classical Mohr-Coulomb model, which has been commonly used for glacial tills. A numerical procedure based on finite element implementation of the DSC is used to simulate an idealized field problem, and its predictions are discussed. Based on these analyses, the unified DSC model is proposed to provide an improved model for subglacial tills compared to other models used commonly, and thus to provide the potential for improved predictions of ice sheet movements.

  10. PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambers, F W; Raymond, B A; Falabella, S

    To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model ismore » coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.« less

  11. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  12. Update of the trauma risk adjustment model of the TraumaRegister DGU™: the Revised Injury Severity Classification, version II.

    PubMed

    Lefering, Rolf; Huber-Wagner, Stefan; Nienaber, Ulrike; Maegele, Marc; Bouillon, Bertil

    2014-09-05

    The TraumaRegister DGU™ (TR-DGU) has used the Revised Injury Severity Classification (RISC) score for outcome adjustment since 2003. In recent years, however, the observed mortality rate has fallen to about 2% below the prognosis, and it was felt that further prognostic factors, like pupil size and reaction, should be included as well. Finally, an increasing number of cases did not receive a RISC prognosis due to the missing values. Therefore, there was a need for an updated model for risk of death prediction in severely injured patients to be developed and validated using the most recent data. The TR-DGU has been collecting data from severely injured patients since 1993. All injuries are coded according to the Abbreviated Injury Scale (AIS, version 2008). Severely injured patients from Europe (ISS ≥ 4) documented between 2010 and 2011 were selected for developing the new score (n = 30,866), and 21,918 patients from 2012 were used for validation. Age and injury codes were required, and transferred patients were excluded. Logistic regression analysis was applied with hospital mortality as the dependent variable. Results were evaluated in terms of discrimination (area under the receiver operating characteristic curve, AUC), precision (observed versus predicted mortality), and calibration (Hosmer-Lemeshow goodness-of-fit statistic). The mean age of the development population was 47.3 years; 71.6% were males, and the average ISS was 19.3 points. Hospital mortality rate was 11.5% in this group. The new RISC II model consists of the following predictors: worst and second-worst injury (AIS severity level), head injury, age, sex, pupil reactivity and size, pre-injury health status, blood pressure, acidosis (base deficit), coagulation, haemoglobin, and cardiopulmonary resuscitation. Missing values are included as a separate category for every variable. In the development and the validation dataset, the new RISC II outperformed the original RISC score, for example AUC in

  13. Validating the Copenhagen Psychosocial Questionnaire (COPSOQ-II) Using Set-ESEM: Identifying Psychosocial Risk Factors in a Sample of School Principals

    PubMed Central

    Dicke, Theresa; Marsh, Herbert W.; Riley, Philip; Parker, Philip D.; Guo, Jiesi; Horwood, Marcus

    2018-01-01

    School principals world-wide report high levels of strain and attrition resulting in a shortage of qualified principals. It is thus crucial to identify psychosocial risk factors that reflect principals' occupational wellbeing. For this purpose, we used the Copenhagen Psychosocial Questionnaire (COPSOQ-II), a widely used self-report measure covering multiple psychosocial factors identified by leading occupational stress theories. We evaluated the COPSOQ-II regarding factor structure and longitudinal, discriminant, and convergent validity using latent structural equation modeling in a large sample of Australian school principals (N = 2,049). Results reveal that confirmatory factor analysis produced marginally acceptable model fit. A novel approach we call set exploratory structural equation modeling (set-ESEM), where cross-loadings were only allowed within a priori defined sets of factors, fit well, and was more parsimonious than a full ESEM. Further multitrait-multimethod models based on the set-ESEM confirm the importance of a principal's psychosocial risk factors; Stressors and depression were related to demands and ill-being, while confidence and autonomy were related to wellbeing. We also show that working in the private sector was beneficial for showing a low psychosocial risk, while other demographics have little effects. Finally, we identify five latent risk profiles (high risk to no risk) of school principals based on all psychosocial factors. Overall the research presented here closes the theory application gap of a strong multi-dimensional measure of psychosocial risk-factors. PMID:29760670

  14. Validating the Copenhagen Psychosocial Questionnaire (COPSOQ-II) Using Set-ESEM: Identifying Psychosocial Risk Factors in a Sample of School Principals.

    PubMed

    Dicke, Theresa; Marsh, Herbert W; Riley, Philip; Parker, Philip D; Guo, Jiesi; Horwood, Marcus

    2018-01-01

    School principals world-wide report high levels of strain and attrition resulting in a shortage of qualified principals. It is thus crucial to identify psychosocial risk factors that reflect principals' occupational wellbeing. For this purpose, we used the Copenhagen Psychosocial Questionnaire (COPSOQ-II), a widely used self-report measure covering multiple psychosocial factors identified by leading occupational stress theories. We evaluated the COPSOQ-II regarding factor structure and longitudinal, discriminant, and convergent validity using latent structural equation modeling in a large sample of Australian school principals ( N = 2,049). Results reveal that confirmatory factor analysis produced marginally acceptable model fit. A novel approach we call set exploratory structural equation modeling (set-ESEM), where cross-loadings were only allowed within a priori defined sets of factors, fit well, and was more parsimonious than a full ESEM. Further multitrait-multimethod models based on the set-ESEM confirm the importance of a principal's psychosocial risk factors; Stressors and depression were related to demands and ill-being, while confidence and autonomy were related to wellbeing. We also show that working in the private sector was beneficial for showing a low psychosocial risk, while other demographics have little effects. Finally, we identify five latent risk profiles (high risk to no risk) of school principals based on all psychosocial factors. Overall the research presented here closes the theory application gap of a strong multi-dimensional measure of psychosocial risk-factors.

  15. Validation of SAGE II ozone measurements

    NASA Technical Reports Server (NTRS)

    Cunnold, D. M.; Chu, W. P.; Mccormick, M. P.; Veiga, R. E.; Barnes, R. A.

    1989-01-01

    Five ozone profiles from the Stratospheric Aerosol and Gas Experiment (SAGE) II are compared with coincident ozonesonde measurements obtained at Natal, Brazil, and Wallops Island, Virginia. It is shown that the mean difference between all of the measurements is about 1 percent and that the agreement is within 7 percent at altitudes between 20 and 53 km. Good agreement is also found for ozone mixing ratios on pressure surfaces. It is concluded that the SAGE II profiles provide useful ozone information up to about 60 km altitude.

  16. Validity and reproducibility of HOMA-IR, 1/HOMA-IR, QUICKI and McAuley's indices in patients with hypertension and type II diabetes.

    PubMed

    Sarafidis, P A; Lasaridis, A N; Nilsson, P M; Pikilidou, M I; Stafilas, P C; Kanaki, A; Kazakos, K; Yovos, J; Bakris, G L

    2007-09-01

    The aim of this study was to evaluate the validity and reliability of homeostasis model assessment-insulin resistance (HOMA-IR) index, its reciprocal (1/HOMA-IR), quantitative insulin sensitivity check index (QUICKI) and McAuley's index in hypertensive diabetic patients. In 78 patients with hypertension and type II diabetes glucose, insulin and triglyceride levels were determined after a 12-h fast to calculate these indices, and insulin sensitivity (IS) was measured with the hyperinsulinemic euglycemic clamp technique. Two weeks later, subjects had again their glucose, insulin and triglycerides measured. Simple and multiple linear regression analysis were applied to assess the validity of these indices compared to clamp IS and coefficients of variation between the two visits were estimated to assess their reproducibility. HOMA-IR index was strongly and inversely correlated with the basic IS clamp index, the M-value (r=-0.572, P<0.001), M-value normalized with subjects' body weight or fat-free mass and every other clamp-derived index. 1/HOMA-IR and QUICKI indices were positively correlated with the M-value (r=0.342, P<0.05 and r=0.456, P<0.01, respectively) and the rest clamp indices. McAuley's index generally presented less strong correlations (r=0.317, P<0.05 with M-value). In multivariate analysis, HOMA-IR was the best fit of clamp-derived IS. Coefficients of variation between the two visits were 23.5% for HOMA-IR, 19.2% for 1/HOMA-IR, 7.8% for QUICKI and 15.1% for McAuley's index. In conclusion, HOMA-IR, 1/HOMA-IR and QUICKI are valid estimates of clamp-derived IS in patients with hypertension and type II diabetes, whereas the validity of McAuley's index needs further evaluation. QUICKI displayed better reproducibility than the other indices.

  17. STUDENT-TEACHER POPULATION GROWTH MODEL--DYNAMOD II.

    ERIC Educational Resources Information Center

    ZABROWSKI, EDWARD K.; AND OTHERS

    DYNAMOD II IS A COMPUTERIZED MARKOVIAN-TYPE FLOW MODEL DEVELOPED TO PROVIDE ESTIMATES OF THE EDUCATIONAL POPULATION OF STUDENTS AND TEACHERS OVER SELECTED INTERVALS OF TIME. THE POPULATION IS CROSS-CLASSIFIED INTO 108 GROUPS BY SEX, RACE, AGE, AND EDUCATIONAL CATEGORY. THIS NOTE DESCRIBES THE METHODOLOGY USED IN DYNAMOD II, COMPARES DYNAMOD II…

  18. Assessing youth who sexually offended: the predictive validity of the ERASOR, J-SOAP-II, and YLS/CMI in a non-Western context.

    PubMed

    Chu, Chi Meng; Ng, Kynaston; Fong, June; Teoh, Jennifer

    2012-04-01

    Recent research suggested that the predictive validity of adult sexual offender risk assessment measures can be affected when used cross-culturally, but there is no published study on the predictive validity of risk assessment measures for youth who sexually offended in a non-Western context. This study compared the predictive validity of three youth risk assessment measures (i.e., the Estimate of Risk of Adolescent Sexual Offense Recidivism [ERASOR], the Juvenile Sex Offender Assessment Protocol-II [J-SOAP-II], and the Youth Level of Service/Case Management Inventory [YLS/CMI]) for sexual and nonviolent recidivism in a sample of 104 male youth who sexually offended within a Singaporean context (M (follow-up) = 1,637 days; SD (follow-up) = 491). Results showed that the ERASOR overall clinical rating and total score significantly predicted sexual recidivism but only the former significantly predicted time to sexual reoffense. All of the measures (i.e., the ERASOR overall clinical rating and total score, the J-SOAP-II total score, as well as the YLS/CMI) significantly predicted nonsexual recidivism and time to nonsexual reoffense for this sample of youth who sexually offended. Overall, the results suggest that the ERASOR appears to be suited for assessing youth who sexually offended in a non-Western context, but the J-SOAP-II and the YLS/CMI have limited utility for such a purpose.

  19. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  20. A Model Structure for the Heterodimer apoA-IMilano–apoA-II Supports Its Peculiar Susceptibility to Proteolysis

    PubMed Central

    Rocco, Alessandro Guerini; Mollica, Luca; Gianazza, Elisabetta; Calabresi, Laura; Franceschini, Guido; Sirtori, Cesare R.; Eberini, Ivano

    2006-01-01

    In this study, we propose a structure for the heterodimer between apolipoprotein A-IMilano and apolipoprotein A-II (apoA-IM–apoA-II) in a synthetic high-density lipoprotein (HDL) containing L-α-palmitoyloleoyl phosphatidylcholine. We applied bioinformatics/computational tools and procedures, such as molecular docking, molecular and essential dynamics, starting from published crystal structures for apolipoprotein A-I and apolipoprotein A-II. Structural and energetic analyses onto the simulated system showed that the molecular dynamics produced a stabilized synthetic HDL. The essential dynamic analysis showed a deviation from the starting belt structure. Our structural results were validated by limited proteolysis experiments on HDL from apoA-IM carriers in comparison with control HDL. The high sensitivity of apoA-IM–apoA-II to proteases was in agreement with the high root mean-square fluctuation values and the reduction in secondary structure content from molecular dynamics data. Circular dichroism on synthetic HDL containing apoA-IM–apoA-II was consistent with the α-helix content computed on the proposed model. PMID:16891368

  1. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  2. Empirical Modeling of the Redshift Evolution of the [{\\rm{N}}\\,{\\rm{II}}]/Hα Ratio for Galaxy Redshift Surveys

    NASA Astrophysics Data System (ADS)

    Faisst, Andreas L.; Masters, Daniel; Wang, Yun; Merson, Alexander; Capak, Peter; Malhotra, Sangeeta; Rhoads, James E.

    2018-03-01

    We present an empirical parameterization of the [N II]/Hα flux ratio as a function of stellar mass and redshift valid at 0 < z < 2.7 and 8.5< {log}(M/{M}ȯ )< 11.0. This description can (i) easily be applied to simulations for modeling [N II]λ6584 line emission, (ii) deblend [N II] and Hα in current low-resolution grism and narrow-band observations to derive intrinsic Hα fluxes, and (iii) reliably forecast the number counts of Hα emission-line galaxies for future surveys, such as those planned for Euclid and the Wide Field Infrared Survey Telescope (WFIRST). Our model combines the evolution of the locus on the Baldwin, Phillips & Terlevich (BPT) diagram measured in spectroscopic data out to z ∼ 2.5 with the strong dependence of [N II]/Hα on stellar mass and [O III]/Hβ observed in local galaxy samples. We find large variations in the [N II]/Hα flux ratio at a fixed redshift due to its dependency on stellar mass; hence, the assumption of a constant [N II] flux contamination fraction can lead to a significant under- or overestimate of Hα luminosities. Specifically, measurements of the intrinsic Hα luminosity function derived from current low-resolution grism spectroscopy assuming a constant 29% contamination of [N II] can be overestimated by factors of ∼8 at {log}(L)> 43.0 for galaxies at redshifts z ∼ 1.5. This has implications for the prediction of Hα emitters for Euclid and WFIRST. We also study the impact of blended Hα and [N II] on the accuracy of measured spectroscopic redshifts.

  3. Capitalizing on Citizen Science Data for Validating Models and Generating Hypotheses Describing Meteorological Drivers of Mosquito-Borne Disease Risk

    NASA Astrophysics Data System (ADS)

    Boger, R. A.; Low, R.; Paull, S.; Anyamba, A.; Soebiyanto, R. P.

    2017-12-01

    Temperature and precipitation are important drivers of mosquito population dynamics, and a growing set of models have been proposed to characterize these relationships. Validation of these models, and development of broader theories across mosquito species and regions could nonetheless be improved by comparing observations from a global dataset of mosquito larvae with satellite-based measurements of meteorological variables. Citizen science data can be particularly useful for two such aspects of research into the meteorological drivers of mosquito populations: i) Broad-scale validation of mosquito distribution models and ii) Generation of quantitative hypotheses regarding changes to mosquito abundance and phenology across scales. The recently released GLOBE Observer Mosquito Habitat Mapper (GO-MHM) app engages citizen scientists in identifying vector taxa, mapping breeding sites and decommissioning non-natural habitats, and provides a potentially useful new tool for validating mosquito ubiquity projections based on the analysis of remotely sensed environmental data. Our early work with GO-MHM data focuses on two objectives: validating citizen science reports of Aedes aegypti distribution through comparison with accepted scientific data sources, and exploring the relationship between extreme temperature and precipitation events and subsequent observations of mosquito larvae. Ultimately the goal is to develop testable hypotheses regarding the shape and character of this relationship between mosquito species and regions.

  4. Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

    2013-03-01

    SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

  5. Nonlinear convective pulsation models of type II Cepheids

    NASA Astrophysics Data System (ADS)

    Smolec, Radoslaw

    2015-08-01

    We present a grid of nonlinear convective pulsation models of type-II Cepheids: BL Her stars, W Vir stars and RV Tau stars. The models cover a wide range of masses, luminosities, effective temperatures and chemical compositions. The most interesting result is detection of deterministic chaos in the models. Different routes to chaos are detected (period doubling, intermittent route) as well as variety of phenomena intrinsic to chaotic dynamics (periodic islands within chaotic bands, crisis bifurcation, type-I and type-III intermittency). Some of the phenomena (period doubling in BL Her and in RV Tau stars, irregular pulsation of RV Tau stars) are well known in the pulsation of type-II Cepheids. Prospects of discovering the other are briefly discussed. Transition from BL Her type pulsation through W Vir type till RV Tau type is analysed. In the most luminous models a dynamical instability is detected, which indicates that pulsation driven mass loss is important process occurring in type-II Cepheids.

  6. Model Validation | Center for Cancer Research

    Cancer.gov

    Research Investigation and Animal Model Validation This activity is also under development and thus far has included increasing pathology resources, delivering pathology services, as well as using imaging and surgical methods to develop and refine animal models in collaboration with other CCR investigators.

  7. Quantitative model validation of manipulative robot systems

    NASA Astrophysics Data System (ADS)

    Kartowisastro, Iman Herwidiana

    This thesis is concerned with applying the distortion quantitative validation technique to a robot manipulative system with revolute joints. Using the distortion technique to validate a model quantitatively, the model parameter uncertainties are taken into account in assessing the faithfulness of the model and this approach is relatively more objective than the commonly visual comparison method. The industrial robot is represented by the TQ MA2000 robot arm. Details of the mathematical derivation of the distortion technique are given which explains the required distortion of the constant parameters within the model and the assessment of model adequacy. Due to the complexity of a robot model, only the first three degrees of freedom are considered where all links are assumed rigid. The modelling involves the Newton-Euler approach to obtain the dynamics model, and the Denavit-Hartenberg convention is used throughout the work. The conventional feedback control system is used in developing the model. The system behavior to parameter changes is investigated as some parameters are redundant. This work is important so that the most important parameters to be distorted can be selected and this leads to a new term called the fundamental parameters. The transfer function approach has been chosen to validate an industrial robot quantitatively against the measured data due to its practicality. Initially, the assessment of the model fidelity criterion indicated that the model was not capable of explaining the transient record in term of the model parameter uncertainties. Further investigations led to significant improvements of the model and better understanding of the model properties. After several improvements in the model, the fidelity criterion obtained was almost satisfied. Although the fidelity criterion is slightly less than unity, it has been shown that the distortion technique can be applied in a robot manipulative system. Using the validated model, the importance of

  8. Economic analysis of model validation for a challenge problem

    DOE PAGES

    Paez, Paul J.; Paez, Thomas L.; Hasselman, Timothy K.

    2016-02-19

    It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. Asmore » a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.« less

  9. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  10. A colored petri nets based workload evaluation model and its validation through Multi-Attribute Task Battery-II.

    PubMed

    Wang, Peng; Fang, Weining; Guo, Beiyuan

    2017-04-01

    This paper proposed a colored petri nets based workload evaluation model. A formal interpretation of workload was firstly introduced based on the process that reflection of petri nets components to task. A petri net based description of Multiple Resources theory was given by comprehending it from a new angle. A new application of VACP rating scales named V/A-C-P unit, and the definition of colored transitions were proposed to build a model of task process. The calculation of workload mainly has the following four steps: determine token's initial position and values; calculate the weight of directed arcs on the basis of the rules proposed; calculate workload from different transitions, and correct the influence of repetitive behaviors. Verify experiments were carried out based on Multi-Attribute Task Battery-II software. Our results show that there is a strong correlation between the model values and NASA -Task Load Index scores (r=0.9513). In addition, this method can also distinguish behavior characteristics between different people. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. An investigation of the factor structure of the beck depression inventory-II in anorexia nervosa.

    PubMed

    Fuss, Samantha; Trottier, Kathryn; Carter, Jacqueline

    2015-01-01

    Symptoms of depression frequently co-occur with eating disorders and have been associated with negative outcomes. Self-report measures such as the Beck Depression Inventory-II (BDI-II) are commonly used to assess for the presence of depressive symptoms in eating disorders, but the instrument's factor structure in this population has not been examined. The purposes of this study were to explore the factor structure of the BDI-II in a sample of individuals (N = 437) with anorexia nervosa undergoing inpatient treatment and to examine changes in depressive symptoms on each of the identified factors following a course of treatment for anorexia nervosa in order to provide evidence supporting the construct validity of the measure. Exploratory factor analysis revealed that a three-factor model reflected the best fit for the data. Confirmatory factor analysis was used to validate this model against competing models and the three-factor model exhibited strong model fit characteristics. BDI-II scores were significantly reduced on all three factors following inpatient treatment, which supported the construct validity of the scale. The BDI-II appears to be reliable in this population, and the factor structure identified through this analysis may offer predictive utility for identifying individuals who may have more difficulty achieving weight restoration in the context of inpatient treatment. Copyright © 2014 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2014 John Wiley & Sons, Ltd and Eating Disorders Association.

  12. Identification of age-dependent motor and neuropsychological behavioural abnormalities in a mouse model of Mucopolysaccharidosis Type II

    PubMed Central

    Gleitz, Hélène F. E.; O’Leary, Claire; Holley, Rebecca J.

    2017-01-01

    Severe mucopolysaccharidosis type II (MPS II) is a progressive lysosomal storage disease caused by mutations in the IDS gene, leading to a deficiency in the iduronate-2-sulfatase enzyme that is involved in heparan sulphate and dermatan sulphate catabolism. In constitutive form, MPS II is a multi-system disease characterised by progressive neurocognitive decline, severe skeletal abnormalities and hepatosplenomegaly. Although enzyme replacement therapy has been approved for treatment of peripheral organs, no therapy effectively treats the cognitive symptoms of the disease and novel therapies are in development to remediate this. Therapeutic efficacy and subsequent validation can be assessed using a variety of outcome measures that are translatable to clinical practice, such as behavioural measures. We sought to consolidate current knowledge of the cognitive, skeletal and motor abnormalities present in the MPS II mouse model by performing time course behavioural examinations of working memory, anxiety, activity levels, sociability and coordination and balance, up to 8 months of age. Cognitive decline associated with alterations in spatial working memory is detectable at 8 months of age in MPS II mice using spontaneous alternation, together with an altered response to novel environments and anxiolytic behaviour in the open-field. Coordination and balance on the accelerating rotarod were also significantly worse at 8 months, and may be associated with skeletal changes seen in MPS II mice. We demonstrate that the progressive nature of MPS II disease is also seen in the mouse model, and that cognitive and motor differences are detectable at 8 months of age using spontaneous alternation, the accelerating rotarod and the open-field tests. This study establishes neurological, motor and skeletal measures for use in pre-clinical studies to develop therapeutic approaches in MPS II. PMID:28207863

  13. Real-time remote scientific model validation

    NASA Technical Reports Server (NTRS)

    Frainier, Richard; Groleau, Nicolas

    1994-01-01

    This paper describes flight results from the use of a CLIPS-based validation facility to compare analyzed data from a space life sciences (SLS) experiment to an investigator's preflight model. The comparison, performed in real-time, either confirms or refutes the model and its predictions. This result then becomes the basis for continuing or modifying the investigator's experiment protocol. Typically, neither the astronaut crew in Spacelab nor the ground-based investigator team are able to react to their experiment data in real time. This facility, part of a larger science advisor system called Principal Investigator in a Box, was flown on the space shuttle in October, 1993. The software system aided the conduct of a human vestibular physiology experiment and was able to outperform humans in the tasks of data integrity assurance, data analysis, and scientific model validation. Of twelve preflight hypotheses associated with investigator's model, seven were confirmed and five were rejected or compromised.

  14. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  15. Imidazole derivatives as angiotensin II AT1 receptor blockers: Benchmarks, drug-like calculations and quantitative structure-activity relationships modeling

    NASA Astrophysics Data System (ADS)

    Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi

    2018-03-01

    We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.

  16. Absolute, pressure-dependent validation of a calibration-free, airborne laser hygrometer transfer standard (SEALDH-II) from 5 to 1200 ppmv using a metrological humidity generator

    NASA Astrophysics Data System (ADS)

    Buchholz, Bernhard; Ebert, Volker

    2018-01-01

    Highly accurate water vapor measurements are indispensable for understanding a variety of scientific questions as well as industrial processes. While in metrology water vapor concentrations can be defined, generated, and measured with relative uncertainties in the single percentage range, field-deployable airborne instruments deviate even under quasistatic laboratory conditions up to 10-20 %. The novel SEALDH-II hygrometer, a calibration-free, tuneable diode laser spectrometer, bridges this gap by implementing a new holistic concept to achieve higher accuracy levels in the field. We present in this paper the absolute validation of SEALDH-II at a traceable humidity generator during 23 days of permanent operation at 15 different H2O mole fraction levels between 5 and 1200 ppmv. At each mole fraction level, we studied the pressure dependence at six different gas pressures between 65 and 950 hPa. Further, we describe the setup for this metrological validation, the challenges to overcome when assessing water vapor measurements on a high accuracy level, and the comparison results. With this validation, SEALDH-II is the first airborne, metrologically validated humidity transfer standard which links several scientific airborne and laboratory measurement campaigns to the international metrological water vapor scale.

  17. EXODUS II: A finite element data model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoof, L.A.; Yarberry, V.R.

    1994-09-01

    EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise the Application Programming Interface (API).

  18. Validation of Model Forecasts of the Ambient Solar Wind

    NASA Technical Reports Server (NTRS)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  19. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    PubMed

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  20. Single-arm phase II trial design under parametric cure models.

    PubMed

    Wu, Jianrong

    2015-01-01

    The current practice of designing single-arm phase II survival trials is limited under the exponential model. Trial design under the exponential model may not be appropriate when a portion of patients are cured. There is no literature available for designing single-arm phase II trials under the parametric cure model. In this paper, a test statistic is proposed, and a sample size formula is derived for designing single-arm phase II trials under a class of parametric cure models. Extensive simulations showed that the proposed test and sample size formula perform very well under different scenarios. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Measurements of Humidity in the Atmosphere and Validation Experiments (Mohave, Mohave II): Results Overview

    NASA Technical Reports Server (NTRS)

    Leblanc, Thierry; McDermid, Iain S.; McGee, Thomas G.; Twigg, Laurence W.; Sumnicht, Grant K.; Whiteman, David N.; Rush, Kurt D.; Cadirola, Martin P.; Venable, Demetrius D.; Connell, R.; hide

    2008-01-01

    The Measurements of Humidity in the Atmosphere and Validation Experiments (MOHAVE, MOHAVE-II) inter-comparison campaigns took place at the Jet Propulsion Laboratory (JPL) Table Mountain Facility (TMF, 34.5(sup o)N) in October 2006 and 2007 respectively. Both campaigns aimed at evaluating the capability of three Raman lidars for the measurement of water vapor in the upper troposphere and lower stratosphere (UT/LS). During each campaign, more than 200 hours of lidar measurements were compared to balloon borne measurements obtained from 10 Cryogenic Frost-point Hygrometer (CFH) flights and over 50 Vaisala RS92 radiosonde flights. During MOHAVE, fluorescence in all three lidar receivers was identified, causing a significant wet bias above 10-12 km in the lidar profiles as compared to the CFH. All three lidars were reconfigured after MOHAVE, and no such bias was observed during the MOHAVE-II campaign. The lidar profiles agreed very well with the CFH up to 13-17 km altitude, where the lidar measurements become noise limited. The results from MOHAVE-II have shown that the water vapor Raman lidar will be an appropriate technique for the long-term monitoring of water vapor in the UT/LS given a slight increase in its power-aperture, as well as careful calibration.

  2. Assessing Discriminative Performance at External Validation of Clinical Prediction Models

    PubMed Central

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.

    2016-01-01

    Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect

  3. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    PubMed

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W

    2016-01-01

    External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  4. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  5. A momentum source model for wire-wrapped rod bundles—Concept, validation, and application

    DOE PAGES

    Hu, Rui; Fanning, Thomas H.

    2013-06-19

    Large uncertainties still exist in the treatment of wire-spacers and drag models for momentum transfer in current lumped parameter models. Here, to improve the hydraulic modeling of wire-wrap spacers in a rod bundle, a three-dimensional momentum source model (MSM) has been developed to model the anisotropic flow without the need to resolve the geometric details of the wire-wraps. The MSM is examined for 7-pin and 37-pin bundles steady-state simulations using the commercial CFD code STAR-CCM+. The calculated steady-state inter-subchannel cross flow velocities match very well in comparisons between bare bundles with the MSM applied and the wire-wrapped bundles with explicitmore » geometry. The validity of the model is further verified by mesh and parameter sensitivity studies. Furthermore, the MSM is applied to a 61-pin EBR-II experimental subassembly for both steady state and PLOF transient simulations. Reasonably accurate predictions of temperature, pressure, and fluid flow velocities have been achieved using the MSM for both steady-state and transient conditions. Significant computing resources are saved with the MSM since it can be used on a much coarser computational mesh.« less

  6. Validation of Magnetospheric Magnetohydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  7. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  8. Validity of empirical models of exposure in asphalt paving

    PubMed Central

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  9. Time-domain simulation of damped impacted plates. II. Numerical model and results.

    PubMed

    Lambourg, C; Chaigne, A; Matignon, D

    2001-04-01

    A time-domain model for the flexural vibrations of damped plates was presented in a companion paper [Part I, J. Acoust. Soc. Am. 109, 1422-1432 (2001)]. In this paper (Part II), the damped-plate model is extended to impact excitation, using Hertz's law of contact, and is solved numerically in order to synthesize sounds. The numerical method is based on the use of a finite-difference scheme of second order in time and fourth order in space. As a consequence of the damping terms, the stability and dispersion properties of this scheme are modified, compared to the undamped case. The numerical model is used for the time-domain simulation of vibrations and sounds produced by impact on isotropic and orthotropic plates made of various materials (aluminum, glass, carbon fiber and wood). The efficiency of the method is validated by comparisons with analytical and experimental data. The sounds produced show a high degree of similarity with real sounds and allow a clear recognition of each constitutive material of the plate without ambiguity.

  10. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  11. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  12. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  13. Predicting the success of IVF: external validation of the van Loendersloot's model.

    PubMed

    Sarais, Veronica; Reschini, Marco; Busnelli, Andrea; Biancardi, Rossella; Paffoni, Alessio; Somigliana, Edgardo

    2016-06-01

    Is the predictive model for IVF success proposed by van Loendersloot et al. valid in a different geographical and cultural context? The model discriminates well but was less accurate than in the original context where it was developed. Several independent groups have developed models that combine different variables with the aim of estimating the chance of pregnancy with IVF but only four of them have been externally validated. One of these four, the van Loendersloot's model, deserves particular attention and further investigation for at least three reasons; (i) the reported area under the receiver operating characteristics curve (c-statistics) in the temporal validation setting was the highest reported to date (0.68), (ii) the perspective of the model is clinically wise since it includes variables obtained from previous failed cycles, if any, so it can be applied to any women entering an IVF cycle, (iii) the model lacks external validation in a geographically different center. Retrospective cohort study of women undergoing oocyte retrieval for IVF between January 2013 and December 2013 at the infertility unit of the Fondazione Ca' Granda, Ospedale Maggiore Policlinico of Milan, Italy. Only the first oocyte retrieval cycle performed during the study period was included in the study. Women with previous IVF cycles were excluded if the last one before the study cycle was in another center. The main outcome was the cumulative live birth rate per oocytes retrieval. Seven hundred seventy-two women were selected. Variables included in the van Loendersloot's model and the relative weights (beta) were used. The variable resulting from this combination (Y) was transformed into a probability. The discriminatory capacity was assessed using the c-statistics. Calibration was made using a logistic regression that included Y as the unique variable and live birth as the outcome. Data are presented using both the original and the calibrated models. Performance was evaluated

  14. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  15. Validation of the Poisson Stochastic Radiative Transfer Model

    NASA Technical Reports Server (NTRS)

    Zhuravleva, Tatiana; Marshak, Alexander

    2004-01-01

    A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

  16. AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.

    PubMed

    Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L

    2016-04-01

    A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.

  17. Development of the AGREE II, part 2: assessment of validity of items and tools to support application

    PubMed Central

    Brouwers, Melissa C.; Kho, Michelle E.; Browman, George P.; Burgers, Jako S.; Cluzeau, Françoise; Feder, Gene; Fervers, Béatrice; Graham, Ian D.; Hanna, Steven E.; Makarski, Julie

    2010-01-01

    Background We established a program of research to improve the development, reporting and evaluation of practice guidelines. We assessed the construct validity of the items and user’s manual in the β version of the AGREE II. Methods We designed guideline excerpts reflecting high-and low-quality guideline content for 21 of the 23 items in the tool. We designed two study packages so that one low-quality and one high-quality version of each item were randomly assigned to each package. We randomly assigned 30 participants to one of the two packages. Participants reviewed and rated the guideline content according to the instructions of the user’s manual and completed a survey assessing the manual. Results In all cases, content designed to be of high quality was rated higher than low-quality content; in 18 of 21 cases, the differences were significant (p < 0.05). The manual was rated by participants as appropriate, easy to use, and helpful in differentiating guidelines of varying quality, with all scores above the mid-point of the seven-point scale. Considerable feedback was offered on how the items and manual of the β-AGREE II could be improved. Interpretation The validity of the items was established and the user’s manual was rated as highly useful by users. We used these results and those of our study presented in part 1 to modify the items and user’s manual. We recommend AGREE II (available at www.agreetrust.org) as the revised standard for guideline development, reporting and evaluation. PMID:20513779

  18. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2017-01-01

    Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237

  19. A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part II - Validation and localization analysis

    NASA Astrophysics Data System (ADS)

    Das, Arghya; Tengattini, Alessandro; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

    2014-10-01

    We study the mechanical failure of cemented granular materials (e.g., sandstones) using a constitutive model based on breakage mechanics for grain crushing and damage mechanics for cement fracture. The theoretical aspects of this model are presented in Part I: Tengattini et al. (2014), A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables, Part I - Theory (Journal of the Mechanics and Physics of Solids, 10.1016/j.jmps.2014.05.021). In this Part II we investigate the constitutive and structural responses of cemented granular materials through analyses of Boundary Value Problems (BVPs). The multiple failure mechanisms captured by the proposed model enable the behavior of cemented granular rocks to be well reproduced for a wide range of confining pressures. Furthermore, through comparison of the model predictions and experimental data, the micromechanical basis of the model provides improved understanding of failure mechanisms of cemented granular materials. In particular, we show that grain crushing is the predominant inelastic deformation mechanism under high pressures while cement failure is the relevant mechanism at low pressures. Over an intermediate pressure regime a mixed mode of failure mechanisms is observed. Furthermore, the micromechanical roots of the model allow the effects on localized deformation modes of various initial microstructures to be studied. The results obtained from both the constitutive responses and BVP solutions indicate that the proposed approach and model provide a promising basis for future theoretical studies on cemented granular materials.

  20. External Validation Study of First Trimester Obstetric Prediction Models (Expect Study I): Research Protocol and Population Characteristics.

    PubMed

    Meertens, Linda Jacqueline Elisabeth; Scheepers, Hubertina Cj; De Vries, Raymond G; Dirksen, Carmen D; Korstjens, Irene; Mulder, Antonius Lm; Nieuwenhuijze, Marianne J; Nijhuis, Jan G; Spaanderman, Marc Ea; Smits, Luc Jm

    2017-10-26

    A number of first-trimester prediction models addressing important obstetric outcomes have been published. However, most models have not been externally validated. External validation is essential before implementing a prediction model in clinical practice. The objective of this paper is to describe the design of a study to externally validate existing first trimester obstetric prediction models, based upon maternal characteristics and standard measurements (eg, blood pressure), for the risk of pre-eclampsia (PE), gestational diabetes mellitus (GDM), spontaneous preterm birth (PTB), small-for-gestational-age (SGA) infants, and large-for-gestational-age (LGA) infants among Dutch pregnant women (Expect Study I). The results of a pilot study on the feasibility and acceptability of the recruitment process and the comprehensibility of the Pregnancy Questionnaire 1 are also reported. A multicenter prospective cohort study was performed in The Netherlands between July 1, 2013 and December 31, 2015. First trimester obstetric prediction models were systematically selected from the literature. Predictor variables were measured by the Web-based Pregnancy Questionnaire 1 and pregnancy outcomes were established using the Postpartum Questionnaire 1 and medical records. Information about maternal health-related quality of life, costs, and satisfaction with Dutch obstetric care was collected from a subsample of women. A pilot study was carried out before the official start of inclusion. External validity of the models will be evaluated by assessing discrimination and calibration. Based on the pilot study, minor improvements were made to the recruitment process and online Pregnancy Questionnaire 1. The validation cohort consists of 2614 women. Data analysis of the external validation study is in progress. This study will offer insight into the generalizability of existing, non-invasive first trimester prediction models for various obstetric outcomes in a Dutch obstetric population

  1. SAGE III Aerosol Extinction Validation in the Arctic Winter: Comparisons with SAGE II and POAM III

    NASA Technical Reports Server (NTRS)

    Thomason, L. W.; Poole, L. R.; Randall, C. E.

    2007-01-01

    The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use potential vorticity as a spatial coordinate and thus greatly increased the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020-nm extinction ratio shows a consistent bias of approx. 30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that the two data sets are not well correlated below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

  2. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  3. Advanced Residual Strength Degradation Rate Modeling for Advanced Composite Structures. Volume II. Tasks II and III.

    DTIC Science & Technology

    1981-07-01

    ADVANCED COMPOSITE STRUCTURES VOLUME II - TASKS Ix AND III K. N. Lauraitis Tl J. T. Ryder ?l4 D. E. Pettit ~ Lockheed-California Company S Burbank...Strength Degradation Rate Final Report Modeling for Advanced Composite Structures 1 July 1979 to 29 May 1981 Vol II - Task II and III S. PERFORMIN ONG...identify by block namber) composites , graphite/epoxy, impact damage, damaged holes, fatigue, damage propagation, residual strength, NDI 20. ABSTRACT

  4. Validation of the phase II feasibility study in a palliative care setting: gastrografin in malignant bowel obstruction.

    PubMed

    Lee, Cindy; Vather, Ryash; O'Callaghan, Anne; Robinson, Jackie; McLeod, Briar; Findlay, Michael; Bissett, Ian

    2013-12-01

    Malignant bowel obstruction (MBO) is common in patients with advanced cancer. To perform a phase II study to assess the feasibility of conducting a phase III trial investigating the therapeutic value of gastrografin in MBO. Randomized double-blinded placebo-controlled feasibility study. Participants received 100 mL of either gastrografin or placebo. Over 8 months, 57 patients were screened and 9 enrolled (15.8% recruitment rate). Of the 9 enrolled, 4 received gastrografin (with 2 completing assessment) and 5 received placebo (with 4 completing assessment). It is not feasible to conduct a phase III trial using the same study protocol. This study validates the use of the phase II feasibility study to assess protocol viability in a palliative population prior to embarking on a larger trial.

  5. Carbonate-mediated Fe(II) oxidation in the air-cathode fuel cell: a kinetic model in terms of Fe(II) speciation.

    PubMed

    Song, Wei; Zhai, Lin-Feng; Cui, Yu-Zhi; Sun, Min; Jiang, Yuan

    2013-06-06

    Due to the high redox activity of Fe(II) and its abundance in natural waters, the electro-oxidation of Fe(II) can be found in many air-cathode fuel cell systems, such as acid mine drainage fuel cells and sediment microbial fuel cells. To deeply understand these iron-related systems, it is essential to elucidate the kinetics and mechanisms involved in the electro-oxidation of Fe(II). This work aims to develop a kinetic model that adequately describes the electro-oxidation process of Fe(II) in air-cathode fuel cells. The speciation of Fe(II) is incorporated into the model, and contributions of individual Fe(II) species to the overall Fe(II) oxidation rate are quantitatively evaluated. The results show that the kinetic model can accurately predict the electro-oxidation rate of Fe(II) in air-cathode fuel cells. FeCO3, Fe(OH)2, and Fe(CO3)2(2-) are the most important species determining the electro-oxidation kinetics of Fe(II). The Fe(II) oxidation rate is primarily controlled by the oxidation of FeCO3 species at low pH, whereas at high pH Fe(OH)2 and Fe(CO3)2(2-) are the dominant species. Solution pH, carbonate concentration, and solution salinity are able to influence the electro-oxidation kinetics of Fe(II) through changing both distribution and kinetic activity of Fe(II) species.

  6. Modeling Fe II Emission and Revised Fe II (UV) Empirical Templates for the Seyfert 1 Galaxy I Zw 1

    NASA Astrophysics Data System (ADS)

    Bruhweiler, F.; Verner, E.

    2008-03-01

    We use the narrow-lined broad-line region (BLR) of the Seyfert 1 galaxy, I Zw 1, as a laboratory for modeling the ultraviolet (UV) Fe II 2100-3050 Å emission complex. We calculate a grid of Fe II emission spectra representative of BLR clouds and compare them with the observed I Zw 1 spectrum. Our predicted spectrum for log [nH/(cm -3) ] = 11.0, log [ΦH/(cm -2 s-1) ] = 20.5, and ξ/(1 km s-1) = 20, using Cloudy and an 830 level model atom for Fe II with energies up to 14.06 eV, gives a better fit to the UV Fe II emission than models with fewer levels. Our analysis indicates (1) the observed UV Fe II emission must be corrected for an underlying Fe II pseudocontinuum; (2) Fe II emission peaks can be misidentified as that of other ions in active galactic nuclei (AGNs) with narrow-lined BLRs possibly affecting deduced physical parameters; (3) the shape of 4200-4700 Å Fe II emission in I Zw 1 and other AGNs is a relative indicator of narrow-line region (NLR) and BLR Fe II emission; (4) predicted ratios of Lyα, C III], and Fe II emission relative to Mg II λ2800 agree with extinction corrected observed I Zw 1 fluxes, except for C IV λ1549 (5) the sensitivity of Fe II emission strength to microturbulence ξ casts doubt on existing relative Fe/Mg abundances derived from Fe II (UV)/Mg II flux ratios. Our calculated Fe II emission spectra, suitable for BLRs in AGNs, are available at http://iacs.cua.edu/people/verner/FeII. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 05-26555.

  7. Validation of urban freeway models. [supporting datasets

    DOT National Transportation Integrated Search

    2015-01-01

    The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...

  8. Developing rural palliative care: validating a conceptual model.

    PubMed

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  9. Validation of the 12-gene colon cancer recurrence score as a predictor of recurrence risk in stage II and III rectal cancer patients.

    PubMed

    Reimers, Marlies S; Kuppen, Peter J K; Lee, Mark; Lopatin, Margarita; Tezcan, Haluk; Putter, Hein; Clark-Langone, Kim; Liefers, Gerrit Jan; Shak, Steve; van de Velde, Cornelis J H

    2014-11-01

    The 12-gene Recurrence Score assay is a validated predictor of recurrence risk in stage II and III colon cancer patients. We conducted a prospectively designed study to validate this assay for prediction of recurrence risk in stage II and III rectal cancer patients from the Dutch Total Mesorectal Excision (TME) trial. RNA was extracted from fixed paraffin-embedded primary rectal tumor tissue from stage II and III patients randomized to TME surgery alone, without (neo)adjuvant treatment. Recurrence Score was assessed by quantitative real time-polymerase chain reaction using previously validated colon cancer genes and algorithm. Data were analysed by Cox proportional hazards regression, adjusting for stage and resection margin status. All statistical tests were two-sided. Recurrence Score predicted risk of recurrence (hazard ratio [HR] = 1.57, 95% confidence interval [CI] = 1.11 to 2.21, P = .01), risk of distant recurrence (HR = 1.50, 95% CI = 1.04 to 2.17, P = .03), and rectal cancer-specific survival (HR = 1.64, 95% CI = 1.15 to 2.34, P = .007). The effect of Recurrence Score was most prominent in stage II patients and attenuated with more advanced stage (P(interaction) ≤ .007 for each endpoint). In stage II, five-year cumulative incidence of recurrence ranged from 11.1% in the predefined low Recurrence Score group (48.5% of patients) to 43.3% in the high Recurrence Score group (23.1% of patients). The 12-gene Recurrence Score is a predictor of recurrence risk and cancer-specific survival in rectal cancer patients treated with surgery alone, suggesting a similar underlying biology in colon and rectal cancers. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Multiple Versus Single Set Validation of Multivariate Models to Avoid Mistakes.

    PubMed

    Harrington, Peter de Boves

    2018-01-02

    Validation of multivariate models is of current importance for a wide range of chemical applications. Although important, it is neglected. The common practice is to use a single external validation set for evaluation. This approach is deficient and may mislead investigators with results that are specific to the single validation set of data. In addition, no statistics are available regarding the precision of a derived figure of merit (FOM). A statistical approach using bootstrapped Latin partitions is advocated. This validation method makes an efficient use of the data because each object is used once for validation. It was reviewed a decade earlier but primarily for the optimization of chemometric models this review presents the reasons it should be used for generalized statistical validation. Average FOMs with confidence intervals are reported and powerful, matched-sample statistics may be applied for comparing models and methods. Examples demonstrate the problems with single validation sets.

  11. Outward Bound Outcome Model Validation and Multilevel Modeling

    ERIC Educational Resources Information Center

    Luo, Yuan-Chun

    2011-01-01

    This study was intended to measure construct validity for the Outward Bound Outcomes Instrument (OBOI) and to predict outcome achievement from individual characteristics and course attributes using multilevel modeling. A sample of 2,340 participants was collected by Outward Bound USA between May and September 2009 using the OBOI. Two phases of…

  12. Validation of Community Models: Identifying Events in Space Weather Model Timelines

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    I develop and document a set of procedures which test the quality of predictions of solar wind speed and polarity of the interplanetary magnetic field (IMF) made by coupled models of the ambient solar corona and heliosphere. The Wang-Sheeley-Arge (WSA) model is used to illustrate the application of these validation procedures. I present an algorithm which detects transitions of the solar wind from slow to high speed. I also present an algorithm which processes the measured polarity of the outward directed component of the IMF. This removes high-frequency variations to expose the longer-scale changes that reflect IMF sector changes. I apply these algorithms to WSA model predictions made using a small set of photospheric synoptic magnetograms obtained by the Global Oscillation Network Group as input to the model. The results of this preliminary validation of the WSA model (version 1.6) are summarized.

  13. Cross-Cultural Validation of the Beck Depression Inventory-II across U.S. and Turkish Samples

    ERIC Educational Resources Information Center

    Canel-Cinarbas, Deniz; Cui, Ying; Lauridsen, Erica

    2011-01-01

    The purpose of this study was to test the Beck Depression Inventory-II (BDI-II) for factorial invariance across Turkish and U.S. college student samples. The results indicated that (a) a two-factor model has an adequate fit for both samples, thus providing evidence of configural invariance, and (b) there is a metric invariance but "no"…

  14. Kinetics of Cd(ii) adsorption and desorption on ferrihydrite: experiments and modeling.

    PubMed

    Liang, Yuzhen; Tian, Lei; Lu, Yang; Peng, Lanfang; Wang, Pei; Lin, Jingyi; Cheng, Tao; Dang, Zhi; Shi, Zhenqing

    2018-05-15

    The kinetics of Cd(ii) adsorption/desorption on ferrihydrite is an important process affecting the fate, transport, and bioavailability of Cd(ii) in the environment, which was rarely systematically studied and understood at quantitative levels. In this work, a combination of stirred-flow kinetic experiments, batch adsorption equilibrium experiments, high-resolution transmission electron microscopy (HR-TEM), and mechanistic kinetic modeling were used to study the kinetic behaviors of Cd(ii) adsorption/desorption on ferrihydrite. HR-TEM images showed the open, loose, and sponge-like structure of ferrihydrite. The batch adsorption equilibrium experiments revealed that higher pH and initial metal concentration increased Cd(ii) adsorption on ferrihydrite. The stirred-flow kinetic results demonstrated the increased adsorption rate and capacity as a result of the increased pH, influent concentration, and ferrihydrite concentration. The mechanistic kinetic model successfully described the kinetic behaviors of Cd(ii) during the adsorption and desorption stages under various chemistry conditions. The model calculations showed that the adsorption rate coefficients varied as a function of solution chemistry, and the relative contributions of the weak and strong ferrihydrite sites for Cd(ii) binding varied with time at different pH and initial metal concentrations. Our model is able to quantitatively assess the contributions of each individual ferrihydrite binding site to the overall Cd(ii) adsorption/desorption kinetics. This study provided insights into the dynamic behavior of Cd(ii) and a predictive modeling tool for Cd(ii) adsorption/desorption kinetics when ferrihydrite is present, which may be helpful for the risk assessment and management of Cd contaminated sites.

  15. Validating Computational Human Behavior Models: Consistency and Accuracy Issues

    DTIC Science & Technology

    2004-06-01

    includes a discussion of SME demographics, content, and organization of the datasets . This research generalizes data from two pilot studies and two base...meet requirements for validating the varied and complex behavioral models. Through a series of empirical studies , this research identifies subject...meet requirements for validating the varied and complex behavioral models. Through a series of empirical studies , this research identifies subject

  16. Adolescent Personality: A Five-Factor Model Construct Validation

    ERIC Educational Resources Information Center

    Baker, Spencer T.; Victor, James B.; Chambers, Anthony L.; Halverson, Jr., Charles F.

    2004-01-01

    The purpose of this study was to investigate convergent and discriminant validity of the five-factor model of adolescent personality in a school setting using three different raters (methods): self-ratings, peer ratings, and teacher ratings. The authors investigated validity through a multitrait-multimethod matrix and a confirmatory factor…

  17. Validation study of a quantitative multigene reverse transcriptase-polymerase chain reaction assay for assessment of recurrence risk in patients with stage II colon cancer.

    PubMed

    Gray, Richard G; Quirke, Philip; Handley, Kelly; Lopatin, Margarita; Magill, Laura; Baehner, Frederick L; Beaumont, Claire; Clark-Langone, Kim M; Yoshizawa, Carl N; Lee, Mark; Watson, Drew; Shak, Steven; Kerr, David J

    2011-12-10

    We developed quantitative gene expression assays to assess recurrence risk and benefits from chemotherapy in patients with stage II colon cancer. We sought validation by using RNA extracted from fixed paraffin-embedded primary colon tumor blocks from 1,436 patients with stage II colon cancer in the QUASAR (Quick and Simple and Reliable) study of adjuvant fluoropyrimidine chemotherapy versus surgery alone. A recurrence score (RS) and a treatment score (TS) were calculated from gene expression levels of 13 cancer-related genes (n = 7 recurrence genes and n = 6 treatment benefit genes) and from five reference genes with prespecified algorithms. Cox proportional hazards regression models and log-rank methods were used to analyze the relationship between the RS and risk of recurrence in patients treated with surgery alone and between TS and benefits of chemotherapy. Risk of recurrence was significantly associated with RS (hazard ratio [HR] per interquartile range, 1.38; 95% CI, 1.11 to 1.74; P = .004). Recurrence risks at 3 years were 12%, 18%, and 22% for predefined low, intermediate, and high recurrence risk groups, respectively. T stage (HR, 1.94; P < .001) and mismatch repair (MMR) status (HR, 0.31; P < .001) were the strongest histopathologic prognostic factors. The continuous RS was associated with risk of recurrence (P = .006) beyond these and other covariates. There was no trend for increased benefit from chemotherapy at higher TS (P = .95). The continuous 12-gene RS has been validated in a prospective study for assessment of recurrence risk in patients with stage II colon cancer after surgery and provides prognostic value that complements T stage and MMR. The TS was not predictive of chemotherapy benefit.

  18. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  19. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    NASA Astrophysics Data System (ADS)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  20. Factors associated with therapeutic inertia in hypertension: validation of a predictive model.

    PubMed

    Redón, Josep; Coca, Antonio; Lázaro, Pablo; Aguilar, Ma Dolores; Cabañas, Mercedes; Gil, Natividad; Sánchez-Zamorano, Miguel Angel; Aranda, Pedro

    2010-08-01

    To study factors associated with therapeutic inertia in treating hypertension and to develop a predictive model to estimate the probability of therapeutic inertia in a given medical consultation, based on variables related to the consultation, patient, physician, clinical characteristics, and level of care. National, multicentre, observational, cross-sectional study in primary care and specialist (hospital) physicians who each completed a questionnaire on therapeutic inertia, provided professional data and collected clinical data on four patients. Therapeutic inertia was defined as a consultation in which treatment change was indicated (i.e., SBP >or= 140 or DBP >or= 90 mmHg in all patients; SBP >or= 130 or DBP >or= 80 in patients with diabetes or stroke), but did not occur. A predictive model was constructed and validated according to the factors associated with therapeutic inertia. Data were collected on 2595 patients and 13,792 visits. Therapeutic inertia occurred in 7546 (75%) of the 10,041 consultations in which treatment change was indicated. Factors associated with therapeutic inertia were primary care setting, male sex, older age, SPB and/or DBP values close to normal, treatment with more than one antihypertensive drug, treatment with an ARB II, and more than six visits/year. Physician characteristics did not weigh heavily in the association. The predictive model was valid internally and externally, with acceptable calibration, discrimination and reproducibility, and explained one-third of the variability in therapeutic inertia. Although therapeutic inertia is frequent in the management of hypertension, the factors explaining it are not completely clear. Whereas some aspects of the consultations were associated with therapeutic inertia, physician characteristics were not a decisive factor.

  1. SPR Hydrostatic Column Model Verification and Validation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less

  2. Longitudinal Models of Reliability and Validity: A Latent Curve Approach.

    ERIC Educational Resources Information Center

    Tisak, John; Tisak, Marie S.

    1996-01-01

    Dynamic generalizations of reliability and validity that will incorporate longitudinal or developmental models, using latent curve analysis, are discussed. A latent curve model formulated to depict change is incorporated into the classical definitions of reliability and validity. The approach is illustrated with sociological and psychological…

  3. Effect of Cu(II), Cd(II) and Zn(II) on Pb(II) biosorption by algae Gelidium-derived materials.

    PubMed

    Vilar, Vítor J P; Botelho, Cidália M S; Boaventura, Rui A R

    2008-06-15

    Biosorption of Pb(II), Cu(II), Cd(II) and Zn(II) from binary metal solutions onto the algae Gelidium sesquipedale, an algal industrial waste and a waste-based composite material was investigated at pH 5.3, in a batch system. Binary Pb(II)/Cu(II), Pb(II)/Cd(II) and Pb(II)/Zn(II) solutions have been tested. For the same equilibrium concentrations of both metal ions (1 mmol l(-1)), approximately 66, 85 and 86% of the total uptake capacity of the biosorbents is taken by lead ions in the systems Pb(II)/Cu(II), Pb(II)/Cd(II) and Pb(II)/Zn(II), respectively. Two-metal results were fitted to a discrete and a continuous model, showing the inhibition of the primary metal biosorption by the co-cation. The model parameters suggest that Cd(II) and Zn(II) have the same decreasing effect on the Pb(II) uptake capacity. The uptake of Pb(II) was highly sensitive to the presence of Cu(II). From the discrete model it was possible to obtain the Langmuir affinity constant for Pb(II) biosorption. The presence of the co-cations decreases the apparent affinity of Pb(II). The experimental results were successfully fitted by the continuous model, at different pH values, for each biosorbent. The following sequence for the equilibrium affinity constants was found: Pb>Cu>Cd approximately Zn.

  4. Development and validation of a mass casualty conceptual model.

    PubMed

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  5. The Beck Depression Inventory-II: Testing for Measurement Equivalence and Factor Mean Differences across Hong Kong and American Adolescents

    ERIC Educational Resources Information Center

    Byrne, Barbara M.; Stewart, Sunita M.; Kennard, Betsy D.; Lee, Peter W. H.

    2007-01-01

    Working within the framework of a confirmatory factor analytic (CFA) model, this study adds another dimension to construct validation of both the Beck Depression Inventory-II (BDI-II; Beck, Steer, & Brown, 1996) and a Chinese version of the BDI-II (C-BDI-II; Chinese Behavioral Sciences Society, 2000). Specifically, we tested for measurement…

  6. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  7. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  8. Using remote sensing for validation of a large scale hydrologic and hydrodynamic model in the Amazon

    NASA Astrophysics Data System (ADS)

    Paiva, R. C.; Bonnet, M.; Buarque, D. C.; Collischonn, W.; Frappart, F.; Mendes, C. B.

    2011-12-01

    We present the validation of the large-scale, catchment-based hydrological MGB-IPH model in the Amazon River basin. In this model, physically-based equations are used to simulate the hydrological processes, such as the Penman Monteith method to estimate evapotranspiration, or the Moore and Clarke infiltration model. A new feature recently introduced in the model is a 1D hydrodynamic module for river routing. It uses the full Saint-Venant equations and a simple floodplain storage model. River and floodplain geometry parameters are extracted from SRTM DEM using specially developed GIS algorithms that provide catchment discretization, estimation of river cross-sections geometry and water storage volume variations in the floodplains. The model was forced using satellite-derived daily rainfall TRMM 3B42, calibrated against discharge data and first validated using daily discharges and water levels from 111 and 69 stream gauges, respectively. Then, we performed a validation against remote sensing derived hydrological products, including (i) monthly Terrestrial Water Storage (TWS) anomalies derived from GRACE, (ii) river water levels derived from ENVISAT satellite altimetry data (212 virtual stations from Santos da Silva et al., 2010) and (iii) a multi-satellite monthly global inundation extent dataset at ~25 x 25 km spatial resolution (Papa et al., 2010). Validation against river discharges shows good performance of the MGB-IPH model. For 70% of the stream gauges, the Nash and Suttcliffe efficiency index (ENS) is higher than 0.6 and at Óbidos, close to Amazon river outlet, ENS equals 0.9 and the model bias equals,-4.6%. Largest errors are located in drainage areas outside Brazil and we speculate that it is due to the poor quality of rainfall datasets in these areas poorly monitored and/or mountainous. Validation against water levels shows that model is performing well in the major tributaries. For 60% of virtual stations, ENS is higher than 0.6. But, similarly, largest

  9. A multi-source satellite data approach for modelling Lake Turkana water level: Calibration and validation using satellite altimetry data

    USGS Publications Warehouse

    Velpuri, N.M.; Senay, G.B.; Asante, K.O.

    2012-01-01

    model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins. ?? Author(s) 2012.

  10. FISPACT-II: An Advanced Simulation System for Activation, Transmutation and Material Modelling

    NASA Astrophysics Data System (ADS)

    Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.; Gilbert, M. R.; Fleming, M.; Arter, W.

    2017-01-01

    Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2 and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation

  11. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  12. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  13. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  14. Semipermeable Hollow Fiber Phantoms for Development and Validation of Perfusion-Sensitive MR Methods and Signal Models

    PubMed Central

    Anderson, J.R.; Ackerman, J.J.H.; Garbow, J.R.

    2015-01-01

    Two semipermeable, hollow fiber phantoms for the validation of perfusion-sensitive magnetic resonance methods and signal models are described. Semipermeable hollow fibers harvested from a standard commercial hemodialysis cartridge serve to mimic tissue capillary function. Flow of aqueous media through the fiber lumen is achieved with a laboratory-grade peristaltic pump. Diffusion of water and solute species (e.g., Gd-based contrast agent) occurs across the fiber wall, allowing exchange between the lumen and the extralumenal space. Phantom design attributes include: i) small physical size, ii) easy and low-cost construction, iii) definable compartment volumes, and iv) experimental control over media content and flow rate. PMID:26167136

  15. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    2012-01-01

    Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

  16. Modeling and Validation of a Three-Stage Solidification Model for Sprays

    NASA Astrophysics Data System (ADS)

    Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

    2010-09-01

    A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

  17. Validation of recent geopotential models in Tierra Del Fuego

    NASA Astrophysics Data System (ADS)

    Gomez, Maria Eugenia; Perdomo, Raul; Del Cogliano, Daniel

    2017-10-01

    This work presents a validation study of global geopotential models (GGM) in the region of Fagnano Lake, located in the southern Andes. This is an excellent area for this type of validation because it is surrounded by the Andes Mountains, and there is no terrestrial gravity or GNSS/levelling data. However, there are mean lake level (MLL) observations, and its surface is assumed to be almost equipotential. Furthermore, in this article, we propose improved geoid solutions through the Residual Terrain Modelling (RTM) approach. Using a global geopotential model, the results achieved allow us to conclude that it is possible to use this technique to extend an existing geoid model to those regions that lack any information (neither gravimetric nor GNSS/levelling observations). As GGMs have evolved, our results have improved progressively. While the validation of EGM2008 with MLL data shows a standard deviation of 35 cm, GOCO05C shows a deviation of 13 cm, similar to the results obtained on land.

  18. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  19. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  20. Mortality Probability Model III and Simplified Acute Physiology Score II

    PubMed Central

    Vasilevskis, Eduard E.; Kuzniewicz, Michael W.; Cason, Brian A.; Lane, Rondall K.; Dean, Mitzi L.; Clay, Ted; Rennie, Deborah J.; Vittinghoff, Eric; Dudley, R. Adams

    2009-01-01

    Background: To develop and compare ICU length-of-stay (LOS) risk-adjustment models using three commonly used mortality or LOS prediction models. Methods: Between 2001 and 2004, we performed a retrospective, observational study of 11,295 ICU patients from 35 hospitals in the California Intensive Care Outcomes Project. We compared the accuracy of the following three LOS models: a recalibrated acute physiology and chronic health evaluation (APACHE) IV-LOS model; and models developed using risk factors in the mortality probability model III at zero hours (MPM0) and the simplified acute physiology score (SAPS) II mortality prediction model. We evaluated models by calculating the following: (1) grouped coefficients of determination; (2) differences between observed and predicted LOS across subgroups; and (3) intraclass correlations of observed/expected LOS ratios between models. Results: The grouped coefficients of determination were APACHE IV with coefficients recalibrated to the LOS values of the study cohort (APACHE IVrecal) [R2 = 0.422], mortality probability model III at zero hours (MPM0 III) [R2 = 0.279], and simplified acute physiology score (SAPS II) [R2 = 0.008]. For each decile of predicted ICU LOS, the mean predicted LOS vs the observed LOS was significantly different (p ≤ 0.05) for three, two, and six deciles using APACHE IVrecal, MPM0 III, and SAPS II, respectively. Plots of the predicted vs the observed LOS ratios of the hospitals revealed a threefold variation in LOS among hospitals with high model correlations. Conclusions: APACHE IV and MPM0 III were more accurate than SAPS II for the prediction of ICU LOS. APACHE IV is the most accurate and best calibrated model. Although it is less accurate, MPM0 III may be a reasonable option if the data collection burden or the treatment effect bias is a consideration. PMID:19363210

  1. Empirical calibration of the near-infrared CaII triplet - IV. The stellar population synthesis models

    NASA Astrophysics Data System (ADS)

    Vazdekis, A.; Cenarro, A. J.; Gorgas, J.; Cardiel, N.; Peletier, R. F.

    2003-04-01

    We present a new evolutionary stellar population synthesis model, which predicts spectral energy distributions for single-age single-metallicity stellar populations (SSPs) at resolution 1.5 Å (FWHM) in the spectral region of the near-infrared CaII triplet feature. The main ingredient of the model is a new extensive empirical stellar spectral library that has been recently presented by Cenarro et al., which is composed of more than 600 stars with an unprecedented coverage of the stellar atmospheric parameters. Two main products of interest for stellar population analysis are presented. The first is a spectral library for SSPs with metallicities -1.7 < [Fe/H] < +0.2, a large range of ages (0.1-18 Gyr) and initial mass function (IMF) types. They are well suited to modelling galaxy data, since the SSP spectra, with flux-calibrated response curves, can be smoothed to the resolution of the observational data, taking into account the internal velocity dispersion of the galaxy, allowing the user to analyse the observed spectrum in its own system. We also produce integrated absorption-line indices (namely CaT*, CaT and PaT) for the same SSPs in the form of equivalent widths. We find the following behaviour for the CaII triplet feature in old-aged SSPs: (i) the strength of the CaT* index does not change much with time for all metallicities for ages larger than ~3 Gyr; (ii) this index shows a strong dependence on metallicity for values below [M/H]~-0.5 and (iii) for larger metallicities this feature does not show a significant dependence either on age or on the metallicity, being more sensitive to changes in the slope of power-like IMF shapes. The SSP spectra have been calibrated with measurements for globular clusters by Armandroff & Zinn, which are well reproduced, probing the validity of using the integrated CaII triplet feature for determining the metallicities of these systems. Fitting the models to two early-type galaxies of different luminosities (NGC 4478 and 4365

  2. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  3. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    PubMed

    Gomes, Anna; van der Wijk, Lars; Proost, Johannes H; Sinha, Bhanu; Touw, Daan J

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  4. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    PubMed Central

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  5. Predicting the ungauged basin: model validation and realism assessment

    NASA Astrophysics Data System (ADS)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  6. Validation and calibration of structural models that combine information from multiple sources.

    PubMed

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  7. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  8. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  9. Mitigation of turbidity currents in reservoirs with passive retention systems: validation of CFD modeling

    NASA Astrophysics Data System (ADS)

    Ferreira, E.; Alves, E.; Ferreira, R. M. L.

    2012-04-01

    Sediment deposition by continuous turbidity currents may affect eco-environmental river dynamics in natural reservoirs and hinder the maneuverability of bottom discharge gates in dam reservoirs. In recent years, innovative techniques have been proposed to enforce the deposition of turbidity further upstream in the reservoir (and away from the dam), namely, the use of solid and permeable obstacles such as water jet screens , geotextile screens, etc.. The main objective of this study is to validate a computational fluid dynamics (CFD) code applied to the simulation of the interaction between a turbidity current and a passive retention system, designed to induce sediment deposition. To accomplish the proposed objective, laboratory tests were conducted where a simple obstacle configuration was subjected to the passage of currents with different initial sediment concentrations. The experimental data was used to build benchmark cases to validate the 3D CFD software ANSYS-CFX. Sensitivity tests of mesh design, turbulence models and discretization requirements were performed. The validation consisted in comparing experimental and numerical results, involving instantaneous and time-averaged sediment concentrations and velocities. In general, a good agreement between the numerical and the experimental values is achieved when: i) realistic outlet conditions are specified, ii) channel roughness is properly calibrated, iii) two equation k - ɛ models are employed iv) a fine mesh is employed near the bottom boundary. Acknowledgements This study was funded by the Portuguese Foundation for Science and Technology through the project PTDC/ECM/099485/2008. The first author thanks the assistance of Professor Moitinho de Almeida from ICIST and to all members of the project and of the Fluvial Hydraulics group of CEHIDRO.

  10. CheS-Mapper 2.0 for visual validation of (Q)SAR models

    PubMed Central

    2014-01-01

    Background Sound statistical validation is important to evaluate and compare the overall performance of (Q)SAR models. However, classical validation does not support the user in better understanding the properties of the model or the underlying data. Even though, a number of visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allow the investigation of model validation results are still lacking. Results We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. The approach applies the 3D viewer CheS-Mapper, an open-source application for the exploration of small molecules in virtual 3D space. The present work describes the new functionalities in CheS-Mapper 2.0, that facilitate the analysis of (Q)SAR information and allows the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. The approach is generic: It is model-independent and can handle physico-chemical and structural input features as well as quantitative and qualitative endpoints. Conclusions Visual validation with CheS-Mapper enables analyzing (Q)SAR information in the data and indicates how this information is employed by the (Q)SAR model. It reveals, if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org. Graphical abstract Comparing actual and predicted activity values with CheS-Mapper.

  11. The SEGUE Stellar Parameter Pipeline. II. Validation with Galactic Globular and Open Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Y.S.; Beers, T.C.; Sivarani, T.

    2007-10-01

    The authors validate the performance and accuracy of the current SEGUE (Sloan Extension for Galactic Understanding and Exploration) Stellar Parameter Pipeline (SSPP), which determines stellar atmospheric parameters (effective temperature, surface gravity, and metallicity) by comparing derived overall metallicities and radial velocities from selected likely members of three globular clusters (M 13, M 15, and M 2) and two open clusters (NGC 2420 and M 67) to the literature values. Spectroscopic and photometric data obtained during the course of the original Sloan Digital Sky Survey (SDSS-1) and its first extension (SDSS-II/SEGUE) are used to determine stellar radial velocities and atmospheric parametermore » estimates for stars in these clusters. Based on the scatter in the metallicities derived for the members of each cluster, they quantify the typical uncertainty of the SSPP values, {sigma}([Fe/H]) = 0.13 dex for stars in the range of 4500 K {le} T{sub eff} {le} 7500 K and 2.0 {le} log g {le} 5.0, at least over the metallicity interval spanned by the clusters studied (-2.3 {le} [Fe/H] < 0). The surface gravities and effective temperatures derived by the SSPP are also compared with those estimated from the comparison of the color-magnitude diagrams with stellar evolution models; they find satisfactory agreement. At present, the SSPP underestimates [Fe/H] for near-solar-metallicity stars, represented by members of M 67 in this study, by {approx} 0.3 dex.« less

  12. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  13. Modeling and Validation of Microwave Ablations with Internal Vaporization

    PubMed Central

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

    2014-01-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  14. Mortality in severe trauma patients attended by emergency services in Navarre, Spain: validation of a new prediction model and comparison with the Revised Injury Severity Classification Score II.

    PubMed

    Ali Ali, Bismil; Lefering, Rolf; Fortún Moral, Mariano; Belzunegui Otano, Tomás

    2018-01-01

    To validate the Mortality Prediction Model of Navarre (MPMN) to predict death after severe trauma and compare it to the Revised Injury Severity Classification Score II (RISCII). Retrospective analysis of a cohort of severe trauma patients (New Injury Severity Score >15) who were attended by emergency services in the Spanish autonomous community of Navarre between 2013 and 2015. The outcome variable was 30-day all-cause mortality. Risk was calculated with the MPMN and the RISCII. The performance of each model was assessed with the area under the receiver operating characteristic (ROC) curve and precision with respect to observed mortality. Calibration was assessed with the Hosmer-Lemeshow test. We included 516 patients. The mean (SD) age was 56 (23) years, and 363 (70%) were males. Ninety patients (17.4%) died within 30 days. The 30-day mortality rates predicted by the MPMN and RISCII were 16.4% and 15.4%, respectively. The areas under the ROC curves were 0.925 (95% CI, 0.902-0.952) for the MPMN and 0.941 (95% CI, 0.921-0.962) for the RISCII (P=0.269, DeLong test). Calibration statistics were 13.6 (P=.09) for the MPMN and 8.9 (P=.35) for the RISCII. Both the MPMN and the RISCII show good ability to discriminate risk and predict 30-day all-cause mortality in severe trauma patients.

  15. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  16. Strategies for the generation, validation and application of in silico ADMET models in lead generation and optimization.

    PubMed

    Gleeson, Matthew Paul; Montanari, Dino

    2012-11-01

    The most desirable chemical starting point in drug discovery is a hit or lead with a good overall profile, and where there may be issues; a clear SAR strategy should be identifiable to minimize the issue. Filtering based on drug-likeness concepts are a first step, but more accurate theoretical methods are needed to i) estimate the biological profile of molecule in question and ii) based on the underlying structure-activity relationships used by the model, estimate whether it is likely that the molecule in question can be altered to remove these liabilities. In this paper, the authors discuss the generation of ADMET models and their practical use in decision making. They discuss the issues surrounding data collation, experimental errors, the model assessment and validation steps, as well as the different types of descriptors and statistical models that can be used. This is followed by a discussion on how the model accuracy will dictate when and where it can be used in the drug discovery process. The authors also discuss how models can be developed to more effectively enable multiple parameter optimization. Models can be applied in lead generation and lead optimization steps to i) rank order a collection of hits, ii) prioritize the experimental assays needed for different hit series, iii) assess the likelihood of resolving a problem that might be present in a particular series in lead optimization and iv) screen a virtual library based on a hit or lead series to assess the impact of diverse structural changes on the predicted properties.

  17. Cross-validation of an employee safety climate model in Malaysia.

    PubMed

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  18. Surface complexation modeling calculation of Pb(II) adsorption onto the calcined diatomite

    NASA Astrophysics Data System (ADS)

    Ma, Shu-Cui; Zhang, Ji-Lin; Sun, De-Hui; Liu, Gui-Xia

    2015-12-01

    Removal of noxious heavy metal ions (e.g. Pb(II)) by surface adsorption of minerals (e.g. diatomite) is an important means in the environmental aqueous pollution control. Thus, it is very essential to understand the surface adsorptive behavior and mechanism. In this work, the Pb(II) apparent surface complexation reaction equilibrium constants on the calcined diatomite and distributions of Pb(II) surface species were investigated through modeling calculations of Pb(II) based on diffuse double layer model (DLM) with three amphoteric sites. Batch experiments were used to study the adsorption of Pb(II) onto the calcined diatomite as a function of pH (3.0-7.0) and different ionic strengths (0.05 and 0.1 mol L-1 NaCl) under ambient atmosphere. Adsorption of Pb(II) can be well described by Freundlich isotherm models. The apparent surface complexation equilibrium constants (log K) were obtained by fitting the batch experimental data using the PEST 13.0 together with PHREEQC 3.1.2 codes and there is good agreement between measured and predicted data. Distribution of Pb(II) surface species on the diatomite calculated by PHREEQC 3.1.2 program indicates that the impurity cations (e.g. Al3+, Fe3+, etc.) in the diatomite play a leading role in the Pb(II) adsorption and dominant formation of complexes and additional electrostatic interaction are the main adsorption mechanism of Pb(II) on the diatomite under weak acidic conditions.

  19. Comparison of LIDAR system performance for alternative single-mode receiver architectures: modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.

    2013-05-01

    In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.

  20. Evidence from mathematical modeling that carbonic anhydrase II and IV enhance CO2 fluxes across Xenopus oocyte plasma membranes

    PubMed Central

    Musa-Aziz, Raif; Boron, Walter F.

    2014-01-01

    Exposing an oocyte to CO2/HCO3− causes intracellular pH (pHi) to decline and extracellular-surface pH (pHS) to rise to a peak and decay. The two companion papers showed that oocytes injected with cytosolic carbonic anhydrase II (CA II) or expressing surface CA IV exhibit increased maximal rate of pHi change (dpHi/dt)max, increased maximal pHS changes (ΔpHS), and decreased time constants for pHi decline and pHS decay. Here we investigate these results using refinements of an earlier mathematical model of CO2 influx into a spherical cell. Refinements include 1) reduced cytosolic water content, 2) reduced cytosolic diffusion constants, 3) refined CA II activity, 4) layer of intracellular vesicles, 5) reduced membrane CO2 permeability, 6) microvilli, 7) refined CA IV activity, 8) a vitelline membrane, and 9) a new simulation protocol for delivering and removing the bulk extracellular CO2/HCO3− solution. We show how these features affect the simulated pHi and pHS transients and use the refined model with the experimental data for 1.5% CO2/10 mM HCO3− (pHo = 7.5) to find parameter values that approximate ΔpHS, the time to peak pHS, the time delay to the start of the pHi change, (dpHi/dt)max, and the change in steady-state pHi. We validate the revised model against data collected as we vary levels of CO2/HCO3− or of extracellular HEPES buffer. The model confirms the hypothesis that CA II and CA IV enhance transmembrane CO2 fluxes by maximizing CO2 gradients across the plasma membrane, and it predicts that the pH effects of simultaneously implementing intracellular and extracellular-surface CA are supra-additive. PMID:24965589

  1. The topoisomerase II-Hsp90 complex: a new chemotherapeutic target?

    PubMed

    Barker, Catherine R; Hamlett, Jane; Pennington, Stephen R; Burrows, Francis; Lundgren, Karen; Lough, Rachel; Watson, Alastair J M; Jenkins, John R

    2006-06-01

    The modulation of DNA topology by topoisomerase II plays a crucial role during chromosome condensation and segregation in mitosis and has thus become a highly attractive target for chemotherapeutic drugs. However, these drugs are highly toxic, and so new approaches are required. One such strategy is to target topoisomerase II-interacting proteins. Here we report the identification of potential topoisomerase II-associated proteins using immunoprecipitation, followed by 1-D and 2-D gel electrophoresis and MALDI-TOF mass spectrometry. A total of 23 proteins were identified and, of these, 17 were further validated as topoisomerase IIalpha-associated proteins by coimmunoprecipitation and Western blot. Six of the interacting proteins were cellular chaperones, including 3 members of the heat shock protein-90 (Hsp90) family, and so the effect of Hsp90 modulation on the antitumor activity of topoisomerase II drugs was tested using the sulforhodamine B assay, clonogenic assays and a xenograft model. The Hsp90 inhibitors geldanamycin, 17-AAG (17-allylamino-17-demethoxygeldanamycin) and radicicol significantly enhanced the activity of the topoisomerase II poisons etoposide and mitoxantrone in vitro and in vivo. Thus, our method of identifying topoisomerase II-interacting proteins appears to be effective, and at least 1 novel topoisomerase IIalpha-associated protein, Hsp90, may represent a valid drug target in the context of topoisomerase II-directed chemotherapy.

  2. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  3. Is the Acute NMDA Receptor Hypofunction a Valid Model of Schizophrenia?

    PubMed Central

    Adell, Albert; Jiménez-Sánchez, Laura; López-Gil, Xavier; Romón, Tamara

    2012-01-01

    Several genetic, neurodevelopmental, and pharmacological animal models of schizophrenia have been established. This short review examines the validity of one of the most used pharmacological model of the illness, ie, the acute administration of N-methyl-D-aspartate (NMDA) receptor antagonists in rodents. In some cases, data on chronic or prenatal NMDA receptor antagonist exposure have been introduced for comparison. The face validity of acute NMDA receptor blockade is granted inasmuch as hyperlocomotion and stereotypies induced by phencyclidine, ketamine, and MK-801 are regarded as a surrogate for the positive symptoms of schizophrenia. In addition, the loss of parvalbumin-containing cells (which is one of the most compelling finding in postmortem schizophrenia brain) following NMDA receptor blockade adds construct validity to this model. However, the lack of changes in glutamic acid decarboxylase (GAD67) is at variance with human studies. It is possible that changes in GAD67 are more reflective of the neurodevelopmental condition of schizophrenia. Finally, the model also has predictive validity, in that its behavioral and transmitter activation in rodents are responsive to antipsychotic treatment. Overall, although not devoid of drawbacks, the acute administration of NMDA receptor antagonists can be considered as a good model of schizophrenia bearing a satisfactory degree of validity. PMID:21965469

  4. Validation of Self-Report Impairment Measures for Section III Obsessive-Compulsive and Avoidant Personality Disorders.

    PubMed

    Liggett, Jacqueline; Carmichael, Kieran L C; Smith, Alexander; Sellbom, Martin

    2017-01-01

    This study examined the validity of newly developed disorder-specific impairment scales (IS), modeled on the Level of Personality Functioning Scale, for obsessive-compulsive (OCPD) and avoidant (AvPD) personality disorders. The IS focused on content validity (items directly reflected the disorder-specific impairments listed in DSM-5 Section III) and severity of impairment. A community sample of 313 adults completed personality inventories indexing the DSM-5 Sections II and III diagnostic criteria for OCPD and AvPD, as well as measures of impairment in the domains of self- and interpersonal functioning. Results indicated that both impairment measures (for AvPD in particular) showed promise in their ability to measure disorder-specific impairment, demonstrating convergent validity with their respective Section II counterparts and discriminant validity with their noncorresponding Section II disorder and with each other. The pattern of relationships between scores on the IS and scores on external measures of personality functioning, however, did not indicate that it is useful to maintain a distinction between impairment in the self- and interpersonal domains, at least for AvPD and OCPD.

  5. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    PubMed

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  6. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Large Black Holes in the Randall-Sundrum II Model

    NASA Astrophysics Data System (ADS)

    Yaghoobpour Tari, Shima

    The Einstein equation with a negative cosmological constant ! in the five dimensions for the Randall-Sundrum II model, which includes a black hole, has been solved numerically. We have constructed an AdS5-CFT 4 solution numerically, using a spectral method to minimize the integral of the square of the error of the Einstein equation, with 210 parameters to be determined by optimization. This metric is conformal to the Schwarzschild metric at an AdS5 boundary with an infinite scale factor. So, we consider this solution as an infinite-mass black hole solution. We have rewritten the infinite-mass black hole in the Fefferman-Graham form and obtained the numerical components of the CFT energy-momentum tensor. Using them, we have perturbed the metric to relocate the brane from infinity and obtained a large static black hole solution for the Randall- Sundrum II model. The changes of mass, entropy, temperature and area of the large black hole from the Schwarzschild metric are studied up to the first order for the perturbation parameter 1/(-Λ5M 2). The Hawking temperature and entropy for our large black hole have the same values as the Schwarzschild metric with the same mass, but the horizon area is increased by about 4.7/(-Λ5). Figueras, Lucietti, and Wiseman found an AdS5-CFT4 solution using an independent and different method from us, called the Ricci-DeTurck-flow method. Then, Figueras and Wiseman perturbed this solution in a same way as we have done and obtained the solution for the large black hole in the Randall-Sundrum II model. These two numerical solutions are the first mathematical proofs for having a large black hole in the Randall-Sundrum II. We have compared their results with ours for the CFT energy-momentum tensor components and the perturbed metric. We have shown that the results are closely in agreement, which can be considered as evidence that the solution for the large black hole in the Randall-Sundrum II model exists.

  8. Modeling MgII Absorbers from SDSS Spectroscopic and Imaging Catalogs

    NASA Astrophysics Data System (ADS)

    Rimoldini, L. G.; Menard, B.; Nestor, D. B.; Rao, S. M.; Sheth, R. K.; Turnshek, D. A.; Zibetti, S.; Feather, S.; Quider, A.

    2005-12-01

    The detection of more than 14,000 MgII absorption doublets along the sight-lines to SDSS DR4 QSOs (pursued by Turnshek, Nestor, Rao, and collaborators) has produced the largest sample of MgII absorbers to date in the redshift interval 0.37 < z < 2.30. The statistical relation between galaxies and MgII systems is investigated by cross-correlating the spectroscopic MgII catalog with the SDSS imaging catalog of galaxies in the neighborhood of QSO sight-lines. A model for MgII absorbers is derived to account for the measured MgII rest equivalent width distribution and the absorbing galaxy properties (e.g., luminosity, impact parameter, and morphological type). Some preliminary results of our analysis are presented. This work was supported in part by the National Science Foundation. L.G.R. acknowledges further support from the Z. Daniel's Predoctoral Fellowship.

  9. Modelling Short-Term Maximum Individual Exposure from Airborne Hazardous Releases in Urban Environments. Part ΙI: Validation of a Deterministic Model with Wind Tunnel Experimental Data.

    PubMed

    Efthimiou, George C; Bartzis, John G; Berbekar, Eva; Hertwig, Denise; Harms, Frank; Leitl, Bernd

    2015-06-26

    The capability to predict short-term maximum individual exposure is very important for several applications including, for example, deliberate/accidental release of hazardous substances, odour fluctuations or material flammability level exceedance. Recently, authors have proposed a simple approach relating maximum individual exposure to parameters such as the fluctuation intensity and the concentration integral time scale. In the first part of this study (Part I), the methodology was validated against field measurements, which are governed by the natural variability of atmospheric boundary conditions. In Part II of this study, an in-depth validation of the approach is performed using reference data recorded under truly stationary and well documented flow conditions. For this reason, a boundary-layer wind-tunnel experiment was used. The experimental dataset includes 196 time-resolved concentration measurements which detect the dispersion from a continuous point source within an urban model of semi-idealized complexity. The data analysis allowed the improvement of an important model parameter. The model performed very well in predicting the maximum individual exposure, presenting a factor of two of observations equal to 95%. For large time intervals, an exponential correction term has been introduced in the model based on the experimental observations. The new model is capable of predicting all time intervals giving an overall factor of two of observations equal to 100%.

  10. MRI-based modeling for radiocarpal joint mechanics: validation criteria and results for four specimen-specific models.

    PubMed

    Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet

    2011-10-01

    The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations

  11. Some guidance on preparing validation plans for the DART Full System Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less

  12. Validation of an Evaluation Model for Learning Management Systems

    ERIC Educational Resources Information Center

    Kim, S. W.; Lee, M. G.

    2008-01-01

    This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

  13. [Algorithm for estimating chlorophyll-a concentration in case II water body based on bio-optical model].

    PubMed

    Yang, Wei; Chen, Jin; Mausushita, Bunki

    2009-01-01

    In the present study, a novel retrieval method for estimating chlorophyll-a concentration in case II waters based on bio-optical model was proposed and was tested with the data measured in the laboratory. A series of reflectance spectra, with which the concentration of each sample constituent (for example chlorophyll-a, NPSS etc.) was obtained from accurate experiments, were used to calculate the absorption and backscattering coefficients of the constituents of the case II waters. Then non-negative least square method was applied to calculate the concentration of chlorophyll-a and non-phytoplankton suspended sediments (NPSS). Green algae was firstly collected from the Kasumigaura lake in Japan and then cultured in the laboratory. The reflectance spectra of waters with different amounts of phytoplankton and NPSS were measured in the dark room using FieldSpec Pro VNIR (Analytical Spectral Devises Inc. , Boulder, CO, USA). In order to validate whether this method can be applied in multispectral data (for example Landsat TM), the spectra measured in the laboratory were resampled with Landsat TM bands 1, 2, 3 and 4. Different combinations of TM bands were compared to derive the most appropriate wavelength for detecting chlorophyll-a in case II water for green algae. The results indicated that the combination of TM bands 2, 3 and 4 achieved much better accuracy than other combinations, and the estimated concentration of chlorophyll-a was significantly more accurate than empirical methods. It is expected that this method can be directly applied to the real remotely sensed image because it is based on bio-optical model.

  14. Molprobity's ultimate rotamer-library distributions for model validation.

    PubMed

    Hintze, Bradley J; Lewis, Steven M; Richardson, Jane S; Richardson, David C

    2016-09-01

    Here we describe the updated MolProbity rotamer-library distributions derived from an order-of-magnitude larger and more stringently quality-filtered dataset of about 8000 (vs. 500) protein chains, and we explain the resulting changes and improvements to model validation as seen by users. To include only side-chains with satisfactory justification for their given conformation, we added residue-specific filters for electron-density value and model-to-density fit. The combined new protocol retains a million residues of data, while cleaning up false-positive noise in the multi- χ datapoint distributions. It enables unambiguous characterization of conformational clusters nearly 1000-fold less frequent than the most common ones. We describe examples of local interactions that favor these rare conformations, including the role of authentic covalent bond-angle deviations in enabling presumably strained side-chain conformations. Further, along with favored and outlier, an allowed category (0.3-2.0% occurrence in reference data) has been added, analogous to Ramachandran validation categories. The new rotamer distributions are used for current rotamer validation in MolProbity and PHENIX, and for rotamer choice in PHENIX model-building and refinement. The multi-dimensional χ distributions and Top8000 reference dataset are freely available on GitHub. These rotamers are termed "ultimate" because data sampling and quality are now fully adequate for this task, and also because we believe the future of conformational validation should integrate side-chain with backbone criteria. Proteins 2016; 84:1177-1189. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  16. Method for appraising model validity of randomised controlled trials of homeopathic treatment: multi-rater concordance study

    PubMed Central

    2012-01-01

    Background A method for assessing the model validity of randomised controlled trials of homeopathy is needed. To date, only conventional standards for assessing intrinsic bias (internal validity) of trials have been invoked, with little recognition of the special characteristics of homeopathy. We aimed to identify relevant judgmental domains to use in assessing the model validity of homeopathic treatment (MVHT). We define MVHT as the extent to which a homeopathic intervention and the main measure of its outcome, as implemented in a randomised controlled trial (RCT), reflect 'state-of-the-art' homeopathic practice. Methods Using an iterative process, an international group of experts developed a set of six judgmental domains, with associated descriptive criteria. The domains address: (I) the rationale for the choice of the particular homeopathic intervention; (II) the homeopathic principles reflected in the intervention; (III) the extent of homeopathic practitioner input; (IV) the nature of the main outcome measure; (V) the capability of the main outcome measure to detect change; (VI) the length of follow-up to the endpoint of the study. Six papers reporting RCTs of homeopathy of varying design were randomly selected from the literature. A standard form was used to record each assessor's independent response per domain, using the optional verdicts 'Yes', 'Unclear', 'No'. Concordance among the eight verdicts per domain, across all six papers, was evaluated using the kappa (κ) statistic. Results The six judgmental domains enabled MVHT to be assessed with 'fair' to 'almost perfect' concordance in each case. For the six RCTs examined, the method allowed MVHT to be classified overall as 'acceptable' in three, 'unclear' in two, and 'inadequate' in one. Conclusion Future systematic reviews of RCTs in homeopathy should adopt the MVHT method as part of a complete appraisal of trial validity. PMID:22510227

  17. Institutional Effectiveness: A Model for Planning, Assessment & Validation.

    ERIC Educational Resources Information Center

    Truckee Meadows Community Coll., Sparks, NV.

    The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

  18. THE HYDRODYNAMICAL MODELS OF THE COMETARY COMPACT H ii REGION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Feng-Yao; Zhu, Qing-Feng; Li, Juan

    2015-10-10

    We have developed a full numerical method to study the gas dynamics of cometary ultracompact H ii regions, and associated photodissociation regions (PDRs). The bow-shock and champagne-flow models with a 40.9/21.9 M{sub ⊙} star are simulated. In the bow-shock models, the massive star is assumed to move through dense (n = 8000 cm{sup −3}) molecular material with a stellar velocity of 15 km s{sup −1}. In the champagne-flow models, an exponential distribution of density with a scale height of 0.2 pc is assumed. The profiles of the [Ne ii] 12.81 μm and H{sub 2} S(2) lines from the ionized regionsmore » and PDRs are compared for two sets of models. In champagne-flow models, emission lines from the ionized gas clearly show the effect of acceleration along the direction toward the tail due to the density gradient. The kinematics of the molecular gas inside the dense shell are mainly due to the expansion of the H ii region. However, in bow-shock models the ionized gas mainly moves in the same direction as the stellar motion. The kinematics of the molecular gas inside the dense shell simply reflects the motion of the dense shell with respect to the star. These differences can be used to distinguish two sets of models.« less

  19. (NTF) National Transonic Facility Test 213-SFW Flow Control II,

    NASA Image and Video Library

    2012-11-19

    (NTF) National Transonic Facility Test 213-SFW Flow Control II, Fast-MAC Model: The fundamental Aerodynamics Subsonic Transonic-Modular Active Control (Fast-MAC) Model was tested for the 2nd time in the NTF. The objectives were to document the effects of Reynolds numbers on circulation control aerodynamics and to develop and open data set for CFD code validation. Image taken in building 1236, National Transonic Facility

  20. Validating a Technology Enhanced Student-Centered Learning Model

    ERIC Educational Resources Information Center

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  1. Finite Element Model Development and Validation for Aircraft Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

    2000-01-01

    The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.

  2. Design, synthesis, pharmacological evaluation and in silico ADMET prediction of novel substituted benzimidazole derivatives as angiotensin II-AT1 receptor antagonists based on predictive 3D QSAR models.

    PubMed

    Vyas, V K; Gupta, N; Ghate, M; Patel, S

    2014-01-01

    In this study we designed novel substituted benzimidazole derivatives and predicted their absorption, distribution, metabolism, excretion and toxicity (ADMET) properties, based on a predictive 3D QSAR study on 132 substituted benzimidazoles as AngII-AT1 receptor antagonists. The two best predicted compounds were synthesized and evaluated for AngII-AT1 receptor antagonism. Three different alignment tools for comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were used. The best 3D QSAR models were obtained using the rigid body (Distill) alignment method. CoMFA and CoMSIA models were found to be statistically significant with leave-one-out correlation coefficients (q(2)) of 0.630 and 0.623, respectively, cross-validated coefficients (r(2)cv) of 0.651 and 0.630, respectively, and conventional coefficients of determination (r(2)) of 0.848 and 0.843, respectively. 3D QSAR models were validated using a test set of 24 compounds, giving satisfactory predicted results (r(2)pred) of 0.727 and 0.689 for the CoMFA and CoMSIA models, respectively. We have identified some key features in substituted benzimidazole derivatives, such as lipophilicity and H-bonding at the 2- and 5-positions of the benzimidazole nucleus, respectively, for AT1 receptor antagonistic activity. We designed 20 novel substituted benzimidazole derivatives and predicted their activity. In silico ADMET properties were also predicted for these designed molecules. Finally, the compounds with best predicted activity were synthesized and evaluated for in vitro angiotensin II-AT1 receptor antagonism.

  3. Athletes' Perceptions of Coaching Competency Scale II-High School Teams

    ERIC Educational Resources Information Center

    Myers, Nicholas D.; Chase, Melissa A.; Beauchamp, Mark R.; Jackson, Ben

    2010-01-01

    The purpose of this validity study was to improve measurement of athletes' evaluations of their head coach's coaching competency, an important multidimensional construct in models of coaching effectiveness. A revised version of the Coaching Competency Scale (CCS) was developed for athletes of high school teams (APCCS II-HST). Data were collected…

  4. Finite Element Model and Validation of Nasal Tip Deformation

    PubMed Central

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian JF

    2016-01-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39mm ± 1.04 mm and deviated up to 2mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow. PMID:27633018

  5. Finite Element Model and Validation of Nasal Tip Deformation.

    PubMed

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  6. Validation and Trustworthiness of Multiscale Models of Cardiac Electrophysiology

    PubMed Central

    Pathmanathan, Pras; Gray, Richard A.

    2018-01-01

    Computational models of cardiac electrophysiology have a long history in basic science applications and device design and evaluation, but have significant potential for clinical applications in all areas of cardiovascular medicine, including functional imaging and mapping, drug safety evaluation, disease diagnosis, patient selection, and therapy optimisation or personalisation. For all stakeholders to be confident in model-based clinical decisions, cardiac electrophysiological (CEP) models must be demonstrated to be trustworthy and reliable. Credibility, that is, the belief in the predictive capability, of a computational model is primarily established by performing validation, in which model predictions are compared to experimental or clinical data. However, there are numerous challenges to performing validation for highly complex multi-scale physiological models such as CEP models. As a result, credibility of CEP model predictions is usually founded upon a wide range of distinct factors, including various types of validation results, underlying theory, evidence supporting model assumptions, evidence from model calibration, all at a variety of scales from ion channel to cell to organ. Consequently, it is often unclear, or a matter for debate, the extent to which a CEP model can be trusted for a given application. The aim of this article is to clarify potential rationale for the trustworthiness of CEP models by reviewing evidence that has been (or could be) presented to support their credibility. We specifically address the complexity and multi-scale nature of CEP models which makes traditional model evaluation difficult. In addition, we make explicit some of the credibility justification that we believe is implicitly embedded in the CEP modeling literature. Overall, we provide a fresh perspective to CEP model credibility, and build a depiction and categorisation of the wide-ranging body of credibility evidence for CEP models. This paper also represents a step

  7. Transfer matrix modeling and experimental validation of cellular porous material with resonant inclusions.

    PubMed

    Doutres, Olivier; Atalla, Noureddine; Osman, Haisam

    2015-06-01

    Porous materials are widely used for improving sound absorption and sound transmission loss of vibrating structures. However, their efficiency is limited to medium and high frequencies of sound. A solution for improving their low frequency behavior while keeping an acceptable thickness is to embed resonant structures such as Helmholtz resonators (HRs). This work investigates the absorption and transmission acoustic performances of a cellular porous material with a two-dimensional periodic arrangement of HR inclusions. A low frequency model of a resonant periodic unit cell based on the parallel transfer matrix method is presented. The model is validated by comparison with impedance tube measurements and simulations based on both the finite element method and a homogenization based model. At the HR resonance frequency (i) the transmission loss is greatly improved and (ii) the sound absorption of the foam can be either decreased or improved depending on the HR tuning frequency and on the thickness and properties of the host foam. Finally, the diffuse field sound absorption and diffuse field sound transmission loss performance of a 2.6 m(2) resonant cellular material are measured. It is shown that the improvements observed at the Helmholtz resonant frequency on a single cell are confirmed at a larger scale.

  8. Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.

    PubMed

    López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

    2013-11-01

    This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Detailed validation of the bidirectional effect in various Case I and Case II waters.

    PubMed

    Gleason, Arthur C R; Voss, Kenneth J; Gordon, Howard R; Twardowski, Michael; Sullivan, James; Trees, Charles; Weidemann, Alan; Berthon, Jean-François; Clark, Dennis; Lee, Zhong-Ping

    2012-03-26

    Simulated bidirectional reflectance distribution functions (BRDF) were compared with measurements made just beneath the water's surface. In Case I water, the set of simulations that varied the particle scattering phase function depending on chlorophyll concentration agreed more closely with the data than other models. In Case II water, however, the simulations using fixed phase functions agreed well with the data and were nearly indistinguishable from each other, on average. The results suggest that BRDF corrections in Case II water are feasible using single, average, particle scattering phase functions, but that the existing approach using variable particle scattering phase functions is still warranted in Case I water.

  10. Using the split Hopkinson pressure bar to validate material models.

    PubMed

    Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian

    2014-08-28

    This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  11. Construct validity of the ovine model in endoscopic sinus surgery training.

    PubMed

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P < .001). Experience of the intermediate group was variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  12. Community-wide validation of geospace model local K-index predictions to support model transition to operations

    NASA Astrophysics Data System (ADS)

    Glocer, A.; Rastätter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; Weigel, R. S.; McCollough, J.; Wing, S.

    2016-07-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPC's effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  13. Community-Wide Validation of Geospace Model Local K-Index Predictions to Support Model Transition to Operations

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Rastaetter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; hide

    2016-01-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPCs effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  14. Making Validated Educational Models Central in Preschool Standards.

    ERIC Educational Resources Information Center

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  15. Using DFT methodology for more reliable predictive models: Design of inhibitors of Golgi α-Mannosidase II.

    PubMed

    Bobovská, Adela; Tvaroška, Igor; Kóňa, Juraj

    2016-05-01

    Human Golgi α-mannosidase II (GMII), a zinc ion co-factor dependent glycoside hydrolase (E.C.3.2.1.114), is a pharmaceutical target for the design of inhibitors with anti-cancer activity. The discovery of an effective inhibitor is complicated by the fact that all known potent inhibitors of GMII are involved in unwanted co-inhibition with lysosomal α-mannosidase (LMan, E.C.3.2.1.24), a relative to GMII. Routine empirical QSAR models for both GMII and LMan did not work with a required accuracy. Therefore, we have developed a fast computational protocol to build predictive models combining interaction energy descriptors from an empirical docking scoring function (Glide-Schrödinger), Linear Interaction Energy (LIE) method, and quantum mechanical density functional theory (QM-DFT) calculations. The QSAR models were built and validated with a library of structurally diverse GMII and LMan inhibitors and non-active compounds. A critical role of QM-DFT descriptors for the more accurate prediction abilities of the models is demonstrated. The predictive ability of the models was significantly improved when going from the empirical docking scoring function to mixed empirical-QM-DFT QSAR models (Q(2)=0.78-0.86 when cross-validation procedures were carried out; and R(2)=0.81-0.83 for a testing set). The average error for the predicted ΔGbind decreased to 0.8-1.1kcalmol(-1). Also, 76-80% of non-active compounds were successfully filtered out from GMII and LMan inhibitors. The QSAR models with the fragmented QM-DFT descriptors may find a useful application in structure-based drug design where pure empirical and force field methods reached their limits and where quantum mechanics effects are critical for ligand-receptor interactions. The optimized models will apply in lead optimization processes for GMII drug developments. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Web Based Semi-automatic Scientific Validation of Models of the Corona and Inner Heliosphere

    NASA Astrophysics Data System (ADS)

    MacNeice, P. J.; Chulaki, A.; Taktakishvili, A.; Kuznetsova, M. M.

    2013-12-01

    Validation is a critical step in preparing models of the corona and inner heliosphere for future roles supporting either or both the scientific research community and the operational space weather forecasting community. Validation of forecasting quality tends to focus on a short list of key features in the model solutions, with an unchanging order of priority. Scientific validation exposes a much larger range of physical processes and features, and as the models evolve to better represent features of interest, the research community tends to shift its focus to other areas which are less well understood and modeled. Given the more comprehensive and dynamic nature of scientific validation, and the limited resources available to the community to pursue this, it is imperative that the community establish a semi-automated process which engages the model developers directly into an ongoing and evolving validation process. In this presentation we describe the ongoing design and develpment of a web based facility to enable this type of validation of models of the corona and inner heliosphere, on the growing list of model results being generated, and on strategies we have been developing to account for model results that incorporate adaptively refined numerical grids.

  17. Enhanced Stability of the Fe(II)/Mn(II) State in a Synthetic Model of Heterobimetallic Cofactor Assembly.

    PubMed

    Kerber, William D; Goheen, Joshua T; Perez, Kaitlyn A; Siegler, Maxime A

    2016-01-19

    Heterobimetallic Mn/Fe cofactors are found in the R2 subunit of class Ic ribonucleotide reductases (R2c) and R2-like ligand binding oxidases (R2lox). Selective cofactor assembly is due at least in part to the thermodynamics of M(II) binding to the apoprotein. We report here equilibrium studies of Fe(II)/Mn(II) discrimination in the biomimetic model system H5(F-HXTA) (5-fluoro-2-hydroxy-1,3-xylene-α,α'-diamine-N,N,N',N'-tetraacetic acid). The homobimetallic F-HXTA complexes [Fe(H2O)6][1]2·14H2O and [Mn(H2O)6][2]2·14H2O (1 = [Fe(II)2(F-HXTA)(H2O)4](-); 2 = [Mn(II)2(F-HXTA)(H2O)4](-)) were characterized by single crystal X-ray diffraction. NMR data show that 1 retains its structure in solution (2 is NMR silent). Metal exchange is facile, and the heterobimetallic complex [Fe(II)Mn(II)(F-HXTA)(H2O)4](-) (3) is formed from mixtures of 1 and 2. (19)F NMR was used to quantify 1 and 3 in the presence of excess M(II)(aq) at various metal ratios, and equilibrium constants for Fe(II)/Mn(II) discrimination were calculated from these data. Fe(II) is preferred over Mn(II) with K1 = 182 ± 13 for complete replacement (2 ⇌ 1). This relatively modest preference is attributed to a hard-soft acid-base mismatch between the divalent cations and the polycarboxylate ligand. The stepwise constants for replacement are K2 = 20.1 ± 1.3 (2 ⇌ 3) and K3 = 9.1 ± 1.1 (3 ⇌ 1). K2 > K3 demonstrates enhanced stability of the heterobimetallic state beyond what is expected for simple Mn(II) → Fe(II) replacement. The relevance to Fe(II)/Mn(II) discrimination in R2c and R2lox proteins is discussed.

  18. Validation of the Sexual Assault Symptom Scale II (SASS II) Using a Panel Research Design

    ERIC Educational Resources Information Center

    Ruch, Libby O.; Wang, Chang-Hwai

    2006-01-01

    To examine the utility of a self-report scale of sexual assault trauma, 223 female victims were interviewed with the 43-item Sexual Assault Symptom Scale II (SASS II) at 1, 3, 7, 11, and 15 months postassault. Factor analyses using principal-components extraction with an oblimin rotation yielded 7 common factors with 31 items. The internal…

  19. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  20. Validation of the Yale-Brown Obsessive-Compulsive Severity Scale in African Americans with obsessive-compulsive disorder.

    PubMed

    Williams, Monnica T; Wetterneck, Chad T; Thibodeau, Michel A; Duque, Gerardo

    2013-09-30

    The Yale-Brown Obsessive Compulsive Scale (Y-BOCS) is widely used in the assessment of obsessive-compulsive disorder (OCD), but the psychometric properties of the instrument have not been examined in African Americans with OCD. Therefore, the purpose of this study is to explore the properties of the Y-BOCS severity scale in this population. Participants were 75 African American adults with a lifetime diagnosis of OCD. They completed the Y-BOCS, the Beck Anxiety Inventory (BAI), the Beck Depression Inventory-II (BDI-II), and the Multigroup Ethnic Identity Measure (MEIM). Evaluators rated OCD severity using the Clinical Global Impression Scale (CGI) and their global assessment of functioning (GAF). The Y-BOCS was significantly correlated with both the CGI and GAF, indicating convergent validity. It also demonstrated good internal consistency (α=0.83) and divergent validity when compared to the BAI and BDI-II. Confirmatory factor analyses tested five previously reported models and supported a three-factor solution, although no model exhibited excellent fit. An exploratory factor analysis was conducted, supporting a three-factor solution. A linear regression was conducted, predicting CGI from the three factors of the Y-BOCS and the MEIM, and the model was significant. The Y-BOCS appears to be a valid measure for African American populations. © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Forward ultrasonic model validation using wavefield imaging methods

    NASA Astrophysics Data System (ADS)

    Blackshire, James L.

    2018-04-01

    The validation of forward ultrasonic wave propagation models in a complex titanium polycrystalline material system is accomplished using wavefield imaging methods. An innovative measurement approach is described that permits the visualization and quantitative evaluation of bulk elastic wave propagation and scattering behaviors in the titanium material for a typical focused immersion ultrasound measurement process. Results are provided for the determination and direct comparison of the ultrasonic beam's focal properties, mode-converted shear wave position and angle, and scattering and reflection from millimeter-sized microtexture regions (MTRs) within the titanium material. The approach and results are important with respect to understanding the root-cause backscatter signal responses generated in aerospace engine materials, where model-assisted methods are being used to understand the probabilistic nature of the backscatter signal content. Wavefield imaging methods are shown to be an effective means for corroborating and validating important forward model predictions in a direct manner using time- and spatially-resolved displacement field amplitude measurements.

  2. Available for the Apple II: FIRM: Florida InteRactive Modeler.

    ERIC Educational Resources Information Center

    Levy, C. Michael; And Others

    1983-01-01

    The Apple II microcomputer program described allows instructors with minimal programing experience to construct computer models of psychological phenomena for students to investigate. Use of these models eliminates need to maintain/house/breed animals or purchase sophisticated laboratory equipment. Several content models are also described,…

  3. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  4. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  5. Calibration of Predictor Models Using Multiple Validation Experiments

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  6. Validation of the measure automobile emissions model : a statistical analysis

    DOT National Transportation Integrated Search

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  7. Diet History Questionnaire II FAQs | EGRP/DCCPS/NCI/NIH

    Cancer.gov

    Answers to general questions about the Diet History Questionnaire II (DHQ II), as well as those related to DHQ II administration, validation, scanning, nutrient estimates, calculations, DHQ II modification, data quality, and more.

  8. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  9. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  10. Testing spectral models for stellar populations with star clusters - II. Results

    NASA Astrophysics Data System (ADS)

    González Delgado, Rosa M.; Cid Fernandes, Roberto

    2010-04-01

    High spectral resolution evolutionary synthesis models have become a routinely used ingredient in extragalactic work, and as such deserve thorough testing. Star clusters are ideal laboratories for such tests. This paper applies the spectral fitting methodology outlined in Paper I to a sample of clusters, mainly from the Magellanic Clouds and spanning a wide range in age and metallicity, fitting their integrated light spectra with a suite of modern evolutionary synthesis models for single stellar populations. The combinations of model plus spectral library employed in this investigation are Galaxev/STELIB, Vazdekis/MILES, SED@/GRANADA and Galaxev/MILES+GRANADA, which provide a representative sample of models currently available for spectral fitting work. A series of empirical tests are performed with these models, comparing the quality of the spectral fits and the values of age, metallicity and extinction obtained with each of them. A comparison is also made between the properties derived from these spectral fits and literature data on these nearby, well studied clusters. These comparisons are done with the general goal of providing useful feedback for model makers, as well as guidance to the users of such models. We find the following. (i) All models are able to derive ages that are in good agreement both with each other and with literature data, although ages derived from spectral fits are on average slightly older than those based on the S-colour-magnitude diagram (S-CMD) method as calibrated by Girardi et al. (ii) There is less agreement between the models for the metallicity and extinction. In particular, Galaxev/STELIB models underestimate the metallicity by ~0.6 dex, and the extinction is overestimated by 0.1 mag. (iii) New generations of models using the GRANADA and MILES libraries are superior to STELIB-based models both in terms of spectral fit quality and regarding the accuracy with which age and metallicity are retrieved. Accuracies of about 0.1 dex in

  11. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  12. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. The Space Weather Modeling Framework (SWMF): Models and Validation

    NASA Astrophysics Data System (ADS)

    Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Ridley, Aaron; Manchester, Ward, IV

    In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. SWMF is a powerful tool for coupling regional models describing the space environment from the solar photosphere to the bottom of the ionosphere. Presently, SWMF contains over a dozen components: the solar corona (SC), eruptive event generator (EE), inner heliosphere (IE), outer heliosphere (OH), solar energetic particles (SE), global magnetosphere (GM), inner magnetosphere (IM), radiation belts (RB), plasmasphere (PS), ionospheric electrodynamics (IE), polar wind (PW), upper atmosphere (UA) and lower atmosphere (LA). This talk will present an overview of SWMF, new results obtained with improved physics as well as some validation studies.

  14. CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment

    NASA Technical Reports Server (NTRS)

    Gaffney, Richard L., Jr.; Cutler, Andrew D.

    2005-01-01

    If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.

  15. A Validated Open-Source Multisolver Fourth-Generation Composite Femur Model.

    PubMed

    MacLeod, Alisdair R; Rose, Hannah; Gill, Harinderjit S

    2016-12-01

    Synthetic biomechanical test specimens are frequently used for preclinical evaluation of implant performance, often in combination with numerical modeling, such as finite-element (FE) analysis. Commercial and freely available FE packages are widely used with three FE packages in particular gaining popularity: abaqus (Dassault Systèmes, Johnston, RI), ansys (ANSYS, Inc., Canonsburg, PA), and febio (University of Utah, Salt Lake City, UT). To the best of our knowledge, no study has yet made a comparison of these three commonly used solvers. Additionally, despite the femur being the most extensively studied bone in the body, no freely available validated model exists. The primary aim of the study was primarily to conduct a comparison of mesh convergence and strain prediction between the three solvers (abaqus, ansys, and febio) and to provide validated open-source models of a fourth-generation composite femur for use with all the three FE packages. Second, we evaluated the geometric variability around the femoral neck region of the composite femurs. Experimental testing was conducted using fourth-generation Sawbones® composite femurs instrumented with strain gauges at four locations. A generic FE model and four specimen-specific FE models were created from CT scans. The study found that the three solvers produced excellent agreement, with strain predictions being within an average of 3.0% for all the solvers (r2 > 0.99) and 1.4% for the two commercial codes. The average of the root mean squared error against the experimental results was 134.5% (r2 = 0.29) for the generic model and 13.8% (r2 = 0.96) for the specimen-specific models. It was found that composite femurs had variations in cortical thickness around the neck of the femur of up to 48.4%. For the first time, an experimentally validated, finite-element model of the femur is presented for use in three solvers. This model is freely available online along with all the supporting validation data.

  16. Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.

    PubMed

    Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

    2016-01-01

    A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. © The Author(s) 2014.

  17. KINEROS2-AGWA: Model Use, Calibration, and Validation

    NASA Technical Reports Server (NTRS)

    Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

    2013-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  18. KINEROS2/AGWA: Model use, calibration and validation

    USGS Publications Warehouse

    Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.

    2012-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  19. Validation of elk resource selection models with spatially independent data

    Treesearch

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  20. Implementation and validation of a wake model for vortex-surface interactions in low speed forward flight

    NASA Technical Reports Server (NTRS)

    Komerath, Narayanan M.; Schreiber, Olivier A.

    1987-01-01

    The wake model was implemented using a VAX 750 and a Microvax II workstation. Online graphics capability using a DISSPLA graphics package. The rotor model used by Beddoes was significantly extended to include azimuthal variations due to forward flight and a simplified scheme for locating critical points where vortex elements are placed. A test case was obtained for validation of the predictions of induced velocity. Comparison of the results indicates that the code requires some more features before satisfactory predictions can be made over the whole rotor disk. Specifically, shed vorticity due to the azimuthal variation of blade loading must be incorporated into the model. Interactions between vortices shed from the four blades of the model rotor must be included. The Scully code for calculating the velocity field is being modified in parallel with these efforts to enable comparison with experimental data. To date, some comparisons with flow visualization data obtained at Georgia Tech were performed and show good agreement for the isolated rotor case. Comparison of time-resolved velocity data obtained at Georgia Tech also shows good agreement. Modifications are being implemented to enable generation of time-averaged results for comparison with NASA data.

  1. Neutrinoless double beta decay in type I+II seesaw models

    NASA Astrophysics Data System (ADS)

    Borah, Debasish; Dasgupta, Arnab

    2015-11-01

    We study neutrinoless double beta decay in left-right symmetric extension of the standard model with type I and type II seesaw origin of neutrino masses. Due to the enhanced gauge symmetry as well as extended scalar sector, there are several new physics sources of neutrinoless double beta decay in this model. Ignoring the left-right gauge boson mixing and heavy-light neutrino mixing, we first compute the contributions to neutrinoless double beta decay for type I and type II dominant seesaw separately and compare with the standard light neutrino contributions. We then repeat the exercise by considering the presence of both type I and type II seesaw, having non-negligible contributions to light neutrino masses and show the difference in results from individual seesaw cases. Assuming the new gauge bosons and scalars to be around a TeV, we constrain different parameters of the model including both heavy and light neutrino masses from the requirement of keeping the new physics contribution to neutrinoless double beta decay amplitude below the upper limit set by the GERDA experiment and also satisfying bounds from lepton flavor violation, cosmology and colliders.

  2. User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs

    Treesearch

    Joseph E. Horn; E. Lee Medema; Ervin G. Schuster

    1986-01-01

    CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....

  3. Assessment of perioperative mortality risk in patients with infective endocarditis undergoing cardiac surgery: performance of the EuroSCORE I and II logistic models.

    PubMed

    Madeira, Sérgio; Rodrigues, Ricardo; Tralhão, António; Santos, Miguel; Almeida, Carla; Marques, Marta; Ferreira, Jorge; Raposo, Luís; Neves, José; Mendes, Miguel

    2016-02-01

    The European System for Cardiac Operative Risk Evaluation (EuroSCORE) has been established as a tool for assisting decision-making in surgical patients and as a benchmark for quality assessment. Infective endocarditis often requires surgical treatment and is associated with high mortality. This study was undertaken to (i) validate both versions of the EuroSCORE, the older logistic EuroSCORE I and the recently developed EuroSCORE II and to compare their performances; (ii) identify predictors other than those included in the EuroSCORE models that might further improve their performance. We retrospectively studied 128 patients from a single-centre registry who underwent heart surgery for active infective endocarditis between January 2007 and November 2014. Binary logistic regression was used to find independent predictors of mortality and to create a new prediction model. Discrimination and calibration of models were assessed by receiver-operating characteristic curve analysis, calibration curves and the Hosmer-Lemeshow test. The observed perioperative mortality was 16.4% (n = 21). The median EuroSCORE I and EuroSCORE II were 13.9% interquartile range (IQ) (7.0-35.0) and 6.6% IQ (3.5-18.2), respectively. Discriminative power was numerically higher for EuroSCORE II {area under the curve (AUC) of 0.83 [95% confidence interval (CI), 0.75-0.91]} than for EuroSCORE I [0.75 (95% CI, 0.66-0.85), P = 0.09]. The Hosmer-Lemeshow test showed good calibration for EuroSCORE II (P = 0.08) but not for EuroSCORE I (P = 0.04). EuroSCORE I tended to over-predict and EuroSCORE II to under-predict mortality. Among the variables known to be associated with greater infective endocarditis severity, only prosthetic valve infective endocarditis remained an independent predictor of mortality [odds ratio (OR) 6.6; 95% CI, 1.1-39.5; P = 0.04]. The new model including the EuroSCORE II variables and variables known to be associated with greater infective endocarditis severity showed an AUC of 0

  4. A validated approach for modeling collapse of steel structures

    NASA Astrophysics Data System (ADS)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  5. Assessing the reliability and validity of the Chinese Sexual Assault Symptom Scale (C-SASS): scale development and validation.

    PubMed

    Wang, Chang-Hwai; Lee, Jin-Chuan; Yuan, Yu-Hsi

    2014-01-01

    The purpose of this research is to establish and verify the psychometric and structural properties of the self-report Chinese Sexual Assault Symptom Scale (C-SASS) to assess the trauma experienced by Chinese victims of sexual assault. An earlier version of the C-SASS was constructed using a modified list of the same trauma symptoms administered to an American sample and used to develop and validate the Sexual Assault Symptom Scale II (SASS II). The rationale of this study is to revise the earlier version of the C-SASS, using a larger and more representative sample and more robust statistical analysis than in earlier research, to permit a more thorough examination of the instrument and further confirm the dimensions of sexual assault trauma in Chinese victims of rape. In this study, a sample of 418 victims from northern Taiwan was collected to confirm the reliability and validity of the C-SASS. Exploratory factor analysis yielded five common factors: Safety Fears, Self-Blame, Health Fears, Anger and Emotional Lability, and Fears About the Criminal Justice System. Further tests of the validity and composite reliability of the C-SASS were provided by the structural equation modeling (SEM). The results indicated that the C-SASS was a brief, valid, and reliable instrument for assessing sexual assault trauma among Chinese victims in Taiwan. The scale can be used to evaluate victims in sexual assault treatment centers around Taiwan, as well as to capture the characteristics of sexual assault trauma among Chinese victims.

  6. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  7. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  8. A Historical Forcing Ice Sheet Model Validation Framework for Greenland

    NASA Astrophysics Data System (ADS)

    Price, S. F.; Hoffman, M. J.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Kalashnikova, I.; Neumann, T.; Nowicki, S.; Perego, M.; Salinger, A.

    2014-12-01

    We propose an ice sheet model testing and validation framework for Greenland for the years 2000 to the present. Following Perego et al. (2014), we start with a realistic ice sheet initial condition that is in quasi-equilibrium with climate forcing from the late 1990's. This initial condition is integrated forward in time while simultaneously applying (1) surface mass balance forcing (van Angelen et al., 2013) and (2) outlet glacier flux anomalies, defined using a new dataset of Greenland outlet glacier flux for the past decade (Enderlin et al., 2014). Modeled rates of mass and elevation change are compared directly to remote sensing observations obtained from GRACE and ICESat. Here, we present a detailed description of the proposed validation framework including the ice sheet model and model forcing approach, the model-to-observation comparison process, and initial results comparing model output and observations for the time period 2000-2013.

  9. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  10. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  11. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems

    DTIC Science & Technology

    2015-12-01

    distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and

  12. Understanding variability of the Southern Ocean overturning circulation in CORE-II models

    NASA Astrophysics Data System (ADS)

    Downes, S. M.; Spence, P.; Hogg, A. M.

    2018-03-01

    The current generation of climate models exhibit a large spread in the steady-state and projected Southern Ocean upper and lower overturning circulation, with mechanisms for deep ocean variability remaining less well understood. Here, common Southern Ocean metrics in twelve models from the Coordinated Ocean-ice Reference Experiment Phase II (CORE-II) are assessed over a 60 year period. Specifically, stratification, surface buoyancy fluxes, and eddies are linked to the magnitude of the strengthening trend in the upper overturning circulation, and a decreasing trend in the lower overturning circulation across the CORE-II models. The models evolve similarly in the upper 1 km and the deep ocean, with an almost equivalent poleward intensification trend in the Southern Hemisphere westerly winds. However, the models differ substantially in their eddy parameterisation and surface buoyancy fluxes. In general, models with a larger heat-driven water mass transformation where deep waters upwell at the surface ( ∼ 55°S) transport warmer waters into intermediate depths, thus weakening the stratification in the upper 2 km. Models with a weak eddy induced overturning and a warm bias in the intermediate waters are more likely to exhibit larger increases in the upper overturning circulation, and more significant weakening of the lower overturning circulation. We find the opposite holds for a cool model bias in intermediate depths, combined with a more complex 3D eddy parameterisation that acts to reduce isopycnal slope. In summary, the Southern Ocean overturning circulation decadal trends in the coarse resolution CORE-II models are governed by biases in surface buoyancy fluxes and the ocean density field, and the configuration of the eddy parameterisation.

  13. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    PubMed Central

    Lafave, Mark R.; Butterwick, Dale; Eubank, Breda

    2015-01-01

    Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete's return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT). The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1) heading descriptors; (2) the order of the model; (3) the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline. PMID:26464897

  14. Validation of cell-based fluorescence assays: practice guidelines from the ICSH and ICCS - part II - preanalytical issues.

    PubMed

    Davis, Bruce H; Dasgupta, Amar; Kussick, Steven; Han, Jin-Yeong; Estrellado, Annalee

    2013-01-01

    Flow cytometry and other technologies of cell-based fluorescence assays are as a matter of good laboratory practice required to validate all assays, which when in clinical practice may pass through regulatory review processes using criteria often defined with a soluble analyte in plasma or serum samples in mind. Recently the U.S. Food and Drug Administration (FDA) has entered into a public dialogue in the U.S. regarding their regulatory interest in laboratory developed tests (LDTs) or so-called "home brew" assays performed in clinical laboratories. The absence of well-defined guidelines for validation of cell-based assays using fluorescence detection has thus become a subject of concern for the International Council for Standardization of Haematology (ICSH) and International Clinical Cytometry Society (ICCS). Accordingly, a group of over 40 international experts in the areas of test development, test validation, and clinical practice of a variety of assay types using flow cytometry and/or morphologic image analysis were invited to develop a set of practical guidelines useful to in vitro diagnostic (IVD) innovators, clinical laboratories, regulatory scientists, and laboratory inspectors. The focus of the group was restricted to fluorescence reporter reagents, although some common principles are shared by immunohistochemistry or immunocytochemistry techniques and noted where appropriate. The work product of this two year effort is the content of this special issue of this journal, which is published as 5 separate articles, this being Validation of Cell-based Fluorescence Assays: Practice Guidelines from the ICSH and ICCS - Part II - Preanalytical issues. © 2013 International Clinical Cytometry Society. © 2013 International Clinical Cytometry Society.

  15. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  16. I-15 San Diego, California, model validation and calibration report.

    DOT National Transportation Integrated Search

    2010-02-01

    The Integrated Corridor Management (ICM) initiative requires the calibration and validation of simulation models used in the Analysis, Modeling, and Simulation of Pioneer Site proposed integrated corridors. This report summarizes the results and proc...

  17. Phase II Testing of Liquid Cooling Garments Using a Sweating Manikin, Controlled by a Human Physiological Model

    NASA Technical Reports Server (NTRS)

    Paul, Heather; Trevino, Luis; Bue,Grant; Rugh, John

    2006-01-01

    An Advanced Automotive Manikin (ADAM) developed at the National Renewable Energy Laboratory (NREL) is used to evaluate NASA's liquid cooling garments (LCGs) used in advanced space suits for extravehicular applications. The manikin has 120 separate heated/sweating zones and is controlled by a finite element physiological model of the human thermoregulatory system. Previous testing showed the thermal sensation and comfort followed the expected trends as the LCG inlet fluid temperature was changed. The Phase II test data demonstrates the repeatability of ADAM by retesting the baseline LCG. Skin and core temperature predictions using ADAM in an LCG/Arctic suit combination are compared to NASA physiological data to validate the manikin/model. Additional LCG configurations are assessed using the manikin and compared to the baseline LCG. Results can extend to other personal protective clothing, including HAZMAT suits, nuclear/biological/chemical protective suits, and fire protection suits.

  18. Animal models of binge drinking, current challenges to improve face validity.

    PubMed

    Jeanblanc, Jérôme; Rolland, Benjamin; Gierski, Fabien; Martinetti, Margaret P; Naassila, Mickael

    2018-05-05

    Binge drinking (BD), i.e., consuming a large amount of alcohol in a short period of time, is an increasing public health issue. Though no clear definition has been adopted worldwide the speed of drinking seems to be a keystone of this behavior. Developing relevant animal models of BD is a priority for gaining a better characterization of the neurobiological and psychobiological mechanisms underlying this dangerous and harmful behavior. Until recently, preclinical research on BD has been conducted mostly using forced administration of alcohol, but more recent studies used scheduled access to alcohol, to model more voluntary excessive intakes, and to achieve signs of intoxications that mimic the human behavior. The main challenges for future research are discussed regarding the need of good face validity, construct validity and predictive validity of animal models of BD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  20. Elasto-dynamic analysis of a gear pump-Part III: Experimental validation procedure and model extension to helical gears

    NASA Astrophysics Data System (ADS)

    Mucchi, E.; Dalpiaz, G.

    2015-01-01

    This work concerns external gear pumps for automotive applications, which operate at high speed and low pressure. In previous works of the authors (Part I and II, [1,2]), a non-linear lumped-parameter kineto-elastodynamic model for the prediction of the dynamic behaviour of external gear pumps was presented. It takes into account the most important phenomena involved in the operation of this kind of machine. The two main sources of noise and vibration are considered: pressure pulsation and gear meshing. The model has been used in order to foresee the influence of working conditions and design modifications on vibration generation. The model's experimental validation is a difficult task. Thus, Part III proposes a novel methodology for the validation carried out by the comparison of simulations and experimental results concerning forces and moments: it deals with the external and inertial components acting on the gears, estimated by the model, and the reactions and inertial components on the pump casing and the test plate, obtained by measurements. The validation is carried out comparing the level of the time synchronous average in the time domain and the waterfall maps in the frequency domain, with particular attention to identify system resonances. The validation results are satisfactory globally, but discrepancies are still present. Moreover, the assessed model has been properly modified for the application to a new virtual pump prototype with helical gears in order to foresee gear accelerations and dynamic forces. Part IV is focused on improvements in the modelling and analysis of the phenomena bound to the pressure evolution around the gears in order to achieve results closer to the measured values. As a matter of fact, the simulation results have shown that a variable meshing stiffness has a notable contribution on the dynamic behaviour of the pump but this is not as important as the pressure phenomena. As a consequence, the original model was modified with the

  1. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamison, Ryan Dale; Buchheit, Thomas E.; Emery, John M

    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element modelmore » of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.« less

  2. A structural model for apolipoprotein C-II amyloid fibrils: experimental characterization and molecular dynamics simulations.

    PubMed

    Teoh, Chai Lean; Pham, Chi L L; Todorova, Nevena; Hung, Andrew; Lincoln, Craig N; Lees, Emma; Lam, Yuen Han; Binger, Katrina J; Thomson, Neil H; Radford, Sheena E; Smith, Trevor A; Müller, Shirley A; Engel, Andreas; Griffin, Michael D W; Yarovsky, Irene; Gooley, Paul R; Howlett, Geoffrey J

    2011-02-04

    The self-assembly of specific proteins to form insoluble amyloid fibrils is a characteristic feature of a number of age-related and debilitating diseases. Lipid-free human apolipoprotein C-II (apoC-II) forms characteristic amyloid fibrils and is one of several apolipoproteins that accumulate in amyloid deposits located within atherosclerotic plaques. X-ray diffraction analysis of aligned apoC-II fibrils indicated a simple cross-β-structure composed of two parallel β-sheets. Examination of apoC-II fibrils using transmission electron microscopy, scanning transmission electron microscopy, and atomic force microscopy indicated that the fibrils are flat ribbons composed of one apoC-II molecule per 4.7-Å rise of the cross-β-structure. Cross-linking results using single-cysteine substitution mutants are consistent with a parallel in-register structural model for apoC-II fibrils. Fluorescence resonance energy transfer analysis of apoC-II fibrils labeled with specific fluorophores provided distance constraints for selected donor-acceptor pairs located within the fibrils. These findings were used to develop a simple 'letter-G-like' β-strand-loop-β-strand model for apoC-II fibrils. Fully solvated all-atom molecular dynamics (MD) simulations showed that the model contained a stable cross-β-core with a flexible connecting loop devoid of persistent secondary structure. The time course of the MD simulations revealed that charge clusters in the fibril rearrange to minimize the effects of same-charge interactions inherent in parallel in-register models. Our structural model for apoC-II fibrils suggests that apoC-II monomers fold and self-assemble to form a stable cross-β-scaffold containing relatively unstructured connecting loops. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Neuro-evolutionary computing paradigm for Painlevé equation-II in nonlinear optics

    NASA Astrophysics Data System (ADS)

    Ahmad, Iftikhar; Ahmad, Sufyan; Awais, Muhammad; Ul Islam Ahmad, Siraj; Asif Zahoor Raja, Muhammad

    2018-05-01

    The aim of this study is to investigate the numerical treatment of the Painlevé equation-II arising in physical models of nonlinear optics through artificial intelligence procedures by incorporating a single layer structure of neural networks optimized with genetic algorithms, sequential quadratic programming and active set techniques. We constructed a mathematical model for the nonlinear Painlevé equation-II with the help of networks by defining an error-based cost function in mean square sense. The performance of the proposed technique is validated through statistical analyses by means of the one-way ANOVA test conducted on a dataset generated by a large number of independent runs.

  4. External validation of EPIWIN biodegradation models.

    PubMed

    Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

    2005-01-01

    The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6.

  5. Assessing the Incremental Value of KABC-II Luria Model Scores in Predicting Achievement: What Do They Tell Us beyond the MPI?

    ERIC Educational Resources Information Center

    McGill, Ryan J.; Spurgin, Angelia R.

    2016-01-01

    The current study examined the incremental validity of the Luria interpretive scheme for the Kaufman Assessment Battery for Children-Second Edition (KABC-II) for predicting scores on the Kaufman Test of Educational Achievement-Second Edition (KTEA-II). All participants were children and adolescents (N = 2,025) drawn from the nationally…

  6. Ventilation tube insertion simulation: a literature review and validity assessment of five training models.

    PubMed

    Mahalingam, S; Awad, Z; Tolley, N S; Khemani, S

    2016-08-01

    The objective of this study was to identify and investigate the face and content validity of ventilation tube insertion (VTI) training models described in the literature. A review of literature was carried out to identify articles describing VTI simulators. Feasible models were replicated and assessed by a group of experts. Postgraduate simulation centre. Experts were defined as surgeons who had performed at least 100 VTI on patients. Seventeen experts were participated ensuring sufficient statistical power for analysis. A standardised 18-item Likert-scale questionnaire was used. This addressed face validity (realism), global and task-specific content (suitability of the model for teaching) and curriculum recommendation. The search revealed eleven models, of which only five had associated validity data. Five models were found to be feasible to replicate. None of the tested models achieved face or global content validity. Only one model achieved task-specific validity, and hence, there was no agreement on curriculum recommendation. The quality of simulation models is moderate and there is room for improvement. There is a need for new models to be developed or existing ones to be refined in order to construct a more realistic training platform for VTI simulation. © 2015 John Wiley & Sons Ltd.

  7. First Principles Modeling of RFQ Cooling System and Resonant Frequency Responses for Fermilab’s PIP-II Injector Test

    DOE PAGES

    Edelen, J. P.; Edelen, A. L.; Bowring, D.; ...

    2016-12-23

    In this study we develop an a priori method for simulating dynamic resonant frequency and temperature responses in a radio frequency quadrupole (RFQ) and its associated water-based cooling system respectively. Our model provides a computationally efficient means to evaluate the transient response of the RFQ over a large range of system parameters. The model was constructed prior to the delivery of the PIP-II Injector Test RFQ and was used to aid in the design of the water-based cooling system, data acquisition system, and resonance control system. Now that the model has been validated with experimental data, it can confidently bemore » used to aid in the design of future RFQ resonance controllers and their associated water-based cooling systems. Finally, without any empirical fitting, it has demonstrated the ability to predict absolute temperature and frequency changes to 11% accuracy on average, and relative changes to 7% accuracy.« less

  8. A Validity Agenda for Growth Models: One Size Doesn't Fit All!

    ERIC Educational Resources Information Center

    Patelis, Thanos

    2012-01-01

    This is a keynote presentation given at AERA on developing a validity agenda for growth models in a large scale (e.g., state) setting. The emphasis of this presentation was to indicate that growth models and the validity agenda designed to provide evidence in supporting the claims to be made need to be personalized to meet the local or…

  9. Objective validation of central sensitization in the rat UVB and heat rekindling model

    PubMed Central

    Weerasinghe, NS; Lumb, BM; Apps, R; Koutsikou, S; Murrell, JC

    2014-01-01

    Background The UVB and heat rekindling (UVB/HR) model shows potential as a translatable inflammatory pain model. However, the occurrence of central sensitization in this model, a fundamental mechanism underlying chronic pain, has been debated. Face, construct and predictive validity are key requisites of animal models; electromyogram (EMG) recordings were utilized to objectively demonstrate validity of the rat UVB/HR model. Methods The UVB/HR model was induced on the heel of the hind paw under anaesthesia. Mechanical withdrawal thresholds (MWTs) were obtained from biceps femoris EMG responses to a gradually increasing pinch at the mid hind paw region under alfaxalone anaesthesia, 96 h after UVB irradiation. MWT was compared between UVB/HR and SHAM-treated rats (anaesthetic only). Underlying central mechanisms in the model were pharmacologically validated by MWT measurement following intrathecal N-methyl-d-aspartate (NMDA) receptor antagonist, MK-801, or saline. Results Secondary hyperalgesia was confirmed by a significantly lower pre-drug MWT {mean [±standard error of the mean (SEM)]} in UVB/HR [56.3 (±2.1) g/mm2, n = 15] compared with SHAM-treated rats [69.3 (±2.9) g/mm2, n = 8], confirming face validity of the model. Predictive validity was demonstrated by the attenuation of secondary hyperalgesia by MK-801, where mean (±SEM) MWT was significantly higher [77.2 (±5.9) g/mm2 n = 7] in comparison with pre-drug [57.8 (±3.5) g/mm2 n = 7] and saline [57.0 (±3.2) g/mm2 n = 8] at peak drug effect. The occurrence of central sensitization confirmed construct validity of the UVB/HR model. Conclusions This study used objective outcome measures of secondary hyperalgesia to validate the rat UVB/HR model as a translational model of inflammatory pain. What's already known about this topic? Most current animal chronic pain models lack translatability to human subjects. Primary hyperalgesia is an established feature of the UVB/heat rekindling

  10. Validation by simulation of a clinical trial model using the standardized mean and variance criteria.

    PubMed

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2006-12-01

    To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.

  11. Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

  12. Angiotensin II, hypertension and angiotensin II receptor antagonism: Roles in the behavioural and brain pathology of a mouse model of Alzheimer's disease.

    PubMed

    Wiesmann, Maximilian; Roelofs, Monica; van der Lugt, Robert; Heerschap, Arend; Kiliaan, Amanda J; Claassen, Jurgen Ahr

    2017-07-01

    Elevated angiotensin II causes hypertension and contributes to Alzheimer's disease by affecting cerebral blood flow. Angiotensin II receptor blockers may provide candidates to reduce (vascular) risk factors for Alzheimer's disease. We studied effects of two months of angiotensin II-induced hypertension on systolic blood pressure, and treatment with the angiotensin II receptor blockers, eprosartan mesylate, after one month of induced hypertension in wild-type C57bl/6j and AβPPswe/PS1ΔE9 (AβPP/PS1/Alzheimer's disease) mice. AβPP/PS1 showed higher systolic blood pressure than wild-type. Subsequent eprosartan mesylate treatment restored this elevated systolic blood pressure in all mice. Functional connectivity was decreased in angiotensin II-infused Alzheimer's disease and wild-type mice, and only 12 months of Alzheimer's disease mice showed impaired cerebral blood flow. Only angiotensin II-infused Alzheimer's disease mice exhibited decreased spatial learning in the Morris water maze. Altogether, angiotensin II-induced hypertension not only exacerbated Alzheimer's disease-like pathological changes such as impairment of cerebral blood flow, functional connectivity, and cognition only in Alzheimer's disease model mice, but it also induced decreased functional connectivity in wild-type mice. However, we could not detect hypertension-induced overexpression of Aβ nor increased neuroinflammation. Our findings suggest a link between midlife hypertension, decreased cerebral hemodynamics and connectivity in an Alzheimer's disease mouse model. Eprosartan mesylate treatment restored and beneficially affected cerebral blood flow and connectivity. This model could be used to investigate prevention/treatment strategies in early Alzheimer's disease.

  13. Development and validation of a cost-utility model for Type 1 diabetes mellitus.

    PubMed

    Wolowacz, S; Pearson, I; Shannon, P; Chubb, B; Gundgaard, J; Davies, M; Briggs, A

    2015-08-01

    To develop a health economic model to evaluate the cost-effectiveness of new interventions for Type 1 diabetes mellitus by their effects on long-term complications (measured through mean HbA1c ) while capturing the impact of treatment on hypoglycaemic events. Through a systematic review, we identified complications associated with Type 1 diabetes mellitus and data describing the long-term incidence of these complications. An individual patient simulation model was developed and included the following complications: cardiovascular disease, peripheral neuropathy, microalbuminuria, end-stage renal disease, proliferative retinopathy, ketoacidosis, cataract, hypoglycemia and adverse birth outcomes. Risk equations were developed from published cumulative incidence data and hazard ratios for the effect of HbA1c , age and duration of diabetes. We validated the model by comparing model predictions with observed outcomes from studies used to build the model (internal validation) and from other published data (external validation). We performed illustrative analyses for typical patient cohorts and a hypothetical intervention. Model predictions were within 2% of expected values in the internal validation and within 8% of observed values in the external validation (percentages represent absolute differences in the cumulative incidence). The model utilized high-quality, recent data specific to people with Type 1 diabetes mellitus. In the model validation, results deviated less than 8% from expected values. © 2014 Research Triangle Institute d/b/a RTI Health Solutions. Diabetic Medicine © 2014 Diabetes UK.

  14. Towards Automatic Validation and Healing of Citygml Models for Geometric and Semantic Consistency

    NASA Astrophysics Data System (ADS)

    Alam, N.; Wagner, D.; Wewetzer, M.; von Falkenhausen, J.; Coors, V.; Pries, M.

    2013-09-01

    A steadily growing number of application fields for large 3D city models have emerged in recent years. Like in many other domains, data quality is recognized as a key factor for successful business. Quality management is mandatory in the production chain nowadays. Automated domain-specific tools are widely used for validation of business-critical data but still common standards defining correct geometric modeling are not precise enough to define a sound base for data validation of 3D city models. Although the workflow for 3D city models is well-established from data acquisition to processing, analysis and visualization, quality management is not yet a standard during this workflow. Processing data sets with unclear specification leads to erroneous results and application defects. We show that this problem persists even if data are standard compliant. Validation results of real-world city models are presented to demonstrate the potential of the approach. A tool to repair the errors detected during the validation process is under development; first results are presented and discussed. The goal is to heal defects of the models automatically and export a corrected CityGML model.

  15. Modern modeling techniques had limited external validity in predicting mortality from traumatic brain injury.

    PubMed

    van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W

    2016-10-01

    Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    NASA Astrophysics Data System (ADS)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  17. Pecan nutshell as biosorbent to remove Cu(II), Mn(II) and Pb(II) from aqueous solutions.

    PubMed

    Vaghetti, Julio C P; Lima, Eder C; Royer, Betina; da Cunha, Bruna M; Cardoso, Natali F; Brasil, Jorge L; Dias, Silvio L P

    2009-02-15

    In the present study we reported for the first time the feasibility of pecan nutshell (PNS, Carya illinoensis) as an alternative biosorbent to remove Cu(II), Mn(II) and Pb(II) metallic ions from aqueous solutions. The ability of PNS to remove the metallic ions was investigated by using batch biosorption procedure. The effects such as, pH, biosorbent dosage on the adsorption capacities of PNS were studied. Four kinetic models were tested, being the adsorption kinetics better fitted to fractionary-order kinetic model. Besides that, the kinetic data were also fitted to intra-particle diffusion model, presenting three linear regions, indicating that the kinetics of adsorption should follow multiple sorption rates. The equilibrium data were fitted to Langmuir, Freundlich, Sips and Redlich-Peterson isotherm models. Taking into account a statistical error function, the data were best fitted to Sips isotherm model. The maximum biosorption capacities of PNS were 1.35, 1.78 and 0.946mmolg(-1) for Cu(II), Mn(II) and Pb(II), respectively.

  18. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  19. German validation of the Conners Adult ADHD Rating Scales (CAARS) II: reliability, validity, diagnostic sensitivity and specificity.

    PubMed

    Christiansen, H; Kis, B; Hirsch, O; Matthies, S; Hebebrand, J; Uekermann, J; Abdel-Hamid, M; Kraemer, M; Wiltfang, J; Graf, E; Colla, M; Sobanski, E; Alm, B; Rösler, M; Jacob, C; Jans, T; Huss, M; Schimmelmann, B G; Philipsen, A

    2012-07-01

    The German version of the Conners Adult ADHD Rating Scales (CAARS) has proven to show very high model fit in confirmative factor analyses with the established factors inattention/memory problems, hyperactivity/restlessness, impulsivity/emotional lability, and problems with self-concept in both large healthy control and ADHD patient samples. This study now presents data on the psychometric properties of the German CAARS-self-report (CAARS-S) and observer-report (CAARS-O) questionnaires. CAARS-S/O and questions on sociodemographic variables were filled out by 466 patients with ADHD, 847 healthy control subjects that already participated in two prior studies, and a total of 896 observer data sets were available. Cronbach's-alpha was calculated to obtain internal reliability coefficients. Pearson correlations were performed to assess test-retest reliability, and concurrent, criterion, and discriminant validity. Receiver Operating Characteristics (ROC-analyses) were used to establish sensitivity and specificity for all subscales. Coefficient alphas ranged from .74 to .95, and test-retest reliability from .85 to .92 for the CAARS-S, and from .65 to .85 for the CAARS-O. All CAARS subscales, except problems with self-concept correlated significantly with the Barrett Impulsiveness Scale (BIS), but not with the Wender Utah Rating Scale (WURS). Criterion validity was established with ADHD subtype and diagnosis based on DSM-IV criteria. Sensitivity and specificity were high for all four subscales. The reported results confirm our previous study and show that the German CAARS-S/O do indeed represent a reliable and cross-culturally valid measure of current ADHD symptoms in adults. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  20. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  1. Association of Adjuvant Chemotherapy With Survival in Patients With Stage II or III Gastric Cancer

    PubMed Central

    Jiang, Yuming; Li, Tuanjie; Liang, Xiaoling; Hu, Yanfeng; Huang, Lei; Liao, Zhenchen; Zhao, Liying; Han, Zhen; Zhu, Shuguang; Wang, Menglan; Xu, Yangwei; Qi, Xiaolong; Liu, Hao; Yang, Yang; Yu, Jiang; Liu, Wei; Cai, Shirong

    2017-01-01

    Importance The current staging system of gastric cancer is not adequate for defining a prognosis and predicting the patients most likely to benefit from chemotherapy. Objective To construct a survival prediction model based on specific tumor and patient characteristics that enables individualized predictions of the net survival benefit of adjuvant chemotherapy for patients with stage II or stage III gastric cancer. Design, Setting, and Participants In this multicenter retrospective analysis, a survival prediction model was constructed using data from a training cohort of 746 patients with stage II or stage III gastric cancer who satisfied the study’s inclusion criteria and underwent surgery between January 1, 2004, and December 31, 2012, at Nanfang Hospital in Guangzhou, China. Patient and tumor characteristics were included as covariates, and their association with overall survival and disease-free survival with and without adjuvant chemotherapy was assessed. The model was internally validated for discrimination and calibration using bootstrap resampling. To externally validate the model, data were included from a validation cohort of 973 patients with stage II or stage III gastric cancer who met the inclusion criteria and underwent surgery at First Affiliated Hospital in Guangzhou, China, and at West China Hospital of Sichuan Hospital in Chendu, China, between January 1, 2000, and June 30, 2009. Data were analyzed from July 10, 2016, to September 1, 2016. Main Outcomes and Measures Concordance index and decision curve analysis for each measure associated with postoperative overall survival and disease-free survival. Results Of the 1719 patients analyzed, 1183 (68.8%) were men and 536 (31.2%) were women and the median (interquartile range) age was 57 (49-66) years. Age, location, differentiation, carcinoembryonic antigen, cancer antigen 19-9, depth of invasion, lymph node metastasis, and adjuvant chemotherapy were significantly associated with overall survival

  2. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    PubMed

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  3. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  4. MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES

    EPA Science Inventory

    This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...

  5. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Sewer solids separation by sedimentation--the problem of modeling, validation and transferability.

    PubMed

    Kutzner, R; Brombach, H; Geiger, W F

    2007-01-01

    Sedimentation of sewer solids in tanks, ponds and similar devices is the most relevant process for the treatment of stormwater and combined sewer overflows in urban collecting systems. In the past a lot of research work was done to develop deterministic models for the description of this separation process. But these modern models are not commonly accepted in Germany until today. Water Authorities are sceptical with regard to model validation and transferability. Within this paper it is checked whether this scepticism is reasonable. A framework-proposal for the validation of mathematical models with zero or one dimensional spatial resolution for particle separation processes for stormwater and combined sewer overflow treatment is presented. This proposal was applied to publications of repute on sewer solids separation by sedimentation. The result was that none of the investigated models described in literature passed the validation entirely. There is an urgent need for future research in sewer solids sedimentation and remobilization!

  7. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    PubMed

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. U.S. 75 Dallas, Texas, Model Validation and Calibration Report

    DOT National Transportation Integrated Search

    2010-02-01

    This report presents the model validation and calibration results of the Integrated Corridor Management (ICM) analysis, modeling, and simulation (AMS) for the U.S. 75 Corridor in Dallas, Texas. The purpose of the project was to estimate the benefits ...

  9. The Adsorption of Cd(II) on Manganese Oxide Investigated by Batch and Modeling Techniques.

    PubMed

    Huang, Xiaoming; Chen, Tianhu; Zou, Xuehua; Zhu, Mulan; Chen, Dong; Pan, Min

    2017-09-28

    Manganese (Mn) oxide is a ubiquitous metal oxide in sub-environments. The adsorption of Cd(II) on Mn oxide as function of adsorption time, pH, ionic strength, temperature, and initial Cd(II) concentration was investigated by batch techniques. The adsorption kinetics showed that the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by pseudo-second-order kinetic model with high correlation coefficients (R² > 0.999). The adsorption of Cd(II) on Mn oxide significantly decreased with increasing ionic strength at pH < 5.0, whereas Cd(II) adsorption was independent of ionic strength at pH > 6.0, which indicated that outer-sphere and inner-sphere surface complexation dominated the adsorption of Cd(II) on Mn oxide at pH < 5.0 and pH > 6.0, respectively. The maximum adsorption capacity of Mn oxide for Cd(II) calculated from Langmuir model was 104.17 mg/g at pH 6.0 and 298 K. The thermodynamic parameters showed that the adsorption of Cd(II) on Mn oxide was an endothermic and spontaneous process. According to the results of surface complexation modeling, the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by ion exchange sites (X₂Cd) at low pH and inner-sphere surface complexation sites (SOCd⁺ and (SO)₂CdOH - species) at high pH conditions. The finding presented herein plays an important role in understanding the fate and transport of heavy metals at the water-mineral interface.

  10. The Adsorption of Cd(II) on Manganese Oxide Investigated by Batch and Modeling Techniques

    PubMed Central

    Huang, Xiaoming; Chen, Tianhu; Zou, Xuehua; Zhu, Mulan; Chen, Dong

    2017-01-01

    Manganese (Mn) oxide is a ubiquitous metal oxide in sub-environments. The adsorption of Cd(II) on Mn oxide as function of adsorption time, pH, ionic strength, temperature, and initial Cd(II) concentration was investigated by batch techniques. The adsorption kinetics showed that the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by pseudo-second-order kinetic model with high correlation coefficients (R2 > 0.999). The adsorption of Cd(II) on Mn oxide significantly decreased with increasing ionic strength at pH < 5.0, whereas Cd(II) adsorption was independent of ionic strength at pH > 6.0, which indicated that outer-sphere and inner-sphere surface complexation dominated the adsorption of Cd(II) on Mn oxide at pH < 5.0 and pH > 6.0, respectively. The maximum adsorption capacity of Mn oxide for Cd(II) calculated from Langmuir model was 104.17 mg/g at pH 6.0 and 298 K. The thermodynamic parameters showed that the adsorption of Cd(II) on Mn oxide was an endothermic and spontaneous process. According to the results of surface complexation modeling, the adsorption of Cd(II) on Mn oxide can be satisfactorily simulated by ion exchange sites (X2Cd) at low pH and inner-sphere surface complexation sites (SOCd+ and (SO)2CdOH− species) at high pH conditions. The finding presented herein plays an important role in understanding the fate and transport of heavy metals at the water–mineral interface. PMID:28956849

  11. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    PubMed

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  12. Independent external validation of predictive models for urinary dysfunction following external beam radiotherapy of the prostate: Issues in model development and reporting.

    PubMed

    Yahya, Noorazrul; Ebert, Martin A; Bulsara, Max; Kennedy, Angel; Joseph, David J; Denham, James W

    2016-08-01

    Most predictive models are not sufficiently validated for prospective use. We performed independent external validation of published predictive models for urinary dysfunctions following radiotherapy of the prostate. Multivariable models developed to predict atomised and generalised urinary symptoms, both acute and late, were considered for validation using a dataset representing 754 participants from the TROG 03.04-RADAR trial. Endpoints and features were harmonised to match the predictive models. The overall performance, calibration and discrimination were assessed. 14 models from four publications were validated. The discrimination of the predictive models in an independent external validation cohort, measured using the area under the receiver operating characteristic (ROC) curve, ranged from 0.473 to 0.695, generally lower than in internal validation. 4 models had ROC >0.6. Shrinkage was required for all predictive models' coefficients ranging from -0.309 (prediction probability was inverse to observed proportion) to 0.823. Predictive models which include baseline symptoms as a feature produced the highest discrimination. Two models produced a predicted probability of 0 and 1 for all patients. Predictive models vary in performance and transferability illustrating the need for improvements in model development and reporting. Several models showed reasonable potential but efforts should be increased to improve performance. Baseline symptoms should always be considered as potential features for predictive models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  14. Modeling the effects of argument length and validity on inductive and deductive reasoning.

    PubMed

    Rotello, Caren M; Heit, Evan

    2009-09-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were observed: Induction judgments were more affected by argument length, and deduction judgments were more affected by validity. In Experiment 2, fluency was manipulated by displaying the materials in a low-contrast font, leading to increased sensitivity to logical validity. Several variants of 1-process and 2-process models of reasoning were assessed against the results. A 1-process model that assumed the same scale of argument strength underlies induction and deduction was not successful. A 2-process model that assumed separate, continuous informational dimensions of apparent deductive validity and associative strength gave the more successful account. (c) 2009 APA, all rights reserved.

  15. Validating soil phosphorus routines in the SWAT model

    USDA-ARS?s Scientific Manuscript database

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  16. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    PubMed Central

    2011-01-01

    Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments

  17. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    NASA Astrophysics Data System (ADS)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  18. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and

  19. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    NASA Astrophysics Data System (ADS)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  20. SPARTAN II: An Instructional High Resolution Land Combat Model

    DTIC Science & Technology

    1993-03-01

    93M-09 SPARTAN II: AN INSTRUCTIONAL HIGH RESOLUTION LAND COMBAT MODEL THESIS DWquALfl’ 4 Presented to the Faculty of the School of Engineering of the...ADVISOR NAJ Edward Negrelli/ENS REALDER MAJ Bruce Marl an/MA LD1 { The goal of this thesis was to improve SPARTAN, a high resolution land combat model...should serve as a useful tool for learning about the advantages and disadvantages of high resolution combat modeling. I wish to thank I4AJ Edward

  1. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  2. MolProbity’s Ultimate Rotamer-Library Distributions for Model Validation

    PubMed Central

    Hintze, Bradley J.; Lewis, Steven M.; Richardson, Jane S.; Richardson, David C.

    2016-01-01

    Here we describe the updated MolProbity rotamer-library distributions derived from an order-of-magnitude larger and more stringently quality-filtered dataset of about 8000 (vs. 500) protein chains, and we explain the resulting changes and improvements to model validation as seen by users. To include only sidechains with satisfactory justification for their given conformation, we added residue-specific filters for electron-density value and model-to-density fit. The combined new protocol retains a million residues of data, while cleaning up false-positive noise in the multi-χ datapoint distributions. It enables unambiguous characterization of conformational clusters nearly 1000-fold less frequent than the most common ones. We describe examples of local interactions that favor these rare conformations, including the role of authentic covalent bond-angle deviations in enabling presumably strained sidechain conformations. Further, along with favored and outlier, an allowed category (0.3% to 2.0% occurrence in reference data) has been added, analogous to Ramachandran validation categories. The new rotamer distributions are used for current rotamer validation in Mol-Probity and PHENIX, and for rotamer choice in PHENIX model-building and refinement. The multi-dimensional χ distributions and Top8000 reference dataset are freely available on GitHub. These rotamers are termed “ultimate” because data sampling and quality are now fully adequate for this task, and also because we believe the future of conformational validation should integrate sidechain with backbone criteria. PMID:27018641

  3. Data-Driven Residential Load Modeling and Validation in GridLAB-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gotseff, Peter; Lundstrom, Blake

    Accurately characterizing the impacts of high penetrations of distributed energy resources (DER) on the electric distribution system has driven modeling methods from traditional static snap shots, often representing a critical point in time (e.g., summer peak load), to quasi-static time series (QSTS) simulations capturing all the effects of variable DER, associated controls and hence, impacts on the distribution system over a given time period. Unfortunately, the high time resolution DER source and load data required for model inputs is often scarce or non-existent. This paper presents work performed within the GridLAB-D model environment to synthesize, calibrate, and validate 1-second residentialmore » load models based on measured transformer loads and physics-based models suitable for QSTS electric distribution system modeling. The modeling and validation approach taken was to create a typical GridLAB-D model home that, when replicated to represent multiple diverse houses on a single transformer, creates a statistically similar load to a measured load for a given weather input. The model homes are constructed to represent the range of actual homes on an instrumented transformer: square footage, thermal integrity, heating and cooling system definition as well as realistic occupancy schedules. House model calibration and validation was performed using the distribution transformer load data and corresponding weather. The modeled loads were found to be similar to the measured loads for four evaluation metrics: 1) daily average energy, 2) daily average and standard deviation of power, 3) power spectral density, and 4) load shape.« less

  4. Turbulence Modeling Validation, Testing, and Development

    NASA Technical Reports Server (NTRS)

    Bardina, J. E.; Huang, P. G.; Coakley, T. J.

    1997-01-01

    The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

  5. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  6. Derivation and validation of a simple clinical risk-model in heart failure based on 6 minute walk test performance and NT-proBNP status--do we need specificity for sex and beta-blockers?

    PubMed

    Frankenstein, L; Goode, K; Ingle, L; Remppis, A; Schellberg, D; Nelles, M; Katus, H A; Clark, A L; Cleland, J G F; Zugck, C

    2011-02-17

    It is unclear whether risk prediction strategies in chronic heart failure (CHF) need to be specific for sex or beta-blockers. We examined this problem and developed and validated the consequent risk models based on 6-minute-walk-test and NT-proBNP. The derivation cohort comprised 636 German patients with systolic dysfunction. They were validated against 676 British patients with similar aetiology. ROC-curves for 1-year mortality identified cut-off values separately for specificity (none, sex, beta-blocker, both). Patients were grouped according to number of cut-offs met (group I/II/III - 0/1/2 cut-offs). Widest separation between groups was achieved with sex- and beta-blocker-specific cut offs. In the derivation population, 1-year mortality was 0%, 8%, 31% for group I, II and III, respectively. In the validation population, 1-year rates in the three risk groups were 2%, 7%, 14%, respectively, after application of the same cut-offs. Risk stratification for CHF should perhaps take sex and beta-blocker usage into account. We derived and independently validated relevant risk models based on 6-minute-walk-tests and NT-proBNP. Specifying sex and use of beta-blockers identified three distinct sub-groups with widely differing prognosis. In clinical practice, it may be appropriate to tailor the intensity of follow-up and/or the treatment strategy according to the risk-group. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  7. Criterion for evaluating the predictive ability of nonlinear regression models without cross-validation.

    PubMed

    Kaneko, Hiromasa; Funatsu, Kimito

    2013-09-23

    We propose predictive performance criteria for nonlinear regression models without cross-validation. The proposed criteria are the determination coefficient and the root-mean-square error for the midpoints between k-nearest-neighbor data points. These criteria can be used to evaluate predictive ability after the regression models are updated, whereas cross-validation cannot be performed in such a situation. The proposed method is effective and helpful in handling big data when cross-validation cannot be applied. By analyzing data from numerical simulations and quantitative structural relationships, we confirm that the proposed criteria enable the predictive ability of the nonlinear regression models to be appropriately quantified.

  8. Mono-component versus binary isotherm models for Cu(II) and Pb(II) sorption from binary metal solution by the green alga Pithophora oedogonia.

    PubMed

    Kumar, Dhananjay; Singh, Alpana; Gaur, J P

    2008-11-01

    The sorption of Cu(II) and Pb(II) by Pithophora markedly decreased as the concentration of the secondary metal ion, Cu(II) or Pb(II), increased in the binary metal solution. However, the test alga showed a greater affinity to sorb Cu(II) than Pb(II) from the binary metal solution. Mono-component Freundlich, Langmuir, Redlich-Peterson and Sips isotherms successfully predicted the sorption of Cu(II) and Pb(II) from both single and binary metal solutions. None of the tested binary sorption isotherms could realistically predict Cu(II) and Pb(II) sorption capacity and affinity of the test alga for the binary metal solutions of varying composition, which mono-component isotherms could very well accomplish. Hence, mono-component isotherm modeling at different concentrations of the secondary metal ion seems to be a better option than binary isotherms for metal sorption from binary metal solution.

  9. Validation of Modelled Ice Dynamics of the Greenland Ice Sheet using Historical Forcing

    NASA Astrophysics Data System (ADS)

    Hoffman, M. J.; Price, S. F.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Tezaur, I.; Kennedy, J. H.; Lenaerts, J.; Lipscomb, W. H.; Neumann, T.; Nowicki, S.; Perego, M.; Saba, J. L.; Salinger, A.; Guerber, J. R.

    2015-12-01

    Although ice sheet models are used for sea level rise projections, the degree to which these models have been validated by observations is fairly limited, due in part to the limited duration of the satellite observation era and the long adjustment time scales of ice sheets. Here we describe a validation framework for the Greenland Ice Sheet applied to the Community Ice Sheet Model by forcing the model annually with flux anomalies at the major outlet glaciers (Enderlin et al., 2014, observed from Landsat/ASTER/Operation IceBridge) and surface mass balance (van Angelen et al., 2013, calculated from RACMO2) for the period 1991-2012. The ice sheet model output is compared to ice surface elevation observations from ICESat and ice sheet mass change observations from GRACE. Early results show promise for assessing the performance of different model configurations. Additionally, we explore the effect of ice sheet model resolution on validation skill.

  10. Validation of the Colorado Retinopathy of Prematurity Screening Model.

    PubMed

    McCourt, Emily A; Ying, Gui-Shuang; Lynch, Anne M; Palestine, Alan G; Wagner, Brandie D; Wymore, Erica; Tomlinson, Lauren A; Binenbaum, Gil

    2018-04-01

    The Colorado Retinopathy of Prematurity (CO-ROP) model uses birth weight, gestational age, and weight gain at the first month of life (WG-28) to predict risk of severe retinopathy of prematurity (ROP). In previous validation studies, the model performed very well, predicting virtually all cases of severe ROP and potentially reducing the number of infants who need ROP examinations, warranting validation in a larger, more diverse population. To validate the performance of the CO-ROP model in a large multicenter cohort. This study is a secondary analysis of data from the Postnatal Growth and Retinopathy of Prematurity (G-ROP) Study, a retrospective multicenter cohort study conducted in 29 hospitals in the United States and Canada between January 2006 and June 2012 of 6351 premature infants who received ROP examinations. Sensitivity and specificity for severe (early treatment of ROP [ETROP] type 1 or 2) ROP, and reduction in infants receiving examinations. The CO-ROP model was applied to the infants in the G-ROP data set with all 3 data points (infants would have received examinations if they met all 3 criteria: birth weight, <1501 g; gestational age, <30 weeks; and WG-28, <650 g). Infants missing WG-28 information were included in a secondary analysis in which WG-28 was considered fewer than 650 g. Of 7438 infants in the G-ROP study, 3575 (48.1%) were girls, and maternal race/ethnicity was 2310 (31.1%) African American, 3615 (48.6%) white, 233 (3.1%) Asian, 40 (0.52%) American Indian/Alaskan Native, and 93 (1.3%) Pacific Islander. In the study cohort, 747 infants (11.8%) had type 1 or 2 ROP, 2068 (32.6%) had lower-grade ROP, and 3536 (55.6%) had no ROP. The CO-ROP model had a sensitivity of 96.9% (95% CI, 95.4%-97.9%) and a specificity of 40.9% (95% CI, 39.3%-42.5%). It missed 23 (3.1%) infants who developed severe ROP. The CO-ROP model would have reduced the number of infants who received examinations by 26.1% (95% CI, 25.0%-27.2%). The CO-ROP model demonstrated high

  11. Modeling Type II-P/II-L Supernovae Interacting with Recent Episodic Mass Ejections from Their Presupernova Stars with MESA and SNEC

    NASA Astrophysics Data System (ADS)

    Das, Sanskriti; Ray, Alak

    2017-12-01

    We show how dense, compact, discrete shells of circumstellar gas immediately outside of red supergiants affect the optical light curves of Type II-P/II-L supernovae (SNe), using the example of SN 2013ej. Earlier efforts in the literature had used an artificial circumstellar medium (CSM) stitched to the surface of an evolved star that had not gone through a phase of late-stage heavy mass loss, which, in essence, is the original source of the CSM. In contrast, we allow enhanced mass-loss rate from the modeled star during the 16O and 28Si burning stages and construct the CSM from the resulting mass-loss history in a self-consistent way. Once such evolved pre-SN stars are exploded, we find that the models with early interaction between the shock and the dense CSM reproduce light curves far better than those without that mass loss and, hence, having no nearby dense CSM. The required explosion energy for the progenitors with a dense CSM is reduced by almost a factor of two compared to those without the CSM. Our model, with a more realistic CSM profile and presupernova and explosion parameters, fits observed data much better throughout the rise, plateau, and radioactive tail phases as compared to previous studies. This points to an intermediate class of supernovae between Type II-P/II-L and Type II-n SNe with the characteristics of simultaneous UV and optical peak, slow decline after peak, and a longer plateau.

  12. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns

    PubMed Central

    Chérel, Guillaume; Cottineau, Clémentine; Reuillon, Romain

    2015-01-01

    Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model’s predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic. PMID:26368917

  13. Validation of Fatigue Modeling Predictions in Aviation Operations

    NASA Technical Reports Server (NTRS)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin

    2017-01-01

    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  14. Line profile studies of hydrodynamical models of cometary compact H II regions

    NASA Astrophysics Data System (ADS)

    Zhu, Feng-Yao; Zhu, Qing-Feng

    2015-06-01

    We simulate the evolution of cometary H II regions based on several champagne flow models and bow shock models, and calculate the profiles of the [Ne II] fine-structure line at 12.81 μm, the H30α recombination line and the [Ne III] fine-structure line at 15.55 μm for these models at different inclinations of 0°, 30° and 60°. We find that the profiles in the bow shock models are generally different from those in the champagne flow models, but the profiles in the bow shock models with lower stellar velocity (≤ 5 km s-1) are similar to those in the champagne flow models. In champagne flow models, both the velocity of peak flux and the flux weighted central velocities of all three lines point outward from molecular clouds. In bow shock models, the directions of these velocities depend on the speed of stars. The central velocities of these lines are consistent with the stellar motion in the high stellar speed cases, but they are opposite directions from the stellar motion in the low speed cases. We notice that the line profiles from the slit along the symmetrical axis of the projected 2D image of these models are useful for distinguishing bow shock models from champagne flow models. It is also confirmed by the calculation that the flux weighted central velocity and the line luminosity of the [Ne III] line can be estimated from the [Ne II] line and the H30α line.

  15. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  16. Comparative study on kinetic adsorption of Cu(II), Cd(II) and Ni(II) ions from aqueous solutions using activated sludge and dried sludge

    NASA Astrophysics Data System (ADS)

    Ong, Soon-An; Toorisaka, Eiichi; Hirata, Makoto; Hano, Tadashi

    2013-03-01

    The adsorption of Cu(II), Cd(II) and Ni(II) ions from aqueous solutions by activated sludge and dried sludge was investigated under laboratory conditions to assess its potential in removing metal ions. The adsorption behavior of metal ions onto activated sludge and dried sludge was analyzed with Weber-Morris intra-particle diffusion model, Lagergren first-order model and pseudo second-order model. The rate constant of intra-particle diffusion on activated sludge and dried sludge increased in the sequence of Cu(II) > Ni(II) > Cd(II). According to the regression coefficients, it was observed that the kinetic adsorption data can fit better by the pseudo second-order model compared to the first-order Lagergren model with R 2 > 0.997. The adsorption capacities of metal ions onto activated sludge and dried sludge followed the sequence Ni(II) ≈ Cu(II) > Cd(II) and Cu(II) > Ni(II) > Cd(II).

  17. Validation of SAM 2 and SAGE satellite

    NASA Technical Reports Server (NTRS)

    Kent, G. S.; Wang, P.-H.; Farrukh, U. O.; Yue, G. K.

    1987-01-01

    Presented are the results of a validation study of data obtained by the Stratospheric Aerosol and Gas Experiment I (SAGE I) and Stratospheric Aerosol Measurement II (SAM II) satellite experiments. The study includes the entire SAGE I data set (February 1979 - November 1981) and the first four and one-half years of SAM II data (October 1978 - February 1983). These data sets have been validated by their use in the analysis of dynamical, physical and chemical processes in the stratosphere. They have been compared with other existing data sets and the SAGE I and SAM II data sets intercompared where possible. The study has shown the data to be of great value in the study of the climatological behavior of stratospheric aerosols and ozone. Several scientific publications and user-oriented data summaries have appeared as a result of the work carried out under this contract.

  18. Longitudinal Stability of the Beck Depression Inventory II: A Latent Trait-State-Occasion Model

    ERIC Educational Resources Information Center

    Wu, Pei-Chen

    2016-01-01

    In a six-wave longitudinal study with two cohorts (660 adolescents and 630 young adults), this study investigated the longitudinal stability of the Beck Depression Inventory II (BDI-II) using the Trait-State-Occasion (TSO) model. The results revealed that the full TSO model was the best fitting representation of the depression measured by the…

  19. Borderline personality disorder subscale (Chinese version) of the structured clinical interview for DSM-IV axis II personality disorders: a validation study in Cantonese-speaking Hong Kong Chinese.

    PubMed

    Wong, H M; Chow, L Y

    2011-06-01

    Borderline personality disorder is an important but under-recognised clinical entity, for which there are only a few available diagnostic instruments in the Chinese language. None has been tested for its psychometric properties in the Cantonese-speaking population in Hong Kong. The present study aimed to assess the validity of the Chinese version of the Borderline Personality Disorder subscale of the Structured Clinical Interview for the Diagnostic and Statistical Manual of Mental Disorders Axis II Personality Disorders (SCID-II) in Cantonese-speaking Hong Kong Chinese. A convenience sampling method was used. The subjects were seen by a multidisciplinary clinical team, who arrived at a best-estimate diagnosis and then by application of the SCID-II rater using the Chinese version of the Borderline Personality Disorder subscale. The study was carried out at the psychiatric clinic of the Prince of Wales Hospital in Hong Kong. A total of 87 patients of Chinese ethnicity aged 18 to 64 years who attended the clinic in April 2007 were recruited. The aforementioned patient parameters were used to examine the internal consistency, best-estimate clinical diagnosis-SCID diagnosis agreement, sensitivity, and specificity of the Chinese version of the subscale. The Borderline Personality Disorder subscale (Chinese version) of SCID-II had an internal consistency of 0.82 (Cronbach's alpha coefficient), best-estimate clinical diagnosis-SCID diagnosis agreement of 0.82 (kappa), sensitivity of 0.92, and specificity of 0.94. The Borderline Personality Disorder subscale (Chinese version) of the SCID-II rater had reasonable validity when applied to Cantonese-speaking Chinese subjects in Hong Kong.

  20. [Reliability and Validity of the Korean Version of the Perinatal Post-Traumatic Stress Disorder Questionnaire].

    PubMed

    Park, Yu Kyung; Ju, Hyeon Ok; Na, Hunjoo

    2016-02-01

    The Perinatal Post-Traumatic Stress Disorder Questionnaire (PPQ) was designed to measure post-traumatic symptoms related to childbirth and symptoms during postnatal period. The purpose of this study was to develop a translated Korean version of the PPQ and to evaluate reliability and validity of the Korean PPQ. Participants were 196 mothers at one to 18 months after giving childbirth and data were collected through e-mails. The PPQ was translated into Korean using translation guideline from World Health Organization. For this study Cronbach's alpha and split-half reliability were used to evaluate the reliability of the PPQ. Exploratory Factor Analysis (EFA), Confirmatory Factor Analysis (CFA), and known-group validity were conducted to examine construct validity. Correlations of the PPQ with Impact of Event Scale (IES), Beck Depression Inventory II (BDI-II), and Beck Anxiety Inventory (BAI) were used to test a criterion validity of the PPQ. Cronbach's alpha and Spearman-Brown split-half correlation coefficient were 0.91 and 0.77, respectively. EFA identified a 3-factor solution including arousal, avoidance, and intrusion factors and CFA revealed the strongest support for the 3-factor model. The correlations of the PPQ with IES, BDI-II, and BAI were .99, .60, and .72, respectively, pointing to criterion validity of a high level. The Korean version PPQ is a useful tool for screening and assessing mothers' experiencing emotional distress related to child birth and during the postnatal period. The PPQ also reflects Post Traumatic Stress Disorder's diagnostic standards well.

  1. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Technical Reports Server (NTRS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard; Hearty, Thomas; hide

    2011-01-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole disk Earth model simulations used to better under- stand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute s Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model (Tinetti et al., 2006a,b). This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of approx.100 pixels on the visible disk, and four categories of water clouds, which were defined using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to the Earth s lightcurve, absolute brightness, and spectral data, with a root-mean-square error of typically less than 3% for the multiwavelength lightcurves, and residuals of approx.10% for the absolute brightness throughout the visible and NIR spectral range. We extend our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of approx.7%, and temperature errors of less than 1K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated

  2. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  3. Earth as an extrasolar planet: Earth model validation using EPOXI earth observations.

    PubMed

    Robinson, Tyler D; Meadows, Victoria S; Crisp, David; Deming, Drake; A'hearn, Michael F; Charbonneau, David; Livengood, Timothy A; Seager, Sara; Barry, Richard K; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M; McFadden, Lucy A; Wellnitz, Dennis D

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  4. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    PubMed Central

    Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-01-01

    Abstract The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward

  5. Computational Modeling and Validation for Hypersonic Inlets

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1996-01-01

    Hypersonic inlet research activity at NASA is reviewed. The basis for the paper is the experimental tests performed with three inlets: the NASA Lewis Research Center Mach 5, the McDonnell Douglas Mach 12, and the NASA Langley Mach 18. Both three-dimensional PNS and NS codes have been used to compute the flow within the three inlets. Modeling assumptions in the codes involve the turbulence model, the nature of the boundary layer, shock wave-boundary layer interaction, and the flow spilled to the outside of the inlet. Use of the codes and the experimental data are helping to develop a clearer understanding of the inlet flow physics and to focus on the modeling improvements required in order to arrive at validated codes.

  6. Comparison of measured and calculated composition of irradiated EBR-II blanket assemblies.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimm, K. N.

    1998-07-13

    In anticipation of processing irradiated EBR-II depleted uranium blanket subassemblies in the Fuel Conditioning Facility (FCF) at ANL-West, it has been possible to obtain a limited set of destructive chemical analyses of samples from a single EBR-II blanket subassembly. Comparison of calculated values with these measurements is being used to validate a depletion methodology based on a limited number of generic models of EBR-II to simulate the irradiation history of these subassemblies. Initial comparisons indicate these methods are adequate to meet the operations and material control and accountancy (MC and A) requirements for the FCF, but also indicate several shortcomingsmore » which may be corrected or improved.« less

  7. Validation of Slosh Modeling Approach Using STAR-CCM+

    NASA Technical Reports Server (NTRS)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  8. Quantitative impedance measurements for eddy current model validation

    NASA Astrophysics Data System (ADS)

    Khan, T. A.; Nakagawa, N.

    2000-05-01

    This paper reports on a series of laboratory-based impedance measurement data, collected by the use of a quantitatively accurate, mechanically controlled measurement station. The purpose of the measurement is to validate a BEM-based eddy current model against experiment. We have therefore selected two "validation probes," which are both split-D differential probes. Their internal structures and dimensions are extracted from x-ray CT scan data, and thus known within the measurement tolerance. A series of measurements was carried out, using the validation probes and two Ti-6Al-4V block specimens, one containing two 1-mm long fatigue cracks, and the other containing six EDM notches of a range of sizes. Motor-controlled XY scanner performed raster scans over the cracks, with the probe riding on the surface with a spring-loaded mechanism to maintain the lift off. Both an impedance analyzer and a commercial EC instrument were used in the measurement. The probes were driven in both differential and single-coil modes for the specific purpose of model validation. The differential measurements were done exclusively by the eddyscope, while the single-coil data were taken with both the impedance analyzer and the eddyscope. From the single-coil measurements, we obtained the transfer function to translate the voltage output of the eddyscope into impedance values, and then used it to translate the differential measurement data into impedance results. The presentation will highlight the schematics of the measurement procedure, a representative of raw data, explanation of the post data-processing procedure, and then a series of resulting 2D flaw impedance results. A noise estimation will be given also, in order to quantify the accuracy of these measurements, and to be used in probability-of-detection estimation.—This work was supported by the NSF Industry/University Cooperative Research Program.

  9. Dynamic modelling and experimental validation of three wheeled tilting vehicles

    NASA Astrophysics Data System (ADS)

    Amati, Nicola; Festini, Andrea; Pelizza, Luigi; Tonoli, Andrea

    2011-06-01

    The present paper describes the study of the stability in the straight running of a three-wheeled tilting vehicle for urban and sub-urban mobility. The analysis was carried out by developing a multibody model in the Matlab/SimulinkSimMechanics environment. An Adams-Motorcycle model and an equivalent analytical model were developed for the cross-validation and for highlighting the similarities with the lateral dynamics of motorcycles. Field tests were carried out to validate the model and identify some critical parameters, such as the damping on the steering system. The stability analysis demonstrates that the lateral dynamic motions are characterised by vibration modes that are similar to that of a motorcycle. Additionally, it shows that the wobble mode is significantly affected by the castor trail, whereas it is only slightly affected by the dynamics of the front suspension. For the present case study, the frame compliance also has no influence on the weave and wobble.

  10. Experimental Validation of Model Updating and Damage Detection via Eigenvalue Sensitivity Methods with Artificial Boundary Conditions

    DTIC Science & Technology

    2017-09-01

    VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY

  11. AIM-9X Block II Sidewinder (AIM-9X Blk II)

    DTIC Science & Technology

    2013-12-01

    Selected Acquisition Report (SAR) RCS: DD- A &T(Q& A )823-442 AIM-9X Block II Sidewinder (AIM-9X Blk II) As of FY 2015 President’s...4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with... a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE DEC 2013 2. REPORT TYPE 3. DATES COVERED

  12. Selection, calibration, and validation of models of tumor growth.

    PubMed

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  13. Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.

    PubMed

    Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle

    2017-02-01

    To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.

  14. Validation Study of Maternal Recall on Breastfeeding Duration 6 Years After Childbirth.

    PubMed

    Amissah, Emma Ayorkor; Kancherla, Vijaya; Ko, Yi-An; Li, Ruowei

    2017-05-01

    Breastfeeding duration is an important indicator commonly measured in maternal and child health and nutrition research. Maternal short-term recall for both initiation and duration of breastfeeding has been shown to be valid; however, validity of long-term recall is not well understood. Research aim: This study aims to assess the validity of maternal recall of breastfeeding duration 6 years after childbirth and its association with sociodemographic factors. Among 635 mother-child pairs, breastfeeding duration data collected monthly throughout the 1st year after childbirth in the Infant Feeding Practices Study II (IFPS II) were compared to recall data obtained 6 years later during the Year 6 Follow-Up. The intraclass correlation coefficient (ICC) and Bland-Altman plots were examined to study the agreement between the two data sets. Sociodemographic factors associated with accurate recall to within 1 month of the IFPS II breastfeeding duration were assessed using multivariable logistic regression modeling. Maternal recall of breastfeeding duration was found to be valid 6 years after childbirth with a small median overall bias (1 week) toward overestimation. The overall concordance was high (ICC = 0.84), except for high school graduates (ICC = 0.63) and smokers (ICC = 0.61). Smokers (adjusted odds ratio = 0.52; 95% confidence interval [0.4, 0.8]) and multiparous women (adjusted odds ratio = 0.57; 95% confidence interval [0.4, 0.9]) were also less likely to give an accurate recall of their breastfeeding duration to within 1 month. Our study found that maternal recall of breastfeeding duration varies by sociodemographic factors but is accurate 6 years after childbirth.

  15. Adsorption of Pb(II), Cu(II), Cd(II), Zn(II), Ni(II), Fe(II), and As(V) on bacterially produced metal sulfides.

    PubMed

    Jong, Tony; Parry, David L

    2004-07-01

    The adsorption of Pb(II), Cu(II), Cd(II), Zn(II), Ni(II), Fe(II) and As(V) onto bacterially produced metal sulfide (BPMS) material was investigated using a batch equilibrium method. It was found that the sulfide material had adsorptive properties comparable with those of other adsorbents with respect to the specific uptake of a range of metals and, the levels to which dissolved metal concentrations in solution can be reduced. The percentage of adsorption increased with increasing pH and adsorbent dose, but decreased with increasing initial dissolved metal concentration. The pH of the solution was the most important parameter controlling adsorption of Cd(II), Cu(II), Fe(II), Ni(II), Pb(II), Zn(II), and As(V) by BPMS. The adsorption data were successfully modeled using the Langmuir adsorption isotherm. Desorption experiments showed that the reversibility of adsorption was low, suggesting high-affinity adsorption governed by chemisorption. The mechanism of adsorption for the divalent metals was thought to be the formation of strong, inner-sphere complexes involving surface hydroxyl groups. However, the mechanism for the adsorption of As(V) by BPMS appears to be distinct from that of surface hydroxyl exchange. These results have important implications to the management of metal sulfide sludge produced by bacterial sulfate reduction.

  16. Summary of EASM Turbulence Models in CFL3D With Validation Test Cases

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Gatski, Thomas B.

    2003-01-01

    This paper summarizes the Explicit Algebraic Stress Model in k-omega form (EASM-ko) and in k-epsilon form (EASM-ke) in the Reynolds-averaged Navier-Stokes code CFL3D. These models have been actively used over the last several years in CFL3D, and have undergone some minor modifications during that time. Details of the equations and method for coding the latest versions of the models are given, and numerous validation cases are presented. This paper serves as a validation archive for these models.

  17. Studying the highly bent spectra of FR II-type radio galaxies with the KDA EXT model

    NASA Astrophysics Data System (ADS)

    Kuligowska, Elżbieta

    2018-04-01

    Context. The Kaiser, Dennett-Thorpe & Alexander (KDA, 1997, MNRAS, 292, 723) EXT model, that is, the extension of the KDA model of Fanaroff & Riley (FR) II-type source evolution, is applied and confronted with the observational data for selected FR II-type radio sources with significantly aged radio spectra. Aim. A sample of FR II-type radio galaxies with radio spectra strongly bent at their highest frequencies is used for testing the usefulness of the KDA EXT model. Methods: The dynamical evolution of FR II-type sources predicted with the KDA EXT model is briefly presented and discussed. The results are then compared to the ones obtained with the classical KDA approach, assuming the source's continuous injection and self-similarity. Results: The results and corresponding diagrams obtained for the eight sample sources indicate that the KDA EXT model predicts the observed radio spectra significantly better than the best spectral fit provided by the original KDA model.

  18. Rationality Validation of a Layered Decision Model for Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Huaqiang; Alves-Foss, James; Zhang, Du

    2007-08-31

    We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDMmore » rationality through simulation.« less

  19. Data Assimilation of Photosynthetic Light-use Efficiency using Multi-angular Satellite Data: II Model Implementation and Validation

    NASA Technical Reports Server (NTRS)

    Hilker, Thomas; Hall, Forest G.; Tucker, J.; Coops, Nicholas C.; Black, T. Andrew; Nichol, Caroline J.; Sellers, Piers J.; Barr, Alan; Hollinger, David Y.; Munger, J. W.

    2012-01-01

    Spatially explicit and temporally continuous estimates of photosynthesis will be of great importance for increasing our understanding of and ultimately closing the terrestrial carbon cycle. Current capabilities to model photosynthesis, however, are limited by accurate enough representations of the complexity of the underlying biochemical processes and the numerous environmental constraints imposed upon plant primary production. A potentially powerful alternative to model photosynthesis through these indirect observations is the use of multi-angular satellite data to infer light-use efficiency (e) directly from spectral reflectance properties in connection with canopy shadow fractions. Hall et al. (this issue) introduced a new approach for predicting gross ecosystem production that would allow the use of such observations in a data assimilation mode to obtain spatially explicit variations in e from infrequent polar-orbiting satellite observations, while meteorological data are used to account for the more dynamic responses of e to variations in environmental conditions caused by changes in weather and illumination. In this second part of the study we implement and validate the approach of Hall et al. (this issue) across an ecologically diverse array of eight flux-tower sites in North America using data acquired from the Compact High Resolution Imaging Spectroradiometer (CHRIS) and eddy-flux observations. Our results show significantly enhanced estimates of e and therefore cumulative gross ecosystem production (GEP) over the course of one year at all examined sites. We also demonstrate that e is greatly heterogeneous even across small study areas. Data assimilation and direct inference of GEP from space using a new, proposed sensor could therefore be a significant step towards closing the terrestrial carbon cycle.

  20. A method for landing gear modeling and simulation with experimental validation

    NASA Technical Reports Server (NTRS)

    Daniels, James N.

    1996-01-01

    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  1. The Development and Validation of a New Land Surface Model for Regional and Global Climate Modeling

    NASA Astrophysics Data System (ADS)

    Lynch-Stieglitz, Marc

    1995-11-01

    A new land-surface scheme intended for use in mesoscale and global climate models has been developed and validated. The ground scheme consists of 6 soil layers. Diffusion and a modified tipping bucket model govern heat and water flow respectively. A 3 layer snow model has been incorporated into a modified BEST vegetation scheme. TOPMODEL equations and Digital Elevation Model data are used to generate baseflow which supports lowland saturated zones. Soil moisture heterogeneity represented by saturated lowlands subsequently impacts watershed evapotranspiration, the partitioning of surface fluxes, and the development of the storm hydrograph. Five years of meteorological and hydrological data from the Sleepers river watershed located in the eastern highlands of Vermont where winter snow cover is significant were then used to drive and validate the new scheme. Site validation data were sufficient to evaluate model performance with regard to various aspects of the watershed water balance, including snowpack growth/ablation, the spring snowmelt hydrograph, storm hydrographs, and the seasonal development of watershed evapotranspiration and soil moisture. By including topographic effects, not only are the main spring hydrographs and individual storm hydrographs adequately resolved, but the mechanisms generating runoff are consistent with current views of hydrologic processes. The seasonal movement of the mean water table depth and the saturated area of the watershed are consistent with site data and the overall model hydroclimatology, including the surface fluxes, seems reasonable.

  2. Validation of the PVSyst Performance Model for the Concentrix CPV Technology

    NASA Astrophysics Data System (ADS)

    Gerstmaier, Tobias; Gomez, María; Gombert, Andreas; Mermoud, André; Lejeune, Thibault

    2011-12-01

    The accuracy of the two-stage PVSyst model for the Concentrix CPV Technology is determined by comparing modeled to measured values. For both stages, i) the module model and ii) the power plant model, the underlying approaches are explained and methods for obtaining the model parameters are presented. The performance of both models is quantified using 19 months of outdoor measurements for the module model and 9 months of measurements at four different sites for the power plant model. Results are presented by giving statistical quantities for the model accuracy.

  3. Validation of landsurface processes in the AMIP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, T J

    The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less

  4. A new simple local muscle recovery model and its theoretical and experimental validation.

    PubMed

    Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu

    2015-01-01

    This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.

  5. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed

  6. Lessons learned from recent geomagnetic disturbance model validation activities

    NASA Astrophysics Data System (ADS)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  7. Temporal and external validation of a prediction model for adverse outcomes among inpatients with diabetes.

    PubMed

    Adderley, N J; Mallett, S; Marshall, T; Ghosh, S; Rayman, G; Bellary, S; Coleman, J; Akiboye, F; Toulis, K A; Nirantharakumar, K

    2018-06-01

    To temporally and externally validate our previously developed prediction model, which used data from University Hospitals Birmingham to identify inpatients with diabetes at high risk of adverse outcome (mortality or excessive length of stay), in order to demonstrate its applicability to other hospital populations within the UK. Temporal validation was performed using data from University Hospitals Birmingham and external validation was performed using data from both the Heart of England NHS Foundation Trust and Ipswich Hospital. All adult inpatients with diabetes were included. Variables included in the model were age, gender, ethnicity, admission type, intensive therapy unit admission, insulin therapy, albumin, sodium, potassium, haemoglobin, C-reactive protein, estimated GFR and neutrophil count. Adverse outcome was defined as excessive length of stay or death. Model discrimination in the temporal and external validation datasets was good. In temporal validation using data from University Hospitals Birmingham, the area under the curve was 0.797 (95% CI 0.785-0.810), sensitivity was 70% (95% CI 67-72) and specificity was 75% (95% CI 74-76). In external validation using data from Heart of England NHS Foundation Trust, the area under the curve was 0.758 (95% CI 0.747-0.768), sensitivity was 73% (95% CI 71-74) and specificity was 66% (95% CI 65-67). In external validation using data from Ipswich, the area under the curve was 0.736 (95% CI 0.711-0.761), sensitivity was 63% (95% CI 59-68) and specificity was 69% (95% CI 67-72). These results were similar to those for the internally validated model derived from University Hospitals Birmingham. The prediction model to identify patients with diabetes at high risk of developing an adverse event while in hospital performed well in temporal and external validation. The externally validated prediction model is a novel tool that can be used to improve care pathways for inpatients with diabetes. Further research to assess

  8. Differential Validation of a Path Analytic Model of University Dropout.

    ERIC Educational Resources Information Center

    Winteler, Adolf

    Tinto's conceptual schema of college dropout forms the theoretical framework for the development of a model of university student dropout intention. This study validated Tinto's model in two different departments within a single university. Analyses were conducted on a sample of 684 college freshmen in the Education and Economics Department. A…

  9. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting

  10. Exploring the Validity of Proposed Transgenic Animal Models of Attention-Deficit Hyperactivity Disorder (ADHD).

    PubMed

    de la Peña, June Bryan; Dela Peña, Irene Joy; Custodio, Raly James; Botanas, Chrislean Jun; Kim, Hee Jin; Cheong, Jae Hoon

    2018-05-01

    Attention-deficit/hyperactivity disorder (ADHD) is a common, behavioral, and heterogeneous neurodevelopmental condition characterized by hyperactivity, impulsivity, and inattention. Symptoms of this disorder are managed by treatment with methylphenidate, amphetamine, and/or atomoxetine. The cause of ADHD is unknown, but substantial evidence indicates that this disorder has a significant genetic component. Transgenic animals have become an essential tool in uncovering the genetic factors underlying ADHD. Although they cannot accurately reflect the human condition, they can provide insights into the disorder that cannot be obtained from human studies due to various limitations. An ideal animal model of ADHD must have face (similarity in symptoms), predictive (similarity in response to treatment or medications), and construct (similarity in etiology or underlying pathophysiological mechanism) validity. As the exact etiology of ADHD remains unclear, the construct validity of animal models of ADHD would always be limited. The proposed transgenic animal models of ADHD have substantially increased and diversified over the years. In this paper, we compiled and explored the validity of proposed transgenic animal models of ADHD. Each of the reviewed transgenic animal models has strengths and limitations. Some fulfill most of the validity criteria of an animal model of ADHD and have been extensively used, while there are others that require further validation. Nevertheless, these transgenic animal models of ADHD have provided and will continue to provide valuable insights into the genetic underpinnings of this complex disorder.

  11. Design and validation of an immunoaffinity LC-MS/MS assay for the quantification of a collagen type II neoepitope peptide in human urine: application as a biomarker of osteoarthritis.

    PubMed

    Nemirovskiy, Olga; Li, Wenlin Wendy; Szekely-Klepser, Gabriella

    2010-01-01

    Biomarkers play an increasingly important role for drug efficacy and safety evaluation in all stages of drug development. It is especially important to develop and validate sensitive and selective biomarkers for diseases where the onset of the disease is very slow and/or the disease progression is hard to follow, i.e., osteoarthritis (OA). The degradation of Type II collagen has been associated with the disease state of OA. Matrix metalloproteinases (MMPs) are enzymes that catalyze the degradation of collagen and therefore pursued as potential targets for the treatment of OA. Peptide biomarkers of MMP activity related to type II collagen degradation were identified and the presence of these peptides in MMP digests of human articular cartilage (HAC) explants and human urine were confirmed. An immunoaffinity LC/MS/MS assay for the quantification of the most abundant urinary type II collagen neoepitope (uTIINE) peptide, a 45-mer with 5 HO-proline residues was developed and clinically validated. The assay has subsequently been applied to analyze human urine samples from clinical studies. We have shown that the assay is able to differentiate between symptomatic OA and normal subjects, indicating that uTIINE can be used as potential biomarker for OA. This chapter discusses the assay procedure and provides information on the validation experiments used to evaluate the accuracy, precision, and selectivity data with attention to the specific challenges related to the quantification of endogenous protein/peptide biomarker analytes. The generalized approach can be used as a follow-up to studies whereby proteomics-based urinary biomarkers are identified and an assay needs to be developed. Considerations for the validation of such an assay are described.

  12. Chronic inhibition of Ca(2+)/calmodulin kinase II activity in the pilocarpine model of epilepsy.

    PubMed

    Churn, S B; Kochan, L D; DeLorenzo, R J

    2000-09-01

    The development of symptomatic epilepsy is a model of long-term plasticity changes in the central nervous system. The rat pilocarpine model of epilepsy was utilized to study persistent alterations in calcium/calmodulin-dependent kinase II (CaM kinase II) activity associated with epileptogenesis. CaM kinase II-dependent substrate phosphorylation and autophosphorylation were significantly inhibited for up to 6 weeks following epileptogenesis in both the cortex and hippocampus, but not in the cerebellum. The net decrease in CaM kinase II autophosphorylation and substrate phosphorylation was shown to be due to decreased kinase activity and not due to increased phosphatase activity. The inhibition in CaM kinase II activity and the development of epilepsy were blocked by pretreating seizure rats with MK-801 indicating that the long-lasting decrease in CaM kinase II activity was dependent on N-methyl-D-aspartate receptor activation. In addition, the inhibition of CaM kinase II activity was associated in time and regional localization with the development of spontaneous recurrent seizure activity. The decrease in enzyme activity was not attributed to a decrease in the alpha or beta kinase subunit protein expression level. Thus, the significant inhibition of the enzyme occurred without changes in kinase protein expression, suggesting a long-lasting, post-translational modification of the enzyme. This is the first published report of a persistent, post-translational alteration of CaM kinase II activity in a model of epilepsy characterized by spontaneous recurrent seizure activity.

  13. Polarized Light Scanning Cryomacroscopy, Part II: Thermal Modeling and Analysis of Experimental Observations

    PubMed Central

    Feig, Justin S.G.; Solanki, Prem K.; Eisenberg, David P.; Rabin, Yoed

    2016-01-01

    This study aims at developing thermal analysis tools and explaining experimental observations made by means of polarized-light cryomacroscopy (Part I). Thermal modeling is based on finite elements analysis (FEA), where two model parameters are extracted from thermal measurements: (i) the overall heat transfer coefficient between the cuvette and the cooling chamber, and (ii) the effective thermal conductivity within the cryoprotective agent (CPA) at the upper part of the cryogenic temperature range. The effective thermal conductivity takes into account enhanced heat transfer due to convection currents within the CPA, creating the so-called Bénard cells. Comparison of experimental results with simulation data indicates that the uncertainty in simulations due to the propagation of uncertainty in measured physical properties exceeds the uncertainty in experimental measurements, which validates the modeling approach. It is shown in this study that while a cavity may form in the upper-center portion of the vitrified CPA, it has very little effect on estimating the temperature distribution within the domain. This cavity is driven by thermal contraction of the CPA, with the upper-center of the domain transitioning to glass last. Finally, it is demonstrated in this study that additional stresses may develop within the glass transition temperature range due to nonlinear behavior of the thermal expansion coefficient. This effect is reported here for the first time in the context of cryobiology, using the capabilities of polarized-light cryomacroscopy. PMID:27343139

  14. Polarized light scanning cryomacroscopy, part II: Thermal modeling and analysis of experimental observations.

    PubMed

    Feig, Justin S G; Solanki, Prem K; Eisenberg, David P; Rabin, Yoed

    2016-10-01

    This study aims at developing thermal analysis tools and explaining experimental observations made by means of polarized-light cryomacroscopy (Part I). Thermal modeling is based on finite elements analysis (FEA), where two model parameters are extracted from thermal measurements: (i) the overall heat transfer coefficient between the cuvette and the cooling chamber, and (ii) the effective thermal conductivity within the cryoprotective agent (CPA) at the upper part of the cryogenic temperature range. The effective thermal conductivity takes into account enhanced heat transfer due to convection currents within the CPA, creating the so-called Bénard cells. Comparison of experimental results with simulation data indicates that the uncertainty in simulations due to the propagation of uncertainty in measured physical properties exceeds the uncertainty in experimental measurements, which validates the modeling approach. It is shown in this study that while a cavity may form in the upper-center portion of the vitrified CPA, it has very little effect on estimating the temperature distribution within the domain. This cavity is driven by thermal contraction of the CPA, with the upper-center of the domain transitioning to glass last. Finally, it is demonstrated in this study that additional stresses may develop within the glass transition temperature range due to nonlinear behavior of the thermal expansion coefficient. This effect is reported here for the first time in the context of cryobiology, using the capabilities of polarized-light cryomacroscopy. Copyright © 2016. Published by Elsevier Inc.

  15. Competitive adsorption of copper(II), cadmium(II), lead(II) and zinc(II) onto basic oxygen furnace slag.

    PubMed

    Xue, Yongjie; Hou, Haobo; Zhu, Shujing

    2009-02-15

    Polluted and contaminated water can often contain more than one heavy metal species. It is possible that the behavior of a particular metal species in a solution system will be affected by the presence of other metals. In this study, we have investigated the adsorption of Cd(II), Cu(II), Pb(II), and Zn(II) onto basic oxygen furnace slag (BOF slag) in single- and multi-element solution systems as a function of pH and concentration, in a background solution of 0.01M NaNO(3). In adsorption edge experiments, the pH was varied from 2.0 to 13.0 with total metal concentration 0.84mM in the single element system and 0.21mM each of Cd(II), Cu(II), Pb(II), and Zn(II) in the multi-element system. The value of pH(50) (the pH at which 50% adsorption occurs) was found to follow the sequence Zn>Cu>Pb>Cd in single-element systems, but Pb>Cu>Zn>Cd in the multi-element system. Adsorption isotherms at pH 6.0 in the multi-element systems showed that there is competition among various metals for adsorption sites on BOF slag. The adsorption and potentiometric titrations data for various slag-metal systems were modeled using an extended constant-capacitance surface complexation model that assumed an ion-exchange process below pH 6.5 and the formation of inner-sphere surface complexes at higher pH. Inner-sphere complexation was more dominant for the Cu(II), Pb(II) and Zn(II) systems.

  16. Temporal validation for landsat-based volume estimation model

    Treesearch

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  17. Calcium-manganese oxides as structural and functional models for active site in oxygen evolving complex in photosystem II: lessons from simple models.

    PubMed

    Najafpour, Mohammad Mahdi

    2011-01-01

    The oxygen evolving complex in photosystem II which induces the oxidation of water to dioxygen in plants, algae and certain bacteria contains a cluster of one calcium and four manganese ions. It serves as a model to split water by sunlight. Reports on the mechanism and structure of photosystem II provide a more detailed architecture of the oxygen evolving complex and the surrounding amino acids. One challenge in this field is the development of artificial model compounds to study oxygen evolution reaction outside the complicated environment of the enzyme. Calcium-manganese oxides as structural and functional models for the active site of photosystem II are explained and reviewed in this paper. Because of related structures of these calcium-manganese oxides and the catalytic centers of active site of the oxygen evolving complex of photosystem II, the study may help to understand more about mechanism of oxygen evolution by the oxygen evolving complex of photosystem II. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Modeling and validating HL7 FHIR profiles using semantic web Shape Expressions (ShEx).

    PubMed

    Solbrig, Harold R; Prud'hommeaux, Eric; Grieve, Grahame; McKenzie, Lloyd; Mandel, Joshua C; Sharma, Deepak K; Jiang, Guoqian

    2017-03-01

    HL7 Fast Healthcare Interoperability Resources (FHIR) is an emerging open standard for the exchange of electronic healthcare information. FHIR resources are defined in a specialized modeling language. FHIR instances can currently be represented in either XML or JSON. The FHIR and Semantic Web communities are developing a third FHIR instance representation format in Resource Description Framework (RDF). Shape Expressions (ShEx), a formal RDF data constraint language, is a candidate for describing and validating the FHIR RDF representation. Create a FHIR to ShEx model transformation and assess its ability to describe and validate FHIR RDF data. We created the methods and tools that generate the ShEx schemas modeling the FHIR to RDF specification being developed by HL7 ITS/W3C RDF Task Force, and evaluated the applicability of ShEx in the description and validation of FHIR to RDF transformations. The ShEx models contributed significantly to workgroup consensus. Algorithmic transformations from the FHIR model to ShEx schemas and FHIR example data to RDF transformations were incorporated into the FHIR build process. ShEx schemas representing 109 FHIR resources were used to validate 511 FHIR RDF data examples from the Standards for Trial Use (STU 3) Ballot version. We were able to uncover unresolved issues in the FHIR to RDF specification and detect 10 types of errors and root causes in the actual implementation. The FHIR ShEx representations have been included in the official FHIR web pages for the STU 3 Ballot version since September 2016. ShEx can be used to define and validate the syntax of a FHIR resource, which is complementary to the use of RDF Schema (RDFS) and Web Ontology Language (OWL) for semantic validation. ShEx proved useful for describing a standard model of FHIR RDF data. The combination of a formal model and a succinct format enabled comprehensive review and automated validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Validation of the World Health Organization Disability Assessment Schedule (WHO-DAS II) in Greek and its added value to the Short Form 36 (SF-36) in a sample of people with or without disabilities.

    PubMed

    Xenouli, Georgia; Xenoulis, Kostis; Sarafis, Pavlos; Niakas, Dimitris; Alexopoulos, Evangelos C

    2016-07-01

    There is controversy and ongoing interest on the measurement of functionality in the personal and social level. (1) to validate the Greek version of the World Health Organization Disability Assessment Schedule (WHO DAS II) and (2) to determine its added value to the physical and psychological health subscales of the Short Form 36 (SF-36). In a cross-sectional design, data were collected between December 2014 and March 2015 by using three questionnaires (WHO DAS II, SF-36, PSS-14) in a sample of people with disabilities (n = 101) and without disabilities (n = 109) in Athens, Greece. WHO DAS II internal consistency, construct and criterion-related validity were assessed by Cronbach alpha, exploratory factor analysis and correlations; its added value by multivariable linear regression. Cronbach Alpha's were satisfactory for the WHO DAS II, PSS-14 and SF-36 (0.85, 0.88 and 0.96 respectively). Exploratory factor analysis confirmed the existence of one or two factors in people with or without disabilities, respectively. WHO DAS II score showed significant negative correlation with the physical and mental health scale of SF-36 score, especially strong for physical health while was positively related to PSS-14 score. In multivariate analysis mental health appraisal was related to perceived stress in both groups. This study support the validity of the Greek version of WHO DAS II and warranted its use in assessment and follow up of people with disabilities, contributing to the development of suitable policies to cover their needs and providing comparable data with other surveys using the same instrument. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Validation of Risk Assessment Models of Venous Thromboembolism in Hospitalized Medical Patients.

    PubMed

    Greene, M Todd; Spyropoulos, Alex C; Chopra, Vineet; Grant, Paul J; Kaatz, Scott; Bernstein, Steven J; Flanders, Scott A

    2016-09-01

    Patients hospitalized for acute medical illness are at increased risk for venous thromboembolism. Although risk assessment is recommended and several at-admission risk assessment models have been developed, these have not been adequately derived or externally validated. Therefore, an optimal approach to evaluate venous thromboembolism risk in medical patients is not known. We conducted an external validation study of existing venous thromboembolism risk assessment models using data collected on 63,548 hospitalized medical patients as part of the Michigan Hospital Medicine Safety (HMS) Consortium. For each patient, cumulative venous thromboembolism risk scores and risk categories were calculated. Cox regression models were used to quantify the association between venous thromboembolism events and assigned risk categories. Model discrimination was assessed using Harrell's C-index. Venous thromboembolism incidence in hospitalized medical patients is low (1%). Although existing risk assessment models demonstrate good calibration (hazard ratios for "at-risk" range 2.97-3.59), model discrimination is generally poor for all risk assessment models (C-index range 0.58-0.64). The performance of several existing risk assessment models for predicting venous thromboembolism among acutely ill, hospitalized medical patients at admission is limited. Given the low venous thromboembolism incidence in this nonsurgical patient population, careful consideration of how best to utilize existing venous thromboembolism risk assessment models is necessary, and further development and validation of novel venous thromboembolism risk assessment models for this patient population may be warranted. Published by Elsevier Inc.

  1. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  2. Radiative transfer model validations during the First ISLSCP Field Experiment

    NASA Technical Reports Server (NTRS)

    Frouin, Robert; Breon, Francois-Marie; Gautier, Catherine

    1990-01-01

    Two simple radiative transfer models, the 5S model based on Tanre et al. (1985, 1986) and the wide-band model of Morcrette (1984) are validated by comparing their outputs with results obtained during the First ISLSCP Field Experiment on concomitant radiosonde, aerosol turbidity, and radiation measurements and sky photographs. Results showed that the 5S model overestimates the short-wave irradiance by 13.2 W/sq m, whereas the Morcrette model underestimated the long-wave irradiance by 7.4 W/sq m.

  3. Heat Transfer Modeling and Validation for Optically Thick Alumina Fibrous Insulation

    NASA Technical Reports Server (NTRS)

    Daryabeigi, Kamran

    2009-01-01

    Combined radiation/conduction heat transfer through unbonded alumina fibrous insulation was modeled using the diffusion approximation for modeling the radiation component of heat transfer in the optically thick insulation. The validity of the heat transfer model was investigated by comparison to previously reported experimental effective thermal conductivity data over the insulation density range of 24 to 96 kg/cu m, with a pressure range of 0.001 to 750 torr (0.1 to 101.3 x 10(exp 3) Pa), and test sample hot side temperature range of 530 to 1360 K. The model was further validated by comparison to thermal conductivity measurements using the transient step heating technique on an insulation sample at a density of 144 kg/cu m over a pressure range of 0.001 to 760 torr, and temperature range of 290 to 1090 K.

  4. Fracture characterization of human cortical bone under mode II loading using the end-notched flexure test.

    PubMed

    Silva, F G A; de Moura, M F S F; Dourado, N; Xavier, J; Pereira, F A M; Morais, J J L; Dias, M I R; Lourenço, P J; Judas, F M

    2017-08-01

    Fracture characterization of human cortical bone under mode II loading was analyzed using a miniaturized version of the end-notched flexure test. A data reduction scheme based on crack equivalent concept was employed to overcome uncertainties on crack length monitoring during the test. The crack tip shear displacement was experimentally measured using digital image correlation technique to determine the cohesive law that mimics bone fracture behavior under mode II loading. The developed procedure was validated by finite element analysis using cohesive zone modeling considering a trapezoidal with bilinear softening relationship. Experimental load-displacement curves, resistance curves and crack tip shear displacement versus applied displacement were used to validate the numerical procedure. The excellent agreement observed between the numerical and experimental results reveals the appropriateness of the proposed test and procedure to characterize human cortical bone fracture under mode II loading. The proposed methodology can be viewed as a novel valuable tool to be used in parametric and methodical clinical studies regarding features (e.g., age, diseases, drugs) influencing bone shear fracture under mode II loading.

  5. Turning Defense into Offense: Defensin Mimetics as Novel Antibiotics Targeting Lipid II

    PubMed Central

    Ateh, Eugene; Oashi, Taiji; Lu, Wuyuan; Huang, Jing; Diepeveen-de Buin, Marlies; Bryant, Joseph; Breukink, Eefjan; MacKerell, Alexander D.; de Leeuw, Erik P. H.

    2013-01-01

    We have previously reported on the functional interaction of Lipid II with human alpha-defensins, a class of antimicrobial peptides. Lipid II is an essential precursor for bacterial cell wall biosynthesis and an ideal and validated target for natural antibiotic compounds. Using a combination of structural, functional and in silico analyses, we present here the molecular basis for defensin-Lipid II binding. Based on the complex of Lipid II with Human Neutrophil peptide-1, we could identify and characterize chemically diverse low-molecular weight compounds that mimic the interactions between HNP-1 and Lipid II. Lead compound BAS00127538 was further characterized structurally and functionally; it specifically interacts with the N-acetyl muramic acid moiety and isoprenyl tail of Lipid II, targets cell wall synthesis and was protective in an in vivo model for sepsis. For the first time, we have identified and characterized low molecular weight synthetic compounds that target Lipid II with high specificity and affinity. Optimization of these compounds may allow for their development as novel, next generation therapeutic agents for the treatment of Gram-positive pathogenic infections. PMID:24244161

  6. SHERMAN, a shape-based thermophysical model. I. Model description and validation

    NASA Astrophysics Data System (ADS)

    Magri, Christopher; Howell, Ellen S.; Vervack, Ronald J.; Nolan, Michael C.; Fernández, Yanga R.; Marshall, Sean E.; Crowell, Jenna L.

    2018-03-01

    SHERMAN, a new thermophysical modeling package designed for analyzing near-infrared spectra of asteroids and other solid bodies, is presented. The model's features, the methods it uses to solve for surface and subsurface temperatures, and the synthetic data it outputs are described. A set of validation tests demonstrates that SHERMAN produces accurate output in a variety of special cases for which correct results can be derived from theory. These cases include a family of solutions to the heat equation for which thermal inertia can have any value and thermophysical properties can vary with depth and with temperature. An appendix describes a new approximation method for estimating surface temperatures within spherical-section craters, more suitable for modeling infrared beaming at short wavelengths than the standard method.

  7. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    PubMed

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  8. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  9. Modeling and optimization by particle swarm embedded neural network for adsorption of zinc (II) by palm kernel shell based activated carbon from aqueous environment.

    PubMed

    Karri, Rama Rao; Sahu, J N

    2018-01-15

    Zn (II) is one the common pollutant among heavy metals found in industrial effluents. Removal of pollutant from industrial effluents can be accomplished by various techniques, out of which adsorption was found to be an efficient method. Applications of adsorption limits itself due to high cost of adsorbent. In this regard, a low cost adsorbent produced from palm oil kernel shell based agricultural waste is examined for its efficiency to remove Zn (II) from waste water and aqueous solution. The influence of independent process variables like initial concentration, pH, residence time, activated carbon (AC) dosage and process temperature on the removal of Zn (II) by palm kernel shell based AC from batch adsorption process are studied systematically. Based on the design of experimental matrix, 50 experimental runs are performed with each process variable in the experimental range. The optimal values of process variables to achieve maximum removal efficiency is studied using response surface methodology (RSM) and artificial neural network (ANN) approaches. A quadratic model, which consists of first order and second order degree regressive model is developed using the analysis of variance and RSM - CCD framework. The particle swarm optimization which is a meta-heuristic optimization is embedded on the ANN architecture to optimize the search space of neural network. The optimized trained neural network well depicts the testing data and validation data with R 2 equal to 0.9106 and 0.9279 respectively. The outcomes indicates that the superiority of ANN-PSO based model predictions over the quadratic model predictions provided by RSM. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. When is the Anelastic Approximation a Valid Model for Compressible Convection?

    NASA Astrophysics Data System (ADS)

    Alboussiere, T.; Curbelo, J.; Labrosse, S.; Ricard, Y. R.; Dubuffet, F.

    2017-12-01

    Compressible convection is ubiquitous in large natural systems such Planetary atmospheres, stellar and planetary interiors. Its modelling is notoriously more difficult than the case when the Boussinesq approximation applies. One reason for that difficulty has been put forward by Ogura and Phillips (1961): the compressible equations generate sound waves with very short time scales which need to be resolved. This is why they introduced an anelastic model, based on an expansion of the solution around an isentropic hydrostatic profile. How accurate is that anelastic model? What are the conditions for its validity? To answer these questions, we have developed a numerical model for the full set of compressible equations and compared its solutions with those of the corresponding anelastic model. We considered a simple rectangular 2D Rayleigh-Bénard configuration and decided to restrict the analysis to infinite Prandtl numbers. This choice is valid for convection in the mantles of rocky planets, but more importantly lead to a zero Mach number. So we got rid of the question of the interference of acoustic waves with convection. In that simplified context, we used the entropy balances (that of the full set of equations and that of the anelastic model) to investigate the differences between exact and anelastic solutions. We found that the validity of the anelastic model is dictated by two conditions: first, the superadiabatic temperature difference must be small compared with the adiabatic temperature difference (as expected) ɛ = Δ TSA / delta Ta << 1, and secondly that the product of ɛ with the Nusselt number must be small.

  11. On the validation of a code and a turbulence model appropriate to circulation control airfoils

    NASA Technical Reports Server (NTRS)

    Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.

    1988-01-01

    A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.

  12. Use of the Ames Check Standard Model for the Validation of Wall Interference Corrections

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Amaya, M.; Flach, R.

    2018-01-01

    The new check standard model of the NASA Ames 11-ft Transonic Wind Tunnel was chosen for a future validation of the facility's wall interference correction system. The chosen validation approach takes advantage of the fact that test conditions experienced by a large model in the slotted part of the tunnel's test section will change significantly if a subset of the slots is temporarily sealed. Therefore, the model's aerodynamic coefficients have to be recorded, corrected, and compared for two different test section configurations in order to perform the validation. Test section configurations with highly accurate Mach number and dynamic pressure calibrations were selected for the validation. First, the model is tested with all test section slots in open configuration while keeping the model's center of rotation on the tunnel centerline. In the next step, slots on the test section floor are sealed and the model is moved to a new center of rotation that is 33 inches below the tunnel centerline. Then, the original angle of attack sweeps are repeated. Afterwards, wall interference corrections are applied to both test data sets and response surface models of the resulting aerodynamic coefficients in interference-free flow are generated. Finally, the response surface models are used to predict the aerodynamic coefficients for a family of angles of attack while keeping dynamic pressure, Mach number, and Reynolds number constant. The validation is considered successful if the corrected aerodynamic coefficients obtained from the related response surface model pair show good agreement. Residual differences between the corrected coefficient sets will be analyzed as well because they are an indicator of the overall accuracy of the facility's wall interference correction process.

  13. The customization of APACHE II for patients receiving orthotopic liver transplants

    PubMed Central

    Moreno, Rui

    2002-01-01

    General outcome prediction models developed for use with large, multicenter databases of critically ill patients may not correctly estimate mortality if applied to a particular group of patients that was under-represented in the original database. The development of new diagnostic weights has been proposed as a method of adapting the general model – the Acute Physiology and Chronic Health Evaluation (APACHE) II in this case – to a new group of patients. Such customization must be empirically tested, because the original model cannot contain an appropriate set of predictive variables for the particular group. In this issue of Critical Care, Arabi and co-workers present the results of the validation of a modified model of the APACHE II system for patients receiving orthotopic liver transplants. The use of a highly heterogeneous database for which not all important variables were taken into account and of a sample too small to use the Hosmer–Lemeshow goodness-of-fit test appropriately makes their conclusions uncertain. PMID:12133174

  14. Qualitative Validation of the IMM Model for ISS and STS Programs

    NASA Technical Reports Server (NTRS)

    Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.

    2016-01-01

    To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.

  15. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    NASA Technical Reports Server (NTRS)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  16. Validation of the Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM).

    PubMed

    Willis, Michael; Johansen, Pierre; Nilsson, Andreas; Asseburg, Christian

    2017-03-01

    The Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM) was developed to address study questions pertaining to the cost-effectiveness of treatment alternatives in the care of patients with type 2 diabetes mellitus (T2DM). Naturally, the usefulness of a model is determined by the accuracy of its predictions. A previous version of ECHO-T2DM was validated against actual trial outcomes and the model predictions were generally accurate. However, there have been recent upgrades to the model, which modify model predictions and necessitate an update of the validation exercises. The objectives of this study were to extend the methods available for evaluating model validity, to conduct a formal model validation of ECHO-T2DM (version 2.3.0) in accordance with the principles espoused by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM), and secondarily to evaluate the relative accuracy of four sets of macrovascular risk equations included in ECHO-T2DM. We followed the ISPOR/SMDM guidelines on model validation, evaluating face validity, verification, cross-validation, and external validation. Model verification involved 297 'stress tests', in which specific model inputs were modified systematically to ascertain correct model implementation. Cross-validation consisted of a comparison between ECHO-T2DM predictions and those of the seminal National Institutes of Health model. In external validation, study characteristics were entered into ECHO-T2DM to replicate the clinical results of 12 studies (including 17 patient populations), and model predictions were compared to observed values using established statistical techniques as well as measures of average prediction error, separately for the four sets of macrovascular risk equations supported in ECHO-T2DM. Sub-group analyses were conducted for dependent vs. independent outcomes and for microvascular vs. macrovascular vs. mortality

  17. Antibody-directed neutralization of annexin II (ANX II) inhibits neoangiogenesis and human breast tumor growth in a xenograft model.

    PubMed

    Sharma, Meena; Blackman, Marc R; Sharma, Mahesh C

    2012-02-01

    Activation of the fibrinolytic pathway has long been associated with human breast cancer. Plasmin is the major end product of the fibrinolytic pathway and is critical for normal physiological functions. The mechanism by which plasmin is generated in breast cancer is not yet fully described. We previously identified annexin II (ANX II), a fibrinolytic receptor, in human breast tumor tissue samples and observed a strong positive correlation with advanced stage cancer (Sharma et al., 2006a). We further demonstrated that tissue plasminogen activator (tPA) binds to ANX II in invasive breast cancer MDA-MB231cells, which leads to plasmin generation (Sharma et al., 2010). We hypothesize that ANX II-dependent plasmin generation in breast tumor is necessary to trigger the switch to neoangiogenesis, thereby stimulating a more aggressive cancer phenotype. Our immunohistochemical studies of human breast tumor tissues provide compelling evidence of a strong positive correlation between ANX II expression and neoangiogenesis, and suggest that ANX II is a potential target to slow or inhibit breast tumor growth by inhibiting neoangiogenesis. We now report that administration of anti-ANX II antibody potently inhibits the growth of human breast tumor in a xenograft model. Inhibition of tumor growth is at least partly due to attenuation of neoangiogenic activity within the tumor. In vitro studies demonstrate that anti-ANX II antibody inhibits angiogenesis on three dimensional matrigel cultures by eliciting endothelial cell (EC) death likely due to apoptosis. Taken together, these data suggest that selective disruption of the fibrinolytic activity of ANX II may provide a novel strategy for specific inhibition of neoangiogenesis in human breast cancer. Published by Elsevier Inc.

  18. Addendum to validation of FHWA's Traffic Noise Model (TNM) : phase 1

    DOT National Transportation Integrated Search

    2004-07-01

    (FHWA) is conducting a multiple-phase study to assess the accuracy and make recommendations on the use of FHWAs Traffic Noise Model (TNM). The TNM Validation Study involves highway noise data collection and TNM modeling for the purpose of data com...

  19. Modelling and validation of electromechanical shock absorbers

    NASA Astrophysics Data System (ADS)

    Tonoli, Andrea; Amati, Nicola; Girardello Detoni, Joaquim; Galluzzi, Renato; Gasparin, Enrico

    2013-08-01

    Electromechanical vehicle suspension systems represent a promising substitute to conventional hydraulic solutions. However, the design of electromechanical devices that are able to supply high damping forces without exceeding geometric dimension and mass constraints is a difficult task. All these challenges meet in off-road vehicle suspension systems, where the power density of the dampers is a crucial parameter. In this context, the present paper outlines a particular shock absorber configuration where a suitable electric machine and a transmission mechanism are utilised to meet off-road vehicle requirements. A dynamic model is used to represent the device. Subsequently, experimental tests are performed on an actual prototype to verify the functionality of the damper and validate the proposed model.

  20. Validation of the filament winding process model

    NASA Technical Reports Server (NTRS)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  1. Classification of case-II waters using hyperspectral (HICO) data over North Indian Ocean

    NASA Astrophysics Data System (ADS)

    Srinivasa Rao, N.; Ramarao, E. P.; Srinivas, K.; Deka, P. C.

    2016-05-01

    State of the art Ocean color algorithms are proven for retrieving the ocean constituents (chlorophyll-a, CDOM and Suspended Sediments) in case-I waters. However, these algorithms could not perform well at case-II waters because of the optical complexity. Hyperspectral data is found to be promising to classify the case-II waters. The aim of this study is to propose the spectral bands for future Ocean color sensors to classify the case-II waters. Study has been performed with Rrs's of HICO at estuaries of the river Indus and GBM of North Indian Ocean. Appropriate field samples are not available to validate and propose empirical models to retrieve concentrations. The sensor HICO is not currently operational to plan validation exercise. Aqua MODIS data at case-I and Case-II waters are used as complementary to in- situ. Analysis of Spectral reflectance curves suggests the band ratios of Rrs 484 nm and Rrs 581 nm, Rrs 490 nm and Rrs 426 nm to classify the Chlorophyll -a and CDOM respectively. Rrs 610 nm gives the best scope for suspended sediment retrieval. The work suggests the need for ocean color sensors with central wavelength's of 426, 484, 490, 581 and 610 nm to estimate the concentrations of Chl-a, Suspended Sediments and CDOM in case-II waters.

  2. Validation of coupled atmosphere-fire behavior models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less

  3. Validation of the 'full reconnection model' of the sawtooth instability in KSTAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nam, Y. B.; Ko, J. S.; Choe, G. H.

    In this paper, the central safety factor (q 0) during sawtooth oscillation has been measured with a great accuracy with the motional Stark effect (MSE) system on KSTAR and the measured value was However, this measurement alone cannot validate the disputed full and partial reconnection models definitively due to non-trivial off-set error (~0.05). Supplemental experiment of the excited m = 2, m = 3 modes that are extremely sensitive to the background q 0 and core magnetic shear definitively validates the 'full reconnection model'. The radial position of the excited modes right after the crash and time evolution into themore » 1/1 kink mode before the crash in a sawtoothing plasma suggests that in the MHD quiescent period after the crash and before the crash. Finally, additional measurement of the long lived m = 3, m = 5 modes in a non-sawtoothing discharge (presumably ) further validates the 'full reconnection model'.« less

  4. Validation of the 'full reconnection model' of the sawtooth instability in KSTAR

    DOE PAGES

    Nam, Y. B.; Ko, J. S.; Choe, G. H.; ...

    2018-03-26

    In this paper, the central safety factor (q 0) during sawtooth oscillation has been measured with a great accuracy with the motional Stark effect (MSE) system on KSTAR and the measured value was However, this measurement alone cannot validate the disputed full and partial reconnection models definitively due to non-trivial off-set error (~0.05). Supplemental experiment of the excited m = 2, m = 3 modes that are extremely sensitive to the background q 0 and core magnetic shear definitively validates the 'full reconnection model'. The radial position of the excited modes right after the crash and time evolution into themore » 1/1 kink mode before the crash in a sawtoothing plasma suggests that in the MHD quiescent period after the crash and before the crash. Finally, additional measurement of the long lived m = 3, m = 5 modes in a non-sawtoothing discharge (presumably ) further validates the 'full reconnection model'.« less

  5. Predictive data modeling of human type II diabetes related statistics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Kristina L.; Jaenisch, Holger M.; Handley, James W.; Albritton, Nathaniel G.

    2009-04-01

    During the course of routine Type II treatment of one of the authors, it was decided to derive predictive analytical Data Models of the daily sampled vital statistics: namely weight, blood pressure, and blood sugar, to determine if the covariance among the observed variables could yield a descriptive equation based model, or better still, a predictive analytical model that could forecast the expected future trend of the variables and possibly eliminate the number of finger stickings required to montior blood sugar levels. The personal history and analysis with resulting models are presented.

  6. Propeller aircraft interior noise model utilization study and validation

    NASA Technical Reports Server (NTRS)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  7. Validation of PV-RPM Code in the System Advisor Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less

  8. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  9. Spatial calibration and temporal validation of flow for regional scale hydrologic modeling

    USDA-ARS?s Scientific Manuscript database

    Physically based regional scale hydrologic modeling is gaining importance for planning and management of water resources. Calibration and validation of such regional scale model is necessary before applying it for scenario assessment. However, in most regional scale hydrologic modeling, flow validat...

  10. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  11. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  12. Validity test and its consistency in the construction of patient loyalty model

    NASA Astrophysics Data System (ADS)

    Yanuar, Ferra

    2016-04-01

    The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.

  13. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  14. Development and Validation of a Disease Severity Scoring Model for Pediatric Sepsis.

    PubMed

    Hu, Li; Zhu, Yimin; Chen, Mengshi; Li, Xun; Lu, Xiulan; Liang, Ying; Tan, Hongzhuan

    2016-07-01

    Multiple severity scoring systems have been devised and evaluated in adult sepsis, but a simplified scoring model for pediatric sepsis has not yet been developed. This study aimed to develop and validate a new scoring model to stratify the severity of pediatric sepsis, thus assisting the treatment of sepsis in children. Data from 634 consecutive patients who presented with sepsis at Children's hospital of Hunan province in China in 2011-2013 were analyzed, with 476 patients placed in training group and 158 patients in validation group. Stepwise discriminant analysis was used to develop the accurate discriminate model. A simplified scoring model was generated using weightings defined by the discriminate coefficients. The discriminant ability of the model was tested by receiver operating characteristic curves (ROC). The discriminant analysis showed that prothrombin time, D-dimer, total bilirubin, serum total protein, uric acid, PaO2/FiO2 ratio, myoglobin were associated with severity of sepsis. These seven variables were assigned with values of 4, 3, 3, 4, 3, 3, 3 respectively based on the standardized discriminant coefficients. Patients with higher scores had higher risk of severe sepsis. The areas under ROC (AROC) were 0.836 for accurate discriminate model, and 0.825 for simplified scoring model in validation group. The proposed disease severity scoring model for pediatric sepsis showed adequate discriminatory capacity and sufficient accuracy, which has important clinical significance in evaluating the severity of pediatric sepsis and predicting its progress.

  15. Evaluation and cross-validation of Environmental Models

    NASA Astrophysics Data System (ADS)

    Lemaire, Joseph

    Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore

  16. Crazy like a fox. Validity and ethics of animal models of human psychiatric disease.

    PubMed

    Rollin, Michael D H; Rollin, Bernard E

    2014-04-01

    Animal models of human disease play a central role in modern biomedical science. Developing animal models for human mental illness presents unique practical and philosophical challenges. In this article we argue that (1) existing animal models of psychiatric disease are not valid, (2) attempts to model syndromes are undermined by current nosology, (3) models of symptoms are rife with circular logic and anthropomorphism, (4) any model must make unjustified assumptions about subjective experience, and (5) any model deemed valid would be inherently unethical, for if an animal adequately models human subjective experience, then there is no morally relevant difference between that animal and a human.

  17. Competitive adsorption of Pb(II), Cu(II), and Zn(II) ions onto hydroxyapatite-biochar nanocomposite in aqueous solutions

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Ying; Liu, Yu-Xue; Lu, Hao-Hao; Yang, Rui-Qin; Yang, Sheng-Mao

    2018-05-01

    A hydroxyapatite-biochar nanocomposite (HAP-BC) was successfully fabricated and its physicochemical properties characterized. The analyses showed that HAP nanoparticles were successfully loaded on the biochar surface. The adsorption of Pb(II), Cu(II), and Zn(II) by HAP-BC was systematically studied in single and ternary metal systems. The results demonstrated that pH affects the adsorption of heavy metals onto HAP-BC. Regarding the adsorption kinetics, the pseudo-second-order model showed the best fit for all three heavy metal ions on HAP-BC. In both single and ternary metal ion systems, the adsorption isotherm of Pb(II) by HAP-BC followed Langmuir model, while those of Cu(II) and Zn(II) fitted well with Freundlich model. The maximum adsorption capacity for each tested metal by HAP-BC was higher than that of pristine rice straw biochar (especially for Pb(II)) or those of other reported adsorbents. Therefore, HAP-BC could explore as a new material for future application in heavy metal removal.

  18. A primer for biomedical scientists on how to execute model II linear regression analysis.

    PubMed

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  19. Developing and validating risk prediction models in an individual participant data meta-analysis

    PubMed Central

    2014-01-01

    Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587

  20. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  1. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and

  2. Approaches to Validation of Models for Low Gravity Fluid Behavior

    NASA Technical Reports Server (NTRS)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  3. Analysis of JSI TRIGA MARK II reactor physical parameters calculated with TRIPOLI and MCNP.

    PubMed

    Henry, R; Tiselj, I; Snoj, L

    2015-03-01

    New computational model of the JSI TRIGA Mark II research reactor was built for TRIPOLI computer code and compared with existing MCNP code model. The same modelling assumptions were used in order to check the differences of the mathematical models of both Monte Carlo codes. Differences between the TRIPOLI and MCNP predictions of keff were up to 100pcm. Further validation was performed with analyses of the normalized reaction rates and computations of kinetic parameters for various core configurations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Validation workflow for a clinical Bayesian network model in multidisciplinary decision making in head and neck oncology treatment.

    PubMed

    Cypko, Mario A; Stoehr, Matthaeus; Kozniewski, Marcin; Druzdzel, Marek J; Dietz, Andreas; Berliner, Leonard; Lemke, Heinz U

    2017-11-01

    Oncological treatment is being increasingly complex, and therefore, decision making in multidisciplinary teams is becoming the key activity in the clinical pathways. The increased complexity is related to the number and variability of possible treatment decisions that may be relevant to a patient. In this paper, we describe validation of a multidisciplinary cancer treatment decision in the clinical domain of head and neck oncology. Probabilistic graphical models and corresponding inference algorithms, in the form of Bayesian networks, can support complex decision-making processes by providing a mathematically reproducible and transparent advice. The quality of BN-based advice depends on the quality of the model. Therefore, it is vital to validate the model before it is applied in practice. For an example BN subnetwork of laryngeal cancer with 303 variables, we evaluated 66 patient records. To validate the model on this dataset, a validation workflow was applied in combination with quantitative and qualitative analyses. In the subsequent analyses, we observed four sources of imprecise predictions: incorrect data, incomplete patient data, outvoting relevant observations, and incorrect model. Finally, the four problems were solved by modifying the data and the model. The presented validation effort is related to the model complexity. For simpler models, the validation workflow is the same, although it may require fewer validation methods. The validation success is related to the model's well-founded knowledge base. The remaining laryngeal cancer model may disclose additional sources of imprecise predictions.

  5. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  6. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE PAGES

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...

    2017-04-01

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  7. Development and validation of a mortality risk model for pediatric sepsis.

    PubMed

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-05-01

    Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial.We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities.According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively.The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients.

  8. Development and validation of a mortality risk model for pediatric sepsis

    PubMed Central

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-01-01

    Abstract Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial. We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities. According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively. The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients. PMID:28514310

  9. Copper(II) and zinc(II) dinuclear enzymes model compounds: The nature of the metal ion in the biological function

    NASA Astrophysics Data System (ADS)

    Ferraresso, L. G.; de Arruda, E. G. R.; de Moraes, T. P. L.; Fazzi, R. B.; Da Costa Ferreira, A. M.; Abbehausen, C.

    2017-12-01

    First series transition metals are used abundantly by nature to perform catalytic transformations of several substrates. Furthermore, the cooperative activity of two proximal metal ions is common and represents a highly efficient catalytic system in living organisms. In this work three dinuclear μ-phenolate bridged metal complexes were prepared with copper(II) and zinc(II), resulting in a ZnZn, CuCu and CuZn with the ligand 2-ethylaminodimethylamino phenol (saldman) as model compounds of superoxide dismutase (CuCu and CuZn) and metallo-β-lactamases (ZnZn). Metals are coordinated in a μ-phenolate bridged symmetric system. Cu(II) presents a more distorted structure, while zinc is very symmetric. For this reason, [CuCu(saldman)] shows higher water solubility and also higher lability of the bridge. The antioxidant and hydrolytic beta-lactamase-like activity of the complexes were evaluated. The lability of the bridge seems to be important for the antioxidant activity and is suggested to because of [CuCu(saldman)] presents a lower antioxidant capacity than [CuZn(saldman)], which showed to present a more stable bridge in solution. The hydrolytic activity of the bimetallic complexes was assayed using nitrocefin as substrate and showed [ZnZn(saldman)] as a better catalyst than the Cu(II) analog. The series demonstrates the importance of the nature of the metal center for the biological function and how the reactivity of the model complex can be modulated by coordination chemistry.

  10. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  11. Rational selection of training and test sets for the development of validated QSAR models

    NASA Astrophysics Data System (ADS)

    Golbraikh, Alexander; Shen, Min; Xiao, Zhiyan; Xiao, Yun-De; Lee, Kuo-Hsiung; Tropsha, Alexander

    2003-02-01

    Quantitative Structure-Activity Relationship (QSAR) models are used increasingly to screen chemical databases and/or virtual chemical libraries for potentially bioactive molecules. These developments emphasize the importance of rigorous model validation to ensure that the models have acceptable predictive power. Using k nearest neighbors ( kNN) variable selection QSAR method for the analysis of several datasets, we have demonstrated recently that the widely accepted leave-one-out (LOO) cross-validated R2 (q2) is an inadequate characteristic to assess the predictive ability of the models [Golbraikh, A., Tropsha, A. Beware of q2! J. Mol. Graphics Mod. 20, 269-276, (2002)]. Herein, we provide additional evidence that there exists no correlation between the values of q 2 for the training set and accuracy of prediction ( R 2) for the test set and argue that this observation is a general property of any QSAR model developed with LOO cross-validation. We suggest that external validation using rationally selected training and test sets provides a means to establish a reliable QSAR model. We propose several approaches to the division of experimental datasets into training and test sets and apply them in QSAR studies of 48 functionalized amino acid anticonvulsants and a series of 157 epipodophyllotoxin derivatives with antitumor activity. We formulate a set of general criteria for the evaluation of predictive power of QSAR models.

  12. Predictive Model for Particle Residence Time Distributions in Riser Reactors. Part 1: Model Development and Validation

    DOE PAGES

    Foust, Thomas D.; Ziegler, Jack L.; Pannala, Sreekanth; ...

    2017-02-28

    Here in this computational study, we model the mixing of biomass pyrolysis vapor with solid catalyst in circulating riser reactors with a focus on the determination of solid catalyst residence time distributions (RTDs). A comprehensive set of 2D and 3D simulations were conducted for a pilot-scale riser using the Eulerian-Eulerian two-fluid modeling framework with and without sub-grid-scale models for the gas-solids interaction. A validation test case was also simulated and compared to experiments, showing agreement in the pressure gradient and RTD mean and spread. For simulation cases, it was found that for accurate RTD prediction, the Johnson and Jackson partialmore » slip solids boundary condition was required for all models and a sub-grid model is useful so that ultra high resolutions grids that are very computationally intensive are not required. Finally, we discovered a 2/3 scaling relation for the RTD mean and spread when comparing resolved 2D simulations to validated unresolved 3D sub-grid-scale model simulations.« less

  13. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  14. Alzheimer's Disease Diagnosis in Individual Subjects using Structural MR Images: Validation Studies

    PubMed Central

    Vemuri, Prashanthi; Gunter, Jeffrey L.; Senjem, Matthew L.; Whitwell, Jennifer L.; Kantarci, Kejal; Knopman, David S.; Boeve, Bradley F.; Petersen, Ronald C.; Jack, Clifford R.

    2008-01-01

    OBJECTIVE To develop and validate a tool for Alzheimer's disease (AD) diagnosis in individual subjects using support vector machine (SVM) based classification of structural MR (sMR) images. BACKGROUND Libraries of sMR scans of clinically well characterized subjects can be harnessed for the purpose of diagnosing new incoming subjects. METHODS 190 patients with probable AD were age- and gender-matched with 190 cognitively normal (CN) subjects. Three different classification models were implemented: Model I uses tissue densities obtained from sMR scans to give STructural Abnormality iNDex (STAND)-score; and Models II and III use tissue densities as well as covariates (demographics and Apolipoprotein E genotype) to give adjusted-STAND (aSTAND)-score. Data from 140 AD and 140 CN were used for training. The SVM parameter optimization and training was done by four-fold cross validation. The remaining independent sample of 50 AD and 50 CN were used to obtain a minimally biased estimate of the generalization error of the algorithm. RESULTS The CV accuracy of Model II and Model III aSTAND-scores was 88.5% and 89.3% respectively and the developed models generalized well on the independent test datasets. Anatomic patterns best differentiating the groups were consistent with the known distribution of neurofibrillary AD pathology. CONCLUSIONS This paper presents preliminary evidence that application of SVM-based classification of an individual sMR scan relative to a library of scans can provide useful information in individual subjects for diagnosis of AD. Including demographic and genetic information in the classification algorithm slightly improves diagnostic accuracy. PMID:18054253

  15. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, Alexander; Hawes, Frederick; Fox, Marsha

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation

  16. Antisocial Personality Disorder Subscale (Chinese Version) of the Structured Clinical Interview for the DSM-IV Axis II disorders: validation study in Cantonese-speaking Hong Kong Chinese.

    PubMed

    Tang, D Y Y; Liu, A C Y; Leung, M H T; Siu, B W M

    2013-06-01

    OBJECTIVE. Antisocial personality disorder (ASPD) is a risk factor for violence and is associated with poor treatment response when it is a co-morbid condition with substance abuse. It is an under-recognised clinical entity in the local Hong Kong setting, for which there are only a few available Chinese-language diagnostic instruments. None has been tested for its psychometric properties in the Cantonese-speaking population in Hong Kong. This study therefore aimed to assess the reliability and validity of the Chinese version of the ASPD subscale of the Structured Clinical Interview for the DSM-IV Axis II Disorders (SCID-II) in Hong Kong Chinese. METHODS. This assessment tool was modified according to dialectal differences between Mainland China and Hong Kong. Inpatients in Castle Peak Hospital, Hong Kong, who were designated for priority follow-up based on their assessed propensity for violence and who fulfilled the inclusion criteria for the study, were recruited. To assess the level of agreement, best-estimate diagnosis made by a multidisciplinary team was compared with diagnostic status determined by the SCID-II ASPD subscale. The internal consistency, sensitivity, and specificity of the subscale were also calculated. RESULTS. The internal consistency of the subscale was acceptable at 0.79, whereas the test-retest reliability and inter-rater reliability showed an excellent and good agreement of 0.90 and 0.86, respectively. Best-estimate clinical diagnosis-SCID diagnosis agreement was acceptable at 0.76. The sensitivity, specificity, positive and negative predictive values were 0.91, 0.86, 0.83, and 0.93, respectively. CONCLUSION. The Chinese version of the SCID-II ASPD subscale is reliable and valid for diagnosing ASPD in a Cantonese-speaking clinical population.

  17. Storm Water Management Model Reference Manual Volume II ...

    EPA Pesticide Factsheets

    SWMM is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. The runoff component of SWMM operates on a collection of subcatchment areas that receive precipitation and generate runoff and pollutant loads. The routing portion of SWMM transports this runoff through a system of pipes, channels, storage/treatment devices, pumps, and regulators. SWMM tracks the quantity and quality of runoff generated within each subcatchment, and the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period comprised of multiple time steps. The reference manual for this edition of SWMM is comprised of three volumes. Volume I describes SWMM’s hydrologic models, Volume II its hydraulic models, and Volume III its water quality and low impact development models. This document provides the underlying mathematics for the hydraulic calculations of the Storm Water Management Model (SWMM)

  18. An Examination of the Validity of the Family Affluence Scale II (FAS II) in a General Adolescent Population of Canada

    ERIC Educational Resources Information Center

    Boudreau, Brock; Poulin, Christiane

    2009-01-01

    This study examined the performance of the FAS II in a general population of 17,545 students in grades 7, 9, 10 and 12 in the Atlantic provinces of Canada. The FAS II was assessed against two other measures of socioeconomic status: mother's highest level of education and family structure. Our study found that the FAS II reduces the likelihood of…

  19. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  20. Explicit validation of a surface shortwave radiation balance model over snow-covered complex terrain

    NASA Astrophysics Data System (ADS)

    Helbig, N.; Löwe, H.; Mayer, B.; Lehning, M.

    2010-09-01

    A model that computes the surface radiation balance for all sky conditions in complex terrain is presented. The spatial distribution of direct and diffuse sky radiation is determined from observations of incident global radiation, air temperature, and relative humidity at a single measurement location. Incident radiation under cloudless sky is spatially derived from a parameterization of the atmospheric transmittance. Direct and diffuse sky radiation for all sky conditions are obtained by decomposing the measured global radiation value. Spatial incident radiation values under all atmospheric conditions are computed by adjusting the spatial radiation values obtained from the parametric model with the radiation components obtained from the decomposition model at the measurement site. Topographic influences such as shading are accounted for. The radiosity approach is used to compute anisotropic terrain reflected radiation. Validations of the shortwave radiation balance model are presented in detail for a day with cloudless sky. For a day with overcast sky a first validation is presented. Validation of a section of the horizon line as well as of individual radiation components is performed with high-quality measurements. A new measurement setup was designed to determine terrain reflected radiation. There is good agreement between the measurements and the modeled terrain reflected radiation values as well as with incident radiation values. A comparison of the model with a fully three-dimensional radiative transfer Monte Carlo model is presented. That validation reveals a good agreement between modeled radiation values.

  1. A Public-Private Partnership Develops and Externally Validates a 30-Day Hospital Readmission Risk Prediction Model

    PubMed Central

    Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat

    2013-01-01

    Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to

  2. Sorbent, Sublimation, and Icing Modeling Methods: Experimental Validation and Application to an Integrated MTSA Subassembly Thermal Model

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2010-01-01

    This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.

  3. MT3DMS: Model use, calibration, and validation

    USGS Publications Warehouse

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  4. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in

  5. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less

  6. First results of GERDA Phase II and consistency with background models

    NASA Astrophysics Data System (ADS)

    Agostini, M.; Allardt, M.; Bakalyarov, A. M.; Balata, M.; Barabanov, I.; Baudis, L.; Bauer, C.; Bellotti, E.; Belogurov, S.; Belyaev, S. T.; Benato, G.; Bettini, A.; Bezrukov, L.; Bode1, T.; Borowicz, D.; Brudanin, V.; Brugnera, R.; Caldwell, A.; Cattadori, C.; Chernogorov, A.; D'Andrea, V.; Demidova, E. V.; Di Marco, N.; Domula, A.; Doroshkevich, E.; Egorov, V.; Falkenstein, R.; Frodyma, N.; Gangapshev, A.; Garfagnini, A.; Gooch, C.; Grabmayr, P.; Gurentsov, V.; Gusev, K.; Hakenmüller, J.; Hegai, A.; Heisel, M.; Hemmer, S.; Hofmann, W.; Hult, M.; Inzhechik, L. V.; Janicskó Csáthy, J.; Jochum, J.; Junker, M.; Kazalov, V.; Kihm, T.; Kirpichnikov, I. V.; Kirsch, A.; Kish, A.; Klimenko, A.; Kneißl, R.; Knöpfle, K. T.; Kochetov, O.; Kornoukhov, V. N.; Kuzminov, V. V.; Laubenstein, M.; Lazzaro, A.; Lebedev, V. I.; Lehnert, B.; Liao, H. Y.; Lindner, M.; Lippi, I.; Lubashevskiy, A.; Lubsandorzhiev, B.; Lutter, G.; Macolino, C.; Majorovits, B.; Maneschg, W.; Medinaceli, E.; Miloradovic, M.; Mingazheva, R.; Misiaszek, M.; Moseev, P.; Nemchenok, I.; Palioselitis, D.; Panas, K.; Pandola, L.; Pelczar, K.; Pullia, A.; Riboldi, S.; Rumyantseva, N.; Sada, C.; Salamida, F.; Salathe, M.; Schmitt, C.; Schneider, B.; Schönert, S.; Schreiner, J.; Schulz, O.; Schütz, A.-K.; Schwingenheuer, B.; Selivanenko, O.; Shevzik, E.; Shirchenko, M.; Simgen, H.; Smolnikov, A.; Stanco, L.; Vanhoefer, L.; Vasenko, A. A.; Veresnikova, A.; von Sturm, K.; Wagner, V.; Wegmann, A.; Wester, T.; Wiesinger, C.; Wojcik, M.; Yanovich, E.; Zhitnikov, I.; Zhukov, S. V.; Zinatulina, D.; Zuber, K.; Zuzel, G.

    2017-01-01

    The GERDA (GERmanium Detector Array) is an experiment for the search of neutrinoless double beta decay (0νββ) in 76Ge, located at Laboratori Nazionali del Gran Sasso of INFN (Italy). GERDA operates bare high purity germanium detectors submersed in liquid Argon (LAr). Phase II of data-taking started in Dec 2015 and is currently ongoing. In Phase II 35 kg of germanium detectors enriched in 76Ge including thirty newly produced Broad Energy Germanium (BEGe) detectors is operating to reach an exposure of 100 kg·yr within about 3 years data taking. The design goal of Phase II is to reduce the background by one order of magnitude to get the sensitivity for T1/20ν = O≤ft( {{{10}26}} \\right){{ yr}}. To achieve the necessary background reduction, the setup was complemented with LAr veto. Analysis of the background spectrum of Phase II demonstrates consistency with the background models. Furthermore 226Ra and 232Th contamination levels consistent with screening results. In the first Phase II data release we found no hint for a 0νββ decay signal and place a limit of this process T1/20ν > 5.3 \\cdot {1025} yr (90% C.L., sensitivity 4.0·1025 yr). First results of GERDA Phase II will be presented.

  7. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  8. Design, development, and application of LANDIS-II, a spatial landscape simulation model with flexible temporal and spatial resolution

    Treesearch

    Robert M. Scheller; James B. Domingo; Brian R. Sturtevant; Jeremy S. Williams; Arnold Rudy; Eric J. Gustafson; David J. Mladenoff

    2007-01-01

    We introduce LANDIS-II, a landscape model designed to simulate forest succession and disturbances. LANDIS-II builds upon and preserves the functionality of previous LANDIS forest landscape simulation models. LANDIS-II is distinguished by the inclusion of variable time steps for different ecological processes; our use of a rigorous development and testing process used...

  9. Multi-dimensional modelling of gas turbine combustion using a flame sheet model in KIVA II

    NASA Technical Reports Server (NTRS)

    Cheng, W. K.; Lai, M.-C.; Chue, T.-H.

    1991-01-01

    A flame sheet model for heat release is incorporated into a multi-dimensional fluid mechanical simulation for gas turbine application. The model assumes that the chemical reaction takes place in thin sheets compared to the length scale of mixing, which is valid for the primary combustion zone in a gas turbine combustor. In this paper, the details of the model are described and computational results are discussed.

  10. DEVELOPMENT OF GUIDELINES FOR CALIBRATING, VALIDATING, AND EVALUATING HYDROLOGIC AND WATER QUALITY MODELS: ASABE ENGINEERING PRACTICE 621

    USDA-ARS?s Scientific Manuscript database

    Information to support application of hydrologic and water quality (H/WQ) models abounds, yet modelers commonly use arbitrary, ad hoc methods to conduct, document, and report model calibration, validation, and evaluation. Consistent methods are needed to improve model calibration, validation, and e...

  11. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2013-11-20

    Granger causality F-test validation 3.1.2. Dynamic time warping for uneven temporal relationships Many causal relationships are imperfectly...mapping for dynamic feedback models Granger causality and DTW can identify causal relationships and consider complex temporal factors. However, many ...variant of the tf-idf algorithm (Manning, Raghavan, Schutze et al., 2008), typically used in search engines, to “score” features. The (-log tf) in

  12. Development and Validation of New Discriminative Dissolution Method for Carvedilol Tablets

    PubMed Central

    Raju, V.; Murthy, K. V. R.

    2011-01-01

    The objective of the present study was to develop and validate a discriminative dissolution method for evaluation of carvedilol tablets. Different conditions such as type of dissolution medium, volume of dissolution medium and rotation speed of paddle were evaluated. The best in vitro dissolution profile was obtained using Apparatus II (paddle), 50 rpm, 900 ml of pH 6.8 phosphate buffer as dissolution medium. The drug release was evaluated by high-performance liquid chromatographic method. The dissolution method was validated according to current ICH and FDA guidelines using parameters such as the specificity, accuracy, precision and stability were evaluated and obtained results were within the acceptable range. The comparison of the obtained dissolution profiles of three different products were investigated using ANOVA-based, model-dependent and model-independent methods, results showed that there is significant difference between the products. The dissolution test developed and validated was adequate for its higher discriminative capacity in differentiating the release characteristics of the products tested and could be applied for development and quality control of carvedilol tablets. PMID:22923865

  13. L-shaped piezoelectric motor--part II: analytical modeling.

    PubMed

    Avirovik, Dragan; Karami, M Amin; Inman, Daniel; Priya, Shashank

    2012-01-01

    This paper develops an analytical model for an L-shaped piezoelectric motor. The motor structure has been described in detail in Part I of this study. The coupling of the bending vibration mode of the bimorphs results in an elliptical motion at the tip. The emphasis of this paper is on the development of a precise analytical model which can predict the dynamic behavior of the motor based on its geometry. The motor was first modeled mechanically to identify the natural frequencies and mode shapes of the structure. Next, an electromechanical model of the motor was developed to take into account the piezoelectric effect, and dynamics of L-shaped piezoelectric motor were obtained as a function of voltage and frequency. Finally, the analytical model was validated by comparing it to experiment results and the finite element method (FEM). © 2012 IEEE

  14. Convergent, discriminant, and criterion validity of DSM-5 traits.

    PubMed

    Yalch, Matthew M; Hopwood, Christopher J

    2016-10-01

    Section III of the Diagnostic and Statistical Manual of Mental Disorders (5th edi.; DSM-5; American Psychiatric Association, 2013) contains a system for diagnosing personality disorder based in part on assessing 25 maladaptive traits. Initial research suggests that this aspect of the system improves the validity and clinical utility of the Section II Model. The Computer Adaptive Test of Personality Disorder (CAT-PD; Simms et al., 2011) contains many similar traits as the DSM-5, as well as several additional traits seemingly not covered in the DSM-5. In this study we evaluate the convergent and discriminant validity between the DSM-5 traits, as assessed by the Personality Inventory for DSM-5 (PID-5; Krueger et al., 2012), and CAT-PD in an undergraduate sample, and test whether traits included in the CAT-PD but not the DSM-5 provide incremental validity in association with clinically relevant criterion variables. Results supported the convergent and discriminant validity of the PID-5 and CAT-PD scales in their assessment of 23 out of 25 DSM-5 traits. DSM-5 traits were consistently associated with 11 criterion variables, despite our having intentionally selected clinically relevant criterion constructs not directly assessed by DSM-5 traits. However, the additional CAT-PD traits provided incremental information above and beyond the DSM-5 traits for all criterion variables examined. These findings support the validity of pathological trait models in general and the DSM-5 and CAT-PD models in particular, while also suggesting that the CAT-PD may include additional traits for consideration in future iterations of the DSM-5 system. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Calibration and validation of a general infiltration model

    NASA Astrophysics Data System (ADS)

    Mishra, Surendra Kumar; Ranjan Kumar, Shashi; Singh, Vijay P.

    1999-08-01

    A general infiltration model proposed by Singh and Yu (1990) was calibrated and validated using a split sampling approach for 191 sets of infiltration data observed in the states of Minnesota and Georgia in the USA. Of the five model parameters, fc (the final infiltration rate), So (the available storage space) and exponent n were found to be more predictable than the other two parameters: m (exponent) and a (proportionality factor). A critical examination of the general model revealed that it is related to the Soil Conservation Service (1956) curve number (SCS-CN) method and its parameter So is equivalent to the potential maximum retention of the SCS-CN method and is, in turn, found to be a function of soil sorptivity and hydraulic conductivity. The general model was found to describe infiltration rate with time varying curve number.

  16. Acute Brain Dysfunction: Development and Validation of a Daily Prediction Model.

    PubMed

    Marra, Annachiara; Pandharipande, Pratik P; Shotwell, Matthew S; Chandrasekhar, Rameela; Girard, Timothy D; Shintani, Ayumi K; Peelen, Linda M; Moons, Karl G M; Dittus, Robert S; Ely, E Wesley; Vasilevskis, Eduard E

    2018-03-24

    The goal of this study was to develop and validate a dynamic risk model to predict daily changes in acute brain dysfunction (ie, delirium and coma), discharge, and mortality in ICU patients. Using data from a multicenter prospective ICU cohort, a daily acute brain dysfunction-prediction model (ABD-pm) was developed by using multinomial logistic regression that estimated 15 transition probabilities (from one of three brain function states [normal, delirious, or comatose] to one of five possible outcomes [normal, delirious, comatose, ICU discharge, or died]) using baseline and daily risk factors. Model discrimination was assessed by using predictive characteristics such as negative predictive value (NPV). Calibration was assessed by plotting empirical vs model-estimated probabilities. Internal validation was performed by using a bootstrap procedure. Data were analyzed from 810 patients (6,711 daily transitions). The ABD-pm included individual risk factors: mental status, age, preexisting cognitive impairment, baseline and daily severity of illness, and daily administration of sedatives. The model yielded very high NPVs for "next day" delirium (NPV: 0.823), coma (NPV: 0.892), normal cognitive state (NPV: 0.875), ICU discharge (NPV: 0.905), and mortality (NPV: 0.981). The model demonstrated outstanding calibration when predicting the total number of patients expected to be in any given state across predicted risk. We developed and internally validated a dynamic risk model that predicts the daily risk for one of three cognitive states, ICU discharge, or mortality. The ABD-pm may be useful for predicting the proportion of patients for each outcome state across entire ICU populations to guide quality, safety, and care delivery activities. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  17. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    PubMed

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  18. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  19. Design and validation of a model to predict early mortality in haemodialysis patients.

    PubMed

    Mauri, Joan M; Clèries, Montse; Vela, Emili

    2008-05-01

    Mortality and morbidity rates are higher in patients receiving haemodialysis therapy than in the general population. Detection of risk factors related to early death in these patients could be of aid for clinical and administrative decision making. Objectives. The aims of this study were (1) to identify risk factors (comorbidity and variables specific to haemodialysis) associated with death in the first year following the start of haemodialysis and (2) to design and validate a prognostic model to quantify the probability of death for each patient. An analysis was carried out on all patients starting haemodialysis treatment in Catalonia during the period 1997-2003 (n = 5738). The data source was the Renal Registry of Catalonia, a mandatory population registry. Patients were randomly divided into two samples: 60% (n = 3455) of the total were used to develop the prognostic model and the remaining 40% (n = 2283) to validate the model. Logistic regression analysis was used to construct the model. One-year mortality in the total study population was 16.5%. The predictive model included the following variables: age, sex, primary renal disease, grade of functional autonomy, chronic obstructive pulmonary disease, malignant processes, chronic liver disease, cardiovascular disease, initial vascular access and malnutrition. The analyses showed adequate calibration for both the sample to develop the model and the validation sample (Hosmer-Lemeshow statistic 0.97 and P = 0.49, respectively) as well as adequate discrimination (ROC curve 0.78 in both cases). Risk factors implicated in mortality at one year following the start of haemodialysis have been determined and a prognostic model designed. The validated, easy-to-apply model quantifies individual patient risk attributable to various factors, some of them amenable to correction by directed interventions.

  20. Theoretical Modeling of the Magnetic Behavior of Thiacalix[4]arene Tetranuclear Mn(II)2Gd(III)2 and Co(II)2Eu(III)2 Complexes.

    PubMed

    Aldoshin, Sergey M; Sanina, Nataliya A; Palii, Andrew V; Tsukerblat, Boris S

    2016-04-04

    In view of a wide perspective of 3d-4f complexes in single-molecule magnetism, here we propose an explanation of the magnetic behavior of the two thiacalix[4]arene tetranuclear heterometallic complexes Mn(II)2Gd(III)2 and Co(II)2Eu(III)2. The energy pattern of the Mn(II)2Gd(III)2 complex evaluated in the framework of the isotropic exchange model exhibits a rotational band of the low-lying spin excitations within which the Landé intervals are affected by the biquadratic spin-spin interactions. The nonmonotonic temperature dependence of the χT product observed for the Mn(II)2Gd(III)2 complex is attributed to the competitive influence of the ferromagnetic Mn-Gd and antiferromagnetic Mn-Mn exchange interactions, the latter being stronger (J(Mn, Mn) = -1.6 cm(-1), Js(Mn, Gd) = 0.8 cm(-1), g = 1.97). The model for the Co(II)2Eu(III)2 complex includes uniaxial anisotropy of the seven-coordinate Co(II) ions and an isotropic exchange interaction in the Co(II)2 pair, while the Eu(III) ions are diamagnetic in their ground states. Best-fit analysis of χT versus T showed that the anisotropic contribution (arising from a large zero-field splitting in Co(II) ions) dominates (weak-exchange limit) in the Co(II)2Eu(III)2 complex (D = 20.5 cm(-1), J = -0.4 cm(-1), gCo = 2.22). This complex is concluded to exhibit an easy plane of magnetization (arising from the Co(II) pair). It is shown that the low-lying part of the spectrum can be described by a highly anisotropic effective spin-(1)/2 Hamiltonian that is deduced for the Co(II)2 pair in the weak-exchange limit.

  1. A Parameter Study for Modeling Mg ii h and k Emission during Solar Flares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio da Costa, Fatima; Kleint, Lucia, E-mail: frubio@stanford.edu

    2017-06-20

    Solar flares show highly unusual spectra in which the thermodynamic conditions of the solar atmosphere are encoded. Current models are unable to fully reproduce the spectroscopic flare observations, especially the single-peaked spectral profiles of the Mg ii h and k lines. We aim to understand the formation of the chromospheric and optically thick Mg ii h and k lines in flares through radiative transfer calculations. We take a flare atmosphere obtained from a simulation with the radiative hydrodynamic code RADYN as input for a radiative transfer modeling with the RH code. By iteratively changing this model atmosphere and varying thermodynamicmore » parameters such as temperature, electron density, and velocity, we study their effects on the emergent intensity spectra. We reproduce the typical single-peaked Mg ii h and k flare spectral shape and approximate the intensity ratios to the subordinate Mg ii lines by increasing either densities, temperatures, or velocities at the line core formation height range. Additionally, by combining unresolved upflows and downflows up to ∼250 km s{sup −1} within one resolution element, we reproduce the widely broadened line wings. While we cannot unambiguously determine which mechanism dominates in flares, future modeling efforts should investigate unresolved components, additional heat dissipation, larger velocities, and higher densities and combine the analysis of multiple spectral lines.« less

  2. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model.

    PubMed

    Wen, Kuang-Yi; Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Two of the seven factors, 'organizational motivation' and 'meeting user needs,' were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term.

  3. A diagnostic model for the detection of sensitization to wheat allergens was developed and validated in bakery workers.

    PubMed

    Suarthana, Eva; Vergouwe, Yvonne; Moons, Karel G; de Monchy, Jan; Grobbee, Diederick; Heederik, Dick; Meijer, Evert

    2010-09-01

    To develop and validate a prediction model to detect sensitization to wheat allergens in bakery workers. The prediction model was developed in 867 Dutch bakery workers (development set, prevalence of sensitization 13%) and included questionnaire items (candidate predictors). First, principal component analysis was used to reduce the number of candidate predictors. Then, multivariable logistic regression analysis was used to develop the model. Internal validation and extent of optimism was assessed with bootstrapping. External validation was studied in 390 independent Dutch bakery workers (validation set, prevalence of sensitization 20%). The prediction model contained the predictors nasoconjunctival symptoms, asthma symptoms, shortness of breath and wheeze, work-related upper and lower respiratory symptoms, and traditional bakery. The model showed good discrimination with an area under the receiver operating characteristic (ROC) curve area of 0.76 (and 0.75 after internal validation). Application of the model in the validation set gave a reasonable discrimination (ROC area=0.69) and good calibration after a small adjustment of the model intercept. A simple model with questionnaire items only can be used to stratify bakers according to their risk of sensitization to wheat allergens. Its use may increase the cost-effectiveness of (subsequent) medical surveillance.

  4. Model improvements and validation of TerraSAR-X precise orbit determination

    NASA Astrophysics Data System (ADS)

    Hackel, S.; Montenbruck, O.; Steigenberger, P.; Balss, U.; Gisinger, C.; Eineder, M.

    2017-05-01

    The radar imaging satellite mission TerraSAR-X requires precisely determined satellite orbits for validating geodetic remote sensing techniques. Since the achieved quality of the operationally derived, reduced-dynamic (RD) orbit solutions limits the capabilities of the synthetic aperture radar (SAR) validation, an effort is made to improve the estimated orbit solutions. This paper discusses the benefits of refined dynamical models on orbit accuracy as well as estimated empirical accelerations and compares different dynamic models in a RD orbit determination. Modeling aspects discussed in the paper include the use of a macro-model for drag and radiation pressure computation, the use of high-quality atmospheric density and wind models as well as the benefit of high-fidelity gravity and ocean tide models. The Sun-synchronous dusk-dawn orbit geometry of TerraSAR-X results in a particular high correlation of solar radiation pressure modeling and estimated normal-direction positions. Furthermore, this mission offers a unique suite of independent sensors for orbit validation. Several parameters serve as quality indicators for the estimated satellite orbit solutions. These include the magnitude of the estimated empirical accelerations, satellite laser ranging (SLR) residuals, and SLR-based orbit corrections. Moreover, the radargrammetric distance measurements of the SAR instrument are selected for assessing the quality of the orbit solutions and compared to the SLR analysis. The use of high-fidelity satellite dynamics models in the RD approach is shown to clearly improve the orbit quality compared to simplified models and loosely constrained empirical accelerations. The estimated empirical accelerations are substantially reduced by 30% in tangential direction when working with the refined dynamical models. Likewise the SLR residuals are reduced from -3 ± 17 to 2 ± 13 mm, and the SLR-derived normal-direction position corrections are reduced from 15 to 6 mm, obtained from

  5. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  6. Understanding Dynamic Model Validation of a Wind Turbine Generator and a Wind Power Plant: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muljadi, Eduard; Zhang, Ying Chen; Gevorgian, Vahan

    Regional reliability organizations require power plants to validate the dynamic models that represent them to ensure that power systems studies are performed to the best representation of the components installed. In the process of validating a wind power plant (WPP), one must be cognizant of the parameter settings of the wind turbine generators (WTGs) and the operational settings of the WPP. Validating the dynamic model of a WPP is required to be performed periodically. This is because the control parameters of the WTGs and the other supporting components within a WPP may be modified to comply with new grid codesmore » or upgrades to the WTG controller with new capabilities developed by the turbine manufacturers or requested by the plant owners or operators. The diversity within a WPP affects the way we represent it in a model. Diversity within a WPP may be found in the way the WTGs are controlled, the wind resource, the layout of the WPP (electrical diversity), and the type of WTGs used. Each group of WTGs constitutes a significant portion of the output power of the WPP, and their unique and salient behaviors should be represented individually. The objective of this paper is to illustrate the process of dynamic model validations of WTGs and WPPs, the available data recorded that must be screened before it is used for the dynamic validations, and the assumptions made in the dynamic models of the WTG and WPP that must be understood. Without understanding the correct process, the validations may lead to the wrong representations of the WTG and WPP modeled.« less

  7. Development and validation of a two-dimensional fast-response flood estimation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judi, David R; Mcpherson, Timothy N; Burian, Steven J

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. Themore » simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.« less

  8. Review and evaluation of performance measures for survival prediction models in external validation settings.

    PubMed

    Rahman, M Shafiqur; Ambler, Gareth; Choodari-Oskooei, Babak; Omar, Rumana Z

    2017-04-18

    When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell's concordance measure which tended to increase as censoring increased. We recommend that Uno's concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller's measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston's D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive accuracy curves. In addition, we recommend to investigate the characteristics

  9. The Safety Culture Enactment Questionnaire (SCEQ): Theoretical model and empirical validation.

    PubMed

    de Castro, Borja López; Gracia, Francisco J; Tomás, Inés; Peiró, José M

    2017-06-01

    This paper presents the Safety Culture Enactment Questionnaire (SCEQ), designed to assess the degree to which safety is an enacted value in the day-to-day running of nuclear power plants (NPPs). The SCEQ is based on a theoretical safety culture model that is manifested in three fundamental components of the functioning and operation of any organization: strategic decisions, human resources practices, and daily activities and behaviors. The extent to which the importance of safety is enacted in each of these three components provides information about the pervasiveness of the safety culture in the NPP. To validate the SCEQ and the model on which it is based, two separate studies were carried out with data collection in 2008 and 2014, respectively. In Study 1, the SCEQ was administered to the employees of two Spanish NPPs (N=533) belonging to the same company. Participants in Study 2 included 598 employees from the same NPPs, who completed the SCEQ and other questionnaires measuring different safety outcomes (safety climate, safety satisfaction, job satisfaction and risky behaviors). Study 1 comprised item formulation and examination of the factorial structure and reliability of the SCEQ. Study 2 tested internal consistency and provided evidence of factorial validity, validity based on relationships with other variables, and discriminant validity between the SCEQ and safety climate. Exploratory Factor Analysis (EFA) carried out in Study 1 revealed a three-factor solution corresponding to the three components of the theoretical model. Reliability analyses showed strong internal consistency for the three scales of the SCEQ, and each of the 21 items on the questionnaire contributed to the homogeneity of its theoretically developed scale. Confirmatory Factor Analysis (CFA) carried out in Study 2 supported the internal structure of the SCEQ; internal consistency of the scales was also supported. Furthermore, the three scales of the SCEQ showed the expected correlation

  10. Integrated corridor management (ICM) analysis, modeling, and simulation (AMS) for Minneapolis site : model calibration and validation report.

    DOT National Transportation Integrated Search

    2010-02-01

    This technical report documents the calibration and validation of the baseline (2008) mesoscopic model for the I-394 Minneapolis, Minnesota, Pioneer Site. DynusT was selected as the mesoscopic model for analyzing operating conditions in the I-394 cor...

  11. Calibration and validation of toxicokinetic-toxicodynamic models for three neonicotinoids and some aquatic macroinvertebrates.

    PubMed

    Focks, Andreas; Belgers, Dick; Boerwinkel, Marie-Claire; Buijse, Laura; Roessink, Ivo; Van den Brink, Paul J

    2018-05-01

    Exposure patterns in ecotoxicological experiments often do not match the exposure profiles for which a risk assessment needs to be performed. This limitation can be overcome by using toxicokinetic-toxicodynamic (TKTD) models for the prediction of effects under time-variable exposure. For the use of TKTD models in the environmental risk assessment of chemicals, it is required to calibrate and validate the model for specific compound-species combinations. In this study, the survival of macroinvertebrates after exposure to the neonicotinoid insecticide was modelled using TKTD models from the General Unified Threshold models of Survival (GUTS) framework. The models were calibrated on existing survival data from acute or chronic tests under static exposure regime. Validation experiments were performed for two sets of species-compound combinations: one set focussed on multiple species sensitivity to a single compound: imidacloprid, and the other set on the effects of multiple compounds for a single species, i.e., the three neonicotinoid compounds imidacloprid, thiacloprid and thiamethoxam, on the survival of the mayfly Cloeon dipterum. The calibrated models were used to predict survival over time, including uncertainty ranges, for the different time-variable exposure profiles used in the validation experiments. From the comparison between observed and predicted survival, it appeared that the accuracy of the model predictions was acceptable for four of five tested species in the multiple species data set. For compounds such as neonicotinoids, which are known to have the potential to show increased toxicity under prolonged exposure, the calibration and validation of TKTD models for survival needs to be performed ideally by considering calibration data from both acute and chronic tests.

  12. Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model

    DTIC Science & Technology

    2010-03-01

    EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End

  13. Gene-environment interactions and construct validity in preclinical models of psychiatric disorders.

    PubMed

    Burrows, Emma L; McOmish, Caitlin E; Hannan, Anthony J

    2011-08-01

    The contributions of genetic risk factors to susceptibility for brain disorders are often so closely intertwined with environmental factors that studying genes in isolation cannot provide the full picture of pathogenesis. With recent advances in our understanding of psychiatric genetics and environmental modifiers we are now in a position to develop more accurate animal models of psychiatric disorders which exemplify the complex interaction of genes and environment. Here, we consider some of the insights that have emerged from studying the relationship between defined genetic alterations and environmental factors in rodent models. A key issue in such animal models is the optimization of construct validity, at both genetic and environmental levels. Standard housing of laboratory mice and rats generally includes ad libitum food access and limited opportunity for physical exercise, leading to metabolic dysfunction under control conditions, and thus reducing validity of animal models with respect to clinical populations. A related issue, of specific relevance to neuroscientists, is that most standard-housed rodents have limited opportunity for sensory and cognitive stimulation, which in turn provides reduced incentive for complex motor activity. Decades of research using environmental enrichment has demonstrated beneficial effects on brain and behavior in both wild-type and genetically modified rodent models, relative to standard-housed littermate controls. One interpretation of such studies is that environmentally enriched animals more closely approximate average human levels of cognitive and sensorimotor stimulation, whereas the standard housing currently used in most laboratories models a more sedentary state of reduced mental and physical activity and abnormal stress levels. The use of such standard housing as a single environmental variable may limit the capacity for preclinical models to translate into successful clinical trials. Therefore, there is a need to

  14. Proposed Modifications to the Conceptual Model of Coaching Efficacy and Additional Validity Evidence for the Coaching Efficacy Scale II-High School Teams

    ERIC Educational Resources Information Center

    Myers, Nicholas; Feltz, Deborah; Chase, Melissa

    2011-01-01

    The purpose of this study was to determine whether theoretically relevant sources of coaching efficacy could predict the measures derived from the Coaching Efficacy Scale II-High School Teams (CES II-HST). Data were collected from head coaches of high school teams in the United States (N = 799). The analytic framework was a multiple-group…

  15. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and

  16. Validation of a dynamic linked segment model to calculate joint moments in lifting.

    PubMed

    de Looze, M P; Kingma, I; Bussmann, J B; Toussaint, H M

    1992-08-01

    A two-dimensional dynamic linked segment model was constructed and applied to a lifting activity. Reactive forces and moments were calculated by an instantaneous approach involving the application of Newtonian mechanics to individual adjacent rigid segments in succession. The analysis started once at the feet and once at a hands/load segment. The model was validated by comparing predicted external forces and moments at the feet or at a hands/load segment to actual values, which were simultaneously measured (ground reaction force at the feet) or assumed to be zero (external moments at feet and hands/load and external forces, beside gravitation, at hands/load). In addition, results of both procedures, in terms of joint moments, including the moment at the intervertebral disc between the fifth lumbar and first sacral vertebra (L5-S1), were compared. A correlation of r = 0.88 between calculated and measured vertical ground reaction forces was found. The calculated external forces and moments at the hands showed only minor deviations from the expected zero level. The moments at L5-S1, calculated starting from feet compared to starting from hands/load, yielded a coefficient of correlation of r = 0.99. However, moments calculated from hands/load were 3.6% (averaged values) and 10.9% (peak values) higher. This difference is assumed to be due mainly to erroneous estimations of the positions of centres of gravity and joint rotation centres. The estimation of the location of L5-S1 rotation axis can affect the results significantly. Despite the numerous studies estimating the load on the low back during lifting on the basis of linked segment models, only a few attempts to validate these models have been made. This study is concerned with the validity of the presented linked segment model. The results support the model's validity. Effects of several sources of error threatening the validity are discussed. Copyright © 1992. Published by Elsevier Ltd.

  17. Model Transformation for a System of Systems Dependability Safety Case

    NASA Technical Reports Server (NTRS)

    Murphy, Judy; Driskell, Steve

    2011-01-01

    The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.

  18. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS III: BOUNDARY AND INITIAL CONDITIONS, MODEL GRID RESOLUTION, AND HG(II) REDUCTION MECHANISMS

    EPA Science Inventory

    In this study we investigate the CMAQ model response in terms of simulated mercury concentration and deposition to boundary/initial conditions (BC/IC), model grid resolution (12- versus 36-km), and two alternative Hg(II) reduction mechanisms. The model response to the change of g...

  19. Validation of APACHE II scoring system at 24 hours after admission as a prognostic tool in urosepsis: A prospective observational study.

    PubMed

    VijayGanapathy, Sundaramoorthy; Karthikeyan, VIlvapathy Senguttuvan; Sreenivas, Jayaram; Mallya, Ashwin; Keshavamurthy, Ramaiah

    2017-11-01

    Urosepsis implies clinically evident severe infection of urinary tract with features of systemic inflammatory response syndrome (SIRS). We validate the role of a single Acute Physiology and Chronic Health Evaluation II (APACHE II) score at 24 hours after admission in predicting mortality in urosepsis. A prospective observational study was done in 178 patients admitted with urosepsis in the Department of Urology, in a tertiary care institute from January 2015 to August 2016. Patients >18 years diagnosed as urosepsis using SIRS criteria with positive urine or blood culture for bacteria were included. At 24 hours after admission to intensive care unit, APACHE II score was calculated using 12 physiological variables, age and chronic health. Mean±standard deviation (SD) APACHE II score was 26.03±7.03. It was 24.31±6.48 in survivors and 32.39±5.09 in those expired (p<0.001). Among patients undergoing surgery, mean±SD score was higher (30.74±4.85) than among survivors (24.30±6.54) (p<0.001). Receiver operating characteristic (ROC) analysis revealed area under curve (AUC) of 0.825 with cutoff 25.5 being 94.7% sensitive and 56.4% specific to predict mortality. Mean±SD score in those undergoing surgery was 25.22±6.70 and was lesser than those who did not undergo surgery (28.44±7.49) (p=0.007). ROC analysis revealed AUC of 0.760 with cutoff 25.5 being 94.7% sensitive and 45.6% specific to predict mortality even after surgery. A single APACHE II score assessed at 24 hours after admission was able to predict morbidity, mortality, need for surgical intervention, length of hospitalization, treatment success and outcome in urosepsis patients.

  20. IMPLEMENTATION AND VALIDATION OF A FULLY IMPLICIT ACCUMULATOR MODEL IN RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Haihua; Zou, Ling; Zhang, Hongbin

    2016-01-01

    This paper presents the implementation and validation of an accumulator model in RELAP-7 under the framework of preconditioned Jacobian free Newton Krylov (JFNK) method, based on the similar model used in RELAP5. RELAP-7 is a new nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). RELAP-7 is a fully implicit system code. The JFNK and preconditioning methods used in RELAP-7 is briefly discussed. The slightly modified accumulator model is summarized for completeness. The implemented model was validated with LOFT L3-1 test and benchmarked with RELAP5 results. RELAP-7 and RELAP5 had almost identical results for themore » accumulator gas pressure and water level, although there were some minor difference in other parameters such as accumulator gas temperature and tank wall temperature. One advantage of the JFNK method is its easiness to maintain and modify models due to fully separation of numerical methods from physical models. It would be straightforward to extend the current RELAP-7 accumulator model to simulate the advanced accumulator design.« less