These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Lattice Discrete Particle Model (LDPM) for failure behavior of concrete. II: Calibration and validation  

Microsoft Academic Search

The Lattice Discrete Particle Model (LDPM) formulated in the preceding Part I of this study is calibrated and validated in the present Part II. Calibration and validation is performed by comparing the results of numerical simulations with experimental data gathered from the literature. Simulated experiments include uniaxial and multiaxial compression, tensile fracture, shear strength, and cycling compression tests.

Gianluca Cusatis; Andrea Mencarelli; Daniele Pelessone; James Baylot

2011-01-01

2

Fatigue crack growth under variable-amplitude loading: Part II Code development and model validation q  

E-print Network

Fatigue crack growth under variable-amplitude loading: Part II ± Code development and model 2001; accepted 12 February 2001 Abstract A state-space model of fatigue crack growth has been information for code development and validates the state-space model with fatigue test data for dierent types

Ray, Asok

3

Large-scale Validation of AMIP II Land-surface Simulations: Preliminary Results for Ten Models  

SciTech Connect

This report summarizes initial findings of a large-scale validation of the land-surface simulations of ten atmospheric general circulation models that are entries in phase II of the Atmospheric Model Intercomparison Project (AMIP II). This validation is conducted by AMIP Diagnostic Subproject 12 on Land-surface Processes and Parameterizations, which is focusing on putative relationships between the continental climate simulations and the associated models' land-surface schemes. The selected models typify the diversity of representations of land-surface climate that are currently implemented by the global modeling community. The current dearth of global-scale terrestrial observations makes exacting validation of AMIP II continental simulations impractical. Thus, selected land-surface processes of the models are compared with several alternative validation data sets, which include merged in-situ/satellite products, climate reanalyses, and off-line simulations of land-surface schemes that are driven by observed forcings. The aggregated spatio-temporal differences between each simulated process and a chosen reference data set then are quantified by means of root-mean-square error statistics; the differences among alternative validation data sets are similarly quantified as an estimate of the current observational uncertainty in the selected land-surface process. Examples of these metrics are displayed for land-surface air temperature, precipitation, and the latent and sensible heat fluxes. It is found that the simulations of surface air temperature, when aggregated over all land and seasons, agree most closely with the chosen reference data, while the simulations of precipitation agree least. In the latter case, there also is considerable inter-model scatter in the error statistics, with the reanalyses estimates of precipitation resembling the AMIP II simulations more than to the chosen reference data. In aggregate, the simulations of land-surface latent and sensible heat fluxes appear to occupy intermediate positions between these extremes, but the existing large observational uncertainties in these processes make this a provisional assessment. In all selected processes as well, the error statistics are found to be sensitive to season and latitude sector, confirming the need for finer-scale analyses which also are in progress.

Phillips, T J; Henderson-Sellers, A; Irannejad, P; McGuffie, K; Zhang, H

2005-12-01

4

Person Heterogeneity of the BDI-II-C and Its Effects on Dimensionality and Construct Validity: Using Mixture Item Response Models  

ERIC Educational Resources Information Center

This study was to apply the mixed Rasch model to investigate person heterogeneity of Beck Depression Inventory-II-Chinese version (BDI-II-C) and its effects on dimensionality and construct validity. Person heterogeneity was reflected by two latent classes that differ qualitatively. Additionally, person heterogeneity adversely affected the…

Wu, Pei-Chen; Huang, Tsai-Wei

2010-01-01

5

Black liquor combustion validated recovery boiler modeling: Final year report. Volume 2 (Appendices I, section 5 and II, section 1)  

SciTech Connect

This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 2 contains the last section of Appendix I, Radiative heat transfer in kraft recovery boilers, and the first section of Appendix II, The effect of temperature and residence time on the distribution of carbon, sulfur, and nitrogen between gaseous and condensed phase products from low temperature pyrolysis of kraft black liquor.

Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

1998-08-01

6

The model SIRANE for atmospheric urban pollutant dispersion; PART II, validation of the model on a real case study  

NASA Astrophysics Data System (ADS)

We analyse the performance of the model SIRANE by comparing its outputs to field data measured within an urban district. SIRANE is the first urban dispersion model based on the concept of street network, and contains specific parametrical law to explicitly simulate the main transfer mechanisms within the urban canopy. The model validation is performed by means of field data collected during a 15 days measurement campaign in an urban district in Lyon, France. The campaign provided information on traffic fluxes and cars emissions, meteorological conditions, background pollution levels and pollutant concentration in different location within the district. This data set, together with complementary modelling tools needed to estimate the spatial distribution of traffic fluxes, allowed us to estimate the input data required by the model. The data set provide also the information essential to evaluate the accuracy of the model outputs. Comparison between model predictions and field measurements was performed in two ways. By evaluate the reliability of the model in simulating the spatial distribution of the pollutant and of their time variability. The study includes a sensitivity analysis to identify the key input parameters influencing the performance of the model, namely the emissions rates and the wind velocity. The analysis focuses only on the influence of varying input parameters in the modelling chain in the model predictions and complements the analyses provided by wind tunnel studies focussing on the parameterisation implemented in the model. The study also elucidates the critical role of background concentrations that represent a significant contribution to local pollution levels. The overall model performance, measured using the Chang and Hanna (2004) criteria can be considered as 'good' except for NO and some of BTX species. The results suggest that improvements of the performances on NO require testing new photochemical models, whereas the improvement on BTX could be achieved by correcting their vehicular emissions factors.

Soulhac, L.; Salizzoni, P.; Mejean, P.; Didier, D.; Rios, I.

2012-03-01

7

Assessing Wildlife Habitat Value of New England Salt Marshes: II. Model Testing and Validation  

EPA Science Inventory

We test a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. Assessment scores ranged f...

8

Error characterization of the Gaia astrometric solution. II. Validating the covariance expansion model  

NASA Astrophysics Data System (ADS)

Context. To use the data in the future Gaia catalogue it is important to have accurate estimates of the statistical uncertainties and correlations of the errors in the astrometric data given in the catalogue. Aims: In a previous paper we derived a mathematical model for computing the covariances of the astrometric data based on series expansions and a simplified attitude description. The aim of the present paper is to determine to what extent this model provides an accurate representation of the expected random errors in the astrometric solution for Gaia. Methods: We simulate the astrometric core solution by making least-squares solutions of the astrometric parameters for one million stars and the attitude parameters for a five-year mission, using nearly one billion simulated elementary observations for a total of 26 million unknowns. Two cases are considered: one in which all stars have the same magnitude, and another with 30% brighter and 70% fainter stars. The resulting astrometric errors are statistically compared with the model predictions. Results: In all cases considered, and within the statistical uncertainties of the numerical experiments (typically below 0.4%), the theoretically calculated variances and covariances are consistent with the simulations. To achieve this it is however necessary to expand the covariances to at least third or fourth order, and to apply a (theoretically motivated and derived) "fudge factor" in the kinematographic model. Conclusions: The model provides a feasible method to estimate the covariance of arbitrary astrometric data, accurate enough for most applications, and as such it should be available as part of the user's interface to the Gaia catalogue. A main assumption in the current model is that the observational errors are uncorrelated (e.g., photon noise), and further studies are needed on how correlated modelling errors, in particular in the attitude, can be taken into account.

Holl, B.; Lindegren, L.; Hobbs, D.

2012-07-01

9

TAMDAR Sensor Validation in 2003 AIRS II  

NASA Technical Reports Server (NTRS)

This study entails an assessment of TAMDAR in situ temperature, relative humidity and winds sensor data from seven flights of the UND Citation II. These data are undergoing rigorous assessment to determine their viability to significantly augment domestic Meteorological Data Communications Reporting System (MDCRS) and the international Aircraft Meteorological Data Reporting (AMDAR) system observational databases to improve the performance of regional and global numerical weather prediction models. NASA Langley Research Center participated in the Second Alliance Icing Research Study from November 17 to December 17, 2003. TAMDAR data taken during this period is compared with validation data from the UND Citation. The data indicate acceptable performance of the TAMDAR sensor when compared to measurements from the UND Citation research instruments.

Daniels, Taumi S.; Murray, John J.; Anderson, Mark V.; Mulally, Daniel J.; Jensen, Kristopher R.; Grainger, Cedric A.; Delene, David J.

2005-01-01

10

INACTIVATION OF CRYPTOSPORIDIUM OOCYSTS IN A PILOT-SCALE OZONE BUBBLE-DIFFUSER CONTACTOR - II: MODEL VALIDATION AND APPLICATION  

EPA Science Inventory

The ADR model developed in Part I of this study was successfully validated with experimenta data obtained for the inactivation of C. parvum and C. muris oocysts with a pilot-scale ozone-bubble diffuser contactor operated with treated Ohio River water. Kinetic parameters, required...

11

Black liquor combustion validated recovery boiler modeling: Final year report. Volume 3 (Appendices II, sections 2--3 and III)  

SciTech Connect

This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 3 contains the following appendix sections: Formation and destruction of nitrogen oxides in recovery boilers; Sintering and densification of recovery boiler deposits laboratory data and a rate model; and Experimental data on rates of particulate formation during char bed burning.

Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

1998-08-01

12

Macrotransport-solidification kinetics modeling of equiaxed dendritic growth: Part II. Computation problems and validation on INCONEL 718 superalloy castings  

NASA Astrophysics Data System (ADS)

In Part I of the article, a new analytical model that describes solidification of equiaxed dendrites was presented. In this part of the article, the model is used to simulate the solidification of INCONEL 718 superalloy castings. The model was incorporated into a commercial finite-element code, PROCAST. A special procedure called microlatent heat method (MLHM) was used for coupling between macroscopic heat flow and microscopic growth kinetics. A criterion for time-stepping selection in microscopic modeling has been derived in conjunction with MLHM. Reductions in computational (CPU) time up to 90 pct over the classic latent heat method were found by adopting this coupling. Validation of the model was performed against experimental data for an INCONEL 718 superalloy casting. In the present calculations, the model for globulitic dendrite was used. The evolution of fraction of solid calculated with the present model was compared with Scheil’s model and experiments. An important feature in solidification of INCONEL 718 is the detrimental Laves phase. Laves phase content is directly related to the intensity of microsegregation of niobium, which is very sensitive to the evolution of the fraction of solid. It was found that there is a critical cooling rate at which the amount of Laves phase is maximum. The critical cooling rate is not a function of material parameters (diffusivity, partition coefficient, etc.). It depends only on the grain size and solidification time. The predictions generated with the present model are shown to agree very well with experiments.

Nastac, L.; Stefanescu, D. M.

1996-12-01

13

A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part II - Validation and localization analysis  

NASA Astrophysics Data System (ADS)

We study the mechanical failure of cemented granular materials (e.g., sandstones) using a constitutive model based on breakage mechanics for grain crushing and damage mechanics for cement fracture. The theoretical aspects of this model are presented in Part I: Tengattini et al. (2014), A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables, Part I - Theory (Journal of the Mechanics and Physics of Solids, 10.1016/j.jmps.2014.05.021). In this Part II we investigate the constitutive and structural responses of cemented granular materials through analyses of Boundary Value Problems (BVPs). The multiple failure mechanisms captured by the proposed model enable the behavior of cemented granular rocks to be well reproduced for a wide range of confining pressures. Furthermore, through comparison of the model predictions and experimental data, the micromechanical basis of the model provides improved understanding of failure mechanisms of cemented granular materials. In particular, we show that grain crushing is the predominant inelastic deformation mechanism under high pressures while cement failure is the relevant mechanism at low pressures. Over an intermediate pressure regime a mixed mode of failure mechanisms is observed. Furthermore, the micromechanical roots of the model allow the effects on localized deformation modes of various initial microstructures to be studied. The results obtained from both the constitutive responses and BVP solutions indicate that the proposed approach and model provide a promising basis for future theoretical studies on cemented granular materials.

Das, Arghya; Tengattini, Alessandro; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

2014-10-01

14

Validation of SAGE II NO2 measurements  

NASA Technical Reports Server (NTRS)

The validity of NO2 measurements from the stratospheric aerosol and gas experiment (SAGE) II is examined by comparing the data with climatological distributions of NO2 and by examining the consistency of the observations themselves. The precision at high altitudes is found to be 5 percent, which is also the case at specific low altitudes for certain latitudes where the mixing ratio is 4 ppbv, and the precision is 0.2 ppbv at low altitudes. The autocorrelation distance of the smoothed profile measurement noise is 3-5 km and 10 km for 1-km and 5-km smoothing, respectively. The SAGE II measurements agree with spectroscopic measurements to within 10 percent, and the SAGE measurements are about 20 percent smaller than average limb monitor measurements at the mixing ratio peak. SAGE I and SAGE II measurements are slightly different, but the difference is not attributed to changes in atmospheric NO2.

Cunnold, D. M.; Zawodny, J. M.; Chu, W. P.; Mccormick, M. P.; Pommereau, J. P.; Goutail, F.

1991-01-01

15

Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part II: Benchmark comparisons of PUMA core parameters with MCNP5 and improvements due to a simple cell heterogeneity correction  

SciTech Connect

In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure vessel design with 451 vertical coolant channels and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update of reactor physics calculation methods and models was recently carried out covering cell, supercell (control rod) and core calculations. This paper presents benchmark comparisons of core parameters of a slightly idealized model of the Atucha-I core obtained with the PUMA reactor code with MCNP5. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, more symmetric than Atucha-II, and has some experimental data available. To validate the new models benchmark comparisons of k-effective, channel power and axial power distributions obtained with PUMA and MCNP5 have been performed. In addition, a simple cell heterogeneity correction recently introduced in PUMA is presented, which improves significantly the agreement of calculated channel powers with MCNP5. To complete the validation, the calculation of some of the critical configurations of the Atucha-I reactor measured during the experiments performed at first criticality is also presented. (authors)

Grant, C. [Comision Nacional de Energia Atomica, Av del Libertador 8250, Buenos Aires 1429 (Argentina); Mollerach, R. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina); Leszczynski, F.; Serra, O.; Marconi, J. [Comision Nacional de Energia Atomica, Av del Libertador 8250, Buenos Aires 1429 (Argentina); Fink, J. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina)

2006-07-01

16

Resolving the mass-anisotropy degeneracy of the spherically symmetric Jeans equation - II. Optimum smoothing and model validation  

NASA Astrophysics Data System (ADS)

The spherical Jeans equation is widely used to estimate the mass content of stellar systems with apparent spherical symmetry. However, this method suffers from a degeneracy between the assumed mass density and the kinematic anisotropy profile, ?(r). In a previous work, we laid the theoretical foundations for an algorithm that combines smoothing B splines with equations from dynamics to remove this degeneracy. Specifically, our method reconstructs a unique kinematic profile of ? _{rr}^2 and ? _{tt}^2 for an assumed free functional form of the potential and mass density (?, ?) and given a set of observed line-of-sight velocity dispersion measurements, ? _los^2. In Paper I, we demonstrated the efficiency of our algorithm with a very simple example and we commented on the need for optimum smoothing of the B-spline representation; this is in order to avoid unphysical variational behaviour when we have large uncertainty in our data. In the current contribution, we present a process of finding the optimum smoothing for a given data set by using information of the behaviour from known ideal theoretical models. Markov Chain Monte Carlo methods are used to explore the degeneracy in the dynamical modelling process. We validate our model through applications to synthetic data for systems with constant or variable mass-to-light ratio ?. In all cases, we recover excellent fits of theoretical functions to observables and unique solutions. Our algorithm is a robust method for the removal of the mass-anisotropy degeneracy of the spherically symmetric Jeans equation for an assumed functional form of the mass density.

Diakogiannis, Foivos I.; Lewis, Geraint F.; Ibata, Rodrigo A.

2014-09-01

17

Probabilistic Methods for Model Validation  

E-print Network

from physics, to argue that validation and refinement are constructive iterative processes [9, 10] fundamental to scientific and technological advancement. One of the early examples of model validation/invalidation came from Nicolaus Copernicus in 1543..., who proposed the heliocentric model that opposed the century- old geocentric model proposed by Aristotle and Ptolemy. However, experimental validation for Copernicus’s model had to wait until the invention of telescope in 17th century that led...

Halder, Abhishek

2014-05-01

18

Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes?  

PubMed Central

Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-? C? root mean square deviation [RMSD]) the high-resolution (1.8-?) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur. PMID:21965559

Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

2011-01-01

19

Macrotransport-solidification kinetics modeling of equiaxed dendritic growth: Part II. Computation problems and validation on INCONEL 718 superalloy castings  

Microsoft Academic Search

In Part I of the article, a new analytical model that describes solidification of equiaxed dendrites was presented. In this\\u000a part of the article, the model is used to simulate the solidification of INCONEL 718 superalloy castings. The model was incorporated\\u000a into a commercial finite-element code, PROCAST. A special procedure called microlatent heat method (MLHM) was used for coupling\\u000a between

L. Nastac; D. M. Stefanescu

1996-01-01

20

Predicting germination in semi-arid wildland seedbeds II. Field validation of wet thermal-time models  

Microsoft Academic Search

Accurate prediction of germination for species used for semi-arid land revegetation would support selection of plant materials for specific climatic conditions and sites. Wet thermal-time models predict germination time by summing progress toward germination subpopulation percentages as a function of temperature across intermittent wet periods or within singular wet periods. Wet periods may be defined by any reasonable seedbed water

Jennifer K. Rawlins; Bruce A. Roundy; Dennis Egget; Nathan Cline

21

Thermospheric dynamics during September 18-19, 1984. II - Validation of the NCAR thermospheric general circulation model  

NASA Technical Reports Server (NTRS)

The winds, temperatures, and densities predicted by the thermospheric GCM are compared with measurements from the Equinox Transition Study of September 17-24, 1984. Agreement between predictions and observation is good in many respects. The quiet day observations contain a strong semidiurnal wind variation which is mainly due to upward-propagating tides. The storm day wind behavior is significantly different and includes a surge of equatorward winds due to a global propagating disturbance associated with the storm onset. A quantitative statistical comparison of the predicted and measured winds indicates that the equatorward winds in the model are weaker than the observed winds, particularly during storm times. A quiet day phase anomaly in the measured F region winds which is not reproduced by the model suggests the occurrence of an important unmodeled interaction between upward propagating semidiurnal tides and high-latitude effects.

Crowley, G.; Emery, B. A.; Roble, R. G.; Carlson, H. C., Jr.; Salah, J. E.

1989-01-01

22

Mechanical behavior of glass\\/epoxy tubes under combined static loading. Part II: Validation of FEA progressive damage model  

Microsoft Academic Search

Experimental results from a series of biaxial static tests of E-Glass\\/Epoxy tubular specimens [±45]2, were compared successfully with numerical predictions from thick shell FE calculations. Stress analysis was performed in a progressive damage sense consisting of layer piece-wise linear elastic behavior, simulating lamina anisotropic non-linear constitutive equations, failure mode-dependent criteria and property degradation strategies. The effect of accurate modeling of

Alexandros E. Antoniou; Christoph Kensche; Theodore P. Philippidis

2009-01-01

23

Model Fe-Al Steel with Exceptional Resistance to High Temperature Coarsening. Part II: Experimental Validation and Applications  

NASA Astrophysics Data System (ADS)

In order to achieve a fine uniform grain-size distribution using the process of thin slab casting and directing rolling (TSCDR), it is necessary to control the grain-size prior to the onset of thermomechanical processing. In the companion paper, Model Fe- Al Steel with Exceptional Resistance to High Temperature Coarsening. Part I: Coarsening Mechanism and Particle Pinning Effects, a new steel composition which uses a small volume fraction of austenite particles to pin the growth of delta-ferrite grains at high temperature was proposed and grain growth was studied in reheated samples. This paper will focus on the development of a simple laboratory-scale setup to simulate thin-slab casting of the newly developed steel and demonstrate the potential for grain size control under industrial conditions. Steel bars with different diameters are briefly dipped into the molten steel to create a shell of solidified material. These are then cooled down to room temperature at different cooling rates. During cooling, the austenite particles nucleate along the delta-ferrite grain boundaries and greatly retard grain growth. With decreasing temperature, more austenite particles precipitate, and grain growth can be completely arrested in the holding furnace. Additional applications of the model alloy are discussed including grain-size control in the heat affected zone in welds and grain-growth resistance at high temperature.

Zhou, Tihe; Zhang, Peng; O'Malley, Ronald J.; Zurob, Hatem S.; Subramanian, Mani

2015-01-01

24

Coupling an Advanced Land Surface Hydrology Model with the Penn State NCAR MM5 Modeling System. Part II: Preliminary Model Validation  

Microsoft Academic Search

A number of short-term numerical experiments conducted by the Penn State-NCAR fifth-generation Mesoscale Model (MM5) coupled with an advanced land surface model, alongside the simulations coupled with a simple slab model, are verified with observations. For clear sky day cases, the MM5 model gives reasonable estimates of radiation forcing at the surface with solar radiation being slightly overestimated probably due

Fei Chen; Jimy Dudhia

2001-01-01

25

Modeling diesel engine using KIVA II 3D-code: Validation of a new global combustion model and its sensitivity to the spatial discretization  

SciTech Connect

The present work consists of two main parts: the first part deals with the simulation, with the aid of a modified version of the KIVA-II code, of the global combustion process in a compression ignition engine with direct injection; the second part describes the sensitivity of the code to spatial discretization. The results obtained from the simulations of the entire analysis are discussed in relation to the experimental data relevant to a DI unit of medium displacement Ruggerini RP 170. The first part of this work describes some of the considerable changes made to the combustion model of the original KIVA-II code. These changes have remarkably improved the code`s ability in simulating the overall combustion process. In particular, models were implemented to take into account the auto-ignition process delay, the diffusively controlled combustion and a transition criterion--based on a dynamic calculation of the delayed time--between the first phase of combustion, (kinetic control) and the diffusive control phase. The second part of this work analyzes the sensitivity of the modified code to the variations of the refinement degree in the azimuthal direction of the mesh. This analysis is carried out by keeping tall the parameters of the global combustion model--described in the first part of this work--constant. Both the fuel spray dynamics and the combustion phase are analyzed in detail. A significant dependence upon the mesh is noticed, in relation both to the spray evolution and to the combustion process. In particular, by thickening the mesh, an increasing displacement of the simulated pressure profile by the experimental one has been noticed, while the trend of the pressure profile is still quite correct.

Mariani, F.; Postrioti, L.

1996-09-01

26

TUTORIAL: Validating biorobotic models  

NASA Astrophysics Data System (ADS)

Some issues in neuroscience can be addressed by building robot models of biological sensorimotor systems. What we can conclude from building models or simulations, however, is determined by a number of factors in addition to the central hypothesis we intend to test. These include the way in which the hypothesis is represented and implemented in simulation, how the simulation output is interpreted, how it is compared to the behaviour of the biological system, and the conditions under which it is tested. These issues will be illustrated by discussing a series of robot models of cricket phonotaxis behaviour. .

Webb, Barbara

2006-09-01

27

Validation of SAGE II ozone measurements  

NASA Technical Reports Server (NTRS)

Five ozone profiles from the Stratospheric Aerosol and Gas Experiment (SAGE) II are compared with coincident ozonesonde measurements obtained at Natal, Brazil, and Wallops Island, Virginia. It is shown that the mean difference between all of the measurements is about 1 percent and that the agreement is within 7 percent at altitudes between 20 and 53 km. Good agreement is also found for ozone mixing ratios on pressure surfaces. It is concluded that the SAGE II profiles provide useful ozone information up to about 60 km altitude.

Cunnold, D. M.; Chu, W. P.; Mccormick, M. P.; Veiga, R. E.; Barnes, R. A.

1989-01-01

28

Statistical validation of system models  

SciTech Connect

It is common practice in system analysis to develop mathematical models for system behavior. Frequently, the actual system being modeled is also available for testing and observation, and sometimes the test data are used to help identify the parameters of the mathematical model. However, no general-purpose technique exists for formally, statistically judging the quality of a model. This paper suggests a formal statistical procedure for the validation of mathematical models of systems when data taken during operation of the system are available. The statistical validation procedure is based on the bootstrap, and it seeks to build a framework where a statistical test of hypothesis can be run to determine whether or not a mathematical model is an acceptable model of a system with regard to user-specified measures of system behavior. The approach to model validation developed in this study uses experimental data to estimate the marginal and joint confidence intervals of statistics of interest of the system. These same measures of behavior are estimated for the mathematical model. The statistics of interest from the mathematical model are located relative to the confidence intervals for the statistics obtained from the experimental data. These relative locations are used to judge the accuracy of the mathematical model. An extension of the technique is also suggested, wherein randomness may be included in the mathematical model through the introduction of random variable and random process terms. These terms cause random system behavior that can be compared to the randomness in the bootstrap evaluation of experimental system behavior. In this framework, the stochastic mathematical model can be evaluated. A numerical example is presented to demonstrate the application of the technique.

Barney, P. [Sandia National Labs., Albuquerque, NM (United States); Ferregut, C.; Perez, L.E. [Texas Univ., El Paso, TX (United States); Hunter, N.F. [Los Alamos National Lab., NM (United States); Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States)

1997-01-01

29

Uncertainty Modeling Via Frequency Domain Model Validation  

NASA Technical Reports Server (NTRS)

Abstract The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much attention in the literature. The influence of the level of accuracy of the uncertainty model on closed loop performance has received even less attention. The research reported herein is an initial step in applying and extending the concept of model validation to the problem of obtaining practical uncertainty models for robust control analysis and design applications. An extension of model validation called 'sequential validation' is presented and applied to a simple spring-mass-damper system to establish the feasibility of the approach and demonstrate the benefits of the new developments.

Waszak, Martin R.; Andrisani, Dominick, II

1999-01-01

30

Turbulence Modeling Verification and Validation  

NASA Technical Reports Server (NTRS)

Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

Rumsey, Christopher L.

2014-01-01

31

Statistical validation of stochastic models  

SciTech Connect

It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

1996-12-31

32

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experimental Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-II, and are also valuable tools for the analysis of innovative reactor designs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1992-01-01

33

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experimental Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-II, and are also valuable tools for the analysis of innovative reactor designs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1992-07-01

34

The MicroArray Quality Control (MAQC)-II study of common practices for the development and validation of microarray-based predictive models.  

PubMed

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis. PMID:20676074

Shi, Leming; Campbell, Gregory; Jones, Wendell D; Campagne, Fabien; Wen, Zhining; Walker, Stephen J; Su, Zhenqiang; Chu, Tzu-Ming; Goodsaid, Federico M; Pusztai, Lajos; Shaughnessy, John D; Oberthuer, André; Thomas, Russell S; Paules, Richard S; Fielden, Mark; Barlogie, Bart; Chen, Weijie; Du, Pan; Fischer, Matthias; Furlanello, Cesare; Gallas, Brandon D; Ge, Xijin; Megherbi, Dalila B; Symmans, W Fraser; Wang, May D; Zhang, John; Bitter, Hans; Brors, Benedikt; Bushel, Pierre R; Bylesjo, Max; Chen, Minjun; Cheng, Jie; Cheng, Jing; Chou, Jeff; Davison, Timothy S; Delorenzi, Mauro; Deng, Youping; Devanarayan, Viswanath; Dix, David J; Dopazo, Joaquin; Dorff, Kevin C; Elloumi, Fathi; Fan, Jianqing; Fan, Shicai; Fan, Xiaohui; Fang, Hong; Gonzaludo, Nina; Hess, Kenneth R; Hong, Huixiao; Huan, Jun; Irizarry, Rafael A; Judson, Richard; Juraeva, Dilafruz; Lababidi, Samir; Lambert, Christophe G; Li, Li; Li, Yanen; Li, Zhen; Lin, Simon M; Liu, Guozhen; Lobenhofer, Edward K; Luo, Jun; Luo, Wen; McCall, Matthew N; Nikolsky, Yuri; Pennello, Gene A; Perkins, Roger G; Philip, Reena; Popovici, Vlad; Price, Nathan D; Qian, Feng; Scherer, Andreas; Shi, Tieliu; Shi, Weiwei; Sung, Jaeyun; Thierry-Mieg, Danielle; Thierry-Mieg, Jean; Thodima, Venkata; Trygg, Johan; Vishnuvajjala, Lakshmi; Wang, Sue Jane; Wu, Jianping; Wu, Yichao; Xie, Qian; Yousef, Waleed A; Zhang, Liang; Zhang, Xuegong; Zhong, Sheng; Zhou, Yiming; Zhu, Sheng; Arasappan, Dhivya; Bao, Wenjun; Lucas, Anne Bergstrom; Berthold, Frank; Brennan, Richard J; Buness, Andreas; Catalano, Jennifer G; Chang, Chang; Chen, Rong; Cheng, Yiyu; Cui, Jian; Czika, Wendy; Demichelis, Francesca; Deng, Xutao; Dosymbekov, Damir; Eils, Roland; Feng, Yang; Fostel, Jennifer; Fulmer-Smentek, Stephanie; Fuscoe, James C; Gatto, Laurent; Ge, Weigong; Goldstein, Darlene R; Guo, Li; Halbert, Donald N; Han, Jing; Harris, Stephen C; Hatzis, Christos; Herman, Damir; Huang, Jianping; Jensen, Roderick V; Jiang, Rui; Johnson, Charles D; Jurman, Giuseppe; Kahlert, Yvonne; Khuder, Sadik A; Kohl, Matthias; Li, Jianying; Li, Li; Li, Menglong; Li, Quan-Zhen; Li, Shao; Li, Zhiguang; Liu, Jie; Liu, Ying; Liu, Zhichao; Meng, Lu; Madera, Manuel; Martinez-Murillo, Francisco; Medina, Ignacio; Meehan, Joseph; Miclaus, Kelci; Moffitt, Richard A; Montaner, David; Mukherjee, Piali; Mulligan, George J; Neville, Padraic; Nikolskaya, Tatiana; Ning, Baitang; Page, Grier P; Parker, Joel; Parry, R Mitchell; Peng, Xuejun; Peterson, Ron L; Phan, John H; Quanz, Brian; Ren, Yi; Riccadonna, Samantha; Roter, Alan H; Samuelson, Frank W; Schumacher, Martin M; Shambaugh, Joseph D; Shi, Qiang; Shippy, Richard; Si, Shengzhu; Smalter, Aaron; Sotiriou, Christos; Soukup, Mat; Staedtler, Frank; Steiner, Guido; Stokes, Todd H; Sun, Qinglan; Tan, Pei-Yi; Tang, Rong; Tezak, Zivana; Thorn, Brett; Tsyganova, Marina; Turpaz, Yaron; Vega, Silvia C; Visintainer, Roberto; von Frese, Juergen; Wang, Charles; Wang, Eric; Wang, Junwei; Wang, Wei; Westermann, Frank; Willey, James C; Woods, Matthew; Wu, Shujian; Xiao, Nianqing; Xu, Joshua; Xu, Lei; Yang, Lun; Zeng, Xiao; Zhang, Jialu; Zhang, Li; Zhang, Min; Zhao, Chen; Puri, Raj K; Scherf, Uwe; Tong, Weida; Wolfinger, Russell D

2010-08-01

35

The MicroArray Quality Control (MAQC)-II study of common practices for the development and validation of microarray-based predictive models  

PubMed Central

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis. PMID:20676074

2012-01-01

36

Developing better and more valid animal models of brain disorders.  

PubMed

Valid sensitive animal models are crucial for understanding the pathobiology of complex human disorders, such as anxiety, autism, depression and schizophrenia, which all have the 'spectrum' nature. Discussing new important strategic directions of research in this field, here we focus i) on cross-species validation of animal models, ii) ensuring their population (external) validity, and iii) the need to target the interplay between multiple disordered domains. We note that optimal animal models of brain disorders should target evolutionary conserved 'core' traits/domains and specifically mimic the clinically relevant inter-relationships between these domains. PMID:24384129

Stewart, Adam Michael; Kalueff, Allan V

2015-01-01

37

Assimilation of Geosat Altimetric Data in a Nonlinear Shallow-Water Model of the Indian Ocean by Adjoint Approach. Part II: Some Validation and Interpretation of the Assimilated Results. Part 2; Some Validation and Interpretation of the Assimilated Results  

NASA Technical Reports Server (NTRS)

This paper examines the results of assimilating Geosat sea level variations relative to the November 1986-November 1988 mean reference, in a nonlinear reduced-gravity model of the Indian Ocean, Data have been assimilated during one year starting in November 1986 with the objective of optimizing the initial conditions and the yearly averaged reference surface. The thermocline slope simulated by the model with or without assimilation is validated by comparison with the signal, which can be derived from expandable bathythermograph measurements performed in the Indian Ocean at that time. The topography simulated with assimilation on November 1986 is in very good agreement with the hydrographic data. The slopes corresponding to the South Equatorial Current and to the South Equatorial Countercurrent are better reproduced with assimilation than without during the first nine months. The whole circulation of the cyclonic gyre south of the equator is then strongly intensified by assimilation. Another assimilation experiment is run over the following year starting in November 1987. The difference between the two yearly mean surfaces simulated with assimilation is in excellent agreement with Geosat. In the southeastern Indian Ocean, the correction to the yearly mean dynamic topography due to assimilation over the second year is negatively correlated to the one the year before. This correction is also in agreement with hydrographic data. It is likely that the signal corrected by assimilation is not only due to wind error, because simulations driven by various wind forcings present the same features over the two years. Model simulations run with a prescribed throughflow transport anomaly indicate that assimilation is rather correcting in the interior of the model domain for inadequate boundary conditions with the Pacific.

Greiner, Eric; Perigaud, Claire

1996-01-01

38

Ground-water models cannot be validated  

Microsoft Academic Search

Ground-water models are embodiments of scientific hypotheses. As such, the models cannot be proven or validated, but only tested and invalidated. However, model testing and the evaluation of predictive errors lead to improved models and a better understanding of the problem at hand. In applying ground-water models to field problems, errors arise from conceptual deficiencies, numerical errors, and inadequate parameter

Leonard F. Konikow; John D. Bredehoeft

1992-01-01

39

Tritium modeling/BEATRIX-II data analysis  

SciTech Connect

Models have been developed to describe the tritium transport in Li{sub 2}O. The mechanisms considered are bulk diffusion, surface desorption, surface adsorption, and solubility. These models have been incorporated into the TIARA steady-state inventory code and the DISPL2 steady-state and transient code. Preliminary validation efforts have focused on the inventory and tritium release rate data from in-reactor, purge-flow tests VOM-15H, EXOTIC-2, CRITIC-1, and MOZART. The models and validation effort are reported in detail in ANL/FPP/TM-260. Since the BEATRIX-II data were released officially in November 1991, validation efforts have been concentrated on the tritium release rate data from the {open_quotes}isothermal{close_quotes} thin-ring sample. In this report, results are presented for the comparison of predicted long-time inventory changes (in response to temperature and hydrogen purge pressure changes) to values determined from the tritium release data.

Billone, M.C.; Attaya, H.; Johnson, C.E.; Kopasz, J.P.

1992-12-31

40

Computational Modeling and Experimental Validation of Aviation  

E-print Network

Computational Modeling and Experimental Validation of Aviation Security Procedures Uwe Gl/srastkar/mvajihol}@cs.sfu.ca January 2006 Abstract Security of civil aviation has become a major concern in recent years, leading and experimental validation of aviation security combining abstract state machine (ASM) specifica- tion techniques

Zhang, Richard "Hao"

41

Algorithm for model validation: theory and applications.  

PubMed

Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer-Meshkov instability. PMID:17420476

Sornette, D; Davis, A B; Ide, K; Vixie, K R; Pisarenko, V; Kamm, J R

2007-04-17

42

Factorial validity and measurement invariance across intelligence levels and gender of the Overexcitabilities Questionnaire-II (OEQ-II).  

PubMed

The concept of overexcitability, derived from Dabrowski's theory of personality development, offers a promising approach for the study of the developmental dynamics of giftedness. The present study aimed at (a) examining the factorial structure of the Overexcitabilities Questionnaire-II scores (OEQ-II) and (b) testing measurement invariance of these scores across intelligence and gender. A sample of 641 Dutch-speaking adolescents from 11 to 15 years old, 363 girls and 278 boys, participated in this study. Results showed that a model without cross-loadings did not fit the data well (using confirmatory factor analysis), whereas a factor model in which all cross-loadings were included yielded fit statistics that were in support of the factorial structure of the OEQ-II scores (using exploratory structural equation modeling). Furthermore, our findings supported the assumption of (partial) strict measurement invariance of the OEQ-II scores across intelligence levels and across gender. Such levels of measurement invariance allow valid comparisons between factor means and factor relationships across groups. In particular, the gifted group scored significantly higher on intellectual and sensual overexcitability (OE) than the nongifted group, girls scored higher on emotional and sensual OE than boys, and boys scored higher on intellectual and psychomotor OE than girls. PMID:24079958

Van den Broeck, Wim; Hofmans, Joeri; Cooremans, Sven; Staels, Eva

2014-03-01

43

Inert doublet model and LEP II limits  

SciTech Connect

The inert doublet model is a minimal extension of the standard model introducing an additional SU(2) doublet with new scalar particles that could be produced at accelerators. While there exists no LEP II analysis dedicated for these inert scalars, the absence of a signal within searches for supersymmetric neutralinos can be used to constrain the inert doublet model. This translation however requires some care because of the different properties of the inert scalars and the neutralinos. We investigate what restrictions an existing DELPHI Collaboration study of neutralino pair production can put on the inert scalars and discuss the result in connection with dark matter. We find that although an important part of the inert doublet model parameter space can be excluded by the LEP II data, the lightest inert particle still constitutes a valid dark matter candidate.

Lundstroem, Erik; Gustafsson, Michael; Edsjoe, Joakim [Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden); INFN, Sezione di Padova, Department of Physics 'Galileo Galilei', Via Marzolo 8, I-35131, Padua (Italy) and Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden); Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden)

2009-02-01

44

Validation of the Hot Strip Mill Model  

SciTech Connect

The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

Richard Shulkosky; David Rosberg; Jerrud Chapman

2005-03-30

45

SIMPLIFIED GINZBURGLANDAU MODELS FOR SUPERCONDUCTIVITY VALID FOR  

E-print Network

SIMPLIFIED GINZBURG­LANDAU MODELS FOR SUPERCONDUCTIVITY VALID FOR HIGH KAPPA AND HIGH FIELDS S. J expansion is used to simplify the Ginzburg­Landau model of superconductivity in the limit of large values by a vortex of superconducting current. At the core of each vortex the density of superconducting charge

Chapman, Jon

46

Validating Mediator Cost Models with Hubert Naackey  

E-print Network

of Disco's cost model based on experimentation with real Web data sources. This vali- dation shows the e de mediation developpe a l'INRIA pour acceder a des sources de donnees heterogenes reparties sur sources de donnees reelles accessibles sur le Web. Cette validation montre l'e cacite de notre modele de

Zimmerman, John

47

On validation and invalidation of biological models  

PubMed Central

Background Very frequently the same biological system is described by several, sometimes competing mathematical models. This usually creates confusion around their validity, ie, which one is correct. However, this is unnecessary since validity of a model cannot be established; model validation is actually a misnomer. In principle the only statement that one can make about a system model is that it is incorrect, ie, invalid, a fact which can be established given appropriate experimental data. Nonlinear models of high dimension and with many parameters are impossible to invalidate through simulation and as such the invalidation process is often overlooked or ignored. Results We develop different approaches for showing how competing ordinary differential equation (ODE) based models of the same biological phenomenon containing nonlinearities and parametric uncertainty can be invalidated using experimental data. We first emphasize the strong interplay between system identification and model invalidation and we describe a method for obtaining a lower bound on the error between candidate model predictions and data. We then turn to model invalidation and formulate a methodology for discrete-time and continuous-time model invalidation. The methodology is algorithmic and uses Semidefinite Programming as the computational tool. It is emphasized that trying to invalidate complex nonlinear models through exhaustive simulation is not only computationally intractable but also inconclusive. Conclusion Biological models derived from experimental data can never be validated. In fact, in order to understand biological function one should try to invalidate models that are incompatible with available data. This work describes a framework for invalidating both continuous and discrete-time ODE models based on convex optimization techniques. The methodology does not require any simulation of the candidate models; the algorithms presented in this paper have a worst case polynomial time complexity and can provide an exact answer to the invalidation problem. PMID:19422679

Anderson, James; Papachristodoulou, Antonis

2009-01-01

48

Numerical model representation and validation strategies  

SciTech Connect

This paper describes model representation and validation strategies for use in numerical tools that define models in terms of topology, geometry, or topography. Examples of such tools include Computer-Assisted Engineering (CAE), Computer-Assisted Manufacturing (CAM), Finite Element Analysis (FEA), and Virtual Environment Simulation (VES) tools. These tools represent either physical objects or conceptual ideas using numerical models for the purpose of posing a question, performing a task, or generating information. Dependence on these numerical representations require that models be precise, consistent across different applications, and verifiable. This paper describes a strategy for ensuring precise, consistent, and verifiable numerical model representations in a topographic framework. The main assertion put forth is that topographic model descriptions are more appropriate for numerical applications than topological or geometrical descriptions. A topographic model verification and validation methodology is presented.

Dolin, R.M.; Hefele, J.

1997-10-01

49

Statistical validation of physical system models  

SciTech Connect

It is common practice in applied mechanics to develop mathematical models for mechanical system behavior. Frequently, the actual physical system being modeled is also available for testing, and sometimes the test data are used to help identify the parameters of the mathematical model. However, no general-purpose technique exists for formally, statistically judging the quality of a model. This paper suggests a formal statistical procedure for the validation of mathematical models of physical systems when data taken during operation of the physical system are available. The statistical validation procedure is based on the bootstrap, and it seeks to build a framework where a statistical test of hypothesis can be run to determine whether or not a mathematical model is an acceptable model of a physical system with regard to user-specified measures of system behavior. The approach to model validation developed in this study uses experimental data to estimate the marginal and joint confidence intervals of statistics of interest of the physical system. These same measures of behavior are estimated for the mathematical model. The statistics of interest from the mathematical model are located relative to the confidence intervals for the statistics obtained from the experimental data. These relative locations are used to judge the accuracy of the mathematical model. A numerical example is presented to demonstrate the application of the technique.

Paez, T.L.; Barney, P. [Sandia National Lab., Albuquerque, NM (United States); Hunter, N.F. [Los Alamos National Lab., NM (United States); Ferregut, C.; Perez, L.E. [Univ. of Texas, El Paso, TX (United States). FAST Center for Structural Integrity of Aerospace Systems

1996-10-01

50

Evaluation (not validation) of quantitative models.  

PubMed Central

The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid. Modelers and policymakers must continue to work toward finding effective ways to evaluate and judge the quality of their models, and to develop appropriate terminology to communicate these judgments to the public whose health and safety may be at stake. PMID:9860904

Oreskes, N

1998-01-01

51

Evaluation (not validation) of quantitative models.  

PubMed

The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid. Modelers and policymakers must continue to work toward finding effective ways to evaluate and judge the quality of their models, and to develop appropriate terminology to communicate these judgments to the public whose health and safety may be at stake. PMID:9860904

Oreskes, N

1998-12-01

52

Holism and Entrenchment in Climate Model Validation  

Microsoft Academic Search

\\u000a Recent work in the domain of the validation of complex computational models reveals that modelers of complex systems, particularly\\u000a modelers of the earth’s climate, face a deeply entrenched form of confirmation holism. Confirmation holism, as it is traditionally\\u000a understood, is the thesis that a single hypothesis cannot be tested in isolation, but that such tests always depend on other\\u000a theories

Johannes Lenhard; Eric Winsberg

53

Validation of an Experimentally Derived Uncertainty Model  

NASA Technical Reports Server (NTRS)

The results show that uncertainty models can be obtained directly from system identification data by using a minimum norm model validation approach. The error between the test data and an analytical nominal model is modeled as a combination of unstructured additive and structured input multiplicative uncertainty. Robust controllers which use the experimentally derived uncertainty model show significant stability and performance improvements over controllers designed with assumed ad hoc uncertainty levels. Use of the identified uncertainty model also allowed a strong correlation between design predictions and experimental results.

Lim, K. B.; Cox, D. E.; Balas, G. J.; Juang, J.-N.

1996-01-01

54

Validation of Space Weather Models at Community Coordinated Modeling Center  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) is a multiagency partnership, which aims at the creation of next generation space weather modes. CCMC goal is to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. The presentation will demonstrate the recent progress in CCMC metrics and validation activities.

Kuznetsova, M. M.; Pulkkinen, A.; Rastaetter, L.; Hesse, M.; Chulaki, A.; Maddox, M.

2011-01-01

55

Reliability and Validity of the Beck Depression Inventory--II with Adolescent Psychiatric Inpatients  

ERIC Educational Resources Information Center

This investigation was conducted to validate the Beck Depression Inventory--II (BDI-II; A. T. Beck, R. A. Steer, & G. K. Brown, 1996) in samples of adolescent psychiatric inpatients. The sample in each substudy was primarily Caucasian. In Study 1, expert raters (N=7) and adolescent psychiatric inpatients (N=13) evaluated the BDI-II items to assess…

Osman, Augustine; Kopper, Beverly A; Barrios, Frank; Gutierrez, Peter M.; Bagge, Courtney L.

2004-01-01

56

Verification validation and accreditation of simulation models  

Microsoft Academic Search

This paper presents guidelines for conducting verifica- tion, validation and accreditation (VV&A) of simulation models. Fifteen guiding principles are introduced to help the researchers, practitioners and managers better com- prehend what VV&A is all about. The VV&A activities are described in the modeling and simulation life cycle. A taxonomy of more than 77 V&V techniques is provided to assist simulationists

Osman Balci

1997-01-01

57

A Hierarchical Systems Approach to Model Validation  

NASA Astrophysics Data System (ADS)

Existing approaches to the question of how climate models should be evaluated tend to rely on either philosophical arguments about the status of models as scientific tools, or on empirical arguments about how well runs from a given model match observational data. These have led to quantitative approaches expressed in terms of model bias or forecast skill, and ensemble approaches where models are assessed according to the extent to which the ensemble brackets the observational data. Unfortunately, such approaches focus the evaluation on models per se (or more specifically, on the simulation runs they produce) as though the models can be isolated from their context. Such approach may overlook a number of important aspects of the use of climate models: - the process by which models are selected and configured for a given scientific question. - the process by which model outputs are selected, aggregated and interpreted by a community of expertise in climatology. - the software fidelity of the models (i.e. whether the running code is actually doing what the modellers think it's doing). - the (often convoluted) history that begat a given model, along with the modelling choices long embedded in the code. - variability in the scientific maturity of different model components within a coupled system. These omissions mean that quantitative approaches cannot assess whether a model produces the right results for the wrong reasons, or conversely, the wrong results for the right reasons (where, say the observational data is problematic, or the model is configured to be unlike the earth system for a specific reason). Hence, we argue that it is a mistake to think that validation is a post-hoc process to be applied to an individual "finished" model, to ensure it meets some criteria for fidelity to the real world. We are therefore developing a framework for model validation that extends current approaches down into the detailed codebase and the processes by which the code is built and tested; and up into the broader scientific context in which models are selected and used to explore theories and test hypotheses. By taking software testing into account, we can build up a picture of the day-to-day practices by which modellers make small changes to the model and test the effect of such changes, both in isolated sections of code, and on the climatology of a full model. By taking the broader scientific context into account, we examine how features of the entire scientific enterprise improve (or impede) model validity, from the collection of observational data, creation of theories, use of these theories to develop models, choices for which model and which model configuration to use, choices for how to set up the runs, and interpretation of the results. Our approach cannot quantify model validity, but it can provide a systematic account of how the detailed practices involved in the development and use of climate models contribute to the quality of modelling systems and the scientific enterprise that they support. By making the relationships between these practices and model quality more explicit, we expect to identify specific strengths and weaknesses the modelling systems, particularly with respect to structural uncertainty in the models, and better characterize the "unknown unknowns".

Easterbrook, S. M.

2011-12-01

58

Concepts of Model Verification and Validation  

SciTech Connect

Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a V&V process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, V&V cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model V&V is fundamentally different from software V&V. Code developers developing computer programs perform software V&V to ensure code correctness, reliability, and robustness. In model V&V, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model V&V guidelines and procedures. The expected outcome of the model V&V process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful V&V program. This objective is motivated by the need for highly accurate numerical models for making predictions to s

B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

2004-10-30

59

CSC6870 Computer Graphics II Geometric Modeling  

E-print Network

CSC6870 Computer Graphics II Geometric Modeling CSC6870 Computer Graphics II Overview 3D Shape, subdivision surfaces, implicit surfaces, particles. · Solids CSC6870 Computer Graphics II Basic Shapes CSC6870 Computer Graphics II Fundamental Shapes CSC6870 Computer Graphics II Fundamental Shapes CSC6870 Computer

Hua, Jing

60

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experiment Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-2, and are also valuable tools for the analysis of innovative reactor designs. 29 refs., 6 figs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1991-01-01

61

Teaching "Instant Experience" with Graphical Model Validation Techniques  

ERIC Educational Resources Information Center

Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

Ekstrøm, Claus Thorn

2014-01-01

62

Validating the Beck Depression Inventory-II for Hong Kong Community Adolescents  

ERIC Educational Resources Information Center

The primary purpose of this study was to test for the validity of a Chinese version of the Beck Depression Inventory-II (C-BDI-II) for use with Hong Kong community (i.e., nonclinical) adolescents. Based on a randomized triadic split of the data (N = 1460), we conducted exploratory factor analysis on Group1 (n = 486) and confirmatory factor…

Byrne, Barbara M.; Stewart, Sunita M.; Lee, Peter W. H.

2004-01-01

63

Validation of Space Weather Models at Community Coordinated Modeling Center  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. Space weather models and coupled model chains hosted at the CCMC range from the solar corona to the Earth's upper atmosphere. CCMC has developed a number of real-time modeling systems, as well as a large number of modeling and data products tailored to address the space weather needs of NASA's robotic missions. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. CCMC has been leading recent comprehensive modeling challenges under GEM, CEDAR and SHINE programs. The presentation will focus on experience in carrying out comprehensive and systematic validation of large sets of. space weather models

Kuznetsova, M. M.; Hesse, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; Zheng, Y.; MacNeice, P. J.; Shim, J.; Taktakishvili, A.; Chulaki, A.

2011-01-01

64

Turbulence Modeling Validation, Testing, and Development  

NASA Technical Reports Server (NTRS)

The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

Bardina, J. E.; Huang, P. G.; Coakley, T. J.

1997-01-01

65

Validation of Computational Models in Biomechanics  

PubMed Central

The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

2010-01-01

66

Measuring avoidance of pain: validation of the Acceptance and Action Questionnaire II-pain version.  

PubMed

Psychometric research on widely used questionnaires aimed at measuring experiential avoidance of chronic pain has led to inconclusive results. To test the structural validity, internal consistency, and construct validity of a recently developed short questionnaire: the Acceptance and Action Questionnaire II-pain version (AAQ-II-P). Cross-sectional validation study among 388 adult patients with chronic nonspecific musculoskeletal pain admitted for multidisciplinary pain rehabilitation in four tertiary rehabilitation centers in the Netherlands. Cronbach's ? was calculated to analyze internal consistency. Principal component analysis was performed to analyze factor structure. Construct validity was analyzed by examining the association between acceptance of pain and measures of psychological flexibility (two scales and sum), pain catastrophizing (three scales and sum), and mental and physical functioning. Interpretation was based on a-priori defined hypotheses. The compound of the seven items of the AAQ-II-P shows a Cronbach's ? of 0.87. The single component explained 56.2% of the total variance. Correlations ranged from r=-0.21 to 0.73. Two of the predefined hypotheses were rejected and seven were not rejected. The AAQ-II-P measures a single component and has good internal consistency, and construct validity is not rejected. Thus, the construct validity of the AAQ-II-P sum scores as indicator of experiential avoidance of pain was supported. PMID:24418966

Reneman, Michiel F; Kleen, Marco; Trompetter, Hester R; Schiphorst Preuper, Henrica R; Köke, Albère; van Baalen, Bianca; Schreurs, Karlein M G

2014-06-01

67

New metrics for permafrost model validation  

NASA Astrophysics Data System (ADS)

Meteorological data from Arctic regions are historically scarce, due principally to their remote and inhospitable nature, and therefore, decreased human habitation compared with more temperature environments. Simulating the future climate of these regions has become a problem of significant importance, as recent projections indicate a high degree of sensitivity to forecasted increases in temperature, as well as the possibility of strong positive feedbacks to the climate system. For these climate projections to be properly constrained, they must be validated through comparison with relevant climate observables in a past time frame. Active layer thickness (ALT) has become a key descriptor of the state of permafrost, in both observation and simulation. As such, it is an ideal metric for model validation as well. Concerted effort to create a database of ALT measurements in Arctic regions culminated in the inception of the Circumpolar Active Layer Measurement (CALM) project over 20 years ago. This paper examines in detail the utility of Alaskan CALM data as a model validation tool. Derivation of ALT data from soil temperature stations and boreholes is also examined, as well as forced numerical modelling of soil temperatures by surface air temperature (SAT) and ground surface temperature (GST). Results indicate that existing individual or repeated borehole temperature logs are generally unsuitable for deriving ALT because of coarse vertical resolution, and failing to capture the exact timing of maximum annual thaw. However, because of their systematic temporal resolution, and comparatively fine vertical resolution, daily soil temperature data compare favourably with the ALT measurements from CALM data. Numerical simulation of subsurface temperatures also agree well with CALM data if forced by GST; results from SAT-forced simulations are less straightforward due to coupling processes, such as snow cover, that complicate heat conduction at the ground surface.

Stevens, M. B.; Beltrami, H.; Gonzalez-Rouco, J. F.

2012-04-01

68

Validity  

NSDL National Science Digital Library

In this chapter, the authors will describe the four types of validity: construct validity, content validity, concurrent validity, and predictive validity. Depending on the test and the rationale or purpose for its administration, and understanding of the

Christmann, Edwin P.; Badgett, John L.

2008-11-01

69

Plasma Reactor Modeling and Validation Experiments  

NASA Technical Reports Server (NTRS)

Plasma processing is a key processing stop in integrated circuit manufacturing. Low pressure, high density plum reactors are widely used for etching and deposition. Inductively coupled plasma (ICP) source has become popular recently in many processing applications. In order to accelerate equipment and process design, an understanding of the physics and chemistry, particularly, plasma power coupling, plasma and processing uniformity and mechanism is important. This understanding is facilitated by comprehensive modeling and simulation as well as plasma diagnostics to provide the necessary data for model validation which are addressed in this presentation. We have developed a complete code for simulating an ICP reactor and the model consists of transport of electrons, ions, and neutrals, Poisson's equation, and Maxwell's equation along with gas flow and energy equations. Results will be presented for chlorine and fluorocarbon plasmas and compared with data from Langmuir probe, mass spectrometry and FTIR.

Meyyappan, M.; Bose, D.; Hash, D.; Hwang, H.; Cruden, B.; Sharma, S. P.; Rao, M. V. V. S.; Arnold, Jim (Technical Monitor)

2001-01-01

70

Model-Based Method for Sensor Validation  

NASA Technical Reports Server (NTRS)

Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

Vatan, Farrokh

2012-01-01

71

Modeling Earth Dynamics: Complexity, Uncertainty, and Validation  

NASA Astrophysics Data System (ADS)

28th IUGG Conference on Mathematical Geophysics; Pisa, Italy, 7-11 June 2010; The capabilities and limits of mathematical models applied to a variety of geophysical processes were discussed during the 28th international Conference on Mathematical Geophysics, held in Italy (see the conference Web site (http://cmg2010.pi.ingv.it), which includes abstracts). The conference was organized by the International Union of Geodesy and Geophysics (IUGG) Commission on Mathematical Geophysics (CMG) and the Istituto Nazionale di Geofisica e Vulcanologia and was cosponsored by the U.S. National Science Foundation. The meeting was attended by more than 160 researchers from 26 countries and was dedicated to the theme “Modelling Earth Dynamics: Complexity, Uncertainty, and Validation.” Many talks were dedicated to illustration of the complexities affecting geophysical processes. Novel applications of geophysical fluid dynamics were presented, with specific reference to volcanological and ­subsurface/surface flow processes. In most cases, investigations highlighted the need for multidimensional and multiphase flow models able to describe the nonlinear effects associated with the nonhomogeneous nature of the matter. Fluid dynamic models of atmospheric, oceanic, and environmental systems also illustrated the fundamental role of nonlinear couplings between the different subsystems. Similarly, solid Earth models have made it possible to obtain the first tomographies of the planet; to formulate nonlocal and dynamic damage models of rocks; to investigate statistically the triggering, clustering, and synchronization of faults; and to develop realistic simulators of the planetary dynamo, plate tectonics, and gravity and magnetic fields.

Neri, A.

2010-12-01

72

Validation of HEDR models. Hanford Environmental Dose Reconstruction Project  

SciTech Connect

The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

1994-05-01

73

Validation of the Korean version Moorehead-Ardelt quality of life questionnaire II  

PubMed Central

Purpose To investigate the weight loss effects with higher sensitivity, disease specific quality of life (QoL) instruments were important. The Moorehead-Ardelt quality of life questionnaire II (MA-II) is widely used, because it was simple and validated the several languages. The aims of present study was performed the translation of MA-II Korean version and the validation compared with EuroQol-5 dimension (EQ-5D), obesity-related problems scale (OP-scale), and impact of weight quality of life-lite (IWQoL-Lite). Methods The study design was a multicenter, cross-sectional survey and this study was included the postoperative patients. The validation procedure is translation-back translation procedure, pilot study, and field study. The instruments of measuring QoL included the MA-II, EQ-5D, OP-scale, and IWQoL-lite. The reliability was checked through internal consistency using Cronbach alpha coefficients. The construct validity was assessed the Spearman rank correlation between 6 domains of MA-II and EQ-5D, OP-scale, and 5 domains of IWQoL-Lite. Results The Cronbach alpha of MA-II was 0.763, so the internal consistency was confirmed. The total score of MA-II was significantly correlated with all other instruments; EQ-5D, OP-scale, and IWQoL-Lite. IWQoL-lite (? = 0.623, P < 0.001) was showed the strongest correlation compared with MA-II, followed by OP-scale (? = 0.588, P < 0.001) and EQ-5D (? = 0.378, P < 0.01). Conclusion The Korean version MA-II was valid instrument of measuring the obesity-specific QoL. Through the present study, the MA-II was confirmed to have good reliability and validity and it was also answered simple for investigating. Thus, MA-II could be estimated sensitive and exact QoL in obesity patients. PMID:25368853

Lee, Yeon Ji; Song, Hyun Jin; Oh, Sung-Hee; Kwon, Jin Won; Moon, Kon-Hak; Park, Joong-Min; Lee, Sang Kuon

2014-01-01

74

Diurnal ocean surface layer model validation  

NASA Technical Reports Server (NTRS)

The diurnal ocean surface layer (DOSL) model at the Fleet Numerical Oceanography Center forecasts the 24-hour change in a global sea surface temperatures (SST). Validating the DOSL model is a difficult task due to the huge areas involved and the lack of in situ measurements. Therefore, this report details the use of satellite infrared multichannel SST imagery to provide day and night SSTs that can be directly compared to DOSL products. This water-vapor-corrected imagery has the advantages of high thermal sensitivity (0.12 C), large synoptic coverage (nearly 3000 km across), and high spatial resolution that enables diurnal heating events to be readily located and mapped. Several case studies in the subtropical North Atlantic readily show that DOSL results during extreme heating periods agree very well with satellite-imagery-derived values in terms of the pattern of diurnal warming. The low wind and cloud-free conditions necessary for these events to occur lend themselves well to observation via infrared imagery. Thus, the normally cloud-limited aspects of satellite imagery do not come into play for these particular environmental conditions. The fact that the DOSL model does well in extreme events is beneficial from the standpoint that these cases can be associated with the destruction of the surface acoustic duct. This so-called afternoon effect happens as the afternoon warming of the mixed layer disrupts the sound channel and the propagation of acoustic energy.

Hawkins, Jeffrey D.; May, Douglas A.; Abell, Fred, Jr.

1990-01-01

75

An upgraded track structure model: experimental validation.  

PubMed

The track nanodosemeter developed at the National Laboratories of Legnaro (LNL), Italy allows the direct investigation of the properties of particle tracks, by measuring ionisation-cluster-size distributions caused by ionising particles within a 'nanometre-sized' target volume while passing it at a well-specified impact parameter. To supplement the measurements, a dedicated Monte Carlo code was developed which is able to reproduce the general shape of measured cluster-size distributions with a satisfactory quality. To reduce the still existing quantitative differences between measured and simulated data, the validity of cross sections used in the Monte Carlo model was revisited again, taking into account the large amount of data available now from recent track structure measurements at LNL. Here, special emphasis was laid on a deeper and detailed investigation of the cross sections applied to calculate the energy of secondary electrons after impact ionisation of primary particles: the cross sections due to the HKS model and the so-called Rudd model. Representative results for 240 MeV (12)C-ions are presented. PMID:24327751

Grosswendt, B; Conte, V; Colautti, P

2014-10-01

76

Boron-10 Lined Proportional Counter Model Validation  

SciTech Connect

The decreasing supply of 3He is stimulating a search for alternative neutron detectors; one potential 3He replacement is 10B-lined proportional counters. Simulations are being performed to predict the performance of systems designed with 10B-lined tubes. Boron-10-lined tubes are challenging to model accurately because the neutron capture material is not the same as the signal generating material. Thus, to simulate the efficiency, the neutron capture reaction products that escape the lining and enter the signal generating fill gas must be tracked. The tube lining thickness and composition are typically proprietary vendor information, and therefore add additional variables to the system simulation. The modeling methodologies used to predict the neutron detection efficiency of 10B-lined proportional counters were validated by comparing simulated to measured results. The measurements were made with a 252Cf source positioned at several distances from a moderated 2.54-cm diameter 10B-lined tube. Models were constructed of the experimental configurations using the Monte Carlo transport code MCNPX, which is capable of tracking the reaction products from the (n,10B) reaction. Several different lining thicknesses and compositions were simulated for comparison with the measured data. This paper presents the results of the evaluation of the experimental and simulated data, and a summary of how the different linings affect the performance of a coincidence counter configuration designed with 10B-lined proportional counters.

Lintereur, Azaree T.; Ely, James H.; Kouzes, Richard T.; Rogers, Jeremy L.; Siciliano, Edward R.

2012-11-18

77

Outward Bound Outcome Model Validation and Multilevel Modeling  

ERIC Educational Resources Information Center

This study was intended to measure construct validity for the Outward Bound Outcomes Instrument (OBOI) and to predict outcome achievement from individual characteristics and course attributes using multilevel modeling. A sample of 2,340 participants was collected by Outward Bound USA between May and September 2009 using the OBOI. Two phases of…

Luo, Yuan-Chun

2011-01-01

78

Modeling distributed hybrid systems in Ptolemy II  

Microsoft Academic Search

We present Ptolemy II as a modeling and simulation environment for distributed hybrid systems. In Ptolemy II, a distributed hybrid system is specified as a hierarchy of models: an event-based top level and distributed islands of hybrid systems. Each hybrid system is in turn a hierarchy of continuous-time models and finite state machines. A variety of models of computation was

Jie Liu; Xiaojun Liu; Edward A. Lee

2001-01-01

79

Validating agent based models through virtual worlds.  

SciTech Connect

As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior. Results from our work indicate that virtual worlds have the potential for serving as a proxy in allocating and populating behaviors that would be used within further agent-based modeling studies.

Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina [Sandia National Laboratories, Livermore, CA] [Sandia National Laboratories, Livermore, CA; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E. [North Carolina State University, Raleigh, NC] [North Carolina State University, Raleigh, NC; Bernstein, Jeremy Ray Rhythm [Gaikai, Inc., Aliso Viejo, CA] [Gaikai, Inc., Aliso Viejo, CA

2014-01-01

80

Model Based Test Generation for Microprocessor Architecture Validation  

E-print Network

Model Based Test Generation for Microprocessor Architecture Validation Sreekumar V. Kodakara.dingankar@intel.com Abstract Functional validation of microprocessors is growing in complexity in current and future microprocessors. Tra- ditionally, the different components (or validation collaterals) used in simulation based

Minnesota, University of

81

External validation of a Cox prognostic model: principles and methods  

PubMed Central

Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

2013-01-01

82

Concurrent and Predictive Validity of the Phelps Kindergarten Readiness Scale-II  

ERIC Educational Resources Information Center

The purpose of this research was to establish the concurrent and predictive validity of the Phelps Kindergarten Readiness Scale, Second Edition (PKRS-II; L. Phelps, 2003). Seventy-four kindergarten students of diverse ethnic backgrounds enrolled in a northeastern suburban school participated in the study. The concurrent administration of the…

Duncan, Jennifer; Rafter, Erin M.

2005-01-01

83

SAGE III aerosol extinction validation in the Arctic winter: comparisons with SAGE II and POAM III  

Microsoft Academic Search

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes.

L. W. Thomason; L. R. Poole; C. E. Randall

2007-01-01

84

SAGE III aerosol extinction validation in the Arctic winter: comparisons with SAGE II and POAM III  

Microsoft Academic Search

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes.

L. W. Thomason; L. R. Poole; C. E. Randall

2006-01-01

85

The Risk Map: A New Tool for Validating Risk Models  

E-print Network

The Risk Map: A New Tool for Validating Risk Models Gilbert Colletaz Christophe Hurlin Christophe Pérignon October 2012 Abstract This paper presents a new method to validate risk models: the Risk Map information about the performance of a risk model. It relies on the concept of a super exception, which is de

Paris-Sud XI, Université de

86

What do we mean by validating a prognostic model?  

Microsoft Academic Search

SUMMARY Prognostic models are used in medicine for investigating patient outcome in relation to patient and disease characteristics. Such models do not always work well in practice, so it is widely recommended that they need to be validated. The idea of validating a prognostic model is generally taken to mean establishing that it works satisfactorily for patients other than those

Douglas G. Altman; Patrick Royston

2000-01-01

87

Narrowband to broadband conversions of land surface albedo: II. Validation  

E-print Network

.T. Daughtry b , Raymond Hunt Jr. b a Laboratory for Global Remote Sensing Studies, 2181 LeFrak Hall, Department of Geography, University of Maryland, College Park, MD 20742, USA b Hydrology and Remote Sensing that surface albedo is among the main radiative uncertainties in current climate modeling. Remote sensing

Liang, Shunlin

88

DEVELOPMENT OF THE MESOPUFF II DISPERSION MODEL  

EPA Science Inventory

The development of the MESOPUFF II regional-scale air quality model is described. MESOPUFF II is a Lagrangian variable-trajectory puff superposition model suitable for modeling the transport, diffusion and removal of air pollutants from multiple point and area sources at transpor...

89

Industrial validation models 1 4/23/03 Experimental validation of new software technology  

E-print Network

Industrial validation models 1 4/23/03 Experimental validation of new software technology Marvin V When to apply a new technology in an organization is a critical decision for every software development organization. Earlier work defines a set of methods that the research community uses when a new technology

Zelkowitz, Marvin V.

90

Techniques and Issues in Agent-Based Modeling Validation  

SciTech Connect

Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

Pullum, Laura L [ORNL; Cui, Xiaohui [New York Institute of Technology (NYIT)

2012-01-01

91

Brief Report: Construct Validity of Two Identity Status Measures: The EIPQ and the EOM-EIS-II  

ERIC Educational Resources Information Center

The present study was designed to examine construct validity of two identity status measures, the Ego Identity Process Questionnaire (EIPQ; J. Adolescence 18 (1995) 179) and the Extended Objective Measure of Ego Identity Status II (EOM-EIS-II; J. Adolescent Res. 1 (1986) 183). Construct validity was operationalized in terms of how identity status…

Schwartz, Seth J.

2004-01-01

92

Parameterisation, calibration and validation of distributed hydrological models  

NASA Astrophysics Data System (ADS)

This paper emphasizes the different requirements for calibration and validation of lumped and distributed models. On the basis of a theoretically founded modelling protocol, the different steps in distributed hydrological modelling are illustrated through a case study based on the MIKE SHE code and the 440 km 2 Karup catchment in Denmark. The importance of a rigorous and purposeful parameterisation is emphasized in order to get as few "free" parameters as possible for which assessments through calibration are required. Calibration and validation using a split-sample procedure were carried out for catchment discharge and piezometric heads at seven selected observation wells. The validated model was then used for two further validation tests. Firstly, model simulations were compared with observations from three additional discharge sites and four additional wells located within the catchment. This internal validation showed significantly poorer results compared to the calibration/validation sites. Secondly, the validated model based on a 500 m model grid was used to generate three additional models with 1000 m, 2000 m and 4000 m grids through interpolation of model parameters. The results from the multi-scale validation suggested that a maximum grid size of 1000 m should be used for simulations of discharge and ground-water heads, while the results deteriorated with coarser model grids.

Refsgaard, Jens Christian

1997-11-01

93

Validation status of the TARDEC visual model (TVM)  

NASA Astrophysics Data System (ADS)

An extensive effort is ongoing to validate the TARDEC visual mode (TVM). This paper describes in detail some recent efforts to utilize the model for dual need commercial and military target acquisition applications. The recent completion of a visual perception laboratory within TARDEC is a useful tool to calibrate and validate human performance models for specific visual tasks. Some validation examples will be given for low contrast targets along with a description of the TVM and perception laboratory capabilities.

Gerhart, Grant R.; Goetz, Richard C.; Meitzler, Thomas J.; Karlsen, Robert E.

1996-06-01

94

Statistical Validation of Normal Tissue Complication Probability Models  

SciTech Connect

Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van't; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)] [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands) [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

2012-09-01

95

Validation of Arabic and English Versions of the ARSMA-II Acculturation Rating Scale.  

PubMed

To translate and adapt the Acculturation Rating Scale of Mexican-Americans II (ARSMA-II) for Arab Americans. A multistage translation process followed by a pilot and a large study. The translated and adapted versions, Acculturation Rating Scale for Arabic Americans-II Arabic and English (ARSAA-IIA, ARSAA-IIE), were validated in a sample of 297 Arab Americans. Factor analyses with principal axis factoring extractions and direct oblimin rotations were used to identify the underlying structure of ARSAA-II. Factor analysis confirmed the underlying structure of ARSAA-II and produced two interpretable factors labeled as 'Attraction to American Culture' (AAmC) and 'Attraction to Arabic Culture' (AArC). The Cronbach's alphas of AAmC and AArC were .89 and .85 respectively. Findings support ARSAA-II A & E to assess acculturation among Arab Americans. The emergent factors of ARSAA-II support the theoretical structure of the original ARSMA-II tool and show high internal consistency. PMID:23934518

Jadalla, Ahlam; Lee, Jerry

2015-02-01

96

Open-source MFIX-DEM software for gas-solids flows: Part IIValidation studies  

Microsoft Academic Search

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate

Tingwen Li; Rahul Garg; Janine Galvin; Sreekanth Pannala

2012-01-01

97

SURVEY, ANALYSIS AND VALIDATION OF INFORMATION FOR BUSINESS PROCESS MODELING  

E-print Network

SURVEY, ANALYSIS AND VALIDATION OF INFORMATION FOR BUSINESS PROCESS MODELING Nuno Castela Escola processes. A business process represents the organizations way of work, horizontally, allowing an analysis, Business Processes, Informational Resources, Activities, UML Abstract: Business processes modeling became

98

Cost model validation: a technical and cultural approach  

NASA Technical Reports Server (NTRS)

This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

2001-01-01

99

Validation of the Beck Depression Inventory-II in a Low-Income African American Sample of Medical Outpatients  

ERIC Educational Resources Information Center

The psychometric properties of the Beck Depression Inventory-II (BDI-II) are well established with primarily Caucasian samples. However, little is known about its reliability and validity with minority groups. This study evaluated the psychometric properties of the BDI-II in a sample of low-income African American medical outpatients (N = 220).…

Grothe, Karen B.; Dutton, Gareth R.; Jones, Glenn N.; Bodenlos, Jamie; Ancona, Martin; Brantley, Phillip J.

2005-01-01

100

Validity of the Autism/Pervasive Developmental Disorder Subscale of the Diagnostic Assessment for the Severely Handicapped-II.  

ERIC Educational Resources Information Center

An evaluation was made of the empirical validity of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II) with 51 individuals with severe/profound mental retardation, some of whom were also diagnosed with autism, and controls. The Autism/Pervasive Developmental Disorder subscale of the DASH-II was internally consistent and just as…

Matson, Johnny L.; Smiroldo, Brandi B.; Hastings, Theresa L.

1998-01-01

101

Computational Model Verification and Validation in Structural Mechanics  

Microsoft Academic Search

Goal-oriented error control for model verification combined with model validation in Computational Mechanics, here for the\\u000a Finite Element Method, is presented regarding the safety and reliability requirements of the ASME V&V 10-2006 Guide for Verification\\u000a and Validation in Computational Solid Mechanics, as well as efficiency aspects.\\u000a \\u000a In particular, model adaptivity with load- and process-depending applications of adequate mathematical models and

E. Stein; M. Rüter; S. Ohnimus

102

Validating Solar and Heliospheric Models at the CCMC  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) hosts a growing number of models of the ambient and transient corona and heliosphere which are ultimately intended for use in space weather forecasting. independent validation of these models is a critical step in their development as potential forecasting tools for the space weather operations community. In this poster we report on validation studies of these models, all of which are also available for use by the research community through our runs-on-request system.

MacNeice, P. J.; Taktakishvilli, A.; Hesse, M.; Kuznetsova, M. M.

2009-01-01

103

The SEGUE Stellar Parameter Pipeline. II. Validation with Galactic Globular and Open Clusters  

E-print Network

We validate the performance and accuracy of the current SEGUE (Sloan Extension for Galactic Understanding and Exploration) Stellar Parameter Pipeline (SSPP), which determines stellar atmospheric parameters (effective temperature, surface gravity, and metallicity) by comparing derived overall metallicities and radial velocities from selected likely members of three globular clusters (M 13, M 15, and M 2) and two open clusters (NGC 2420 and M 67) to the literature values. Spectroscopic and photometric data obtained during the course of the original Sloan Digital Sky Survey (SDSS-I) and its first extension (SDSS-II/SEGUE) are used to determine stellar radial velocities and atmospheric parameter estimates for stars in these clusters. Based on the scatter in the metallicities derived for the members of each cluster, we quantify the typical uncertainty of the SSPP values, sigma([Fe/H]) = 0.13 dex for stars in the range of 4500 K < Teff < 7500 K and 2.0 < log g < 5.0, at least over the metallicity interval spanned by the clusters studied (-2.3 < [Fe/H] < 0). The surface gravities and effective temperatures derived by the SSPP are also compared with those estimated from the comparison of the color-magnitude diagrams with stellar evolution models; we find satisfactory agreement. At present, the SSPP underestimates [Fe/H] for near-solar-metallicity stars, represented by members of M 67 in this study, by about 0.3 dex.

Y. S. Lee; T. C. Beers; T. Sivarani; J. A. Johnson; D. An; R. Wilhelm; C. Allende Prieto; L. Koesterke; P. Re Fiorentin; C. A. L. Bailer-Jones; J. E. Norris; B. Yanny; C. M. Rockosi; H. J. Newberg; K. M. Cudworth; K. Pan

2007-10-31

104

Reliability and validity of the test of gross motor development-II in Korean preschool children: applying AHP.  

PubMed

The Test of Gross Motor Development-II (TGMD-II) is a frequently used assessment tool for measuring motor ability. The purpose of this study is to investigate the reliability and validity of TGMD-II's weighting scores (by comparing pre-weighted TGMD-II scores with post ones) as well as examine applicability of the TGMD-II on Korean preschool children. A total of 121 Korean children (three kindergartens) participated in this study. There were 65 preschoolers who were 5-years-old (37 boys and 28 girls) and 56 preschoolers who were 6-years-old (34 boys and 22 girls). For internal consistency, reliability, and construct validity, only one researcher evaluated all of the children using the TGMD-II in the following areas: running; galloping; sliding; hopping; leaping; horizontal jumping; overhand throwing; underhand rolling; striking a stationary ball; stationary dribbling; kicking; and catching. For concurrent validity, the evaluator measured physical fitness (strength, flexibility, power, agility, endurance, and balance). The key findings were as follows: first, the reliability coefficient and the validity coefficient between pre-weighted and post-weighted TGMD-II scores were quite similar. Second, the research showed adequate reliability and validity of the TGMD-II for Korean preschool children. The TGMD-II is a proper instrument to test Korean children's motor development. Yet, applying relative weighting on the TGMD-II should be a point of consideration. PMID:24529860

Kim, Chung-Il; Han, Dong-Wook; Park, Il-Hyeok

2014-04-01

105

From Model to Assessment: Validating "Information Power."  

ERIC Educational Resources Information Center

Describes the development of an instrument for evaluating building-level school library media programs based on "Information Power," the American Association of School Libraries' guidelines for such programs. Data collection at an Oklahoma teleconference is described, the use of factor analysis is explained, construct validity is discussed, and…

Latrobe, Kathy; Swisher, Robert

1990-01-01

106

A platform for validation of FACTS models  

Microsoft Academic Search

The paper presents a platform system for the incorporation of flexible ac transmission systems (FACTS) devices. The platform permits detailed electromagnetic transients simulation as it is of manageable size. It manifests some of the common problems for which FACTS devices are used such as congestion management, stability improvement, and voltage support. The platform can be valuable for the validation of

Shan Jiang; U. D. Annakkage; A. M. Gole

2006-01-01

107

Uncertainty and validation of health economic decision models.  

PubMed

Health economic decision models are based on specific assumptions relating to model structure and parameter estimation. Validation of these models is recommended as an indicator of reliability, but is not commonly reported. Furthermore, models derived from different data and employing different assumptions may produce a variety of results.A Markov model for evaluating the long-term cost-effectiveness of screening for abdominal aortic aneurysm is described. Internal, prospective and external validations are carried out using individual participant data from two randomised trials. Validation is assessed in terms of total numbers and timings of key events, and total costs and life-years. Since the initial model validates well only internally, two further models are developed that better fit the prospective and external validation data. All three models are then extrapolated to a life-time horizon, producing cost-effectiveness estimates ranging from pound1600 to pound4200 per life-year gained.Parameter uncertainty is now commonly addressed in health economic decision modelling. However, the derivation of models from different data sources adds another level of uncertainty. This extra uncertainty should be recognised in practical decision-making and, where possible, specifically investigated through independent model validation. PMID:19206080

Kim, Lois G; Thompson, Simon G

2010-01-01

108

Validation of SAGE II aerosol measurements by comparison with correlative sensors  

Microsoft Academic Search

The SAGE II limb-scanning radiometer carried on the Earth Radiation Budget Satellite functions at wavelengths of 0.385, 0.45, 0.525, and 1.02 microns to identify vertical profiles of aerosol density by atmospheric extinction measurements from cloud tops upward. The data are being validated by correlating the satellite data with data gathered with, e.g., lidar, sunphotometer, and dustsonde instruments. Work thus far

T. J. Swissler

1986-01-01

109

Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis  

NASA Astrophysics Data System (ADS)

Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

2013-12-01

110

Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised  

NASA Technical Reports Server (NTRS)

Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

Lim, K. B.; Giesy, D. P.

2000-01-01

111

Validation of Numerical Shallow Water Models for Tidal Lagoons  

SciTech Connect

An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

Eliason, D.; Bourgeois, A.

1999-11-01

112

Validation of an Evaluation Model for Learning Management Systems  

ERIC Educational Resources Information Center

This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

Kim, S. W.; Lee, M. G.

2008-01-01

113

VALIDATION METHODS FOR CHEMICAL EXPOSURE AND HAZARD ASSESSMENT MODELS  

EPA Science Inventory

Mathematical models and computer simulation codes designed to aid in hazard assessment for environmental protection must be verified and validated before they can be used with confidence in a decision-making or priority-setting context. Operational validation, or full-scale testi...

114

Model validation for control and controller validation in a prediction error identication framework - Part I : theory  

Microsoft Academic Search

We propose a model validation procedure that consists of a prediction error iden- tication experiment with a full order model. It delivers a parametric uncertainty ellipsoid and a corresponding set of parametrized transfer functions, which we call PE (for Prediction Error) uncertainty set. Such uncertainty set diers from the clas- sical uncertainty descriptions used in robust control analysis and design.

Michel Gevers; Xavier Bombois; Brian D. O. Anderson; X. J. A. Bombois

115

EXODUS II: A finite element data model  

Microsoft Academic Search

EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise

Larry A. Schoof; Victor R. Yarberry

1994-01-01

116

Considerations for the validation of species-habitat models  

Microsoft Academic Search

The multitude of approaches to wildlife-habitat modeling reflect the broad objectives and goals of various research, management, and conservation programs. Validating models is an often overlooked component of using models effectively and confidently to achieve the desired objectives. Statistical models that attempt to predict the presence or absence of a species are often developed with logistic regression. In this paper,

Jennifer M. Psyllakis; Michael P. Gillingham

117

SAGE III aerosol extinction validation in the Arctic winter: comparisons with SAGE II and POAM III  

NASA Astrophysics Data System (ADS)

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use of potential vorticity as a spatial coordinate and thus greatly increased of the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020 nm extinction ratio shows a consistent bias of ~30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that SAGE II and POAM III data sets are not well correlated at and below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

Thomason, L. W.; Poole, L. R.; Randall, C. E.

2007-03-01

118

Using virtual reality to validate system models  

SciTech Connect

To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

Winter, V.L.; Caudell, T.P.

1999-12-09

119

ECOLOGICAL MODEL TESTING: VERIFICATION, VALIDATION OR NEITHER?  

EPA Science Inventory

Consider the need to make a management decision about a declining animal population. Two models are available to help. Before a decision is made based on model results, the astute manager or policy maker may ask, "Do the models work?" Or, "Have the models been verified or validat...

120

Particulate dispersion apparatus for the validation of plume models  

E-print Network

The purpose of this thesis is to document design, development, and fabrication of a transportable source of dry aerosol to improve testing and validation of atmospheric plume models. The proposed dispersion apparatus is intended to complement...

Bala, William D

2012-06-07

121

On the problem of model validation for predictive exposure assessments  

Microsoft Academic Search

The development and use of models for predicting exposures are increasingly common and are essential for many risk assessments\\u000a of the United States Environmental Protection Agency (EPA). Exposure assessments conducted by the EPA to assist regulatory\\u000a or policy decisions are often challenged to demonstrate their “scientific validity”. Model validation has thus inevitably\\u000a become a major concern of both EPA officials

M. B. Beck; J. R. Ravetz; L. A. Mulkey; T. O. Barnwell

1997-01-01

122

Validation of the Archimedes Diabetes Model  

Microsoft Academic Search

trolled trials by repeating in the model the steps taken for the real trials and comparing the results calculated by the model with the results of the trial. Eighteen trials were chosen by an independent advisory committee. Half the trials had been used to help build the model (\\

DAVID M. EDDY; LEONARD SCHLESSINGER

123

SAGE III aerosol extinction validation in the Arctic winter: comparisons with SAGE II and POAM III  

NASA Astrophysics Data System (ADS)

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use potential vorticity as a spatial coordinate and thus greatly increased the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020 nm extinction ratio shows a consistent bias of ~30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that the two data sets are not well correlated below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

Thomason, L. W.; Poole, L. R.; Randall, C. E.

2006-11-01

124

Atmospheric forcing validation for modeling the central Arctic  

Microsoft Academic Search

We compare daily data from the National Center for Atmospheric Research and National Centers for Environmental Prediction ``Reanalysis 1'' project with observational data obtained from the North Pole drifting stations in order to validate the atmospheric forcing data used in coupled ice-ocean models. This analysis is conducted to assess the role of errors associated with model forcing before performing model

A. Makshtas; D. Atkinson; M. Kulakov; S. Shutilin; R. Krishfield; A. Proshutinsky

2007-01-01

125

IC immunity modeling process validation using on-chip measurements  

E-print Network

1 IC immunity modeling process validation using on-chip measurements S. Ben Dhia (1)(2) , A. Boyer Eisenhower, 31023 Toulouse, France Abstract-- Developing integrated circuit (IC) immunity models susceptibility tests before fabrication and avoid redesign cost. This paper presents an IC immunity modeling

Paris-Sud XI, Université de

126

Validation of Model Forecasts of the Ambient Solar Wind  

NASA Technical Reports Server (NTRS)

Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

2009-01-01

127

VALIDATION OF EROSION MODELING: PHYSICAL AND Mehrad Kamalzare1  

E-print Network

VALIDATION OF EROSION MODELING: PHYSICAL AND NUMERICAL Mehrad Kamalzare1 , Christopher Stuetzle2-3590 ABSTRACT The overall intent of this research is to develop numerical models of erosion of levees, dams a geotechnical centrifuge. The erosion is modeled in detail, from beginning to end, that is from the time

Varela, Carlos

128

Validation of Erosion Modeling: Physical and Numerical Mehrad Kamalzare1  

E-print Network

1 Validation of Erosion Modeling: Physical and Numerical Mehrad Kamalzare1 , Christopher Stuetzle2-3590 ABSTRACT The overall intent of this research is to develop numerical models of erosion of levees, dams a geotechnical centrifuge. The erosion is modeled in detail, from beginning to end, that is from the time

Franklin, W. Randolph

129

End-to-end modelling of He II flow systems  

NASA Technical Reports Server (NTRS)

A practical computer code has been developed which uses the accepted two-fluid model to simulate He II flow in complicated systems. The full set of equations are used, retaining the coupling between the pressure, temperature and velocity fields. This permits modeling He II flow over the full range of conditions, from strongly or weakly driven flow through large pipes, narrow channels and porous media. The system may include most of the components used in modern superfluid flow systems: non-ideal thermomechanical pumps, tapered sections, constrictions, lines with heated side walls and heat exchangers. The model is validated by comparison with published experimental data. It is applied to a complex system to show some of the non-intuitive feedback effects that can occur. This code is ready to be used as a design tool for practical applications of He II. It can also be used for the design of He II experiments and as a tool for comparison of experimental data with the standard two-fluid model.

Mord, A. J.; Snyder, H. A.; Newell, D. A.

1992-01-01

130

Validation of the NATO-standard ship signature model (SHIPIR)  

NASA Astrophysics Data System (ADS)

An integrated naval infrared target, threat and countermeasure simulator (SHIPIR/NTCS) has been developed. The SHIPIR component of the model has been adopted by both NATO and the US Navy as a common tool for predicting the infrared (IR) signature of naval ships in their background. The US Navy has taken a lead role in further developing and validating SHIPIR for use in the Twenty-First Century Destroyer (DD-21) program. As a result, the US Naval Research Laboratory (NRL) has performed an in-depth validation of SHIPIR. This paper presents an overview of SHIPIR, the model validation methodology developed by NRL, and the results of the NRL validation study. The validation consists of three parts: a review of existing validation information, the design, execution, and analysis of a new panel test experiment, and the comparison of experiment with predictions from the latest version of SHIPIR (v2.5). The results show high levels of accuracy in the radiometric components of the model under clear-sky conditions, but indicate the need for more detailed measurement of solar irradiance and cloud model data for input to the heat transfer and in-band sky radiance sub-models, respectively.

Vaitekunas, David A.; Fraedrich, Douglas S.

1999-07-01

131

Design and validation of a multiphase 3D model to simulate tropospheric pollution.  

PubMed

This work presents the Transport and Chemical Aerosol Model (TCAM) formulation and its validation in the frame of CityDelta-CAFE project. TCAM is a 3D eulerian multiphase model simulating tropospheric secondary pollution at mesoscale. It is included in the GAMES (Gas Aerosol Modelling Evaluation System) modelling system, designed to support the analysis of secondary pollution dynamics and to assess the impact of emission control strategies. The presented validation assessment has been performed in the frame of the CityDelta II project over the Milan domain and concerns both gas and aerosol 1999 simulations. Ozone, nitrogen oxides and aerosol computed and observed patterns have been compared and analysed by means of statistical indicators showing high model performances for both winter and summer pollution regimes. PMID:17963821

Carnevale, Claudio; Decanini, Edoardo; Volta, Marialuisa

2008-02-01

132

Translation, adaptation and validation of a Portuguese version of the Moorehead-Ardelt Quality of Life Questionnaire II.  

PubMed

The prevalence of obesity has increased worldwide. An assessment of the impact of obesity on health-related quality of life (HRQoL) requires specific instruments. The Moorehead-Ardelt Quality of Life Questionnaire II (MA-II) is a widely used instrument to assess HRQoL in morbidly obese patients. The objective of this study was to translate and validate a Portuguese version of the MA-II.The study included forward and backward translations of the original MA-II. The reliability of the Portuguese MA-II was estimated using the internal consistency and test-retest methods. For validation purposes, the Spearman's rank correlation coefficient was used to evaluate the correlation between the Portuguese MA-II and the Portuguese versions of two other questionnaires, the 36-item Short Form Health Survey (SF-36) and the Impact of Weight on Quality of Life-Lite (IWQOL-Lite).One hundred and fifty morbidly obese patients were randomly assigned to test the reliability and validity of the Portuguese MA-II. Good internal consistency was demonstrated by a Cronbach's alpha coefficient of 0.80, and a very good agreement in terms of test-retest reliability was recorded, with an overall intraclass correlation coefficient (ICC) of 0.88. The total sums of MA-II scores and each item of MA-II were significantly correlated with all domains of SF-36 and IWQOL-Lite. A statistically significant negative correlation was found between the MA-II total score and BMI. Moreover, age, gender and surgical status were independent predictors of MA-II total score.A reliable and valid Portuguese version of the MA-II was produced, thus enabling the routine use of MA-II in the morbidly obese Portuguese population. PMID:24817428

Maciel, João; Infante, Paulo; Ribeiro, Susana; Ferreira, André; Silva, Artur C; Caravana, Jorge; Carvalho, Manuel G

2014-11-01

133

Validation of the Poisson Stochastic Radiative Transfer Model  

NASA Technical Reports Server (NTRS)

A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

Zhuravleva, Tatiana; Marshak, Alexander

2004-01-01

134

SWAT: Model use, calibration, and validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

SWAT (Soil and Water Assessment Tool) is a comprehensive, semi-distributed river basin model that requires a large number of input parameters which complicates model parameterization and calibration. Several calibration techniques have been developed for SWAT including manual calibration procedures...

135

Validating the Mexican American Intergenerational Caregiving Model  

ERIC Educational Resources Information Center

The purpose of this study was to substantiate and further develop a previously formulated conceptual model of Role Acceptance in Mexican American family caregivers by exploring the theoretical strengths of the model. The sample consisted of women older than 21 years of age who self-identified as Hispanic, were related through consanguinal or…

Escandon, Socorro

2011-01-01

136

A framework for biodynamic feedthrough analysis--part II: validation and application.  

PubMed

Biodynamic feedthrough (BDFT) is a complex phenomenon, that has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, the framework for BDFT analysis, as presented in Part I of this dual publication, is validated and applied. The goal of this framework is twofold. First of all, it provides some common ground between the seemingly large range of different approaches existing in BDFT literature. Secondly, the framework itself allows for gaining new insights into BDFT phenomena. Using recently obtained measurement data, parts of the framework that were not already addressed elsewhere, are validated. As an example of a practical application of the framework, it will be demonstrated how the effects of control device dynamics on BDFT can be understood and accurately predicted. Other ways of employing the framework are illustrated by interpreting the results of three selected studies from the literature using the BDFT framework. The presentation of the BDFT framework is divided into two parts. This paper, Part II, addresses the validation and application of the framework. Part I, which is also published in this journal issue, addresses the theoretical foundations of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation. PMID:25137695

Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

2014-09-01

137

Experimental tests for the validation of active numerical human models.  

PubMed

The development of numerical human models is a topic of current interdisciplinary research. In the field of automotive safety these models can be applied for the optimization of protection systems. In forensic research human models can be used for the investigation of injury mechanisms and for the prediction and reproduction of injury patterns. However, up to now human models have been validated on the basis of PMHS tests without considering the effects of muscle activity. This paper shows two experimental volunteer test set-ups for the generation of experimental validation data. In a pendulum set-up the influence of muscle activity on the human kinematics was investigated. A drop test set-up was developed for the analysis of the effects of muscle activity on impact response characteristics of muscle tissue. Experimental results, presented in this paper, can be used for the validation and optimization of active numerical human models. PMID:18262372

Muggenthaler, Holger; von Merten, Katja; Peldschus, Steffen; Holley, Stephanie; Adamec, Jiri; Praxl, Norbert; Graw, Matthias

2008-05-20

138

Validation of SAGE II aerosol measurements by comparison with correlative sensors  

NASA Technical Reports Server (NTRS)

The SAGE II limb-scanning radiometer carried on the Earth Radiation Budget Satellite functions at wavelengths of 0.385, 0.45, 0.525, and 1.02 microns to identify vertical profiles of aerosol density by atmospheric extinction measurements from cloud tops upward. The data are being validated by correlating the satellite data with data gathered with, e.g., lidar, sunphotometer, and dustsonde instruments. Work thus far has shown that the 1 micron measurements from the ground and satellite are highly correlated and are therefore accurate to within measurement uncertainty.

Swissler, T. J.

1986-01-01

139

VERIFICATION AND VALIDATION OF THE SPARC MODEL  

EPA Science Inventory

Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

140

Validating predictions from climate envelope models.  

PubMed

Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 ) and evaluated using occurrence data from 1998-2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species. PMID:23717452

Watling, James I; Bucklin, David N; Speroterra, Carolina; Brandt, Laura A; Mazzotti, Frank J; Romañach, Stephanie S

2013-01-01

141

Validating Predictions from Climate Envelope Models  

PubMed Central

Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species. PMID:23717452

Watling, James I.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.

2013-01-01

142

Validating Requirements for Fault Tolerant Systems Using Model Checking  

NASA Technical Reports Server (NTRS)

Model checking is shown to be an effective tool in validating the behavior of a fault tolerant embedded spacecraft controller. The case study presented here shows that by judiciously abstracting away extraneous complexity, the state space of the model could be exhaustively searched allowing critical functional requirements to be validated down to the design level. Abstracting away detail not germane to the problem of interest leaves by definition a partial specification behind. The success of this procedure shows that it is feasible to effectively validate a partial specification with this technique. Three anomalies were found in the system one of which is an error in the detailed requirements, and the other two are missing/ambiguous requirements. Because the method allows validation of partial specifications, it also is an effective methodology towards maintaining fidelity between a co-evolving specification and an implementation.

Schneider, Francis; Easterbrook, Steve M.; Callahan, John R.; Holzmann, Gerard J.

1997-01-01

143

Solution Verification Linked to Model Validation, Reliability, and Confidence  

SciTech Connect

The concepts of Verification and Validation (V&V) can be oversimplified in a succinct manner by saying that 'verification is doing things right' and 'validation is doing the right thing'. In the world of the Finite Element Method (FEM) and computational analysis, it is sometimes said that 'verification means solving the equations right' and 'validation means solving the right equations'. In other words, if one intends to give an answer to the equation '2+2=', then one must run the resulting code to assure that the answer '4' results. However, if the nature of the physics or engineering problem being addressed with this code is multiplicative rather than additive, then even though Verification may succeed (2+2=4 etc), Validation may fail because the equations coded are not those needed to address the real world (multiplicative) problem. We have previously provided a 4-step 'ABCD' quantitative implementation for a quantitative V&V process: (A) Plan the analyses and validation testing that may be needed along the way. Assure that the code[s] chosen have sufficient documentation of software quality and Code Verification (i.e., does 2+2=4?). Perform some calibration analyses and calibration based sensitivity studies (these are not validated sensitivities but are useful for planning purposes). Outline the data and validation analyses that will be needed to turn the calibrated model (and calibrated sensitivities) into validated quantities. (B) Solution Verification: For the system or component being modeled, quantify the uncertainty and error estimates due to spatial, temporal, and iterative discretization during solution. (C) Validation over the data domain: Perform a quantitative validation to provide confidence-bounded uncertainties on the quantity of interest over the domain of available data. (D) Predictive Adequacy: Extend the model validation process of 'C' out to the application domain of interest, which may be outside the domain of available data in one or more planes of multi-dimensional space. Part 'D' should provide the numerical information about the model and its predictive capability such that given a requirement, an adequacy assessment can be made to determine of more validation analyses or data are needed.

Logan, R W; Nitta, C K

2004-06-16

144

EXODUS II: A finite element data model  

SciTech Connect

EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise the Application Programming Interface (API).

Schoof, L.A.; Yarberry, V.R.

1994-09-01

145

Paediatric bed fall computer simulation model development and validation.  

PubMed

Falls from beds and other household furniture are common scenarios stated to conceal child abuse. Knowledge of the biomechanics associated with short-distance falls may aid clinicians in distinguishing between abusive and accidental injuries. Computer simulation is a useful tool to investigate injury-producing events and to study the effect of altering event parameters on injury risk. In this study, a paediatric bed fall computer simulation model was developed and validated. The simulation was created using Mathematical Dynamic Modeling(®) software with a child restraint air bag interaction (CRABI) 12-month-old anthropomorphic test device (ATD) representing the fall victim. The model was validated using data from physical fall experiments of the same scenario with an instrumented CRABI ATD. Validation was conducted using both observational and statistical comparisons. Future parametric sensitivity studies using this model will lead to an improved understanding of relationships between child (fall victim) parameters, fall environment parameters and injury potential. PMID:22185087

Thompson, Angela K; Bertocci, Gina E

2013-01-01

146

Validation of nuclear models used in space radiation shielding applications  

SciTech Connect

A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

Norman, Ryan B., E-mail: Ryan.B.Norman@nasa.gov [NASA Langley Research Center, Hampton, VA 23681 (United States); Blattnig, Steve R. [NASA Langley Research Center, Hampton, VA 23681 (United States)] [NASA Langley Research Center, Hampton, VA 23681 (United States)

2013-01-15

147

WEPP: Model use, calibration and validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

The Water Erosion Prediction Project (WEPP) model is a process-based, continuous simulation, distributed parameter, hydrologic and soil erosion prediction system. It has been developed over the past 25 years to allow for easy application to a large number of land management scenarios. Most general o...

148

WEPP: Model use, calibration, and validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

The Water Erosion Prediction Project (WEPP) model is a process-based, continuous simulation, distributed parameter, hydrologic and soil erosion prediction system. It has been developed over the past 25 years to allow for easy application to a large number of land management scenarios. Most general o...

149

Modeling of Alpine Atmospheric Dynamics II  

E-print Network

Modeling of Alpine Atmospheric Dynamics II 707.424, VU 2, SS2005 Unit 6: Post-processing with REVU #12;The REVU package The REVU package is used for visualization of RAMS model output data: $ path(path,'/home/.../read_grib'); Example: $ infile = `/mnt/o3800/c707174/ecdata/.../....grib'; $ data

Gohm, Alexander

150

Antisolvent crystallization: Model identification, experimental validation and dynamic simulation  

Microsoft Academic Search

This paper is concerned with the development, simulation and experimental validation of a detailed antisolvent crystallization model. A population balance approach is adopted to describe the dynamic change of particle size in crystallization processes under the effect of antisolvent addition. Maximum likelihood method is used to identify the nucleation and growth kinetic models using data derived from controlled experiments. The

S. Mostafa Nowee; Ali Abbas; Jose A. Romagnoli

2008-01-01

151

Near-real time validation of an operational hydrographic model  

Microsoft Academic Search

The Irish Marine Institute maintains an operational model of the NE Atlantic from which weekly hydrographic forecasts are published on the institute's web site. A method for the systematic validation of the operational model has been developed, making use of temperature and salinity profile data from ARGO floats, surface water temperature and salinity data from the Irish weather buoys and

H. Cannaby; M. Cure; K. Lyons; G. Nolan

152

THE FERNALD DOSIMETRY RECONSTRUCTION PROJECT Environmental Pathways -Models and Validation  

E-print Network

Uncertainties in the Air Transport Model 25 VALIDATION EXERCISES . . . . . . . . . 26 Air Monitoring Data for Modeling the Transport of Airborne Releases F. The Straight-Line Gaussian Plume and Related Air Transport and Plume Depletion I. Plume Rise J. Building Wake Effects K Parametric Uncertainty in the Air Transport

153

Predicting Vehicle Crashworthiness: Validation of Computer Models for  

E-print Network

Predicting Vehicle Crashworthiness: Validation of Computer Models for Functional and Hierarchical. Cafeo, Chin-Hsu Lin, and Jian Tu Abstract The CRASH computer model simulates the effect of a vehicle colliding against different barrier types. If it accurately represents real vehicle crash- worthiness

Berger, Jim

154

NONLINEAR CO N SOURCEMESFET BEHAVIOUR AND MODEL VALIDATION  

E-print Network

conditions and load resistance on the nonlinear behaviour of a MESFET Common Source (CS) amplifier at mediumNONLINEAR CO N SOURCEMESFET BEHAVIOUR AND MODEL VALIDATION G. Passiopoulos*,D.R .Webster",A. E nonlinear device modelling of MESFETs is of primary importance in the design of high frequency circuits

155

ORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical  

E-print Network

Obstructive sleep apnea syndrome 1 Introduction Since the 1990s, biomechanical modelling of the human upper properties of the upper airway (geometry, rheology). This makes them of interest to improve the qualityORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical simulations using

Lagrée, Pierre-Yves

156

ORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical  

E-print Network

, biomechanical modelling of the human upper airway has received a growing interest since it allows a better of the biomechanical properties of the upper airway (geometry, rheology). This makes them of interest to improveORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical simulations using

Payan, Yohan

157

Development and validation of cognitive models for human error reduction  

Microsoft Academic Search

A three-dimensional conceptual model for human error reduction is proposed and partially validated. The three dimensions are instruction type, task type, and individual differences. Three experiments were conducted to test the model. The independent variables included subjects' knowledge level, task type, and instruction type. The dependent variables included subjects' attitude toward task, the number of finished tasks, errors in all

Xianzhan Lin

1997-01-01

158

A Formal Approach to Empirical Dynamic Model Optimization and Validation  

NASA Technical Reports Server (NTRS)

A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

2014-01-01

159

Validating Finite Element Models of Assembled Shell Structures  

NASA Technical Reports Server (NTRS)

The validation of finite element models of assembled shell elements is presented. The topics include: 1) Problems with membrane rotations in assembled shell models; 2) Penalty stiffness for membrane rotations; 3) Physical stiffness for membrane rotations using shell elements with 6 dof per node; and 4) Connections avoiding rotations.

Hoff, Claus

2006-01-01

160

Reduced-complexity modeling of braided rivers: Assessing model performance by sensitivity analysis, calibration, and validation  

NASA Astrophysics Data System (ADS)

paper addresses an important question of modeling stream dynamics: How may numerical models of braided stream morphodynamics be rigorously and objectively evaluated against a real case study? Using simulations from the Cellular Automaton Evolutionary Slope and River (CAESAR) reduced-complexity model (RCM) of a 33 km reach of a large gravel bed river (the Tagliamento River, Italy), this paper aims to (i) identify a sound strategy for calibration and validation of RCMs, (ii) investigate the effectiveness of multiperformance model assessments, (iii) assess the potential of using CAESAR at mesospatial and mesotemporal scales. The approach used has three main steps: first sensitivity analysis (using a screening method and a variance-based method), then calibration, and finally validation. This approach allowed us to analyze 12 input factors initially and then to focus calibration only on the factors identified as most important. Sensitivity analysis and calibration were performed on a 7.5 km subreach, using a hydrological time series of 20 months, while validation on the whole 33 km study reach over a period of 8 years (2001-2009). CAESAR was able to reproduce the macromorphological changes of the study reach and gave good results as for annual bed load sediment estimates which turned out to be consistent with measurements in other large gravel bed rivers but showed a poorer performance in reproducing the characteristics of the braided channel (e.g., braiding intensity). The approach developed in this study can be effectively applied in other similar RCM contexts, allowing the use of RCMs not only in an explorative manner but also in obtaining quantitative results and scenarios.

Ziliani, L.; Surian, N.; Coulthard, T. J.; Tarantola, S.

2013-12-01

161

Experiments for foam model development and validation.  

SciTech Connect

A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

2008-09-01

162

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory. (author)

Ko, Y.-C. [Nuclear Science and Engineering Department, MIT, Cambridge, MA 02139 (United States); Hu, L.-W. [Nuclear Reactor Laboratory, MIT, Cambridge, MA 02139 (United States)], E-mail: lwhu@mit.edu; Olson, Arne P.; Dunn, Floyd E. [RERTR Program, Argonne National Laboratory, Argonne, IL 60439 (United States)

2008-07-15

163

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU.  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory.

Ko, Y. C.; Hu, L. W.; Olson, A. P.; Dunn, F. E.; Nuclear Engineering Division; MIT

2007-01-01

164

Validity of empirical models of exposure in asphalt paving  

PubMed Central

Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

2002-01-01

165

Simultaneous model building and validation with uniform designs of experiments  

NASA Astrophysics Data System (ADS)

This article describes an implementation of a particular design of experiment (DoE) plan based upon optimal Latin hypercubes that have certain space-filling and uniformity properties with the goal of maximizing the information gained. The feature emphasized here is the concept of simultaneous model building and model validation plans whose union contains the same properties as the component sets. Two Latin hypercube DoE are constructed simultaneously for use in a meta-modelling context for model building and model validation. The goal is to optimize the uniformity of both sets with respect to space-filling properties of the designs whilst satisfying the key concept that the merged DoE, comprising the union of build and validation sets, has similar space-filling properties. This represents a development of an optimal sampling approach for the first iteration—the initial model building and validation where most information is gained to take the full advantage of parallel computing. A permutation genetic algorithm using several genetic operator strategies is implemented in which fitness evaluation is based upon the Audze-Eglais potential energy function, and an example is presented based upon the well-known six-hump camel back function. The relative efficiency of the strategies and the associated computational aspects are discussed with respect to the quality of the designs obtained. The requirement for such design approaches arises from the need for multiple calls to traditionally expensive system and discipline analyses within iterative multi-disciplinary optimisation frameworks.

Narayanan, A.; Toropov, V. V.; Wood, A. S.; Campean, I. F.

2007-07-01

166

Sub-nanometer Level Model Validation of the SIM Interferometer  

NASA Technical Reports Server (NTRS)

The Space Interferometer Mission (SIM) flight instrument will not undergo a full performance, end-to-end system test on the ground due to a number of constraints. Thus, analysis and physics-based models will play a significant role in providing confidence that SIM will meet its science goals on orbit. The various models themselves are validated against the experimental results obtained from the MicroArcsecond Metrology (MAM) testbed adn the Diffraction testbed (DTB). The metric for validation is provided by the SIM astrometric error budget.

Korechoff, Robert P.; Hoppe, Daniel; Wang, Xu

2004-01-01

167

Theoretical models of ultrasonic inspection and their validation  

SciTech Connect

In response to the perception of demands by the public for higher than ever standards of safety, the nuclear industry in Britain embarked on an extensive program of nuclear safety research in support of the safety case for the new Sizewell B pressurized water reactor, which is now approaching completion. A suite of diverse computer models, of various aspects of ultrasonic inspection, is described, ranging from transducer design to ray-tracing in anisotropic stainless steel weldments or complex geometries. These provide aids to inspection design, verification, validation and data analysis, but the models must also be validated.

Birchall, D.; Daniels, W. [AEA Technology, Risley (United Kingdom); Hawker, B.M.; Ramsey, A.T.; Temple, J.A.G. [AEA Technology, Harwell (United Kingdom)

1994-12-31

168

Validation of green tea polyphenol biomarkers in a phase II human intervention trial  

Microsoft Academic Search

Health benefits of green tea polyphenols (GTPs) have been reported in many animal models, but human studies are inconclusive. This is partly due to a lack of biomarkers representing green tea consumption. In this study, GTP components and metabolites were analyzed in plasma and urine samples collected from a phase II intervention trial carried out in 124 healthy adults who

Jia-Sheng Wang; Haitao Luo; Piwen Wang; Lili Tang; Jiahua Yu; Tianren Huang; Stephen Cox; Weimin Gao

2008-01-01

169

Hazard function modeling using cross validation: From data collection to model selection  

Microsoft Academic Search

A general methodology for reliability modeling of component failures and model discrimination using cross validation is developed. First, the requirements for collection of failure, maintenance, and operation data are outlined, including left and right censored data. Cross validation is then used as a probabilistic measure of predictive performance for selection of the optimal model from a set of reliability model

Jonathan S. Tan; Mark A. Kramer

1995-01-01

170

Closed Form Solution for Minimum Norm Model-Validating Uncertainty  

NASA Technical Reports Server (NTRS)

A methodology in which structured uncertainty models are directly constructed from measurement data for use in robust control design of multivariable systems is proposed. The formulation allows a general linear fractional transformation uncertainty structure connections with respect to a given nominal model. Existence conditions are given, and under mild assumptions, a closed-form expression for the smallest norm structured uncertainty that validates the model is given. The uncertainty bound computation is simple and is formulated for both open and closed loop systems.

Lim, Kyong Been

1997-01-01

171

Validation of a Colonoscopy Simulation Model for Skills Assessment  

Microsoft Academic Search

OBJECTIVE:The purpose is to provide initial validation of a novel simulation model's fidelity and ability to assess competence in colonoscopy skills.METHODS:In a prospective, cross-sectional design, each of 39 endoscopists (13 staff, 13 second year fellows, and 13 novices) performed a colonoscopy on a novel bovine simulation model. Staff endoscopists also completed a survey examining different aspects of the model's realism

Robert E. Sedlack; Todd H. Baron; Steven M. Downing; Alan J. Schwartz

2007-01-01

172

Validation techniques of agent based modelling for geospatial simulations  

NASA Astrophysics Data System (ADS)

One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

Darvishi, M.; Ahmadi, G.

2014-10-01

173

Psychometric Validation of the BDI-II Among HIV-Positive CHARTER Study Participants.  

PubMed

Rates of depression are high among individuals living with HIV. Accurate assessment of depressive symptoms among this population is important for ensuring proper diagnosis and treatment. The Beck Depression Inventory-II (BDI-II) is a widely used measure for assessing depression, however its psychometric properties have not yet been investigated for use with HIV-positive populations in the United States. The current study was the first to assess the psychometric properties of the BDI-II among a large cohort of HIV-positive participants sampled at multiple sites across the United States as part of the CNS HIV Antiretroviral Therapy Effects Research (CHARTER) study. The BDI-II test scores showed good internal consistency (? = .93) and adequate test-retest reliability (internal consistency coefficient = 0.83) over a 6-mo period. Using a "gold standard" of major depressive disorder determined by the Composite International Diagnostic Interview, sensitivity and specificity were maximized at a total cut-off score of 17 and a receiver operating characteristic analysis confirmed that the BDI-II is an adequate diagnostic measure for the sample (area under the curve = 0.83). The sensitivity and specificity of each score are provided graphically. Confirmatory factor analyses confirmed the best fit for a three-factor model over one-factor and two-factor models and models with a higher-order factor included. The results suggest that the BDI-II is an adequate measure for assessing depressive symptoms among U.S. HIV-positive patients. Cut-off scores should be adjusted to enhance sensitivity or specificity as needed and the measure can be differentiated into cognitive, affective, and somatic depressive symptoms. (PsycINFO Database Record (c) 2014 APA, all rights reserved). PMID:25419643

Hobkirk, Andréa L; Starosta, Amy J; De Leo, Joseph A; Marra, Christina M; Heaton, Robert K; Earleywine, Mitch

2014-11-24

174

Data for model validation summary report. A summary of data for validation and benchmarking of recovery boiler models  

SciTech Connect

One of the tasks in the project was to obtain data from operating recovery boilers for the purpose of model validation. Another task was to obtain water model data and computer output from University of British Columbia for purposes of benchmarking the UBC model against other codes. In the course of discussions on recovery boiler modeling over the course of this project, it became evident that there would be value in having some common cases for carrying out benchmarking exercises with different recovery boiler models. In order to facilitate such a benchmarking exercise, the data that was obtained on this project for validation and benchmarking purposes has been brought together in a single, separate report. The intent is to make this data available to anyone who may want to use it for model validation. The report contains data from three different cases. Case 1 is an ABBCE recovery boiler which was used for model validation. The data are for a single set of operating conditions. Case 2 is a Babcock & Wilcox recovery boiler that was modified by Tampella. In this data set, several different operating conditions were employed. The third case is water flow data supplied by UBC, along with computational output using the UBC code, for benchmarking purposes.

Grace, T.; Lien, S.; Schmidl, W.; Salcudean, M.; Abdullah, Z.

1997-07-01

175

Modeling and Validation of Microwave Ablations with Internal Vaporization  

PubMed Central

Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

2014-01-01

176

Modeling and validation of microwave ablations with internal vaporization.  

PubMed

Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this paper, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10, and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intraprocedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard index of 0.27, 0.49, 0.61, 0.67, and 0.69 at 1, 2, 3, 4, and 5 min, respectively. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L

2015-02-01

177

Climate Model Validation Using Spectrally Resolved Shortwave Radiation Measurements  

NASA Astrophysics Data System (ADS)

The climate science community has made significant strides in the development and improvement of Global Climate Models (GCMs) to predict how the Earth's climate system will change over the next several decades. It is crucial to evaluate how well these models reproduce observed climate variability using strict validation techniques to assist the climate modeling community with improving GCM prediction. The ability of climate models to simulate Earth's present-day climate is an initial evaluation of their ability to predict future changes in climate. Models are evaluated in several ways, including model intercomparison projects and comparing model simulations of physical variables with observations of those variables. We are developing new methods for rigorous climate model validation and physical attribution of the cause of model errors using existing, direct measurements of hyperspectral shortwave reflectance. We have also developed a SCIAMACHY-based (Scanning Imaging Absorption Spectrometer for Atmospheric Cartography) hyperspectral shortwave climate validation product to demonstrate using the product to validate GCMs. We are also investigating the information content added by using multispectral and hyperspectral data to study climate variability and validate climate models. The goal is to determine if it is necessary to use data with continuous spectral sampling across the shortwave spectral range, or if it is sufficient to use a subset of carefully selected spectral bands (e.g. MODIS-like data) to study climate trends and evaluate climate model performance. We are carrying out this activity by comparing the information content within broadband, multispectral (discrete-band sampling), and hyperspectral (high spectral resolution with continuous spectral sampling) data sets. Changes in climate-relevant atmospheric and surface variables impact the spectral, spatial, and temporal variability of Earth-reflected solar radiation (0.3-2.5 ?m) through spectrally dependent scattering and absorption processes. Previous studies have demonstrated that highly accurate, hyperspectral (spectrally contiguous and overlapping) measurements of shortwave reflectance are important for monitoring climate variability from space. We are continuing to work to demonstrate that highly accurate, high information content hyperspectral shortwave measurements can be used to detect changes in climate, identify climate variance drivers, and validate GCMs.

Roberts, Y.; Taylor, P. C.; Lukashin, C.; Feldman, D.; Pilewskie, P.; Collins, W.

2013-12-01

178

Modeling of sensor nets in Ptolemy II  

Microsoft Academic Search

This paper describes a modeling and simulation framework called VisualSense for wireless sensor networks that builds on and leverages Ptolemy II. This framework supports actor-oriented definition of sensor nodes, wireless communication channels, physical media such as acoustic channels, and wired subsystems. The software architecture consists of a set of base classes for defining channels and sensor nodes, a library of

Philip Baldwin; Sanjeev Kohli; Edward A. Lee; Xiaojun Liu; Yang Zhao

2004-01-01

179

Radiative transfer model validations during the First ISLSCP Field Experiment  

NASA Technical Reports Server (NTRS)

Two simple radiative transfer models, the 5S model based on Tanre et al. (1985, 1986) and the wide-band model of Morcrette (1984) are validated by comparing their outputs with results obtained during the First ISLSCP Field Experiment on concomitant radiosonde, aerosol turbidity, and radiation measurements and sky photographs. Results showed that the 5S model overestimates the short-wave irradiance by 13.2 W/sq m, whereas the Morcrette model underestimated the long-wave irradiance by 7.4 W/sq m.

Frouin, Robert; Breon, Francois-Marie; Gautier, Catherine

1990-01-01

180

Validating soil phosphorus routines in the SWAT model  

Technology Transfer Automated Retrieval System (TEKTRAN)

Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

181

Validating Work Discrimination and Coping Strategy Models for Sexual Minorities  

ERIC Educational Resources Information Center

The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

2009-01-01

182

Testing the Validity of Cost-Effectiveness Models  

Microsoft Academic Search

A growing body of recent work has identified several problems with economic evaluations undertaken alongside controlled trials that can have potentially serious impacts on the ability of decision makers to draw valid conclusions. At the same time, the use of cost-effectiveness models has been drawn into question, due to the alleged arbitrary nature of their construction. This has led researchers

Chris McCabe; Simon Dixon

2000-01-01

183

Validity of Cardiovascular Risk Prediction Models in Kidney Transplant Recipients  

PubMed Central

Background. Predicting cardiovascular risk is of great interest in renal transplant recipients since cardiovascular disease is the leading cause of mortality. Objective. To conduct a systematic review to assess the validity of cardiovascular risk prediction models in this population. Methods. Five databases were searched (MEDLINE, EMBASE, SCOPUS, CINAHL, and Web of Science) and cohort studies with at least one year of follow-up were included. Variables that described population characteristics, study design, and prognostic performance were extracted. The Quality in Prognostic Studies (QUIPS) tool was used to evaluate bias. Results. Seven studies met the criteria for inclusion, of which, five investigated the Framingham risk score and three used a transplant-specific model. Sample sizes ranged from 344 to 23,575, and three studies lacked sufficient event rates to confidently reach conclusion. Four studies reported discrimination (as measured by c-statistic), which ranged from 0.701 to 0.75, while only one risk model was both internally and externally validated. Conclusion. The Framingham has underestimated cardiovascular events in renal transplant recipients, but these studies have not been robust. A risk prediction model has been externally validated at least on one occasion, but comprehensive validation in multiple cohorts and impact analysis are recommended before widespread clinical application is advocated. PMID:24977223

Stewart, Samuel Alan; Shoker, Ahmed

2014-01-01

184

Validation of the NATO-standard ship signature model (SHIPIR)  

Microsoft Academic Search

An integrated naval infrared target, threat and countermeasure simulator (SHIPIR\\/NTCS) has been developed. The SHIPIR component of the model has been adopted by both NATO and the US Navy as a common tool for predicting the infrared (IR) signature of naval ships in their background. The US Navy has taken a lead role in further developing and validating SHIPIR for

David A. Vaitekunas; Douglas S. Fraedrich

1999-01-01

185

Specification and validation of communications in client\\/server models  

Microsoft Academic Search

Errors such as deadlock and race conditions are very common yet extremely difficult to debug in the communications design of client\\/server models based on remote procedure calls and multi-threading. This paper presents an effective approach to detecting these errors. It shows how to apply the specification and validation techniques used in protocol engineering to discover those errors in the early

F. Joe Lin

1994-01-01

186

Direct Methanol Fuel Cell Experimental and Model Validation Study  

E-print Network

Direct Methanol Fuel Cell Experimental and Model Validation Study M. Mench, J. Scott, S. Thynell boundary Fuel cell performance Current density distribution measurements Conclusions #12;3 Method, flow rate, species inlet and fuel cell temperature, and humidity. Transparent polycarbonate windows

Wang, Chao-Yang

187

Validation of an operational model of direct recharge and evapotranspiration  

NASA Astrophysics Data System (ADS)

This work describes the validation of a distributed model for estimating direct recharge and evapotranspiration over arid and semiarid regions. This validation was performed for a lysimeter-site planted to festuca (grown under controlled irrigated treatment) and for two months, June and July 2003. The model, which can be classified as a distributed water balance model, puts its emphasis on two devising aspects. First, a detailed description of the effect of the land use on the water balance through processes of evaporation/transpiration and the evolution in time of the vegetated surfaces on the area. Second, the operational character of the model. The model was conceived to run integrated into a Geographical Information System and incorporates the pre-processing of the needed input parameters. This pre-processing comprises the use of remote sensing observations to monitor the plants status and their dynamics. In this study, agrometeorogical station records and information on irrigation scheduling, soil hydraulic properties and the festuca culture were used to run the model, whereas lysimeter measurements were used as validation data. Moreover, the performance of the model was checked for contrasting water conditions of the soil: completely wet and dried out.

Rubio, Eva; Mejuto, Miguel F.; Calera, Alfonso; Vela, Alicia; Castano, Santiago; Moratalla, Agueda

2004-02-01

188

VALIDATION OF ACOUSTIC MODELS OF AUDITORY NEURAL PROSTHESES  

PubMed Central

Acoustic models have been used in numerous studies over the past thirty years to simulate the percepts elicited by auditory neural prostheses. In these acoustic models, incoming signals are processed the same way as in a cochlear implant speech processor. The percepts that would be caused by electrical stimulation in a real cochlear implant are simulated by modulating the amplitude of either noise bands or sinusoids. Despite their practical usefulness these acoustic models have never been convincingly validated. This study presents a tool to conduct such validation using subjects who have a cochlear implant in one ear and have near perfect hearing in the other ear, allowing for the first time a direct perceptual comparison of the output of acoustic models to the stimulation provided by a cochlear implant. PMID:25435816

Svirsky, Mario A.; Ding, Nai; Sagi, Elad; Tan, Chin-Tuan; Fitzgerald, Matthew; Glassman, E. Katelyn; Seward, Keena; Neuman, Arlene C.

2014-01-01

189

Numerical model for the performance prediction of a PEM fuel cell. Model results and experimental validation  

Microsoft Academic Search

This work presents a Computational Fluid Dynamics (CFD) model developed for a 50cm2 fuel cell with parallel and serpentine flow field bipolar plates, and its validation against experimental measurements. The numerical CFD model was developed using the commercial ANSYS FLUENT software, and the results obtained were compared with the experimental results in order to perform a model validation. A single

Alfredo Iranzo; Miguel Muñoz; Felipe Rosa; Javier Pino

2010-01-01

190

Development, Verification, and Validation of Multiphase Models for Polydisperse Flows  

Microsoft Academic Search

This report describes in detail the technical findings of the DOE Award entitled 'Development, Verification, and Validation of Multiphase Models for Polydisperse Flows.' The focus was on high-velocity, gas-solid flows with a range of particle sizes. A complete mathematical model was developed based on first principles and incorporated into MFIX. The solid-phase description took two forms: the Kinetic Theory of

Christine Hrenya; Ray Cocco; Rodney Fox; Shankar Subramaniam; Sankaran Sundaresan

2011-01-01

191

Finite Element Model Development and Validation for Aircraft Fuselage Structures  

NASA Technical Reports Server (NTRS)

The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.

Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

2000-01-01

192

Analysis of Experimental Data for High Burnup PWR Spent Fuel Isotopic Validation - Vandellos II Reactor  

SciTech Connect

This report is one of the several recent NUREG/CR reports documenting benchmark-quality radiochemical assay data and the use of the data to validate computer code predictions of isotopic composition for spent nuclear fuel, to establish the uncertainty and bias associated with code predictions. The experimental data analyzed in the current report were acquired from a high-burnup fuel program coordinated by Spanish organizations. The measurements included extensive actinide and fission product data of importance to spent fuel safety applications, including burnup credit, decay heat, and radiation source terms. Six unique spent fuel samples from three uranium oxide fuel rods were analyzed. The fuel rods had a 4.5 wt % {sup 235}U initial enrichment and were irradiated in the Vandellos II pressurized water reactor operated in Spain. The burnups of the fuel samples range from 42 to 78 GWd/MTU. The measurements were used to validate the two-dimensional depletion sequence TRITON in the SCALE computer code system.

Ilas, Germina [ORNL; Gauld, Ian C [ORNL

2011-01-01

193

Validating the BHR RANS model for variable density turbulence  

SciTech Connect

The BHR RANS model is a turbulence model for multi-fluid flows in which density variation plays a strong role in the turbulence processes. In this paper they demonstrate the usefulness of BHR over a wide range of flows which include the effects of shear, buoyancy, and shocks. The results are in good agreement with experimental and DNS data across the entire set of validation cases, with no need to retune model coefficients between cases. The model has potential application to a number of aerospace related flow problems.

Israel, Daniel M [Los Alamos National Laboratory; Gore, Robert A [Los Alamos National Laboratory; Stalsberg - Zarling, Krista L [Los Alamos National Laboratory

2009-01-01

194

Tutorial: Building Ptolemy II Models Graphically Edward A. Lee  

E-print Network

Tutorial: Building Ptolemy II Models Graphically Edward A. Lee Stephen Neuendorffer Electrical Modeling and Design 1 Tutorial: Building Ptolemy II Models Graphically Authors: Edward A. Lee Steve Neuendorffer 1. Introduction This tutorial document explains how to build Ptolemy II models using Vergil

195

A process improvement model for software verification and validation  

NASA Technical Reports Server (NTRS)

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

Callahan, John; Sabolish, George

1994-01-01

196

A process improvement model for software verification and validation  

NASA Technical Reports Server (NTRS)

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

Callahan, John; Sabolish, George

1994-01-01

197

Attempted validation of ICRP 30 and ICRP 66 respiratory models.  

PubMed

The validation of human biological models for inhaled radionuclides is nearly impossible. Requirements for validation are: (1) the measurement of the relevant human tissue data and (2) valid exposure measurements over the interval known to apply to tissue uptake. Two lung models, ICRP 30(1) and ICRP 66(2), are widely used to estimate lung doses following acute occupational or environmental exposure. Both ICRP 30 and 66 lung models are structured to estimate acute rather than chronic exposure. Two sets of human tissue measurements are available: (210)Po accumulated in tissue from inhaled cigarettes and ingested in diet and airborne global fallout (239,240)Pu accumulated in the lungs from inhalation. The human tissue measurements include pulmonary and bronchial tissue in smokers, ex-smokers and non-smokers analysed radiochemically for (210)Po, and pulmonary, bronchial and lymph nodes analysed for (239,240)Pu in lung tissue collected by the New York City Medical Examiner from 1972 to 1974. Both ICRP 30 and 66 models were included in a programme to accommodate chronic uptake. Neither lung model accurately described the estimated tissue concentrations but was within a factor of 2 from measurements. ICRP 66 was the exception and consistently overestimated the bronchial concentrations probably because of its assumption of an overly long 23-d clearance half-time in the bronchi and bronchioles. PMID:22923255

Harley, N H; Fisenne, I M; Robbins, E S

2012-11-01

198

Robust cross-validation of linear regression QSAR models.  

PubMed

A quantitative structure-activity relationship (QSAR) model is typically developed to predict the biochemical activity of untested compounds from the compounds' molecular structures. "The gold standard" of model validation is the blindfold prediction when the model's predictive power is assessed from how well the model predicts the activity values of compounds that were not considered in any way during the model development/calibration. However, during the development of a QSAR model, it is necessary to obtain some indication of the model's predictive power. This is often done by some form of cross-validation (CV). In this study, the concepts of the predictive power and fitting ability of a multiple linear regression (MLR) QSAR model were examined in the CV context allowing for the presence of outliers. Commonly used predictive power and fitting ability statistics were assessed via Monte Carlo cross-validation when applied to percent human intestinal absorption, blood-brain partition coefficient, and toxicity values of saxitoxin QSAR data sets, as well as three known benchmark data sets with known outlier contamination. It was found that (1) a robust version of MLR should always be preferred over the ordinary-least-squares MLR, regardless of the degree of outlier contamination and that (2) the model's predictive power should only be assessed via robust statistics. The Matlab and java source code used in this study is freely available from the QSAR-BENCH section of www.dmitrykonovalov.org for academic use. The Web site also contains the java-based QSAR-BENCH program, which could be run online via java's Web Start technology (supporting Windows, Mac OSX, Linux/Unix) to reproduce most of the reported results or apply the reported procedures to other data sets. PMID:18826208

Konovalov, Dmitry A; Llewellyn, Lyndon E; Vander Heyden, Yvan; Coomans, Danny

2008-10-01

199

KINEROS2-AGWA: Model Use, Calibration, and Validation  

NASA Technical Reports Server (NTRS)

KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

2013-01-01

200

Verifying and Validating Proposed Models for FSW Process Optimization  

NASA Technical Reports Server (NTRS)

This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

Schneider, Judith

2008-01-01

201

Validation of the SUNY Satellite Model in a Meteosat Evironment  

SciTech Connect

The paper presents a validation of the SUNY satellite-to-irradiance model against four ground-truth stations from the Indian solar radiation network located in and around the province of Rajasthan, India. The SUNY model had initially been developed and tested to process US weather satellite data from the GOES series and has been used as part of the production of the US National Solar Resource Data Base (NSRDB). Here the model is applied to processes data from the European weather satellites Meteosat 5 and 7.

Perez, R.; Schlemmer, J.; Renne, D.; Cowlin, S.; George, R.; Bandyopadhyay, B.

2009-01-01

202

Dynamic Modeling and Wavelet-Based Multi-Parametric Tuning and Validation for HVAC Systems  

E-print Network

of the dynamic models is important before their application. Parameter tuning and model validation is a crucial way to improve the accuracy and reliability of the dynamic models. Traditional parameter tuning and validation methods are generally time...

Liang, Shuangshuang

2014-07-10

203

Microelectronics package design using experimentally-validated modeling and simulation.  

SciTech Connect

Packaging high power radio frequency integrated circuits (RFICs) in low temperature cofired ceramic (LTCC) presents many challenges. Within the constraints of LTCC fabrication, the design must provide the usual electrical isolation and interconnections required to package the IC, with additional consideration given to RF isolation and thermal management. While iterative design and prototyping is an option for developing RFIC packaging, it would be expensive and most likely unsuccessful due to the complexity of the problem. To facilitate and optimize package design, thermal and mechanical simulations were used to understand and control the critical parameters in LTCC package design. The models were validated through comparisons to experimental results. This paper summarizes an experimentally-validated modeling approach to RFIC package design, and presents some results and key findings.

Johnson, Jay Dean; Young, Nathan Paul; Ewsuk, Kevin Gregory

2010-11-01

204

Rationality Validation of a Layered Decision Model for Network Defense  

SciTech Connect

We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDM rationality through simulation.

Wei, Huaqiang; Alves-Foss, James; Zhang, Du; Frincke, Deb

2007-08-31

205

Approaches to Validation of Models for Low Gravity Fluid Behavior  

NASA Technical Reports Server (NTRS)

This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

2005-01-01

206

ENERGETIC MATERIAL RESPONSE IN A COOKOFF MODEL VALIDATION EXPERIMENT  

Microsoft Academic Search

The cookoff experiments described in this paper belong to the small-scale experimental portion of a three-year phased study of the slow cookoff problem. This paper presents the response of three energetic materials in a small-scale cookoff experiment. The experimental effort is being used to validate the cookoff models currently under development by the Department of Energy (DOE).1-2 In this phase

A. I. Atwood; P. O. Curran; D. T. Bui; T. L. Boggs; K. B. Lee

207

Validation of the NATO-standard ship signature model (SHIPIR)  

Microsoft Academic Search

ABSTRACT An integrated naval infrared target, threat and countermeasure simulator (SHIPIR\\/NTCS) has been developed. The SHIPIR component,of the model has been adopted by both NATO and the US Navy as a common,tool for predicting the infrared (IR) signature of naval ships in their background. The US Navy has taken a lead role in further developing and validating SHIPIR for use

D. a. Vaitekunas; D. s. Fraedrich

208

Validation of a CFD model for predicting film cooling performance  

Microsoft Academic Search

The validation of a CFD code for predicting supersonic, tangential injection film cooling performance is presented. Three different experimental film cooling studies have been simulated using the MDNS3D CFD code, and results are shown for comparison with the experimental data. MDNS3D is a Reynolds Averaged Navier-Stokes code with fully coupled k-epsilon turbulence and finite rate chemical kinetics models. Free shear

S. C. Ward; D. A. Fricker; R. S. Lee

1993-01-01

209

Reuse of a Formal Model for Requirements Validation  

NASA Technical Reports Server (NTRS)

This paper reports experience from how a project engaged in the process of requirements analysis for evolutionary builds can reuse the formally specified design model produced for a similar, earlier project in the same domain. Two levels of reuse are described here. First, a formally specified generic design model was generated on one project to systematically capture the design commonality in a set of software monitors on board a spacecraft. These monitors periodically check for faults and invoke recovery software when needed. The paper summarizes the use of the design model to validate the software design of the various monitors on that first project. Secondly, the paper describes how the formal design model created for the first project was reused on a second, subsequent project. The model was reused to validate the evolutionary requirements for the second project's software monitors, which were being developed in a series of builds. Some mismatches due to the very different architectures on the two projects suggested changes to make the model more generic. In addition, several advantages to the reuse of the first project's formal model on the second project are reported.

Lutz, Robyn R.

1997-01-01

210

Validation of detailed thermal hydraulic models used for LMR safety and for improvement of technical specifications  

SciTech Connect

Detailed steady-state and transient coolant temperatures and flow rates from an operating reactor have been used to validate the multiple pin model in the SASSYS-1 liquid metal reactor systems analysis code. This multiple pin capability can be used for explicit calculations of axial and lateral temperature distributions within individual subassemblies. Thermocouples at a number of axial locations and in a number of different coolant sub-channels m the XXO9 instrumented subassembly in the EBR-II reactor provided temperature data from the Shutdown Heat Removal Test (SHRT) series. Flow meter data for XXO9 and for the overall system are also available from these tests. Results of consistent SASSYS-1 multiple pin analyses for both the SHRT-45 loss-of-flow-without-scram-test and the S14RT-17 protected loss-of-flow test agree well with the experimental data, providing validation of the SASSYS-1 code over a wide range of conditions.

Dunn, F.E.

1995-12-31

211

Finite element modeling for validation of structural damage identification experimentation.  

SciTech Connect

The project described in this report was performed to couple experimental and analytical techniques in the field of structural health monitoring and darnage identification. To do this, a finite dement model was Constructed of a simulated three-story building used for damage identification experiments. The model was used in conjunction with data from thie physical structure to research damage identification algorithms. Of particular interest was modeling slip in joints as a function of bolt torque and predicting the smallest change of torque that could be detected experimentally. After being validated with results from the physical structure, the model was used to produce data to test the capabilities of damage identification algorithms. This report describes the finite element model constructed, the results obtained, and proposed future use of the model.

Stinemates, D. W. (Daniel W.); Bennett, J. G. (Joel G.)

2001-01-01

212

Open-source MFIX-DEM software for gas-solids flows: Part II Validation studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen [National Energy Technology Laboratory (NETL); Garg, Rahul [National Energy Technology Laboratory (NETL); Galvin, Janine [National Energy Technology Laboratory (NETL); Pannala, Sreekanth [ORNL

2012-01-01

213

Open-Source MFIX-DEM Software for Gas-Solids Flows: Part II - Validation Studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas–solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen

2012-04-01

214

In-Drift Microbial Communities Model Validation Calculation  

SciTech Connect

The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

D. M. Jolley

2001-10-31

215

IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS  

SciTech Connect

The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

D.M. Jolley

2001-12-18

216

In-Drift Microbial Communities Model Validation Calculations  

SciTech Connect

The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

D. M. Jolley

2001-09-24

217

Short-Term Mortality Prediction for Acute Lung Injury Patients: External Validation of the ARDSNet Prediction Model  

PubMed Central

Objective An independent cohort of acute lung injury (ALI) patients was used to evaluate the external validity of a simple prediction model for short-term mortality previously developed using data from ARDS Network (ARDSNet) trials. Design, Setting, and Patients Data for external validation were obtained from a prospective cohort study of ALI patients from 13 ICUs at four teaching hospitals in Baltimore, Maryland. Measurements and Main Results Of the 508 non-trauma, ALI patients eligible for this analysis, 234 (46%) died in-hospital. Discrimination of the ARDSNet prediction model for inhospital mortality, evaluated by the area under the receiver operator characteristics curves (AUC), was 0.67 for our external validation dataset versus 0.70 and 0.68 using APACHE II and the ARDSNet validation dataset, respectively. In evaluating calibration of the model, predicted versus observed in-hospital mortality for the external validation dataset was similar for both low risk (ARDSNet model score = 0) and high risk (score = 3 or 4+) patient strata. However, for intermediate risk (score = 1 or 2) patients, observed in-hospital mortality was substantially higher than predicted mortality (25.3% vs. 16.5% and 40.6% vs. 31.0% for score = 1 and 2, respectively). Sensitivity analyses limiting our external validation data set to only those patients meeting the ARDSNet trial eligibility criteria and to those who received mechanical ventilation in compliance with the ARDSNet ventilation protocol, did not substantially change the model’s discrimination or improve its calibration. Conclusions Evaluation of the ARDSNet prediction model using an external ALI cohort demonstrated similar discrimination of the model as was observed with the ARDSNet validation dataset. However, there were substantial differences in observed versus predicted mortality among intermediate risk ALI patients. The ARDSNet model provided reasonable, but imprecise, estimates of predicted mortality when applied to our external validation cohort of ALI patients. PMID:21761595

Damluji, Abdulla; Colantuoni, Elizabeth; Mendez-Tellez, Pedro A.; Sevransky, Jonathan E.; Fan, Eddy; Shanholtz, Carl; Wojnar, Margaret; Pronovost, Peter J.; Needham, Dale M.

2011-01-01

218

A validated approach for modeling collapse of steel structures  

NASA Astrophysics Data System (ADS)

A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are shown. The calibration is performed using a particle swarm optimization algorithm to establish accurate parameters when calibrated to circumferentially notched tensile coupons. It is shown that consistent, accurate predictions are attained using the chosen models. The variation of triaxiality in steel material during plastic hardening and softening is reported. The range of triaxiality in steel structures undergoing collapse is investigated in detail and the accuracy of the chosen finite element deletion approaches is discussed. This is done through validation of different structural components and structural frames undergoing severe fracture and collapse.

Saykin, Vitaliy Victorovich

219

Validation of the WATEQ4 geochemical model for uranium  

SciTech Connect

As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

1983-09-01

220

Experimental validation of flexible robot arm modeling and control  

NASA Technical Reports Server (NTRS)

Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

Ulsoy, A. Galip

1989-01-01

221

Validation Analysis of the Shoal Groundwater Flow and Transport Model  

SciTech Connect

Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of groundwater withdrawal activities in the area. The conceptual and numerical models were developed based upon regional hydrogeologic investigations conducted in the 1960s, site characterization investigations (including ten wells and various geophysical and geologic studies) at Shoal itself prior to and immediately after the test, and two site characterization campaigns in the 1990s for environmental restoration purposes (including eight wells and a year-long tracer test). The new wells are denoted MV-1, MV-2, and MV-3, and are located to the northnortheast of the nuclear test. The groundwater model was generally lacking data in the north-northeastern area; only HC-1 and the abandoned PM-2 wells existed in this area. The wells provide data on fracture orientation and frequency, water levels, hydraulic conductivity, and water chemistry for comparison with the groundwater model. A total of 12 real-number validation targets were available for the validation analysis, including five values of hydraulic head, three hydraulic conductivity measurements, three hydraulic gradient values, and one angle value for the lateral gradient in radians. In addition, the fracture dip and orientation data provide comparisons to the distributions used in the model and radiochemistry is available for comparison to model output. Goodness-of-fit analysis indicates that some of the model realizations correspond well with the newly acquired conductivity, head, and gradient data, while others do not. Other tests indicated that additional model realizations may be needed to test if the model input distributions need refinement to improve model performance. This approach (generating additional realizations) was not followed because it was realized that there was a temporal component to the data disconnect: the new head measurements are on the high side of the model distributions, but the heads at the original calibration locations themselves have also increased over time. This indicates that the steady-state assumption of the groundwater model is in error. To test the robustness of the model d

A. Hassan; J. Chapman

2008-11-01

222

Calibration and validation of DRAINMOD to model bioretention hydrology  

NASA Astrophysics Data System (ADS)

SummaryPrevious field studies have shown that the hydrologic performance of bioretention cells varies greatly because of factors such as underlying soil type, physiographic region, drainage configuration, surface storage volume, drainage area to bioretention surface area ratio, and media depth. To more accurately describe bioretention hydrologic response, a long-term hydrologic model that generates a water balance is needed. Some current bioretention models lack the ability to perform long-term simulations and others have never been calibrated from field monitored bioretention cells with underdrains. All peer-reviewed models lack the ability to simultaneously perform both of the following functions: (1) model an internal water storage (IWS) zone drainage configuration and (2) account for soil-water content using the soil-water characteristic curve. DRAINMOD, a widely-accepted agricultural drainage model, was used to simulate the hydrologic response of runoff entering a bioretention cell. The concepts of water movement in bioretention cells are very similar to those of agricultural fields with drainage pipes, so many bioretention design specifications corresponded directly to DRAINMOD inputs. Detailed hydrologic measurements were collected from two bioretention field sites in Nashville and Rocky Mount, North Carolina, to calibrate and test the model. Each field site had two sets of bioretention cells with varying media depths, media types, drainage configurations, underlying soil types, and surface storage volumes. After 12 months, one of these characteristics was altered - surface storage volume at Nashville and IWS zone depth at Rocky Mount. At Nashville, during the second year (post-repair period), the Nash-Sutcliffe coefficients for drainage and exfiltration/evapotranspiration (ET) both exceeded 0.8 during the calibration and validation periods. During the first year (pre-repair period), the Nash-Sutcliffe coefficients for drainage, overflow, and exfiltration/ET ranged from 0.6 to 0.9 during both the calibration and validation periods. The bioretention cells at Rocky Mount included an IWS zone. For both the calibration and validation periods, the modeled volume of exfiltration/ET was within 1% and 5% of the estimated volume for the cells with sand (Sand cell) and sandy clay loam (SCL cell) underlying soils, respectively. Nash-Sutcliffe coefficients for the SCL cell during both the calibration and validation periods were 0.92.

Brown, R. A.; Skaggs, R. W.; Hunt, W. F.

2013-04-01

223

VALIDATION OF COMPUTER MODELS FOR RADIOACTIVE MATERIAL SHIPPING PACKAGES  

SciTech Connect

Computer models are abstractions of physical reality and are routinely used for solving practical engineering problems. These models are prepared using large complex computer codes that are widely used in the industry. Patran/Thermal is such a finite element computer code that is used for solving complex heat transfer problems in the industry. Finite element models of complex problems involve making assumptions and simplifications that depend upon the complexity of the problem and upon the judgment of the analysts. The assumptions involve mesh size, solution methods, convergence criteria, material properties, boundary conditions, etc. that could vary from analyst to analyst. All of these assumptions are, in fact, candidates for a purposeful and intended effort to systematically vary each in connection with the others to determine there relative importance or expected overall effect on the modeled outcome. These kinds of models derive from the methods of statistical science and are based on the principles of experimental designs. These, as all computer models, must be validated to make sure that the output from such an abstraction represents reality [1,2]. A new nuclear material packaging design, called 9977, which is undergoing a certification design review, is used to assess the capability of the Patran/Thermal computer model to simulate 9977 thermal response. The computer model for the 9977 package is validated by comparing its output with the test data collected from an actual thermal test performed on a full size 9977 package. Inferences are drawn by performing statistical analyses on the residuals (test data--model predictions).

Gupta, N; Gene Shine, G; Cary Tuckfield, C

2007-05-07

224

Rapid target gene validation in complex cancer mouse models using re-derived embryonic stem cells  

PubMed Central

Human cancers modeled in Genetically Engineered Mouse Models (GEMMs) can provide important mechanistic insights into the molecular basis of tumor development and enable testing of new intervention strategies. The inherent complexity of these models, with often multiple modified tumor suppressor genes and oncogenes, has hampered their use as preclinical models for validating cancer genes and drug targets. In our newly developed approach for the fast generation of tumor cohorts we have overcome this obstacle, as exemplified for three GEMMs; two lung cancer models and one mesothelioma model. Three elements are central for this system; (i) The efficient derivation of authentic Embryonic Stem Cells (ESCs) from established GEMMs, (ii) the routine introduction of transgenes of choice in these GEMM-ESCs by Flp recombinase-mediated integration and (iii) the direct use of the chimeric animals in tumor cohorts. By applying stringent quality controls, the GEMM-ESC approach proofs to be a reliable and effective method to speed up cancer gene assessment and target validation. As proof-of-principle, we demonstrate that MycL1 is a key driver gene in Small Cell Lung Cancer. PMID:24401838

Huijbers, Ivo J; Bin Ali, Rahmen; Pritchard, Colin; Cozijnsen, Miranda; Kwon, Min-Chul; Proost, Natalie; Song, Ji-Ying; Vries, Hilda; Badhai, Jitendra; Sutherland, Kate; Krimpenfort, Paul; Michalak, Ewa M; Jonkers, Jos; Berns, Anton

2014-01-01

225

Rapid target gene validation in complex cancer mouse models using re-derived embryonic stem cells.  

PubMed

Human cancers modeled in Genetically Engineered Mouse Models (GEMMs) can provide important mechanistic insights into the molecular basis of tumor development and enable testing of new intervention strategies. The inherent complexity of these models, with often multiple modified tumor suppressor genes and oncogenes, has hampered their use as preclinical models for validating cancer genes and drug targets. In our newly developed approach for the fast generation of tumor cohorts we have overcome this obstacle, as exemplified for three GEMMs; two lung cancer models and one mesothelioma model. Three elements are central for this system; (i) The efficient derivation of authentic Embryonic Stem Cells (ESCs) from established GEMMs, (ii) the routine introduction of transgenes of choice in these GEMM-ESCs by Flp recombinase-mediated integration and (iii) the direct use of the chimeric animals in tumor cohorts. By applying stringent quality controls, the GEMM-ESC approach proofs to be a reliable and effective method to speed up cancer gene assessment and target validation. As proof-of-principle, we demonstrate that MycL1 is a key driver gene in Small Cell Lung Cancer. PMID:24401838

Huijbers, Ivo J; Bin Ali, Rahmen; Pritchard, Colin; Cozijnsen, Miranda; Kwon, Min-Chul; Proost, Natalie; Song, Ji-Ying; de Vries, Hilda; Badhai, Jitendra; Sutherland, Kate; Krimpenfort, Paul; Michalak, Ewa M; Jonkers, Jos; Berns, Anton

2014-02-01

226

Coupled Thermal-Chemical-Mechanical Modeling of Validation Cookoff Experiments  

SciTech Connect

The cookoff of energetic materials involves the combined effects of several physical and chemical processes. These processes include heat transfer, chemical decomposition, and mechanical response. The interaction and coupling between these processes influence both the time-to-event and the violence of reaction. The prediction of the behavior of explosives during cookoff, particularly with respect to reaction violence, is a challenging task. To this end, a joint DoD/DOE program has been initiated to develop models for cookoff, and to perform experiments to validate those models. In this paper, a series of cookoff analyses are presented and compared with data from a number of experiments for the aluminized, RDX-based, Navy explosive PBXN-109. The traditional thermal-chemical analysis is used to calculate time-to-event and characterize the heat transfer and boundary conditions. A reaction mechanism based on Tarver and McGuire's work on RDX{sup 2} was adjusted to match the spherical one-dimensional time-to-explosion data. The predicted time-to-event using this reaction mechanism compares favorably with the validation tests. Coupled thermal-chemical-mechanical analysis is used to calculate the mechanical response of the confinement and the energetic material state prior to ignition. The predicted state of the material includes the temperature, stress-field, porosity, and extent of reaction. There is little experimental data for comparison to these calculations. The hoop strain in the confining steel tube gives an estimation of the radial stress in the explosive. The inferred pressure from the measured hoop strain and calculated radial stress agree qualitatively. However, validation of the mechanical response model and the chemical reaction mechanism requires more data. A post-ignition burn dynamics model was applied to calculate the confinement dynamics. The burn dynamics calculations suffer from a lack of characterization of the confinement for the flaw-dominated failure mode experienced in the tests. High-pressure burning rates are needed for more detailed post-ignition studies. Sub-models for chemistry, mechanical response and burn dynamics need to be validated against data from less complex experiments. The sub-models can then be used in integrated analysis for comparison with experimental data taken during integrated tests.

ERIKSON,WILLIAM W.; SCHMITT,ROBERT G.; ATWOOD,A.I.; CURRAN,P.D.

2000-11-27

227

Validation of High Displacement Piezoelectric Actuator Finite Element Models  

NASA Technical Reports Server (NTRS)

The paper presents the results obtained by using NASTRAN(Registered Trademark) and ANSYS(Regitered Trademark) finite element codes to predict doming of the THUNDER piezoelectric actuators during the manufacturing process and subsequent straining due to an applied input voltage. To effectively use such devices in engineering applications, modeling and characterization are essential. Length, width, dome height, and thickness are important parameters for users of such devices. Therefore, finite element models were used to assess the effects of these parameters. NASTRAN(Registered Trademark) and ANSYS(Registered Trademark) used different methods for modeling piezoelectric effects. In NASTRAN(Registered Trademark), a thermal analogy was used to represent voltage at nodes as equivalent temperatures, while ANSYS(Registered Trademark) processed the voltage directly using piezoelectric finite elements. The results of finite element models were validated by using the experimental results.

Taleghani, B. K.

2000-01-01

228

Leading compounds for the validation of animal models of psychopathology.  

PubMed

Modelling of complex psychiatric disorders, e.g., depression and schizophrenia, in animals is a major challenge, since they are characterized by certain disturbances in functions that are absolutely unique to humans. Furthermore, we still have not identified the genetic and neurobiological mechanisms, nor do we know precisely the circuits in the brain that function abnormally in mood and psychotic disorders. Consequently, the pharmacological treatments used are mostly variations on a theme that was started more than 50 years ago. Thus, progress in novel drug development with improved therapeutic efficacy would benefit greatly from improved animal models. Here, we review the available animal models of depression and schizophrenia and focus on the way that they respond to various types of potential candidate molecules, such as novel antidepressant or antipsychotic drugs, as an index of predictive validity. We conclude that the generation of convincing and useful animal models of mental illnesses could be a bridge to success in drug discovery. PMID:23942897

Micale, Vincenzo; Kucerova, Jana; Sulcova, Alexandra

2013-10-01

229

Validation of coupled atmosphere-fire behavior models  

SciTech Connect

Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexity of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.

Bossert, J.E.; Reisner, J.M.; Linn, R.R.; Winterkamp, J.L. [Los Alamos National Lab., NM (United States); Schaub, R. [Dynamac Corp., Kennedy Space Center, FL (United States); Riggan, P.J. [Forest Service, Riverside, CA (United States)

1998-12-31

230

A model for the separation of cloud and aerosol in SAGE II occultation data  

NASA Technical Reports Server (NTRS)

The Stratospheric Aerosol and Gas Experiment (SAGE) II satellite experiment measures the extinction due to aerosols and thin cloud, at wavelengths of 0.525 and 1.02 micrometers, down to an altitude of 6 km. The wavelength dependence of the extinction due to aerosols differs from that of the extinction due to cloud and is used as the basis of a model for separating these two components. The model is presented and its validation using airborne lidar data, obtained coincident with SAGE II observations, is described. This comparison shows that smaller SAGE II cloud extinction values correspond to the presence of subvisible cirrus cloud in the lidar record. Examples of aerosol and cloud data products obtained using this model to interpret SAGE II upper tropospheric and lower stratospheric data are also shown.

Kent, G. S.; Winker, D. M.; Osborn, M. T.; Skeens, K. M.

1993-01-01

231

Validation of thermal models for a prototypical MEMS thermal actuator.  

SciTech Connect

This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the polycrystalline silicon test structures, as well as uncontrolled nonuniform changes in this quantity over time and during operation.

Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

2008-09-01

232

Deviatoric constitutive model: domain of strain rate validity  

SciTech Connect

A case is made for using an enhanced methodology in determining the parameters that appear in a deviatoric constitutive model. Predictability rests on our ability to solve a properly posed initial boundary value problem (IBVP), which incorporates an accurate reflection of material constitutive behavior. That reflection is provided through the constitutive model. Moreover, the constitutive model is required for mathematical closure of the IBVP. Common practice in the shock physics community is to divide the Cauchy tensor into spherical and deviatoric parts, and to develop separate models for spherical and deviatoric constitutive response. Our focus shall be on the Cauchy deviator and deviatoric constitutive behavior. Discussions related to the spherical part of the Cauchy tensor are reserved for another time. A number of deviatoric constitutive models have been developed for utilization in the solution of IBVPs that are of interest to those working in the field of shock physics, e.g. All of these models are phenomenological and contain a number of parameters that must be determined in light of experimental data. The methodology employed in determining these parameters dictates the loading regime over which the model can be expected to be accurate. The focus of this paper is the methodology employed in determining model parameters and the consequences of that methodology as it relates to the domain of strain rate validity. We shall begin by describing the methodology that is typically employed. We shall discuss limitations imposed upon predictive capability by the typically employed methodology. We shall propose a modification to the typically employed methodology that significantly extends the domain of strain rate validity.

Zocher, Marvin A [Los Alamos National Laboratory

2009-01-01

233

Volumetric Intraoperative Brain Deformation Compensation: Model Development and Phantom Validation  

PubMed Central

During neurosurgery, nonrigid brain deformation may affect the reliability of tissue localization based on preoperative images. To provide accurate surgical guidance in these cases, preoperative images must be updated to reflect the intraoperative brain. This can be accomplished by warping these preoperative images using a biomechanical model. Due to the possible complexity of this deformation, intraoperative information is often required to guide the model solution. In this paper, a linear elastic model of the brain is developed to infer volumetric brain deformation associated with measured intraoperative cortical surface displacement. The developed model relies on known material properties of brain tissue, and does not require further knowledge about intraoperative conditions. To provide an initial estimation of volumetric model accuracy, as well as determine the model’s sensitivity to the specified material parameters and surface displacements, a realistic brain phantom was developed. Phantom results indicate that the linear elastic model significantly reduced localization error due to brain shift, from >16 mm to under 5 mm, on average. In addition, though in vivo quantitative validation is necessary, preliminary application of this approach to images acquired during neocortical epilepsy cases confirms the feasibility of applying the developed model to in vivo data. PMID:22562728

DeLorenzo, Christine; Papademetris, Xenophon; Staib, Lawrence H.; Vives, Kenneth P.; Spencer, Dennis D.; Duncan, James S.

2012-01-01

234

Parallel Measurement and Modeling of Transport in the Darht II Beamline on ETA II  

Microsoft Academic Search

To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data

F. W. Chambers; B. A. Raymond; S. Falabella; B. S. Lee; R. A. Richardson; J. T. Weir; H. A. Davis; M. E. Schultze

2005-01-01

235

Dynamic validation of the Planck-LFI thermal model  

NASA Astrophysics Data System (ADS)

The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its validation is therefore extremely important in the context of the Planck mission. Our analysis shows that the measured thermal properties of the instrument show a thermal damping level better than predicted, therefore further reducing the expected systematic effect induced in the LFI maps. We then propose an explanation of the increased damping in terms of non-ideal thermal contacts.

Tomasi, M.; Cappellini, B.; Gregorio, A.; Colombo, F.; Lapolla, M.; Terenzi, L.; Morgante, G.; Bersanelli, M.; Butler, R. C.; Galeotta, S.; Mandolesi, N.; Maris, M.; Mennella, A.; Valenziano, L.; Zacchei, A.

2010-01-01

236

Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology  

PubMed Central

We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations. PMID:25493967

Krishnamoorthi, Shankarjee; Perotti, Luigi E.; Borgstrom, Nils P.; Ajijola, Olujimi A.; Frid, Anna; Ponnaluri, Aditya V.; Weiss, James N.; Qu, Zhilin; Klug, William S.; Ennis, Daniel B.; Garfinkel, Alan

2014-01-01

237

Low frequency eddy current benchmark study for model validation  

SciTech Connect

This paper presents results of an eddy current model validation study. Precise measurements were made using an impedance analyzer to investigate changes in impedance due to Electrical Discharge Machining (EDM) notches in aluminum plates. Each plate contained one EDM notch at an angle of 0, 10, 20, or 30 degrees from the normal of the plate surface. Measurements were made with the eddy current probe both scanning parallel and perpendicular to the notch length. The experimental response from the vertical and oblique notches will be reported and compared to results from different numerical simulation codes.

Mooers, R. D.; Boehnlein, T. R. [University of Dayton Research Institute, Structural Integrity Division, Dayton, OH, 45469 (United States); Cherry, M. R.; Knopp, J. S. [Air Force Research Lab, NDE Division, Wright Patterson, OH 45433 (United States); Aldrin, J. C. [Computational Tools, Gurnee, IL 60031 (United States); Sabbagh, H. A. [Victor Technologies LLC, Bloomington, IN 47401 (United States)

2011-06-23

238

Bolted connection modeling and validation through laser-aided testing  

NASA Astrophysics Data System (ADS)

Bolted connections are widely employed in facility structures, such as light masts, transmission poles, and wind turbine towers. The complex connection behavior plays a significant role in the overall dynamic characteristics of a structure. A finite element (FE) modeling study of a bolt-connected square tubular steel beam is presented in this paper. Modal testing was performed in a controlled laboratory condition to validate the FE model, developed for the bolted beam. Two laser Doppler vibrometers were used simultaneously to measure structural vibration. A simplified joint model was proposed to further save computation time for structures with bolted connections. This study is an on-going effort to marshal knowledge associated with detecting damage on facility structures with bolted connections.

Dai, Kaoshan; Gong, Changqing; Smith, Benjiamin

2013-04-01

239

Validation of GOCE densities and evaluation of thermosphere models  

NASA Astrophysics Data System (ADS)

Atmospheric densities from ESA’s GOCE satellite at a mean altitude of 270 km are validated by comparison with predictions from the near real time model HASDM along the GOCE orbit in the time frame 1 November 2009 through 31 May 2012. Except for a scale factor of 1.29, which is due to different aerodynamic models being used in HASDM and GOCE, the agreement is at the 3% (standard deviation) level when comparing daily averages. The models NRLMSISE-00, JB2008 and DTM2012 are compared with the GOCE data. They match at the 10% level, but significant latitude-dependent errors as well as errors with semiannual periodicity are detected. Using the 0.1 Hz sampled data leads to much larger differences locally, and this dataset can be used presently to analyze variations down to scales as small as 150 km.

Bruinsma, S. L.; Doornbos, E.; Bowman, B. R.

2014-08-01

240

Image based validation of dynamical models for cell reorientation.  

PubMed

A key feature of directed cell movement is the ability of cells to reorient quickly in response to changes in the direction of an extracellular stimulus. Mathematical models have suggested quite different regulatory mechanisms to explain reorientation, raising the question of how we can validate these models in a rigorous way. In this study, we fit three reaction-diffusion models to experimental data of Dictyostelium amoebae reorienting in response to alternating gradients of mechanical shear flow. The experimental readouts we use to fit are spatio-temporal distributions of a fluorescent reporter for cortical F-actin labeling the cell front. Experiments performed under different conditions are fitted simultaneously to challenge the models with different types of cellular dynamics. Although the model proposed by Otsuji is unable to provide a satisfactory fit, those suggested by Meinhardt and Levchenko fit equally well. Further, we show that reduction of the three-variable Meinhardt model to a two-variable model also provides an excellent fit, but has the advantage of all parameters being uniquely identifiable. Our work demonstrates that model selection and identifiability analysis, commonly applied to temporal dynamics problems in systems biology, can be a powerful tool when extended to spatio-temporal imaging data. © 2014 The Authors. Published by Wiley Periodicals, Inc. PMID:25492625

Lockley, Robert; Ladds, Graham; Bretschneider, Till

2014-12-01

241

A validation study of a stochastic model of human interaction  

NASA Astrophysics Data System (ADS)

The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.

Burchfield, Mitchel Talmadge

242

Evaluation and cross-validation of Environmental Models  

NASA Astrophysics Data System (ADS)

Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore, preliminary analyses should be carried out under the control and authority of an established international professional Organization or Association, before any final political decision is made by ISO to select a specific Environmental Models, like for example IGRF and DGRF. Of course, Commissions responsible for checking the consistency of definitions, methods and algorithms for data processing might consider to delegate specific tasks (e.g. bench-marking the technical tools, the calibration procedures, the methods of data analysis, and the software algorithms employed in building the different types of models, as well as their usage) to private, intergovernmental or international organization/agencies (e.g.: NASA, ESA, AGU, EGU, COSPAR, . . . ); eventually, the latter should report conclusions to the Commissions members appointed by IAGA or any established authority like IUGG.

Lemaire, Joseph

243

Cross-Cultural Validation of the Beck Depression Inventory-II across U.S. and Turkish Samples  

ERIC Educational Resources Information Center

The purpose of this study was to test the Beck Depression Inventory-II (BDI-II) for factorial invariance across Turkish and U.S. college student samples. The results indicated that (a) a two-factor model has an adequate fit for both samples, thus providing evidence of configural invariance, and (b) there is a metric invariance but "no" sufficient…

Canel-Cinarbas, Deniz; Cui, Ying; Lauridsen, Erica

2011-01-01

244

Validating a spatially distributed hydrological model with soil morphology data  

NASA Astrophysics Data System (ADS)

Spatially distributed models are popular tools in hydrology claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for inputs of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography and artificial drainage. We translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to observed groundwater levels and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the the groundwater level predictions were not accurate enough to be used for the prediction of saturated areas. Groundwater level dynamics were not adequately reproduced and the predicted spatial saturation patterns did not correspond to those estimated from the soil map. Our results indicate that an accurate prediction of the groundwater level dynamics of the shallow groundwater in our catchment that is subject to artificial drainage would require a model that better represents processes at the boundary between the unsaturated and the saturated zone. However, data needed for such a more detailed model are not generally available. This severely hampers the practical use of such models despite their usefulness for scientific purposes.

Doppler, T.; Honti, M.; Zihlmann, U.; Weisskopf, P.; Stamm, C.

2014-09-01

245

Metric attributes of the unified Parkinson's disease rating scale 3.0 battery: part II, construct and content validity.  

PubMed

This article is the second of a two-part series concerning the metric properties of the following three Parkinson's disease (PD) scales: modified Hoehn and Yahr staging (H&Y), Schwab and England (S&E), and Unified Parkinson's Disease Rating Scale (UPDRS) 3.0. Part II focuses on construct and content validity. To assess construct validity, a sample of 1,136 PD patients completed the above-mentioned PD scales. Correlation coefficients between measures of disability and dysfunction [S&E, UPDRS Activities of Daily Living (ADL), and UPDRS Motor Examination] were |r| = 0.69-0.77, indicating good convergent validity. Results showed that the S&E (F(5,945) = 193.47; P < 0.0001) and UPDRS subscales discriminated between modified H&Y stages (F(20,2784) = 25.28; P < 0.001). A panel of 12 to 13 international experts rated the relevance of the scales and items. This enabled the scales' content validity index to be calculated, which ranged from 41.7% (UPDRS Mentation) to 83.3% (UPDRS Motor Examination). In conclusion, while the modified H&Y, S&E, and UPDRS displayed satisfactory construct validity, the content validity of all scales except UPDRS Motor Examination failed to attain adequate standards. PMID:16958134

Forjaz, Maria João; Martinez-Martin, Pablo

2006-11-01

246

A geomagnetically induced current warning system: model development and validation  

NASA Astrophysics Data System (ADS)

Geomagnetically Induced Currents (GIC), which can flow in technological systems at the Earth's surface, are a consequence of magnetic storms and Space Weather. A well-documented practical problem for the power transmission industry is that GIC can affect the lifetime and performance of transformers within the power grid. Operational mitigation is widely considered to be one of the best strategies to manage the Space Weather and GIC risk. Therefore in the UK a magnetic storm warning and GIC monitoring and analysis programme has been under development by the British Geological Survey and Scottish Power plc (the power grid operator for Central Scotland) since 1999. Under the auspices of the European Space Agency's service development activities BGS is developing the capability to meet two key user needs that have been identified. These needs are, firstly, the development of a near real-time solar wind shock/ geomagnetic storm warning, based on L1 solar wind data and, secondly, the development of an integrated surface geo-electric field and power grid network model that should allow prediction of GIC throughout the power grid in near real time. While the final goal is a `seamless package', the components of the package utilise diverse scientific techniques. We review progress to date with particular regard to the validation of the individual components of the package. The Scottish power grid response to the October 2003 magnetic storms is also discussed and model and validation data are presented.

McKay, A.; Clarke, E.; Reay, S.; Thomson, A.

247

Predictive validity of behavioural animal models for chronic pain  

PubMed Central

Rodent models of chronic pain may elucidate pathophysiological mechanisms and identify potential drug targets, but whether they predict clinical efficacy of novel compounds is controversial. Several potential analgesics have failed in clinical trials, in spite of strong animal modelling support for efficacy, but there are also examples of successful modelling. Significant differences in how methods are implemented and results are reported means that a literature-based comparison between preclinical data and clinical trials will not reveal whether a particular model is generally predictive. Limited reports on negative outcomes prevents reliable estimate of specificity of any model. Animal models tend to be validated with standard analgesics and may be biased towards tractable pain mechanisms. But preclinical publications rarely contain drug exposure data, and drugs are usually given in high doses and as a single administration, which may lead to drug distribution and exposure deviating significantly from clinical conditions. The greatest challenge for predictive modelling is, however, the heterogeneity of the target patient populations, in terms of both symptoms and pharmacology, probably reflecting differences in pathophysiology. In well-controlled clinical trials, a majority of patients shows less than 50% reduction in pain. A model that responds well to current analgesics should therefore predict efficacy only in a subset of patients within a diagnostic group. It follows that successful translation requires several models for each indication, reflecting critical pathophysiological processes, combined with data linking exposure levels with effect on target. LINKED ARTICLES This article is part of a themed issue on Translational Neuropharmacology. To view the other articles in this issue visit http://dx.doi.org/10.1111/bph.2011.164.issue-4 PMID:21371010

Berge, Odd-Geir

2011-01-01

248

Literature-derived bioaccumulation models for earthworms: Development and validation  

SciTech Connect

Estimation of contaminant concentrations in earthworms is a critical component in many ecological risk assessments. Without site-specific data, literature-derived uptake factors or models are frequently used. Although considerable research has been conducted on contaminant transfer from soil to earthworms, most studies focus on only a single location. External validation of transfer models has not been performed. The authors developed a database of soil and tissue concentrations for nine inorganic and two organic chemicals. Only studies that presented total concentrations in departed earthworms were included. Uptake factors and simple and multiple regression models of natural-log-transformed concentrations of each analyte in soil and earthworms were developed using data from 26 studies. These models were then applied to data from six additional studies. Estimated and observed earthworm concentrations were compared using nonparametric Wilcoxon signed-rank tests. Relative accuracy and quality of different estimation methods were evaluated by calculating the proportional deviation of the estimate from the measured value. With the exception of Cr, significant, single-variable (e.g., soil concentration) regression models were fit for each analyte. Inclusion of soil Ca improved model fits for Cd and Pb. Soil pH only marginally improved model fits. The best general estimates of chemical concentrations in earthworms were generated by simple ln-ln regression models for As, Cd, Cu, Hg, Mn, Pb, Zn, and polychlorinated biphenyls. No method accurately estimated Cr or Ni in earthworms. Although multiple regression models including pH generated better estimates for a few analytes, in general, the predictive utility gained by incorporating environmental variables was marginal.

Sample, B.E.; Suter, G.W. II; Beauchamp, J.J.; Efroymson, R.A.

1999-09-01

249

Assessing uncertainty in pollutant wash-off modelling via model validation.  

PubMed

Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies. PMID:25169872

Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

2014-11-01

250

Canyon building ventilation system dynamic model -- Parameters and validation  

SciTech Connect

Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical models.'' These component models are functionally integrated to represent the plant. With today's low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in single loop'' design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

Moncrief, B.R. (Westinghouse Savannah River Co., Aiken, SC (United States)); Chen, F.F.K. (Bechtel National, Inc., San Francisco, CA (United States))

1993-01-01

251

NAIRAS aircraft radiation model development, dose climatology, and initial validation  

NASA Astrophysics Data System (ADS)

The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway.

Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

2013-10-01

252

Conformational Analysis of the DFG-Out Kinase Motif and Biochemical Profiling of Structurally Validated Type II Inhibitors.  

PubMed

Structural coverage of the human kinome has been steadily increasing over time. The structures provide valuable insights into the molecular basis of kinase function and also provide a foundation for understanding the mechanisms of kinase inhibitors. There are a large number of kinase structures in the PDB for which the Asp and Phe of the DFG motif on the activation loop swap positions, resulting in the formation of a new allosteric pocket. We refer to these structures as "classical DFG-out" conformations in order to distinguish them from conformations that have also been referred to as DFG-out in the literature but that do not have a fully formed allosteric pocket. We have completed a structural analysis of almost 200 small molecule inhibitors bound to classical DFG-out conformations; we find that they are recognized by both type I and type II inhibitors. In contrast, we find that nonclassical DFG-out conformations strongly select against type II inhibitors because these structures have not formed a large enough allosteric pocket to accommodate this type of binding mode. In the course of this study we discovered that the number of structurally validated type II inhibitors that can be found in the PDB and that are also represented in publicly available biochemical profiling studies of kinase inhibitors is very small. We have obtained new profiling results for several additional structurally validated type II inhibitors identified through our conformational analysis. Although the available profiling data for type II inhibitors is still much smaller than for type I inhibitors, a comparison of the two data sets supports the conclusion that type II inhibitors are more selective than type I. We comment on the possible contribution of the DFG-in to DFG-out conformational reorganization to the selectivity. PMID:25478866

Vijayan, R S K; He, Peng; Modi, Vivek; Duong-Ly, Krisna C; Ma, Haiching; Peterson, Jeffrey R; Dunbrack, Roland L; Levy, Ronald M

2015-01-01

253

Development and validation of a realistic head model for EEG  

NASA Astrophysics Data System (ADS)

The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients present the unique opportunity to generate sources at known positions in the human brain using the depth electrodes. Known dipolar sources were created inside the human brain at known locations by injecting a weak biphasic current (sub-threshold) between alternate contacts on the depth electrode. The corresponding bioelectric fields (intracranial and scalp EEG) were recorded in patients during the injection of biphasic pulses. The in vivo depth stimulation data provides a direct test of the performance of the forward model. The factors affecting the accuracy of the intracranial measurements are quantified in a precise manner by studying the effects of including different tissue types and anisotropy. The results show that white matter anisotropy is crucial for predicting the electric fields in a precise manner for intracranial locations, thereby affecting the source reconstructions. Accurate modeling of the skull is necessary for predicting accurately the scalp measurements. In sum, with the aid of high-resolution finite element realistic head models it is possible to accurately predict electric fields generated by current sources in the brain and thus in a precise way, understand the relationship between electromagnetic measure and neuronal activity at the voxel-scale.

Bangera, Nitin Bhalchandra

254

First principles Candu fuel model and validation experimentation  

SciTech Connect

Many modeling projects on nuclear fuel rest on a quantitative understanding of the co-existing phases at various stages of burnup. Since the various fission products have considerably different abilities to chemically associate with oxygen, and the O/M ratio is slowly changing as well, the chemical potential (generally expressed as an equivalent oxygen partial pressure) is a function of burnup. Concurrently, well-recognized small fractions of new phases such as inert gas, noble metals, zirconates, etc. also develop. To further complicate matters, the dominant UO{sub 2} fuel phase may be non-stoichiometric and most of minor phases have a variable composition dependent on temperature and possible contact with the coolant in the event of a sheathing defect. A Thermodynamic Fuel Model to predict the phases in partially burned Candu nuclear fuel containing many major fission products has been under development. This model is capable of handling non-stoichiometry in the UO{sub 2} fluorite phase, dilute solution behaviour of significant solute oxides, noble metal inclusions, a second metal solid solution U(Pd-Rh-Ru)3, zirconate and uranate solutions as well as other minor solid phases, and volatile gaseous species. The treatment is a melding of several thermodynamic modeling projects dealing with isolated aspects of this important multi-component system. To simplify the computations, the number of elements has been limited to twenty major representative fission products known to appear in spent fuel. The proportion of elements must first be generated using SCALES-5. Oxygen is inferred from the concentration of the other elements. Provision to study the disposition of very minor fission products is included within the general treatment but these are introduced only on an as needed basis for a particular purpose. The building blocks of the model are the standard Gibbs energies of formation of the many possible compounds expressed as a function of temperature. To these data are added mixing terms associated with the appearance of the component species in particular phases. To validate the model, coulometric titration experiments relating the chemical potential of oxygen to the moles of oxygen introduced to SIMFUEL are underway. A description of the apparatus to oxidize and reduce samples in a controlled way in a H{sub 2}/H{sub 2}O mixture is presented. Existing measurements for irradiated fuel, both defected and non-defected, are also being incorporated into the validation process. (authors)

Corcoran, E.C.; Kaye, M.H.; Lewis, B.J.; Thompson, W.T. [Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Akbari, F. [Atomic Energy of Canada Limited - Chalk River Ontario, Ontario KOJ IJ0 (Canada); Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Higgs, J.D. [Atomic Energy of Canada Limited - 430 Bayside Drive, Saint John, NB E2J 1A8 (Canada); Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Verrall, R.A.; He, Z.; Mouris, J.F. [Atomic Energy of Canada Limited - Chalk River Laboratories, Chalk River Ontario, Ontario KOJ IJ0 (Canada)

2007-07-01

255

Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.  

SciTech Connect

A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

2004-10-01

256

Using remote sensing for validation of a large scale hydrologic and hydrodynamic model in the Amazon  

NASA Astrophysics Data System (ADS)

We present the validation of the large-scale, catchment-based hydrological MGB-IPH model in the Amazon River basin. In this model, physically-based equations are used to simulate the hydrological processes, such as the Penman Monteith method to estimate evapotranspiration, or the Moore and Clarke infiltration model. A new feature recently introduced in the model is a 1D hydrodynamic module for river routing. It uses the full Saint-Venant equations and a simple floodplain storage model. River and floodplain geometry parameters are extracted from SRTM DEM using specially developed GIS algorithms that provide catchment discretization, estimation of river cross-sections geometry and water storage volume variations in the floodplains. The model was forced using satellite-derived daily rainfall TRMM 3B42, calibrated against discharge data and first validated using daily discharges and water levels from 111 and 69 stream gauges, respectively. Then, we performed a validation against remote sensing derived hydrological products, including (i) monthly Terrestrial Water Storage (TWS) anomalies derived from GRACE, (ii) river water levels derived from ENVISAT satellite altimetry data (212 virtual stations from Santos da Silva et al., 2010) and (iii) a multi-satellite monthly global inundation extent dataset at ~25 x 25 km spatial resolution (Papa et al., 2010). Validation against river discharges shows good performance of the MGB-IPH model. For 70% of the stream gauges, the Nash and Suttcliffe efficiency index (ENS) is higher than 0.6 and at Óbidos, close to Amazon river outlet, ENS equals 0.9 and the model bias equals,-4.6%. Largest errors are located in drainage areas outside Brazil and we speculate that it is due to the poor quality of rainfall datasets in these areas poorly monitored and/or mountainous. Validation against water levels shows that model is performing well in the major tributaries. For 60% of virtual stations, ENS is higher than 0.6. But, similarly, largest errors are also located in drainage areas outside Brazil, mostly Japurá River, and in the lower Amazon River. In the latter, correlation with observations is high but the model underestimates the amplitude of water levels. We also found a large bias between model and ENVISAT water levels, ranging from -3 to -15 m. The model provided TWS in good accordance with GRACE estimates. ENS values for TWS over the whole Amazon equals 0.93. We also analyzed results in 21 sub-regions of 4 x 4°. ENS is smaller than 0.8 only in 5 areas, and these are found mostly in the northwest part of the Amazon, possibly due to same errors reported in discharge results. Flood extent validation is under development, but a previous analysis in Brazilian part of Solimões River basin suggests a good model performance. The authors are grateful for the financial and operational support from the brazilian agencies FINEP, CNPq and ANA and from the french observatories HYBAM and SOERE RBV.

Paiva, R. C.; Bonnet, M.; Buarque, D. C.; Collischonn, W.; Frappart, F.; Mendes, C. B.

2011-12-01

257

Long-term ELBARA-II Assistance to SMOS Land Product and Algorithm Validation at the Valencia Anchor Station (MELBEX Experiment 2010-2013)  

NASA Astrophysics Data System (ADS)

The main activity of the Valencia Anchor Station (VAS) is currently now to support the validation of SMOS (Soil Moisture and Ocean Salinity) Level 2 and 3 land products (soil moisture, SM, and vegetation optical depth, TAU). With this aim, the European Space Agency (ESA) has provided the Climatology from Satellites Group of the University of Valencia with an ELBARA-II microwave radiometer under a loan agreement since September 2009. During this time, brightness temperatures (TB) have continuously been acquired, except during normal maintenance or minor repair interruptions. ELBARA-II is an L-band dual-polarization radiometer with two channels (1400-1418 MHz, 1409-1427 MHz). It is continuously measuring over a vineyard field (El Renegado, Caudete de las Fuentes, Valencia) from a 15 m platform with a constant protocol for calibration and angular scanning measurements with the aim to assisting the validation of SMOS land products and the calibration of the L-MEB (L-Band Emission of the Biosphere) -basis for the SMOS Level 2 Land Processor- over the VAS validation site. One of the advantages of using the VAS site is the possibility of studying two different environmental conditions along the year. While the vine cycle extends mainly between April and October, during the rest of the year the area remains under bare soil conditions, adequate for the calibration of the soil model. The measurement protocol currently running has shown to be robust during the whole operation time and will be extended in time as much as possible to continue providing a long-term data set of ELBARA-II TB measurements and retrieved SM and TAU. This data set is also showing to be useful in support of SMOS scientific activities: the VAS area and, specifically the ELBARA-II site, offer good conditions to control the long-term evolution of SMOS Level 2 and Level 3 land products and interpret eventual anomalies that may obscure sensor hidden biases. In addition, SM and TAU that are currently retrieved from the ELBARA-II TB data by inversion of the L-MEB model, can also be compared to the Level 2 and Level 3 SMOS products. L-band ELBARA-II measurements provide area-integrated estimations of SM and TAU that are much more representative of the soil and vegetation conditions at field scale than ground measurements (from capacitive probes for SM and destructive measurements for TAU). For instance, Miernecki et al., (2012) and Wigneron et al. (2012) showed that very good correlations could be obtained from TB data and SM retrievals obtained from both SMOS and ELBARA-II over the 2010-2011 time period. The analysis of the quality of these correlations over a long time period can be very useful to evaluate the SMOS measurements and retrieved products (Level 2 and 3). The present work that extends the analysis over almost 4 years now (2010-2013) emphasizes the need to (i) maintain the long-time record of ELBARA-II measurements (ii) enhance as much as possible the control over other parameters, especially, soil roughness (SR), vegetation water content (VWC) and surface temperature, to interpret the retrieved results obtained from both SMOS and ELBARA-II instruments.

Lopez-Baeza, Ernesto; Wigneron, Jean-Pierre; Schwank, Mike; Miernecki, Maciej; Kerr, Yann; Casal, Tania; Delwart, Steven; Fernandez-Moran, Roberto; Mecklenburg, Susanne; Coll Pajaron, M. Amparo; Salgado Hernanz, Paula

258

Utilizing Chamber Data for Developing and Validating Climate Change Models  

NASA Technical Reports Server (NTRS)

Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

Monje, Oscar

2012-01-01

259

A Data set for the Validation of Reflectance Models  

NASA Astrophysics Data System (ADS)

Three mature forest stands in the CHRIS scene of the Järvselja forest test site in Estonia have been selected for the validation of forest reflectance models. CHRIS data are supported by measurements of downward spectral fluxes and airborne reflectance measurements at the test site. Rigorous atmospheric correction of CHRIS data has been performed based on the use of spectral measurements at the test site and AERONET sun-photometer data. Airborne measurements are used for the updating of CHRIS calibration coefficients. The ground truth measurements include data on stand structure - exact positions and breast-height diameter of trees, tree crown dimensions, LAI-2000 data and hemispherical images of tree canopy. Reflectance spectra (400-1050 nm) of leaves and needles, of trunk and branch bark, and of ground vegetation have been measured.

Kuusk, A.; Kuusk, J.; Lang, M.

2008-08-01

260

Validation of atmospheric propagation models in littoral waters  

NASA Astrophysics Data System (ADS)

Various atmospheric propagation effects are limiting the long-range performance of electro-optical imaging systems. These effects include absorption and scattering by molecules and aerosols, refraction due to vertical temperature gradients and scintillation and blurring due to turbulence. In maritime and coastal areas, ranges up to 25 km are relevant for detection and classification tasks on small targets (missiles, pirates). From November 2009 to October 2010 a measurement campaign was set-up over a range of more than 15 km in the False Bay in South Africa, where all of the propagation effects could be investigated quantitatively. The results have been used to provide statistical information on basic parameters as visibility, air-sea temperature difference, absolute humidity and wind speed. In addition various propagation models on aerosol particle size distribution, temperature profile, blur and scintillation under strong turbulence conditions could be validated. Examples of collected data and associated results are presented in this paper.

de Jong, Arie N.; Schwering, Piet B. W.; van Eijk, Alexander M. J.; Gunter, Willem H.

2013-04-01

261

Validation of a CFD model for predicting film cooling performance  

NASA Astrophysics Data System (ADS)

The validation of a CFD code for predicting supersonic, tangential injection film cooling performance is presented. Three different experimental film cooling studies have been simulated using the MDNS3D CFD code, and results are shown for comparison with the experimental data. MDNS3D is a Reynolds Averaged Navier-Stokes code with fully coupled k-epsilon turbulence and finite rate chemical kinetics models. Free shear layer flow fields with both chemically reacting and nonreacting coolant layers are examined. Test case one simulates nitrogen coolant injection over a recessed window on a 3D interceptor forebody. Test case two involves helium coolant injection into an air freestream. Test case three simulates highly reactive N2O4/NO2 coolant injection over a flat plate with an external arcjet onset flow. The results presented demonstrate the capability of the CFD code to accurately predict film cooling performance for a variety of complex flow configurations.

Ward, S. C.; Fricker, D. A.; Lee, R. S.

1993-06-01

262

Circulation Control Model Experimental Database for CFD Validation  

NASA Technical Reports Server (NTRS)

A 2D circulation control wing was tested in the Basic Aerodynamic Research Tunnel at the NASA Langley Research Center. A traditional circulation control wing employs tangential blowing along the span over a trailing-edge Coanda surface for the purpose of lift augmentation. This model has been tested extensively at the Georgia Tech Research Institute for the purpose of performance documentation at various blowing rates. The current study seeks to expand on the previous work by documenting additional flow-field data needed for validation of computational fluid dynamics. Two jet momentum coefficients were tested during this entry: 0.047 and 0.114. Boundary-layer transition was investigated and turbulent boundary layers were established on both the upper and lower surfaces of the model. Chordwise and spanwise pressure measurements were made, and tunnel sidewall pressure footprints were documented. Laser Doppler Velocimetry measurements were made on both the upper and lower surface of the model at two chordwise locations (x/c = 0.8 and 0.9) to document the state of the boundary layers near the spanwise blowing slot.

Paschal, Keith B.; Neuhart, Danny H.; Beeler, George B.; Allan, Brian G.

2012-01-01

263

Validation of an empirical model for photosynthetically active radiation  

NASA Astrophysics Data System (ADS)

Knowledge of the photosynthetically active radiation is necessary in different applications dealing with plant physiology, biomass production and natural illumination in greenhouses. Nevertheless, as a result of the absence of widespread measurements of this radiometric flux, it is often calculated as a constant ratio of the broadband solar radiation. This ratio is affected by many parameters. In a previous study, the authors analysed the global horizontal component of this flux. In this work, they validate the model against two independent data sets, one acquired at the same place where the previous model was developed and the other acquired at another location characterised by different climatic conditions. The first one is located at the University of Almería, a seashore location (36.83°N, 2.41°W, 20 m a.m.s.l.), while the second one is located at Granada (37.18°N, 3.58°W, 660 m a.m.s.l.), an inland location. The database includes hourly values of the relevant variables that cover the years 1993 and 1994 in Almería and 1994 and 1995 in Granada. The use of data sets registered in two different climatic conditions allows for the verification of the local independence of the proposed technique. The proposed models can provide estimates of the photosynthetically active radiation with negligible bias and root mean square deviations close to instrumental errors.

Alados, I.; Alados-Arboledas, L.

1999-08-01

264

Validation of an Acoustic Impedance Prediction Model for Skewed Resonators  

NASA Technical Reports Server (NTRS)

An impedance prediction model was validated experimentally to determine the composite impedance of a series of high-aspect ratio slot resonators incorporating channel skew and sharp bends. Such structures are useful for packaging acoustic liners into constrained spaces for turbofan noise control applications. A formulation of the Zwikker-Kosten Transmission Line (ZKTL) model, incorporating the Richards correction for rectangular channels, is used to calculate the composite normalized impedance of a series of six multi-slot resonator arrays with constant channel length. Experimentally, acoustic data was acquired in the NASA Langley Normal Incidence Tube over the frequency range of 500 to 3500 Hz at 120 and 140 dB OASPL. Normalized impedance was reduced using the Two-Microphone Method for the various combinations of channel skew and sharp 90o and 180o bends. Results show that the presence of skew and/or sharp bends does not significantly alter the impedance of a slot resonator as compared to a straight resonator of the same total channel length. ZKTL predicts the impedance of such resonators very well over the frequency range of interest. The model can be used to design arrays of slot resonators that can be packaged into complex geometries heretofore unsuitable for effective acoustic treatment.

Howerton, Brian M.; Parrott, Tony L.

2009-01-01

265

Validation model for the transient analysis of tightly coupled reactors  

SciTech Connect

Both the static and transient analysis of tightly coupled reactors differ from those of the loosely coupled systems. In these reactors, highly absorbing regions are interspaced with low absorbing regions. That raises questions of the acceptability of diffusion theory approximations. Also, the spectral shapes change drastically throughout the core and can be altered significantly by perturbations. Accurate analysis requires at least two-dimensional, multigroup transport methods. Although, such methods can be applied for static cases, for transient analysis they would be almost impossibly expensive. Recently a transient nodal model accounting for transport corrections has been developed for tightly coupled reactors. In this model, few-group, node-averaged cross sections and discontinuity factors are edited from full-core, higher order reference results such as Monte Carlo or fine-mesh, multigroup, discrete ordinate transport solutions for various conditions expected during transients. Tables of nodal parameters are constructed, and their values as the transient proceeds are found by interpolation. Although the static part of this few-group model can be tested easily by comparing nodal results with the reference transport solution, without a time-dependent transport code (at least a two-dimensional, multigroup, discrete ordinate code), doing the analogous validation for the time-dependent problem is not possible.

Bahadir, T.; Henry, A.F. [Massachusetts Inst. of Technology, Cambridge, MA (United States)

1996-12-31

266

Ptolemy II: Heterogeneous Concurrent Modeling And Design In Java  

Microsoft Academic Search

This document describes the design and implementation of Ptolemy II 2.0.1. Ptolemy II is a set of Java packages supporting heterogeneous, concurrent modeling and design. The focus is on assembly of concurrent components. The key underlying principle in the Ptolemy II is the use of well-defined models of computation that govern the interaction between components. A major problem area that

Christopher Hylands; Edward A. Lee; Jie Liu; Xiaojun Liu; Steve Neuendorffer; Yuhong Xiong

2001-01-01

267

Validation of a modified clinical risk score to predict cancer-specific survival for stage II colon cancer  

PubMed Central

Many patients with stage II colon cancer will die of their disease despite curative surgery. Therefore, identification of patients at high risk of poor outcome after surgery for stage II colon cancer is desirable. This study aims to validate a clinical risk score to predict cancer-specific survival in patients undergoing surgery for stage II colon cancer. Patients undergoing surgery for stage II colon cancer in 16 hospitals in the West of Scotland between 2001 and 2004 were identified from a prospectively maintained regional clinical audit database. Overall and cancer-specific survival rates up to 5 years were calculated. A total of 871 patients were included. At 5 years, cancer-specific survival was 81.9% and overall survival was 65.6%. On multivariate analysis, age ?75 years (hazard ratio (HR) 2.11, 95% confidence intervals (CI) 1.57–2.85; P<0.001) and emergency presentation (HR 1.97, 95% CI 1.43–2.70; P<0.001) were independently associated with cancer-specific survival. Age and mode of presentation HRs were added to form a clinical risk score of 0–2. The cancer-specific survival at 5 years for patients with a cumulative score 0 was 88.7%, 1 was 78.2% and 2 was 65.9%. These results validate a modified simple clinical risk score for patients undergoing surgery for stage II colon cancer. The combination of these two universally documented clinical factors provides a solid foundation for the examination of the impact of additional clinicopathological and treatment factors on overall and cancer-specific survival. PMID:25487740

Oliphant, Raymond; Horgan, Paul G; Morrison, David S; McMillan, Donald C

2015-01-01

268

Dynamic models and model validation for PEM fuel cells using electrical circuits  

Microsoft Academic Search

This paper presents the development of dynamic models for proton exchange membrane (PEM) fuel cells using electrical circuits. The models have been implemented in MATLAB\\/SIMULINK and PSPICE environments. Both the double-layer charging effect and the thermodynamic characteristic inside the fuel cell are included in the models. The model responses obtained at steady-state and transient conditions are validated by experimental data

Caisheng Wang; M. Hashem Nehrir; Steven R. Shaw

2005-01-01

269

Modeling of light transmission under heterogeneous forest canopy: model description and validation  

E-print Network

Modeling of light transmission under heterogeneous forest canopy: model description and validation avenue du Brézet, 63100 Clermont-Ferrand, France Keywords : light transmission, heterogeneous canopy assuming homogeneous foliage within the canopy. However forest canopies are far from homogeneous, which

Boyer, Edmond

270

An evaluation of diagnostic tests and their roles in validating forest biometric models  

Microsoft Academic Search

Model validation is an important part of model development. It is performed to increase the credibility and gain sufficient confidence about a model. This paper evaluated the usefulness of 10 statistical tests, five parametric and five nonparametric, in validating forest biometric models. The five parametric tests are the paired t test, the ?2 test, the separate t test, the simultaneous

Yuqing Yang; Robert A. Monserud; Shongming Huang

2004-01-01

271

Modeling Topaz-II system performance  

SciTech Connect

The US acquisition of the Topaz-11 in-core thermionic space reactor test system from Russia provides a good opportunity to perform a comparison of the Russian reported data and the results from computer codes such as MCNP (Ref. 3) and TFEHX (Ref. 4). The comparison study includes both neutronic and thermionic performance analyses. The Topaz II thermionic reactor is modeled with MCNP using actual Russian dimensions and parameters. The computation of the neutronic performance considers several important aspects such as the fuel enrichment and location of the thermionic fuel elements (TFES) in the reactor core. The neutronic analysis included the calculation of both radial and axial power distribution, which are then used in the TFEHX code for electrical performance. The reactor modeled consists of 37 single-cell TFEs distributed in a 13-cm-radius zirconium hydride block surrounded by 8 cm of beryllium metal reflector. The TFEs use 90% enriched [sup 235]U and molybdenum coated with a thin layer of [sup 184]W for emitter surface. Electrons emitted are captured by a collector surface with a gap filled with cesium vapor between the collector and emitter surfaces. The collector surface is electrically insulated with alumina. Liquid NaK provides the cooling system for the TFEs. The axial thermal power distribution is obtained by dividing the TFE into 40 axial nodes. Comparison of the true axial power distribution with that produced by electrical heaters was also performed.

Lee, H.H.; Klein, A.C. (Oregon State Univ., Corvallis (United States))

1993-01-01

272

Confirming the Validity of Part II of the National Board Dental Examinations: A Practice Analysis  

Microsoft Academic Search

Successful completion of Part II of the National Board Dental Examinations is a part of the licensure process for dentists. Good testing practice requires that the content of a high stakes examination like Part II be based on a strong relationship between the content and the judgments of practicing dentists on what is important to their practice of dentistry. In

Gene A. Kramer; Laura M. Neumann

273

Traveling with Cognitive Tests: Testing the Validity of a KABC-II Adaptation in India  

ERIC Educational Resources Information Center

The authors evaluated the adequacy of an extensive adaptation of the American Kaufman Assessment Battery for Children, second edition (KABC-II), for 6- to 10-year-old Kannada-speaking children of low socioeconomic status in Bangalore, South India. The adapted KABC-II was administered to 598 children. Subtests showed high reliabilities, the…

Malda, Maike; van de Vijver, Fons J. R.; Srinivasan, Krishnamachari; Transler, Catherine; Sukumar, Prathima

2010-01-01

274

Canyon building ventilation system dynamic model -- Parameters and validation  

SciTech Connect

Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical ``models.`` These component models are functionally integrated to represent the plant. With today`s low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in ``single loop`` design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

Moncrief, B.R. [Westinghouse Savannah River Co., Aiken, SC (United States); Chen, F.F.K. [Bechtel National, Inc., San Francisco, CA (United States)

1993-02-01

275

An approach to model validation and model-based prediction -- polyurethane foam case study.  

SciTech Connect

Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are created and addressed. Hypothetical problems are used because no guidance was provided concern

Dowding, Kevin J.; Rutherford, Brian Milne

2003-07-01

276

Validation of the galactic cosmic ray and geomagnetic transmission models  

NASA Technical Reports Server (NTRS)

A very high-momentum resolution particle spectrometer called the Alpha Magnetic Spectrometer (AMS) was flown in the payload bay of the Space Shuttle in a 51.65 degrees x 380-km orbit during the last solar minimum. This spectrometer has provided the first high statistics data set for galactic cosmic radiation protons, and helium, as well as limited spectral data on carbon and oxygen nuclei in the International Space Station orbit. First measurements of the albedo protons at this inclination were also made. Because of the high-momentum resolution and high statistics, the data can be separated as a function of magnetic latitude. A related investigation, the balloon borne experiment with a superconducting solenoid spectrometer (BESS), has been flown from Lynn Lake, Canada and has also provided excellent high-resolution data on protons and helium. These two data sets have been used here to study the validity of two galactic cosmic ray models and the geomagnetic transmission function developed from the 1990 geomagnetic reference field model. The predictions of both the CREME96 and NASA/JSC models are in good agreement with the AMS data. The shape of the AMS measured albedo proton spectrum, up to 2 GeV, is in excellent agreement with the previous balloon and satellite observations. A new LIS spectrum was developed that is consistent with both previous and new BESS 3He observations. Because the astronaut radiation exposures onboard ISS will be highest around the time of the solar minimum, these AMS measurements and these models provide important benchmarks for future radiation studies. AMS-02 slated for launch in September 2003, will provide even better momentum resolution and higher statistics data. Published by Elsevier Science Ltd.

Badhwar, G. D.; Truong, A. G.; O'Neill, P. M.; Choutko, V.

2001-01-01

277

Calibration and Validation of Airborne InSAR Geometric Model  

NASA Astrophysics Data System (ADS)

The image registration or geo-coding is a very important step for many applications of airborne interferometric Synthetic Aperture Radar (InSAR), especially for those involving Digital Surface Model (DSM) generation, which requires an accurate knowledge of the geometry of the InSAR system. While the trajectory and attitude instabilities of the aircraft introduce severe distortions in three dimensional (3-D) geometric model. The 3-D geometrical model of an airborne SAR image depends on the SAR processor itself. Working at squinted model, i.e., with an offset angle (squint angle) of the radar beam from broadside direction, the aircraft motion instabilities may produce distortions in airborne InSAR geometric relationship, which, if not properly being compensated for during SAR imaging, may damage the image registration. The determination of locations of the SAR image depends on the irradiated topography and the exact knowledge of all signal delays: range delay and chirp delay (being adjusted by the radar operator) and internal delays which are unknown a priori. Hence, in order to obtain reliable results, these parameters must be properly calibrated. An Airborne InSAR mapping system has been developed by the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS) to acquire three-dimensional geo-spatial data with high resolution and accuracy. To test the performance of the InSAR system, the Validation/Calibration (Val/Cal) campaign has carried out in Sichun province, south-west China, whose results will be reported in this paper.

Chunming, Han; huadong, Guo; Xijuan, Yue; Changyong, Dou; Mingming, Song; Yanbing, Zhang

2014-03-01

278

Bidirectional reflectance function in coastal waters: modeling and validation  

NASA Astrophysics Data System (ADS)

The current operational algorithm for the correction of bidirectional effects from the satellite ocean color data is optimized for typical oceanic waters. However, versions of bidirectional reflectance correction algorithms, specifically tuned for typical coastal waters and other case 2 conditions, are particularly needed to improve the overall quality of those data. In order to analyze the bidirectional reflectance distribution function (BRDF) of case 2 waters, a dataset of typical remote sensing reflectances was generated through radiative transfer simulations for a large range of viewing and illumination geometries. Based on this simulated dataset, a case 2 water focused remote sensing reflectance model is proposed to correct above-water and satellite water leaving radiance data for bidirectional effects. The proposed model is first validated with a one year time series of in situ above-water measurements acquired by collocated multi- and hyperspectral radiometers which have different viewing geometries installed at the Long Island Sound Coastal Observatory (LISCO). Match-ups and intercomparisons performed on these concurrent measurements show that the proposed algorithm outperforms the algorithm currently in use at all wavelengths.

Gilerson, Alex; Hlaing, Soe; Harmel, Tristan; Tonizzo, Alberto; Arnone, Robert; Weidemann, Alan; Ahmed, Samir

2011-11-01

279

Stratospheric Heterogeneous Chemistry and Microphysics: Model Development, Validation and Applications  

NASA Technical Reports Server (NTRS)

The objectives of this project are to: define the chemical and physical processes leading to stratospheric ozone change that involve polar stratospheric clouds (PSCS) and the reactions occurring on the surfaces of PSC particles; study the formation processes, and the physical and chemical properties of PSCS, that are relevant to atmospheric chemistry and to the interpretation of field measurements taken during polar stratosphere missions; develop quantitative models describing PSC microphysics and heterogeneous chemical processes; assimilate laboratory and field data into these models; and calculate the extent of chemical processing on PSCs and the impact of specific microphysical processes on polar composition and ozone depletion. During the course of the project, a new coupled microphysics/physical-chemistry/ photochemistry model for stratospheric sulfate aerosols and nitric acid and ice PSCs was developed and applied to analyze data collected during NASA's Arctic Airborne Stratospheric Expedition-II (AASE-II) and other missions. In this model, detailed treatments of multicomponent sulfate aerosol physical chemistry, sulfate aerosol microphysics, polar stratospheric cloud microphysics, PSC ice surface chemistry, as well as homogeneous gas-phase chemistry were included for the first time. In recent studies focusing on AASE measurements, the PSC model was used to analyze specific measurements from an aircraft deployment of an aerosol impactor, FSSP, and NO(y) detector. The calculated results are in excellent agreement with observations for particle volumes as well as NO(y) concentrations, thus confirming the importance of supercooled sulfate/nitrate droplets in PSC formation. The same model has been applied to perform a statistical study of PSC properties in the Northern Hemisphere using several hundred high-latitude air parcel trajectories obtained from Goddard. The rates of ozone depletion along trajectories with different meteorological histories are presently being systematically evaluated to identify the principal relationships between ozone loss and aerosol state. Under this project, we formulated a detailed quantitative model that predicts the multicomponent composition of sulfate aerosols under stratospheric conditions, including sulfuric, nitric, hydrochloric, hydrofluoric and hydrobromic acids. This work defined for the first time the behavior of liquid ternary-system type-1b PSCS. The model also allows the compositions and reactivities of sulfate aerosols to be calculated over the entire range of environmental conditions encountered in the stratosphere (and has been incorporated into a trajectory/microphysics model-see above). Important conclusions that derived from this work over the last few years include the following: the HNO3 content of liquid-state aerosols dominate PSCs below about 195 K; the freezing of nitric acid ice from sulfate aerosol solutions is likely to occur within a few degrees K of the water vapor frost point; the uptake and reactions of HCl in liquid aerosols is a critical component of PSC heterogeneous chemistry. In a related application of this work, the inefficiency of chlorine injection into the stratosphere during major volcanic eruptions was explained on the basis of nucleation of sulfuric acid aerosols in rising volcanic plumes leading to the formation of supercooled water droplets on these aerosols, which efficiently scavenges HCl via precipitation.

Turco, Richard P.

1996-01-01

280

VALIDATING COMPLEX CONSTRUCTION SIMULATION MODELS USING 3D VISUALIZATION  

E-print Network

for the example was developed using Stroboscope and was visualized using the Dynamic Construction Visualizer Stroboscope and animated using the Dynamic Construction Visualizer. VERIFICATION AND VALIDATION OF SIMULATED

Kamat, Vineet R.

281

Parental modelling of eating behaviours: Observational validation of the Parental Modelling of Eating Behaviours scale (PARM).  

PubMed

Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. PMID:25111293

Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

2015-03-01

282

Non-Functional Modeling and Validation in Model-Driven Architecture  

Microsoft Academic Search

Software models are, in most cases, considered as func- tional abstractions of systems. They represent the backbone of transformational processes aimed at code generation. On the other end, modeling is a traditional activity in the field of non-functional validation of software\\/hardware sys- tems, although non-functional models found on different no- tations (such as Petri Nets) and embed additional informa- tion

Vittorio Cortellessa; Antinisca Di Marco; Paola Inverardi

2007-01-01

283

Validation of transport models using additive flux minimization technique  

SciTech Connect

A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States)] [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States)] [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States)] [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)] [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

2013-10-15

284

Experiments for calibration and validation of plasticity and failure material modeling: 304L stainless steel.  

SciTech Connect

Experimental data for material plasticity and failure model calibration and validation were obtained from 304L stainless steel. Model calibration data were taken from smooth tension, notched tension, and compression tests. Model validation data were provided from experiments using thin-walled tube specimens subjected to path dependent combinations of internal pressure, extension, and torsion.

Lee, Kenneth L.; Korellis, John S.; McFadden, Sam X.

2006-01-01

285

Validation of population-based disease simulation models: a review of concepts and methods  

Microsoft Academic Search

BACKGROUND: Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. METHODS: We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of

Jacek A Kopec; Philippe Finès; Douglas G Manuel; David L Buckeridge; William M Flanagan; Jillian Oderkirk; Michal Abrahamowicz; Samuel Harper; Behnam Sharif; Anya Okhmatovskaia; Eric C Sayre; M Mushfiqur Rahman; Michael C Wolfson

2010-01-01

286

On calibration and validation of eigendeformation-based multiscale models for failure analysis of heterogeneous systems  

NASA Astrophysics Data System (ADS)

We present a new strategy for calibration and validation of hierarchical multiscale models based on computational homogenization. The proposed strategy hinges on the concept of the experimental simulator repository (SIMEX) which provides the basis for a generic algorithmic framework in calibration and validation of multiscale models. Gradient-based and genetic algorithms are incorporated into SIMEX framework to investigate the validity of these algorithms in multiscale model calibration. The strategy is implemented using the eigendeformation-based reduced order homogenization (EHM) model and integrated into a commercial finite element package (Abaqus). Ceramic- and polymer- matrix composite problems are analyzed to study the capabilities of the proposed calibration and validation framework.

Oskay, Caglar; Fish, Jacob

2008-07-01

287

Criterion Validity, Severity Cut Scores, and Test-Retest Reliability of the Beck Depression Inventory-II in a University Counseling Center Sample  

ERIC Educational Resources Information Center

The criterion validity of the Beck Depression Inventory-II (BDI-II; A. T. Beck, R. A. Steer, & G. K. Brown, 1996) was investigated by pairing blind BDI-II administrations with the major depressive episode portion of the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I; M. B. First, R. L. Spitzer, M. Gibbon, & J. B. W. Williams,…

Sprinkle, Stephen D.; Lurie, Daphne; Insko, Stephanie L.; Atkinson, George; Jones, George L.; Logan, Arthur R.; Bissada, Nancy N.

2002-01-01

288

Automatic validation of computational models using pseudo-3D spatio-temporal model checking.  

PubMed

BackgroundComputational models play an increasingly important role in systems biology for generating predictions and in synthetic biology as executable prototypes/designs. For real life (clinical) applications there is a need to scale up and build more complex spatio-temporal multiscale models; these could enable investigating how changes at small scales reflect at large scales and viceversa. Results generated by computational models can be applied to real life applications only if the models have been validated first. Traditional in silico model checking techniques only capture how non-dimensional properties (e.g. concentrations) evolve over time and are suitable for small scale systems (e.g. metabolic pathways). The validation of larger scale systems (e.g. multicellular populations) additionally requires capturing how spatial patterns and their properties change over time, which are not considered by traditional non-spatial approaches.ResultsWe developed and implemented a methodology for the automatic validation of computational models with respect to both their spatial and temporal properties. Stochastic biological systems are represented by abstract models which assume a linear structure of time and a pseudo-3D representation of space (2D space plus a density measure). Time series data generated by such models is provided as input to parameterised image processing modules which automatically detect and analyse spatial patterns (e.g. cell) and clusters of such patterns (e.g. cellular population). For capturing how spatial and numeric properties change over time the Probabilistic Bounded Linear Spatial Temporal Logic is introduced. Given a collection of time series data and a formal spatio-temporal specification the model checker Mudi (http://mudi.modelchecking.org) determines probabilistically if the formal specification holds for the computational model or not. Mudi is an approximate probabilistic model checking platform which enables users to choose between frequentist and Bayesian, estimate and statistical hypothesis testing based validation approaches. We illustrate the expressivity and efficiency of our approach based on two biological case studies namely phase variation patterning in bacterial colony growth and the chemotactic aggregation of cells.ConclusionsThe formal methodology implemented in Mudi enables the validation of computational models against spatio-temporal logic properties and is a precursor to the development and validation of more complex multidimensional and multiscale models. PMID:25440773

Pârvu, Ovidiu; Gilbert, David

2014-12-01

289

Validation of a model for the cast-film process  

SciTech Connect

We have developed a model of the cast-film process and compared theoretical predictions against experiments on a pilot line. Three polyethylenes with a markedly different level of melt elasticity were used in this evaluation; namely, a high pressure low density polyethylene, LDPE, and two linear low density polyethylenes, LLDPE-1 and LLDPE-2. The final film dimensions of the LDPE were found to be in good agreement with 1-D viscoelastic stationary predictions. Flow field visualization experiments indicate, however, a 2-D velocity field in the airgap between the extrusion die and the chill roll. Taking this observation into account, evolutions of the free surface of the web along the airgap were recorded with LLDPE-2, our least elastic melt. An excellent agreement is found between these measurements and predictions of neck-in and edge bead with 2-D Newtonian stationary simulations. The time-dependent solution, which is based on a linear stability analysis, allows to identify a zone of draw resonance within the working space of the process, defined by the draw ratio, the Deborah number, and the web aspect ratio. It is predicted that increasing this latter parameter stabilizes the process until an optimum value is reached. Experiments with LLDPE-1 are shown to validate this unique theoretical result, thus allowing to increase the draw ratio by about 75%.

Chambon, F. [Exxon Chemical Co., Baytown, TX (United States); Ohlsson, S. [Exxon Chemical Europe, Machelen (Belgium); Silagy, D. [Ecole des Mines de Paris, Sophia-Antipolis (France)

1996-12-31

290

A cross-validation deletion–substitution–addition model selection algorithm: Application to marginal structural models  

PubMed Central

The cross-validation deletion–substitution–addition (cvDSA) algorithm is based on data-adaptive estimation methodology to select and estimate marginal structural models (MSMs) for point treatment studies as well as models for conditional means where the outcome is continuous or binary. The algorithm builds and selects models based on user-defined criteria for model selection, and utilizes a loss function-based estimation procedure to distinguish between different model fits. In addition, the algorithm selects models based on cross-validation methodology to avoid “over-fitting” data. The cvDSA routine is an R software package available for download. An alternative R-package (DSA) based on the same principles as the cvDSA routine (i.e., cross-validation, loss function), but one that is faster and with additional refinements for selection and estimation of conditional means, is also available for download. Analyses of real and simulated data were conducted to demonstrate the use of these algorithms, and to compare MSMs where the causal effects were assumed (i.e., investigator-defined), with MSMs selected by the cvDSA. The package was used also to select models for the nuisance parameter (treatment) model to estimate the MSM parameters with inverse-probability of treatment weight (IPTW) estimation. Other estimation procedures (i.e., G-computation and double robust IPTW) are available also with the package. PMID:25505354

Haight, Thaddeus J.; Wang, Yue; van der Laan, Mark J.; Tager, Ira B.

2014-01-01

291

The African American Acculturation Scale II: Cross-Validation and Short Form.  

ERIC Educational Resources Information Center

Studied African American culture, using a new, shortened, 33-item African American Acculturation Scale (AAAS-33) to assess the scale's validity and reliability. Comparisons between the original form and AAAS-33 reveal high correlations, however, the longer form may be sensitive to some beliefs, practices, and attitudes not assessed by the short…

Landrine, Hope; Klonoff, Elizabeth A.

1995-01-01

292

Comparing Validity and Reliability in Special Education Title II and IDEA Data  

ERIC Educational Resources Information Center

Previous researchers have found that special education teacher shortages are pervasive and exacerbated by federal policies regarding "highly qualified" teacher requirements. The authors examined special education teacher personnel data from 2 federal data sources to determine if these sources offer a reliable and valid means of…

Steinbrecher, Trisha D.; McKeown, Debra; Walther-Thomas, Chriss

2013-01-01

293

Stress concentration near stiff inclusions: validation of rigid inclusion model and boundary layers by means of photoelasticity  

E-print Network

Photoelasticity is employed to investigate the stress state near stiff rectangular and rhombohedral inclusions embedded in a 'soft' elastic plate. Results show that the singular stress field predicted by the linear elastic solution for the rigid inclusion model can be generated in reality, with great accuracy, within a material. In particular, experiments: (i.) agree with the fact that the singularity is lower for obtuse than for acute inclusion angles; (ii.) show that the singularity is stronger in Mode II than in Mode I (differently from a notch); (iii.) validate the model of rigid quadrilateral inclusion; (iv.) for thin inclusions, show the presence of boundary layers deeply influencing the stress field, so that the limit case of rigid line inclusion is obtained in strong dependence on the inclusion's shape. The introduced experimental methodology opens the possibility of enhancing the design of thin reinforcements and of analyzing complex situations involving interaction between inclusions and defects.

Diego Misseroni; Francesco Dal Corso; Summer Shahzad; Davide Bigoni

2014-04-03

294

Validation of CATHARE Code for Gas-Cooled Reactors: Comparison with E.V.O Experimental Data on Oberhausen II Facility  

SciTech Connect

Extensively validated and qualified for light-water reactor safety studies, the thermal-hydraulics code CATHARE has been adapted to deal also with Gas-Cooled Reactor applications. In order to validate the code for these new applications, CEA (Commissariat a l'Energie Atomique) has initiated an ambitious long-term experimental program. The foreseen experimental facilities range from small-scale loops for physical correlations, to component technology and system demonstration loops. In the short-term perspective, CATHARE is being validated against existing experimental data, in particular from the German power plant Oberhausen II. Oberhausen II, operated by the German utility E.V.O (Energie Versorgung Oberhausen AG), is a 50 electrical Megawatts (MW(e)) direct-cycle Helium turbine plant. The power source is a gas burner instead of a nuclear reactor core, but the power conversion system resembles those of the GFR (Gas-cooled Fast Reactor) and other high-temperature reactor concepts. Oberhausen II was operated for more than 25 000 hours between 1974 and 1988. Design specifications, drawings and experimental data have been obtained through the European HTR-E project, offering a unique opportunity to validate CATHARE on a large-scale Brayton cycle. Available measurements of temperature, pressure and mass flow rate throughout the circuit have allowed a very comprehensive thermal-hydraulic description of the plant, in steady-state conditions for design data and operating data as well as during transients. First, the paper presents the modeling of the Oberhausen II plant with the CATHARE code, with a complete description of the modeling of each component: the recuperator, a complex gas to gas counter flow heat exchanger, the pre-cooler and inter-cooler, two complex gas to water cross flow heat exchanger, the heater, which is a gas burner, and the two turbines and two compressors. A particular attention is given to the modeling of leakages all along the circuit and to the modeling of cooling of the first stages of the high pressure turbine. The modeling of the helium storage tanks used for injection or removing of helium during load following is also described. Then, the results of the CATHARE calculations for four steady-state conditions are compared with E.V.O data. It concerns the design nominal point (50 MW(e)), the operating nominal point (30 MW(e)) and two operating partial loads (20 MW(e) and 13 MW(e)). The major differences between the expected design power (50 MW(e)) and the maximum operating power observed in the facility (30 MW(e)) are discussed. First conclusion is that results of calculation are in a good agreement with experimental data for these four nominal states. Finally, the results of a load following transient calculation performed with CATHARE are shown in comparison with experimental data. The scenario of this load following is a decrease of the electrical power from 10.6 MW to 7.6 MW due to a decrease of the mass flow rate and pressure levels caused by a removing of helium towards storage tanks. Calculation agrees well with measured data. (authors)

Bentivoglio, Fabrice; Tauveron, Nicolas [Commissariat a l'Energie Atomique, 17 rue des Martyrs, F-38000, Grenoble (France)

2006-07-01

295

Simplified modeling of the EBR-II control rods  

Microsoft Academic Search

Simplified models of EBR-II control and safety rods have been developed for core modeling under various operational and shutdown conditions. A parametric study was performed on normal worth, high worth, and safety rod type control rods. A summary of worth changes due to individual modeling approximations is tabulated. Worth effects due to structural modeling simplification are negligible. Fuel region homogenization

1995-01-01

296

Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts  

NASA Technical Reports Server (NTRS)

We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

2012-01-01

297

Land-cover change model validation by an ROC method for the Ipswich watershed, Massachusetts, USA  

Microsoft Academic Search

Scientists need a better and larger set of tools to validate land-use change models, because it is essential to know a model’s prediction accuracy. This paper describes how to use the relative operating characteristic (ROC) as a quantitative measurement to validate a land-cover change model. Typically, a crucial component of a spatially explicit simulation model of land-cover change is a

R. Gil Pontius Jr; Laura C. Schneider

2001-01-01

298

Science Motivation Questionnaire II: Validation with Science Majors and Nonscience Majors  

ERIC Educational Resources Information Center

From the perspective of social cognitive theory, the motivation of students to learn science in college courses was examined. The students--367 science majors and 313 nonscience majors--responded to the Science Motivation Questionnaire II, which assessed five motivation components: intrinsic motivation, self-determination, self-efficacy, career…

Glynn, Shawn M.; Brickman, Peggy; Armstrong, Norris; Taasoobshirazi, Gita

2011-01-01

299

The Internal Validation of Level II and Level III Respiratory Therapy Examinations. Final Report.  

ERIC Educational Resources Information Center

This project began with the delineation of the roles and functions of respiratory therapy personnel by the American Association for Respiratory Therapy. In Phase II, The Psychological Corporation used this delineation to develop six proficiency examinations, three at each of two levels. One exam at each level was designated for the purpose of the…

Jouett, Michael L.

300

Results of site validation experiments. Volume II. Supporting documents 5 through 14  

SciTech Connect

Volume II contains the following supporting documents: Summary of Geologic Mapping of Underground Investigations; Logging of Vertical Coreholes - ''Double Box'' Area and Exploratory Drift; WIPP High Precision Gravity Survey; Basic Data Reports for Drillholes, Brine Content of Facility Internal Strata; Mineralogical Content of Facility Interval Strata; Location and Characterization of Interbedded Materials; Characterization of Aquifers at Shaft Locations; and Permeability of Facility Interval Strate.

Not Available

1983-01-01

301

Reconceptualizing the learning transfer conceptual framework: empirical validation of a new systemic model  

Microsoft Academic Search

The main purpose of this study was to examine the validity of a new systemic model of learning transfer and thus determine if a more holistic approach to training transfer could better explain the phenomenon. In all, this study confirmed the validity of the new systemic model and suggested that a high performance work system could indeed serve as a

Constantine Kontoghiorghes

2004-01-01

302

Reconceptualizing the Learning Transfer Conceptual Framework: Empirical Validation of a New Systemic Model  

ERIC Educational Resources Information Center

The main purpose of this study was to examine the validity of a new systemic model of learning transfer and thus determine if a more holistic approach to training transfer could better explain the phenomenon. In all, this study confirmed the validity of the new systemic model and suggested that a high performance work system could indeed serve as…

Kontoghiorghes, Constantine

2004-01-01

303

Model validation for robust control of uncertain systems with an integral quadratic constraint  

Microsoft Academic Search

This paper presents a new approach to the model validation problem for a class of uncertain systems in which the uncertainty is described by an integral quadratic constraint. The proposed model validation algorithm is based on the solution to a game-type Riccati differential equation and a set of state equations closely related to a robust Kalman filtering problem.

Andrey V. Savkin; Ian R. Petersen

1996-01-01

304

Techniques for Down-Sampling a Measured Surface Height Map for Model Validation  

NASA Technical Reports Server (NTRS)

This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

Sidick, Erkin

2012-01-01

305

Rotating Blade Trailing-Edge Noise: Experimental Validation of Analytical Model  

E-print Network

Rotating Blade Trailing-Edge Noise: Experimental Validation of Analytical Model Yannick Rozenberg The paper is dealing with the experimental validation of an analytical trailing-edge noise model dedicated is based on a previously published extension of Amiet's trailing-edge noise theory. A blade is split

Paris-Sud XI, Université de

306

Servant Leadership Behaviour Scale: A hierarchical model and test of construct validity  

Microsoft Academic Search

Servant leadership is widely believed to be a multidimensional construct. However, existing measures of servant leadership typically suffer from highly correlated dimensions, raising concerns over discriminant validity. We set out in this study to examine the dimensionality of the hypothesized six-factor Servant Leadership Behaviour Scale (SLBS) and validate a hierarchical model of servant leadership. Using structural equation modelling, convergent and

Sen Sendjayar; Brian Cooper

2011-01-01

307

Validation of computational models in biomechanics H B Henninger1,2  

E-print Network

Validation of computational models in biomechanics H B Henninger1,2 , S P Reese1,2 , A E Anderson1 biomechanics, and many recent articles have applied these concepts in an attempt to build credibility to present them in the context of computational biomechanics. Specifically, the task of model validation

Utah, University of

308

PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II  

SciTech Connect

To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model is coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.

Chambers, F W; Raymond, B A; Falabella, S; Lee, B S; Richardson, R A; Weir, J T; Davis, H A; Schultze, M E

2005-05-31

309

MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom  

NASA Astrophysics Data System (ADS)

The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

310

EEG-neurofeedback for optimising performance. II: creativity, the performing arts and ecological validity.  

PubMed

As a continuation of a review of evidence of the validity of cognitive/affective gains following neurofeedback in healthy participants, including correlations in support of the gains being mediated by feedback learning (Gruzelier, 2014a), the focus here is on the impact on creativity, especially in the performing arts including music, dance and acting. The majority of research involves alpha/theta (A/T), sensory-motor rhythm (SMR) and heart rate variability (HRV) protocols. There is evidence of reliable benefits from A/T training with advanced musicians especially for creative performance, and reliable benefits from both A/T and SMR training for novice music performance in adults and in a school study with children with impact on creativity, communication/presentation and technique. Making the SMR ratio training context ecologically relevant for actors enhanced creativity in stage performance, with added benefits from the more immersive training context. A/T and HRV training have benefitted dancers. The neurofeedback evidence adds to the rapidly accumulating validation of neurofeedback, while performing arts studies offer an opportunity for ecological validity in creativity research for both creative process and product. PMID:24239853

Gruzelier, John H

2014-07-01

311

Importance of Sea Ice for Validating Global Climate Models  

NASA Technical Reports Server (NTRS)

Reproduction of current day large-scale physical features and processes is a critical test of global climate model performance. Without this benchmark, prognoses of future climate conditions are at best speculation. A fundamental question relevant to this issue is, which processes and observations are both robust and sensitive enough to be used for model validation and furthermore are they also indicators of the problem at hand? In the case of global climate, one of the problems at hand is to distinguish between anthropogenic and naturally occuring climate responses. The polar regions provide an excellent testing ground to examine this problem because few humans make their livelihood there, such that anthropogenic influences in the polar regions usually spawn from global redistribution of a source originating elsewhere. Concomitantly, polar regions are one of the few places where responses to climate are non-anthropogenic. Thus, if an anthropogenic effect has reached the polar regions (e.g. the case of upper atmospheric ozone sensitivity to CFCs), it has most likely had an impact globally but is more difficult to sort out from local effects in areas where anthropogenic activity is high. Within this context, sea ice has served as both a monitoring platform and sensitivity parameter of polar climate response since the time of Fridtjof Nansen. Sea ice resides in the polar regions at the air-sea interface such that changes in either the global atmospheric or oceanic circulation set up complex non-linear responses in sea ice which are uniquely determined. Sea ice currently covers a maximum of about 7% of the earth's surface but was completely absent during the Jurassic Period and far more extensive during the various ice ages. It is also geophysically very thin (typically <10 m in Arctic, <3 m in Antarctic) compared to the troposphere (roughly 10 km) and deep ocean (roughly 3 to 4 km). Because of these unique conditions, polar researchers regard sea ice as one of the more important features to monitor in terms of heat, mass, and momentum transfer between the air and sea and furthermore, the impact of such responses to global climate.

Geiger, Cathleen A.

1997-01-01

312

A Permutation Method to Assess Heterogeneity in External Validation for Risk Prediction Models  

PubMed Central

The value of a developed prediction model depends on its performance outside the development sample. The key is therefore to externally validate the model on a different but related independent data. In this study, we propose a permutation method to assess heterogeneity in external validation for risk prediction models. The permutation p value measures the extent of homology between development and validation datasets. If p < 0.05, the model may not be directly transported to the external validation population without further revision or updating. Monte-Carlo simulations are conducted to evaluate the statistical properties of the proposed method, and two microarray breast cancer datasets are analyzed for demonstration. The permutation method is easy to implement and is recommended for routine use in external validation for risk prediction models. PMID:25606854

Wang, Ling-Yi; Lee, Wen-Chung

2015-01-01

313

ShipIR model validation using NATO SIMVEX experiment results  

NASA Astrophysics Data System (ADS)

An infrared field trial has been conducted by a NATO science panel on IR ship signatures, TG-16. This trial was planned, designed and executed for the expressed purpose of the validation of predictive IR ship signature simulations. The details of the trial were dictated by a thoughtful validation methodology, which exploits the concept of "experimental precision." Two governmental defense laboratories, the Norwegian Defence Research Establishment and the US Naval Research Laboratory have used this trial data to perform a validation analysis on the ShipIR IR signature code. This analysis quantifies prediction accuracy of the current versions of the code and identifies specific portions of the code that need to be upgraded to improve prediction accuracy.

Fraedrich, Doug S.; Stark, Espen; Heen, Lars T.; Miller, Craig

2003-09-01

314

Cross-validation pitfalls when selecting and assessing regression and classification models  

PubMed Central

Background We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. Methods We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. Results We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. Conclusions We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error. PMID:24678909

2014-01-01

315

A Formal Algorithm for Verifying the Validity of Clustering Results Based on Model Checking  

PubMed Central

The limitations in general methods to evaluate clustering will remain difficult to overcome if verifying the clustering validity continues to be based on clustering results and evaluation index values. This study focuses on a clustering process to analyze crisp clustering validity. First, we define the properties that must be satisfied by valid clustering processes and model clustering processes based on program graphs and transition systems. We then recast the analysis of clustering validity as the problem of verifying whether the model of clustering processes satisfies the specified properties with model checking. That is, we try to build a bridge between clustering and model checking. Experiments on several datasets indicate the effectiveness and suitability of our algorithms. Compared with traditional evaluation indices, our formal method can not only indicate whether the clustering results are valid but, in the case the results are invalid, can also detect the objects that have led to the invalidity. PMID:24608823

Huang, Shaobin; Cheng, Yuan; Lang, Dapeng; Chi, Ronghua; Liu, Guofeng

2014-01-01

316

Crazy like a fox. Validity and ethics of animal models of human psychiatric disease.  

PubMed

Animal models of human disease play a central role in modern biomedical science. Developing animal models for human mental illness presents unique practical and philosophical challenges. In this article we argue that (1) existing animal models of psychiatric disease are not valid, (2) attempts to model syndromes are undermined by current nosology, (3) models of symptoms are rife with circular logic and anthropomorphism, (4) any model must make unjustified assumptions about subjective experience, and (5) any model deemed valid would be inherently unethical, for if an animal adequately models human subjective experience, then there is no morally relevant difference between that animal and a human. PMID:24534739

Rollin, Michael D H; Rollin, Bernard E

2014-04-01

317

BIOCHEMICAL AND MORPHOLOGICAL VALIDATION OF A RODENT MODEL OF OPIDN  

EPA Science Inventory

The paper describes six years of research designed to validate the use of the rat as a viable alternative to the hen for screening and mechanistic studies of neuropathic OP compounds. To date the results indicate that if morphological rather than behavioral endpoints are used, th...

318

Design of embedded systems: formal models, validation, and synthesis  

Microsoft Academic Search

This paper addresses the design of reactive real-time embedded systems. Such systems are often heterogeneous in implementation technologies and design styles, for example by combining hardware application-specific integrated circuits (ASICs) with embedded software. The concurrent design process for such embedded systems involves solving the specification, validation, and synthesis problems. We review the variety of approaches to these problems that have

Stephen Edwards; Luciano Lavagno; Edward A. Lee; Alberto Sangiovanni-Vincentelli

1997-01-01

319

INTER-MODEL, ANALYTICAL, AND EXPERIMENTAL VALIDATION OF  

E-print Network

Adviser Dr. Daniel E. Fisher Dr. Ronald D. Delahoussaye Dr. Khaled Mansy Dr. A. Gordon Emslie Dean Fort Wayne Project, generously aided in providing measured data and corresponding information from the Fort Wayne house, which was used in the experimental validation of RHB. Dr. Jon W. Hand, currently

320

Computational Modeling and Experimental Validation of Aviation Security Procedures  

Microsoft Academic Search

Security of civil aviation has become a major concern in recent years, leading to a variety of protective measures related to airport and aircraft security to be established by regional, national and international author- ities. Due to the very nature of natural language, these informal require- ments lack precision and are inappropriate for validation and verifica- tion of resulting properties

Uwe Glässer; Sarah Rastkar; Mona Vajihollahi

2006-01-01

321

Computational modeling and validation of intraventricular flow in a simple model of the left ventricle  

NASA Astrophysics Data System (ADS)

Simulations of flow inside a laboratory model of the left ventricle are validated against experiments. The simulations employ an immersed boundary-based method for flowmodeling, and the computationalmodel of the expanding-contracting ventricle is constructed via image-segmentation. A quantitative comparison of the phase-averaged velocity and vorticity fields between the simulation and the experiment shows a reasonable agreement, given the inherent uncertainties in the modeling procedure. Simulations also exhibit a good agreement in terms of time-varying net circulation, as well as clinically important metrics such as flow-wave propagation velocity and its ratio with peak early-wave flow velocity. The detailed and critical assessment of this comparison is used to identify and discuss the key challenges that are faced in such a validation study.

Vedula, Vijay; Fortini, Stefania; Seo, Jung-Hee; Querzoli, Giorgio; Mittal, Rajat

2014-11-01

322

Computational modeling and validation of intraventricular flow in a simple model of the left ventricle  

NASA Astrophysics Data System (ADS)

Simulations of flow inside a laboratory model of the left ventricle are validated against experiments. The simulations employ an immersed boundary-based method for flowmodeling, and the computationalmodel of the expanding-contracting ventricle is constructed via image-segmentation. A quantitative comparison of the phase-averaged velocity and vorticity fields between the simulation and the experiment shows a reasonable agreement, given the inherent uncertainties in the modeling procedure. Simulations also exhibit a good agreement in terms of time-varying net circulation, as well as clinically important metrics such as flow-wave propagation velocity and its ratio with peak early-wave flow velocity. The detailed and critical assessment of this comparison is used to identify and discuss the key challenges that are faced in such a validation study.

Vedula, Vijay; Fortini, Stefania; Seo, Jung-Hee; Querzoli, Giorgio; Mittal, Rajat

2014-12-01

323

Validation of the integration of CFD and SAS4A/SASSYS-1: Analysis of EBR-II shutdown heat removal test 17  

SciTech Connect

Recent analyses have demonstrated the need to model multidimensional phenomena, particularly thermal stratification in outlet plena, during safety analyses of loss-of-flow transients of certain liquid-metal cooled reactor designs. Therefore, Argonne's reactor systems safety code SAS4A/SASSYS-1 is being enhanced by integrating 3D computational fluid dynamics models of the plena. A validation exercise of the new tool is being performed by analyzing the protected loss-of-flow event demonstrated by the EBR-II Shutdown Heat Removal Test 17. In this analysis, the behavior of the coolant in the cold pool is modeled using the CFD code STAR-CCM+, while the remainder of the cooling system and the reactor core are modeled with SAS4A/SASSYS-1. This paper summarizes the code integration strategy and provides the predicted 3D temperature and velocity distributions inside the cold pool during SHRT-17. The results of the coupled analysis should be considered preliminary at this stage, as the exercise pointed to the need to improve the CFD model of the cold pool tank. (authors)

Thomas, J. W.; Fanning, T. H.; Vilim, R.; Briggs, L. L. [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439-4842 (United States)

2012-07-01

324

Laser-silicon interaction for selective emitter formation in photovoltaics. I. Numerical model and validation  

NASA Astrophysics Data System (ADS)

Laser doping to form selective emitters offers an attractive method to increase the performance of silicon wafer based photovoltaics. However, the effect of processing conditions, such as laser power and travel speed, on molten zone geometry and the phosphorus dopant profile is not well understood. A mathematical model is developed to quantitatively investigate and understand how processing parameters impact the heat and mass transfer and fluid flow during laser doping using continuous wave lasers. Calculated molten zone dimensions and dopant concentration profiles are in good agreement with independent experimental data reported in the literature. The mechanisms for heat (conduction) and mass (convection) transport are examined, which lays the foundation for quantitatively understanding the effect of processing conditions on molten zone geometry and dopant concentration distribution. The validated model and insight into heat and mass transport mechanisms also provide the bases for developing process maps, which are presented in part II. These maps illustrate the effects of output power and travel speed on molten zone geometry, average dopant concentration, dopant profile shape, and sheet resistance.

Blecher, J. J.; Palmer, T. A.; DebRoy, T.

2012-12-01

325

A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection  

Microsoft Academic Search

We review accuracy estimation methods and compare the two most common methods cross- validation and bootstrap Recent experimen­ tal results on artificial data and theoretical re cults m restricted settings have shown that for selecting a good classifier from a set of classi­ fiers (model selection), ten-fold cross-validation may be better than the more expensive ka\\\\p one-out cross-validation We report

Ron Kohavi

1995-01-01

326

On-Line Processing of Unidirectional Fiber Composites Using Radiative Heating: II. Radiative Properties, Experimental Validation and Process Parameter Selection  

Microsoft Academic Search

Experimental validation is presented for a detailed thermal model (described in Paper I) for on-line processing of unidirectional fiber composites by surface or volumetric radiative heating. Surface and volumetric radiative properties of unidirectional graphite\\/epoxy and glass\\/epoxy are presented: measurements of the complex refractive index of an uncured and cured 3501-6 epoxy resin as a function of wavelength; semi-empirical extinction and

Bih-Cherng Chern; Tess J. Moon; John R. Howell

2002-01-01

327

DISCRETE EVENT MODELING IN PTOLEMY II  

Microsoft Academic Search

Abstract This report describes the discrete-event semantics and its implementation,in the Ptolemy II software architecture. The discrete-event system representation is appropriate for time-oriented systems such as queueing systems, communication networks, and hardware systems. A key strength in our discrete-event implementation ,is that simultaneous ,events are handled systematically and deterministically. A formal and rigorous treatment of this property is given. One

Lukito Muliadi

328

Modeling pedestrian shopping behavior using principles of bounded rationality: model comparison and validation  

NASA Astrophysics Data System (ADS)

Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.

Zhu, Wei; Timmermans, Harry

2011-06-01

329

Heterogeneous Concurrent Modeling and Design in Java (Volume 3: Ptolemy II Domains)  

E-print Network

Heterogeneous Concurrent Modeling and Design in Java (Volume 3: Ptolemy II Domains) Christopher, Infineon, Microsoft, National Instruments, and Toyota. #12;PTOLEMY II HETEROGENEOUS CONCURRENT MODELING, Haiyang Zheng VOLUME 3: PTOLEMY II DOMAINS Authors: Shuvra S. Bhattacharyya Christopher Brooks Elaine

330

Development and validation of a dynamical atmosphere-vegetation-soil HTO transport and OBT formation model.  

PubMed

A numerical model simulating transport of tritiated water (HTO) in atmosphere-soil-vegetation system, and, accumulation of organically bound tritium (OBT) in vegetative leaves was developed. Characteristic of the model is, for calculating tritium transport, it incorporates a dynamical atmosphere-soil-vegetation model (SOLVEG-II) that calculates transport of heat and water, and, exchange of CO(2). The processes included for calculating tissue free water tritium (TFWT) in leaves are HTO exchange between canopy air and leaf cellular water, root uptake of aqueous HTO in soil, photosynthetic assimilation of TFWT into OBT, and, TFWT formation from OBT through respiration. Tritium fluxes at the last two processes are input to a carbohydrate compartment model in leaves that calculates OBT translocation from leaves and allocation in them, by using photosynthesis and respiration rate in leaves. The developed model was then validated through a simulation of an existing experiment of acute exposure of grape plants to atmospheric HTO. Calculated TFWT concentration in leaves increased soon after the start of HTO exposure, reaching to equilibrium with the atmospheric HTO within a few hours, and then rapidly decreased after the end of the exposure. Calculated non-exchangeable OBT amount in leaves linearly increased during the exposure, and after the exposure, rapidly decreased in daytime, and, moderately nighttime. These variations in the calculated TFWT concentrations and OBT amounts, each mainly controlled by HTO exchange between canopy air and leaf cellular water and by carbohydrates translocation from leaves, fairly agreed with the observations within average errors of a factor of two. PMID:21665337

Ota, Masakazu; Nagai, Haruyasu

2011-09-01

331

Validation of models for global irradiance on inclined planes  

Microsoft Academic Search

The accuracy of models to estimate irradiance on inclined planes is tested by comparing the predictions to measurements taken with four instruments of various tilt and azimuth angles in Sede Boqer, Israel. The three models investigated are: the Perez model, Hay's anisotropic model, and the isotropic model. The Perez model is found to perform significantly better than the other two,

D. Feuremann; A. Zemel

1992-01-01

332

Deformable gel dosimetry II: experimental validation of DIR-based dose-warping  

NASA Astrophysics Data System (ADS)

Algorithms exist for the deformation of radiotherapy doses based on patient image sets, though these are sometimes contentious because not all such image calculations are constrained by appropriate physical laws. By use of a deformable dosimetric gel phantom, 'DEFGEL', we demonstrate a full 3D experimental validation of a range of dose deformation algorithms publicly available. Spatial accuracy in low contrast areas was assessed using "ghost" fiducial markers (digitally removed from CT images prior to registration) implanted in the phantom. The accuracy with which the different algorithms deform dose was evaluated by comparing doses measured with the deformable phantom to warped planned doses, via 3D g-analysis. Mean spatial errors ranged from 1.9 mm with a g3D passing ratio of 95.8 % for the original Horn and Schunck algorithm to 3.9 mm with a g3D passing ratio of 39.9 % for the modified demons algorithm.

Yeo, U. J.; Taylor, M. L.; Supple, J. R.; Smith, R. L.; Kron, T.; Franich, R. D.

2013-06-01

333

Are the binary typology models of alcoholism valid in polydrug abusers ?  

PubMed

Objective: To evaluate the dichotomy of type I/II and type A/B alcoholism typologies in opiate-dependent patients with a comorbid alcohol dependence problem (ODP-AP). Methods: The validity assessment process comprised the information regarding the history of alcohol use (internal validity), cognitive-behavioral variables regarding substance use (external validity), and indicators of treatment during 6-month follow-up (predictive validity). Results: ODP-AP subjects classified as type II/B presented an early and much more severe drinking problem and a worse clinical prognosis when considering opiate treatment variables as compared with ODP-AP subjects defined as type I/A. Furthermore, type II/B patients endorse more general positive beliefs and expectancies related to the effect of alcohol and tend to drink heavily across several intra- and interpersonal situations as compared with type I/A patients. Conclusions: These findings confirm two different forms of alcohol dependence, recognized as a low-severity/vulnerability subgroup and a high-severity/vulnerability subgroup, in an opiate-dependent population with a lifetime diagnosis of alcohol dependence. PMID:25372059

Pombo, Samuel; Costa, Nuno F da; Figueira, Maria L

2014-10-31

334

Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--7.  

PubMed

Trust and confidence are critical to the success of health care models. There are two main methods for achieving this: transparency (people can see how the model is built) and validation (how well the model reproduces reality). This report describes recommendations for achieving transparency and validation developed by a taskforce appointed by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. Recommendations were developed iteratively by the authors. A nontechnical description--including model type, intended applications, funding sources, structure, intended uses, inputs, outputs, other components that determine function, and their relationships, data sources, validation methods, results, and limitations--should be made available to anyone. Technical documentation, written in sufficient detail to enable a reader with necessary expertise to evaluate the model and potentially reproduce it, should be made available openly or under agreements that protect intellectual property, at the discretion of the modelers. Validation involves face validity (wherein experts evaluate model structure, data sources, assumptions, and results), verification or internal validity (check accuracy of coding), cross validity (comparison of results with other models analyzing the same problem), external validity (comparing model results with real-world results), and predictive validity (comparing model results with prospectively observed events). The last two are the strongest form of validation. Each section of this article contains a number of recommendations that were iterated among the authors, as well as among the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. PMID:22999134

Eddy, David M; Hollingworth, William; Caro, J Jaime; Tsevat, Joel; McDonald, Kathryn M; Wong, John B

2012-01-01

335

The Sandia MEMS Passive Shock Sensor : FY08 testing for functionality, model validation, and technology readiness.  

SciTech Connect

This report summarizes the functional, model validation, and technology readiness testing of the Sandia MEMS Passive Shock Sensor in FY08. Functional testing of a large number of revision 4 parts showed robust and consistent performance. Model validation testing helped tune the models to match data well and identified several areas for future investigation related to high frequency sensitivity and thermal effects. Finally, technology readiness testing demonstrated the integrated elements of the sensor under realistic environments.

Walraven, Jeremy Allen; Blecke, Jill; Baker, Michael Sean; Clemens, Rebecca C.; Mitchell, John Anthony; Brake, Matthew Robert; Epp, David S.; Wittwer, Jonathan W.

2008-10-01

336

Systematic clinical methodology for validating bipolar-II disorder: data in mid-stream from a French national multi-site study (EPIDEP)  

Microsoft Academic Search

Background: This paper presents the methodology and clinical data in mid-stream from a French multi-center study (EPIDEP) in progress on a national sample of patients with DSM-IV major depressive episode (MDE). The aim of EPIDEP is to show the feasibility of validating the spectrum of soft bipolar disorders by practising clinicians. In this report, we focus on bipolar II (BP-II).

Elie G Hantouche; Hagop S Akiskal; Sylvie Lancrenon; Jean-François Allilaire; Daniel Sechter; Jean-Michel Azorin; Marc Bourgeois; Jean-Philippe Fraud; Liliane Châtenet-Duchêne

1998-01-01

337

Validation of a Model of a Resonant Optothermoacoustic Trace Gas Sensor  

E-print Network

sensitive than QEPAS when the ambient pressure is low enough ( 50 Torr). At low ambient pressure the linesValidation of a Model of a Resonant Optothermoacoustic Trace Gas Sensor N. Petra1, J. Zweck1, S. E optothermoacoustic sensor is validated by comparison with experiments performed with 0.5% acetylene in nitrogen

Minkoff, Susan E.

338

Numerical modeling and experimental validation of dynamic fracture events along weak planes  

E-print Network

scale simulations, Cohesive zone model, Validation, Verification, Hopkinson bar, Photoelasticity entirely empir- ical design and optimization of structural systems in ballistic and impact applica- tions experimental data. These experiences manifest the need for a simulation/validation-driven exper- imental design

Huerta, Antonio

339

A free wake vortex lattice model for vertical axis wind turbines: Modeling, verification and validation  

NASA Astrophysics Data System (ADS)

Since the 1970s several research activities had been carried out on developing aerodynamic models for Vertical Axis Wind Turbines (VAWTs). In order to design large VAWTs of MW scale, more accurate aerodynamic calculation is required to predict their aero-elastic behaviours. In this paper, a 3D free wake vortex lattice model for VAWTs is developed, verified and validated. Comparisons to the experimental results show that the 3D free wake vortex lattice model developed is capable of making an accurate prediction of the general performance and the instantaneous aerodynamic forces on the blades. The comparison between momentum method and the vortex lattice model shows that free wake vortex models are needed for detailed loads calculation and for calculating highly loaded rotors.

Meng, Fanzhong; Schwarze, Holger; Vorpahl, Fabian; Strobel, Michael

2014-12-01

340

Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-7.  

PubMed

Trust and confidence are critical to the success of health care models. There are two main methods for achieving this: transparency (people can see how the model is built) and validation (how well it reproduces reality). This report describes recommendations for achieving transparency and validation, developed by a task force appointed by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM). Recommendations were developed iteratively by the authors. A nontechnical description should be made available to anyone-including model type and intended applications; funding sources; structure; inputs, outputs, other components that determine function, and their relationships; data sources; validation methods and results; and limitations. Technical documentation, written in sufficient detail to enable a reader with necessary expertise to evaluate the model and potentially reproduce it, should be made available openly or under agreements that protect intellectual property, at the discretion of the modelers. Validation involves face validity (wherein experts evaluate model structure, data sources, assumptions, and results), verification or internal validity (check accuracy of coding), cross validity (comparison of results with other models analyzing same problem), external validity (comparing model results to real-world results), and predictive validity (comparing model results with prospectively observed events). The last two are the strongest form of validation. Each section of this paper contains a number of recommendations that were iterated among the authors, as well as the wider modeling task force jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. PMID:22990088

Eddy, David M; Hollingworth, William; Caro, J Jaime; Tsevat, Joel; McDonald, Kathryn M; Wong, John B

2012-01-01

341

NTHMP Model Validation Workshop. Galveston TX, Mar. 30 -Apr. 01, 2011 Validation of MOST (Method Of  

E-print Network

Using VTCS-2. J. Waterway, Port, Coastal and Ocean Eng., V 121, N 6, 308 - 316. 2. V. V. Titov, C. E. Synolakis (1998). Numerical Modeling of Tidal Wave Runup. J. Waterway, Port, Coastal and Ocean Eng., V 124 earthquake Pacific Islands Alaska 8 9 10 11 12 13 14 15 16 17 18 19 20 -1 -0.5 0 0.5 1 Port Angeles, WA 11

Tolkova, Elena

342

A multi-source satellite data approach for modelling Lake Turkana water level: calibration and validation using satellite altimetry data  

NASA Astrophysics Data System (ADS)

Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of inter- and intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellite-driven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2 m. The lake level fluctuated in the range up to 4 m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins.

Velpuri, N. M.; Senay, G. B.; Asante, K. O.

2012-01-01

343

Composing Different Models of Computation in Kepler and Ptolemy II  

Microsoft Academic Search

A model of computation (MoC) is a formal abstraction of execution in a computer. There is a need for composing MoCs in e-science. Kepler, which is based on Ptolemy II, is a scientific workflow environment that allows for MoC composition. This paper explains how MoCs are combined in Kepler and Ptolemy II and analyzes which combinations of MoCs are currently

Antoon Goderis; Christopher Brooks; Ilkay Altintas; Edward A. Lee; Carole A. Goble

2007-01-01

344

Modeling and Validation of a Three-Stage Solidification Model for Sprays  

NASA Astrophysics Data System (ADS)

A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

2010-09-01

345

Transient PVT measurements and model predictions for vessel heat transfer. Part II.  

SciTech Connect

Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models in which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.

Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.; Evans, Gregory Herbert; Rice, Steven F.

2010-07-01

346

TILTING SATURN. II. NUMERICAL MODEL Douglas P. Hamilton  

E-print Network

TILTING SATURN. II. NUMERICAL MODEL Douglas P. Hamilton Department of Astronomy, University@boulder.swri.edu Receivved 2003 December 30; accepted 2004 July 15 ABSTRACT We argue that the gas giants Jupiter and Saturn of Saturn's rotation to that of Neptune's orbit. Strong support for this model comes from (1) a near- match

Hamilton, Douglas P.

347

Tilting Saturn II. Numerical Model Douglas P. Hamilton  

E-print Network

Tilting Saturn II. Numerical Model Douglas P. Hamilton Astronomy Department, University of Maryland and Saturn were both formed with their rotation axes nearly perpendicular to their orbital planes of Saturn's rotation to that of Neptune's orbit. Strong support for this model comes from i) a near match

Hamilton, Douglas P.

348

Description of a Website Resource for Turbulence Modeling Verification and Validation  

NASA Technical Reports Server (NTRS)

The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

2010-01-01

349

Model Identification and Validation for a Heating System using MATLAB System Identification Toolbox  

NASA Astrophysics Data System (ADS)

This paper proposed a systematic approach to select a mathematical model for an industrial heating system by adopting system identification techniques with the aim of fulfilling the design requirement for the controller. The model identification process will begin by collecting real measurement data samples with the aid of MATLAB system identification toolbox. The criteria for selecting the model that could validate model output with actual data will based upon: parametric identification technique, picking the best model structure with low order among ARX, ARMAX and BJ, and then applying model estimation and validation tests. Simulated results have shown that the BJ model has been best in providing good estimation and validation based upon performance criteria such as: final prediction error, loss function, best percentage of model fit, and co-relation analysis of residual for output.

Junaid Rabbani, Muhammad; Hussain, Kashan; khan, Asim-ur-Rehman; Ali, Abdullah

2013-12-01

350

Model-Based Verification and Validation of Spacecraft Avionics  

NASA Technical Reports Server (NTRS)

Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

Khan, Mohammed Omair

2012-01-01

351

Biomarker Discovery and Validation for Proteomics and Genomics: Modeling And Systematic Analysis  

E-print Network

Discovery and validation of protein biomarkers with high specificity is the main challenge of current proteomics studies. Different mass spectrometry models are used as shotgun tools for discovery of biomarkers which is usually done on a small...

Atashpazgargari, Esmaeil

2014-08-27

352

Validation of Models for Prediction of BRCA1 and BRCA2 Mutations  

Cancer.gov

Validation of Models for Prediction of BRCA1 and BRCA2 Mutations Giovanni Parmigiani, The Sidney Kimmel Comprehensive Cancer Center and the Department of Biostatistics, Johns Hopkins University Tara Friebel, Center for Clinical Epidemiology and Biostatistics,

353

Validation of and enhancements to an operating-speed-based geometric design consistency evaluation model  

E-print Network

This thesis documents efforts to validate two elements related to an operating-speed-based geometric design consistency evaluation procedure: (1) the speed reduction estimation ability of the model, and (2) assumptions about acceleration...

Collins, Kent Michael

2012-06-07

354

Climatically Diverse Data Set for Flat-Plate PV Module Model Validations (Presentation)  

SciTech Connect

Photovoltaic (PV) module I-V curves were measured at Florida, Colorado, and Oregon locations to provide data for the validation and development of models used for predicting the performance of PV modules.

Marion, B.

2013-05-01

355

Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.  

PubMed

A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. PMID:25361792

Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

2014-10-31

356

Incremental “model-build-test” validation exercise for a 1-D biomedical ultrasonic imaging array  

Microsoft Academic Search

Quantitative validation is critical to the effective utilization of large-scale modeling in advanced biomedical ultrasonic imaging applications. This work describes an incremental “model-build-test” validation exercise centered around a nonproprietary, 5 MHz, 1D linear array design. The step-by-step sequence reported here includes piezoceramic slivers, slivers with matching and slivers with both backing and matching. Furthermore, prior to the fabrication process, all

D. J. Powell; G. L. Wojcik; C. S. Desilets; T. R. Gururaja; K. Guggenberger; S. Sherrit; B. K. Mukherjee

1997-01-01

357

Why test animals to treat humans? On the validity of animal models.  

PubMed

Critics of animal modeling have advanced a variety of arguments against the validity of the practice. The point of one such form of argument is to establish that animal modeling is pointless and therefore immoral. In this article, critical arguments of this form are divided into three types, the pseudoscience argument, the disanalogy argument, and the predictive validity argument. I contend that none of these criticisms currently succeed, nor are they likely to. However, the connection between validity and morality is important, suggesting that critical efforts would be instructive if they addressed it in a more nuanced way. PMID:20934650

Shelley, Cameron

2010-09-01

358

Application of CFD techniques toward the validation of nonlinear aerodynamic models  

NASA Technical Reports Server (NTRS)

Applications of computational fluid dynamics (CFD) methods to determine the regimes of applicability of nonlinear models describing the unsteady aerodynamic responses to aircraft flight motions are described. The potential advantages of computational methods over experimental methods are discussed and the concepts underlying mathematical modeling are reviewed. The economic and conceptual advantages of the modeling procedure over coupled, simultaneous solutions of the gas dynamic equations and the vehicle's kinematic equations of motion are discussed. The modeling approach, when valid, eliminates the need for costly repetitive computation of flow field solutions. For the test cases considered, the aerodynamic modeling approach is shown to be valid.

Schiff, L. B.; Katz, J.

1985-01-01

359

GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose  

NASA Technical Reports Server (NTRS)

This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

2014-01-01

360

Distinguishing 'new' from 'old' organic carbon in reclaimed coal mine sites using thermogravimetry: II. Field validation  

SciTech Connect

Thermogravimetry was used under laboratory conditions to differentiate 'new' and 'old' organic carbon (c) by using grass litter, coal, and limestone to represent the different C fractions. Thermogravimetric and derivative thermogravimetry curves showed pyrolysis peaks at distinctively different temperatures, with the peak for litter occurring at 270 to 395{sup o}C, for coal at 415 to 520 {sup o}C, and for limestone at 700 to 785{sup o}C. To validate this method in a field setting, we studied four reforested coal mine sites in Kentucky representing a chronosequence since reclamation: 0 and 2 years located at Bent Mountain and 3 and 8 years located at the Starfire mine. A nonmined mature (approximate to 80 years old) stand at Robinson Forest, Kentucky, was selected as a reference location. Results indicated a general peak increase in the 270 to 395{sup o}C region with increased time, signifying an increase in the 'new' organic matter (OM) fraction. For the Bent Mountain site, the OM fraction increased from 0.03 to 0.095% between years 0 and 2, whereas the Starfire site showed an increase from 0.095 to 1.47% between years 3 and 8. This equates to a C sequestration rate of 2.92 Mg ha{sup -1} yr{sup -1} for 'new' OM in the upper 10-cm layer during the 8 years of reclamation on eastern Kentucky reclaimed coal mine sites. Results suggest that stable isotopes and elemental data can be used as proxy tools for qualifying soil organic C (SOC) changes over time on the reclaimed coal mine sites but cannot be used to determine the exact SOC accumulation rate. However, results suggested that the thermogravimetric and derivative thermogravimetry methods can be used to quantify SOC accumulation and has the potential to be a more reliable, cost-effective, and rapid means to determine the new organic C fraction in mixed geological material, especially in areas dominated by coal and carbonate materials.

Maharaj, S.; Barton, C.D.; Karathanasis, T.A.D.; Rowe, H.D.; Rimmer, S.M. [University of Kentucky, Lexington, KY (United States). Dept. of Forestry

2007-04-15

361

Validation and assessment of integer programming sensor placement models.  

SciTech Connect

We consider the accuracy of predictions made by integer programming (IP) models of sensor placement for water security applications. We have recently shown that IP models can be used to find optimal sensor placements for a variety of different performance criteria (e.g. minimize health impacts and minimize time to detection). However, these models make a variety of simplifying assumptions that might bias the final solution. We show that our IP modeling assumptions are similar to models developed for other sensor placement methodologies, and thus IP models should give similar predictions. However, this discussion highlights that there are significant differences in how temporal effects are modeled for sensor placement. We describe how these modeling assumptions can impact sensor placements.

Uber, James G. (University of Cincinnati, Cincinnati, OH); Hart, William Eugene; Watson, Jean-Paul; Phillips, Cynthia Ann; Berry, Jonathan W.

2005-02-01

362

Differential validation of the US-TEC model  

NASA Astrophysics Data System (ADS)

This paper presents a validation and accuracy assessment of the total electron content (TEC) from US-TEC, a new product presented by the Space Environment Center over the contiguous United States (CONUS). US-TEC is a real-time operational implementation of the MAGIC code and provides TEC maps every 15 min and the line-of-sight electron content between any point within the CONUS and all GPS satellites in view. Validation of TEC is difficult since there are no absolute or true values of TEC. All methods of obtaining TEC, for instance, from GPS, ocean surface monitors (TOPEX), and lightning detectors (FORTE), have challenges that limit their accuracy. GPS data have interfrequency biases; TOPEX also has biases, and data are collected only over the oceans; and FORTE can eliminate biases, but because of the lower operating frequency, the signals suffer greater bending on the rays. Because of the difficulty in obtaining an absolute unbiased TEC measurement, a "differential" accuracy estimate has been performed. The method relies on the fact that uninterrupted GPS data along a particular receiver-satellite link with no cycle slips are very precise. The phase difference (scaled to TEC units) from one epoch to the next can be determined with an accuracy of less than 0.01 TEC units. This fact can be utilized to estimate the uncertainty in the US-TEC vertical and slant path maps. By integrating through US-TEC inversion maps at two different times, the difference in the slant TEC can be compared with the direct phase difference in the original RINEX data file for nine receivers not used in the US-TEC calculations. The results of this study, for the period of April-September 2004, showed an average root mean square error of 2.4 TEC units, which is equivalent to less than 40 cm of signal delay at the GPS L1 frequency. The accuracy estimates from this "differential" method are similar to the results from a companion paper utilizing an "absolute" validation method by comparing with FORTE data.

Araujo-Pradere, E. A.; Fuller-Rowell, T. J.; Spencer, P. S. J.; Minter, C. F.

2007-06-01

363

Comparison of Ecological Validity of Learning Disabilities Diagnostic Models  

ERIC Educational Resources Information Center

The purpose of this article is to examine models designed for the determination of a learning disability and compare them to specific criteria to determine whether the given diagnostic process is ecological in nature. The traditional child-centered deficit model (CCD), Relative Achievement Discrepancy model (RAD), and Responsiveness to…

Dean, Vincent J.; Burns, Matthew K.; Grialou, Tina; Varro, Patrick J.

2006-01-01

364

An Experimentally Validated Model of the Paging Drum  

Microsoft Academic Search

The behaviour of the paging drum or disc has a significant impact on the performance of multiprogrammed, paging systems. A study is made of such devices using the three tools of performance evaluation: an analytical model, a simulation model and empirical measurements. The models extend previous studies by including non-exponential arrival times and a comparison is made between the proposed

Colin Adams; Erol Gelenbe; Jean Vicard

1979-01-01

365

Resilience in an ocean model Strategy, implementation and validation  

E-print Network

a resilience strategy on scientific programs such as climate models: when a failure on a certain MPI process-- Towards Exascale climate model implementations, we propose to explore the possibility of resilience- scale I. INTRODUCTION For decades, climate modeling has been benefiting from the computing progress; now

366

FIELD VALIDATION OF EXPOSURE ASSESSMENT MODELS. VOLUME 2. ANALYSIS  

EPA Science Inventory

This is the second of two volumes describing a series of dual tracer experiments designed to evaluate the PAL-DS model, a Gaussian diffusion model modified to take into account settling and deposition, as well as three other deposition models. In this volume, an analysis of the d...

367

Charge inventory system modeling and validation for unitary air conditioners  

Microsoft Academic Search

Charge inventory, the accounting of distributed fluid mass in closed systems, is important to the modeling of vapor compression systems. Public domain simulation models used to predict the performance of unitary equipment are currently unable to accurately determine charge inventory. Several issues have been identified as sources of error in these models including incomplete internal volume accounting, absent refrigerant-oil diffusion

Todd Michael Harms

2002-01-01

368

Renormalization Group Evolution in the type I + II seesaw model  

E-print Network

We carefully analyze the renormalization group equations in the type I + II seesaw scenario in the extended standard model (SM) and minimal supersymmetric standard model (MSSM). Furthermore, we present analytic formulae of the mixing angles and phases and discuss the RG effect on the different mixing parameters in the type II seesaw scenario. The renormalization group equations of the angles have a contribution which is proportional to the mass squared difference for a hierarchical spectrum. This is in contrast to the inverse proportionality to the mass squared difference in the effective field theory case.

Michael Andreas Schmidt

2007-05-25

369

CheS-Mapper 2.0 for visual validation of (Q)SAR models  

PubMed Central

Background Sound statistical validation is important to evaluate and compare the overall performance of (Q)SAR models. However, classical validation does not support the user in better understanding the properties of the model or the underlying data. Even though, a number of visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allow the investigation of model validation results are still lacking. Results We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. The approach applies the 3D viewer CheS-Mapper, an open-source application for the exploration of small molecules in virtual 3D space. The present work describes the new functionalities in CheS-Mapper 2.0, that facilitate the analysis of (Q)SAR information and allows the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. The approach is generic: It is model-independent and can handle physico-chemical and structural input features as well as quantitative and qualitative endpoints. Conclusions Visual validation with CheS-Mapper enables analyzing (Q)SAR information in the data and indicates how this information is employed by the (Q)SAR model. It reveals, if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org. Graphical abstract Comparing actual and predicted activity values with CheS-Mapper.

2014-01-01

370

PEP-II vacuum system pressure profile modeling using EXCEL  

SciTech Connect

A generic, adaptable Microsoft EXCEL program to simulate molecular flow in beam line vacuum systems is introduced. Modeling using finite-element approximation of the governing differential equation is discussed, as well as error estimation and program capabilities. The ease of use and flexibility of the spreadsheet-based program is demonstrated. PEP-II vacuum system models are reviewed and compared with analytical models.

Nordby, M.; Perkins, C.

1994-06-01

371

Controlled field study for validation of vadose zone transport models  

SciTech Connect

Prediction of radionuclide migration through soil and groundwater requires models which have been tested under a variety of conditions. Unfortunately, many of the existing models have not been tested in the field, partly because such testing requires accurate and representative data. This report provides the design of a large scale field experiment representative, in terms of,surface area and depth of vadose zone, of an actual disposal area. Experiments are proposed which will yield documented data, of sufficient scale, to allow testing of a variety of models including effective media stochastic models and deterministic models. Details of the methodology and procedures to be used in the experiment are presented.

Wierenga, P.J.; Warrick, A.W.; Yeh, T.C. [Arizona Univ., Tucson, AZ (United States); Hills, R.G. [New Mexico State Univ., Las Cruces, NM (United States). Dept. of Mechanical Engineering

1994-08-01

372

Modeling the Object-Oriented Space Through Validated Measures  

NASA Technical Reports Server (NTRS)

In order to truly understand software and the software development process, software measurement must be better understood. A beginning step toward a better understanding of software measurement is the categorization of the measurements by some meaningful taxonomy. The most meaningful taxonomy would capture the basic nature of the subject oriented (O-O) space. The interesting characteristics of object oriented software offer a starting point for such a categorization of measures. A taxonomy has been developed based on fourteen characteristics of object-oriented software gathered from the literature This taxonomy allows us to easily see gaps and redundancies in the O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with thirty-two measures that have been validated in the narrow sense of Fenton, using measurement theory with Zuse's augmentation.

Neal, Ralph D.

1996-01-01

373

Predictive Models and Computational Toxicology (II IBAMTOX)  

EPA Science Inventory

EPA?s ?virtual embryo? project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

374

Nyala and Bushbuck II: A Harvesting Model.  

ERIC Educational Resources Information Center

Adds a cropping or harvesting term to the animal overpopulation model developed in Part I of this article. Investigates various harvesting strategies that might suggest a solution to the overpopulation problem without actually culling any animals. (ASK)

Fay, Temple H.; Greeff, Johanna C.

1999-01-01

375

Three-Dimensional Human Head Finite-Element Model Validation Against Two Experimental Impacts  

Microsoft Academic Search

The impact response of a three-dimensional human head model has been determined by simulating two cadaver tests. The objective of this study was to validate a finite-element human head model under different impact conditions by considering intracranial compressibility. The current University Louis Pasteur model was subjected initially to a direct head impact, of short (6 ms) duration, and the simulation

Remy Willinger; Ho-Sung Kang; Baye Diaw

1999-01-01

376

An Experimental Validation for Broadband Power-Line Communication (BPLC) Model  

Microsoft Academic Search

Recently, different models have been proposed for analyzing the broadband power-line communication (BPLC) systems based on transmission-line (TL) theory. In this paper, we make an attempt to validate one such BPLC model with laboratory experiments by comparing the channel transfer functions. A good agreement between the BPLC model based on TL theory and experiments are found for channel frequencies up

Justinian Anatory; Nelson Theethayi; Rajeev Thottappillil; Mussa M. Kissaka; Nerey H. Mvungi

2008-01-01

377

Getting a picture that is both accurate and stable: Situation models and epistemic validation  

Microsoft Academic Search

Text comprehension entails the construction of a situation model that prepares individuals for situated action. In order to meet this function, situation model representations are required to be both accurate and stable. We propose a framework according to which comprehenders rely on epistemic validation to prevent inaccurate information from entering the situation model. Once information has been integrated in the

Sascha Schroeder; Tobias Richter; Inga Hoever

2008-01-01

378

Spatial calibration and temporal validation of flow for regional scale hydrologic modeling  

Technology Transfer Automated Retrieval System (TEKTRAN)

Physically based regional scale hydrologic modeling is gaining importance for planning and management of water resources. Calibration and validation of such regional scale model is necessary before applying it for scenario assessment. However, in most regional scale hydrologic modeling, flow validat...

379

Development, experimental validation and dynamic analysis of a general transient biofilter model  

Microsoft Academic Search

In this study, a general transient biofiltration model, which incorporates general mixing phenomena, oxygen limitation effects, adsorption phenomena and general biodegradation reaction kinetics, is developed. Solutions are presented with and without the assumption of pseudo-steady state for the biofilm leading to approximate and general models, respectively. Solutions of the model are presented and validated with experimental transient data of benzene

S. M. Zarook; A. A. Shaikh; Z. Ansar

1997-01-01

380

A Validation of the Factor Structure of OQ-45 Scores Using Factor Mixture Modeling  

ERIC Educational Resources Information Center

This study investigated the Outcome Questionnaire's (OQ-45) factor structure and demonstrated the use of factor mixture modeling (FMM) for the purpose of score validation. OQ-45 scores did not fit the one-class, one- and three-factor models. Use of FMM to identify a two-class model is detailed. Implications for OQ-45 users are provided. (Contains…

Kim, Seong-Hyeon; Beretvas, S. Natasha; Sherry, Alissa R.

2010-01-01

381

Numerical implementation and validation of a nonlinear viscoelastic and viscoplastic model for asphalt mixes  

Microsoft Academic Search

This study presents the numerical implementation and validation of a constitutive model for describing the nonlinear behaviour of asphalt mixes. This model incorporates nonlinear viscoelasticity and viscoplasticity to predict the recoverable and irrecoverable responses, respectively. The model is represented in a numerical formulation and implemented in a finite element code using a recursive–iterative algorithm for nonlinear viscoelasticity and the radial

Chien-Wei Huang; Rashid K. Abu Al-Rub; Eyad A. Masad; Dallas N. Little; Gordon D. Airey

2011-01-01

382

Verification, Validation, and Confirmation of Numerical Models in the Earth Sciences  

Microsoft Academic Search

Verification and validation of numerical models of natural systems is impossible. This is because natural systems are never closed and because model results are always non-unique. Models can be confirmed by the demonstration of agreement between observation and prediction, but confirmation is inherently partial. Complete confirmation is logically precluded by the fallacy of affirming the consequent and by incomplete access

Naomi Oreskes; Kristin Shrader-Frechette; Kenneth Belitz

1994-01-01

383

Tsunami Generation by Submarine Mass Failure. I: Modeling, Experimental Validation, and Sensitivity Analyses  

E-print Network

Tsunami Generation by Submarine Mass Failure. I: Modeling, Experimental Validation, and Sensitivity with a two-dimensional 2D fully nonlinear potential flow FNPF model for tsunami generation by two idealized a simple wavemaker formalism, and prescribed as a boundary condition in the FNPF model. Tsunami amplitudes

Grilli, Stéphan T.

384

Model validation for structural dynamic analysis: An approach to the Sandia Structural Dynamics Challenge  

Microsoft Academic Search

The validation of mathematical models constructed for the dynamic analysis of critical structures is a very important, but complex, process. The essential requirement is to provide confirmation, using independent and more reliable data than that presented by the model in question, that the subject model is capable of describing the essential physics of the structure’s behaviour within the required accuracy.

C. Zang; C. W. Schwingshackl; D. J. Ewins

2008-01-01

385

Dicobalt II-II, II-III, and III-III complexes as spectroscopic models for dicobalt enzyme active sites.  

PubMed

A matched set of dinuclear cobalt complexes with II-II, II-III, and III-III oxidation states have been prepared and structurally characterized. In [(bpbp)Co2(O2P(OPh)2)2]n+ ( n = 1, 2, or 3; bpbp(-) = 2,6-bis(( N,N'-bis-(2-picolyl)amino)-methyl)-4-tertbutylphenolato), the nonbonded Co...Co separations are within the range 3.5906(17) to 3.7081(11) angstroms, and the metal ions are triply bridged by the phenolate oxygen atom of the heptadentate dinucleating ligand and by two diphenylphosphate groups. The overall structures and geometries of the complexes are very similar, with minor variations in metal-ligand bond distances consistent with oxidation state assignments. The CoIICoIII compound is a valence-trapped Robin-Day class II complex. Solid state 31P NMR spectra of the diamagnetic CoIIICoIII (3) and paramagnetic CoIICoIII (2) and CoIICoII (1) complexes show that 31P isotropic shifts broaden and move downfield by about 3000 ppm for each increment in oxidation state. Cyclic voltammetry corroborates the existence of the CoIICoII, CoIICoIII, and CoIIICoIII species in solution. The redox changes are not reversible in the applied scanning timescales, indicating that chemical changes are associated with oxidation and reduction of the cobalt centers. An investigation of the spectroscopic properties of this series has been carried out for its potential usefulness in analyses of the related spectroscopic properties of the dicobalt metallohydrolases. Principally, magnetic circular dichroism (MCD) has been used to determine the strength of the magnetic exchange coupling in the CoIICoII complex by analysis of the variable-temperature variable-field (VTVH) intensity behavior of the MCD signal. The series is ideal for the spectroscopic determination of magnetic coupling since it can occur only in the CoIICoII complex. The CoIICoIII complex contains a nearly isostructural CoII ion, but since CoIII is diamagnetic, the magnetic coupling is switched off, while the spectral features of the CoII ion remain. Analysis of the MCD data from the CoIICoIII complex has been undertaken in the theoretical context of a 4T1g ground-state of the CoII ion, initially in an octahedral ligand field that is split by both geometric distortion and zero-field splitting to form an isolated doublet ground state. The MCD data for the CoIICoII pair in the [(bpbp)Co2(O2P(OPh)2)2]+ complex were fitted to a model based on weak antiferromagnetic coupling with J = -1.6 cm (-1). The interpretation is confirmed by solid state magnetic susceptibility measurements. PMID:18494467

Johansson, Frank B; Bond, Andrew D; Nielsen, Ulla Gro; Moubaraki, Boujemaa; Murray, Keith S; Berry, Kevin J; Larrabee, James A; McKenzie, Christine J

2008-06-16

386

Shape memory polymer filled honeycomb model and experimental validation  

NASA Astrophysics Data System (ADS)

An analytical model predicting the in-plane Young’s and shear moduli of a shape memory polymer filled honeycomb composite is presented. By modeling the composite as a series of rigidly attached beams, the mechanical advantage of the load distributed on each beam by the infill is accounted for. The model is compared to currently available analytical models as well as experimental data. The model correlates extremely well with experimental data for empty honeycomb and when the polymer is above its glass transition temperature. Below the glass transition temperature, rule of mixtures is shown to be more accurate as bending is no longer the dominant mode of deformation. The model is also derived for directions other than the typical x and y allowing interpolation of the stiffness of the composite in any direction.

Beblo, R. V.; Puttmann, J. P.; Joo, J. J.; Reich, G. W.

2015-02-01

387

Radiation model predictions and validation using LDEF satellite data  

NASA Technical Reports Server (NTRS)

Predictions and comparisons with the radiation dose measurements on Long Duration Exposure Facility (LDEF) by thermoluminescent dosimeters were made to evaluate the accuracy of models currently used in defining the ionizing radiation environment for low Earth orbit missions. The calculations include a detailed simulation of the radiation exposure (altitude and solar cycle variations, directional dependence) and shielding effects (three-dimensional LDEF geometry model) so that differences in the predicted and observed doses can be attributed to environment model uncertainties. The LDEF dose data are utilized to assess the accuracy of models describing the trapped proton flux, the trapped proton directionality, and the trapped electron flux.

Armstrong, T. W.; Colborn, B. L.

1993-01-01

388

Probabilistic Models for NLP? Empirical Validity and Technological Viability  

E-print Network

an adequate model for natural language processing (NLP). We provide evidence that the Linguistic approach in modeling the input-output behavior of an adult language user in processing actual, naturally occurring tools for dealing with the core problem in language processing: uncertainty. We explain how uncertainty

Sima'an, Khalil

389

Validation and Verification of LADEE Models and Software  

NASA Technical Reports Server (NTRS)

The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

Gundy-Burlet, Karen

2013-01-01

390

An Experimental Validation of Heat Loss Models for Marine Mammals  

Microsoft Academic Search

Various heat loss models have been used to predict metabolic rates or lower critical temperatures of marine mammals. We have evaluated the accuracy of four models by making detailed measurements of all input parameters, while simultaneously recording the metabolic rate in two resting harp seals (Phoca groenlandica) in ice water. We subtracted respiratory heat loss from metabolic rate and compared

Petter H. Kvadsheim; Anne R. L. Gotaas; Lars P. Folkow; Arnoldus S. Blix

1997-01-01

391

Chemical kinetics parameters and model validation for the gasification of PCEA nuclear graphite  

SciTech Connect

A series of gasification experiments, using two right cylinder specimens (~ 12.7 x 25.4 mm and 25.4 x 25.4 mm) of PCEA nuclear graphite in ambient airflow, measured the total gasification flux at weight losses up to 41.5% and temperatures (893-1015 K) characteristics of those for in-pores gasification Mode (a) and in-pores diffusion-limited Mode (b). The chemical kinetics parameters for the gasification of PCEA graphite are determined using a multi-parameters optimization algorithm from the measurements of the total gasification rate and transient weight loss in experiments. These parameters are: (i) the pre-exponential rate coefficients and the Gaussian distributions and values of specific activation energies for adsorption of oxygen and desorption of CO gas; (ii) the specific activation energy and pre-exponential rate coefficient for the breakup of stable un-dissociated C(O2) oxygen radicals to form stable (CO) complexes; (iii) the specific activation energy and pre-exponential coefficient for desorption of CO2 gas and; (iv) the initial surface area of reactive free sites per unit mass. This area is consistently 13.5% higher than that for nuclear graphite grades of NBG-25 and IG-110 and decreases inversely proportional with the square root of the initial mass of the graphite specimens in the experiments. Experimental measurements successfully validate the chemical-reactions kinetics model that calculates continuous Arrhenius curves of the total gasification flux and the production rates of CO and CO2 gases. The model results at different total weight losses agree well with measurements and expand beyond the temperatures in the experiments to the diffusion-limited mode of gasification. Also calculated are the production rates of CO and CO2 gases and their relative contributions to the total gasification rate in the experiments as functions of temperature, for total weight losses of 5% and 10%.

El-Genk, Mohamed S [University of New Mexico, Albuquerque] [University of New Mexico, Albuquerque; Tournier, Jean-Michel [University of New Mexico, Albuquerque] [University of New Mexico, Albuquerque; Contescu, Cristian I [ORNL] [ORNL

2014-01-01

392

Mesoscale Model Validation using Stable Water Isotopes: The isoWATFLOOD Model  

NASA Astrophysics Data System (ADS)

A methodology to improve mesoscale model validation is developed by calibrating simulations of both water and isotope mass simultaneously. The isoWATFLOOD model simulates changes in oxygen-18 of streamflow and hydrological processes contributing to streamflow. The added constraint of simulated to measured delta oxygen-18 in streamflow lowers the models degrees of freedom and generates more physically-based model parameterizations. Modelled results are shown to effectively reduce and constrain errors associated with equifinality in streamflow generation, providing a practical new approach for the assessment of mesoscale modelling. The WATFLOOD model is a conceptually-based distributed hydrological model used for simulating streamflow on mesoscale watersheds. Given the model's intended application to mesoscale hydrology, it remains crucial to ensure conceptualizations are physically representative of the hydrologic cycle and the natural environment. Building upon the existing flowpath-separation module within WATFLOOD, the capability to simulate changes in oxygen-18 through each component of the hydrological cycle is introduced. Masses of heavy-isotope are computed for compartmental storages; compartmental flows transfer flux-weighted portions of isotope mass between storages; and mass outflows from each compartment simultaneously combine to form the resultant channel flow composition. Heavy-isotope compositions are enriched when storages undergo evaporation resulting from the loss of isotopically-depleted vapour described by the well-known Craig & Gordon isotopic fractionation model. The isoWATFLOOD model is forced by oxygen-18 in rain, oxygen-18 in snow, and relative humidity; and requires no additional parameterizations of WATFLOOD. The first mesoscale, continuous simulations of changes in oxygen-18 in streamflow are presented for the remote Fort Simpson basin in Northwest Territories, Canada and for the largely populated Grand River Basin in south western Ontario using the EnSim post-processor software. These simulations shed light on watershed 'hot spots' and the dominant hydrological controls and responses inherent to various regions and dominant hydrological controls.

Stadnyk, T.; Kouwen, N.; Edwards, T.; Gibson, J.; Pietroniro, A.

2009-05-01

393

Validated biomechanical model for efficiency and speed of rowing.  

PubMed

The speed of a competitive rowing crew depends on the number of crew members, their body mass, sex and the type of rowing-sweep rowing or sculling. The time-averaged speed is proportional to the rower's body mass to the 1/36th power, to the number of crew members to the 1/9th power and to the physiological efficiency (accounted for by the rower's sex) to the 1/3rd power. The quality of the rowing shell and propulsion system is captured by one dimensionless parameter that takes the mechanical efficiency, the shape and drag coefficient of the shell and the Froude propulsion efficiency into account. We derive the biomechanical equation for the speed of rowing by two independent methods and further validate it by successfully predicting race times. We derive the theoretical upper limit of the Froude propulsion efficiency for low viscous flows. This upper limit is shown to be a function solely of the velocity ratio of blade to boat speed (i.e., it is completely independent of the blade shape), a result that may also be of interest for other repetitive propulsion systems. PMID:25189093

Pelz, Peter F; Vergé, Angela

2014-10-17

394

Nearshore Tsunami Inundation Model Validation: Toward Sediment Transport Applications  

USGS Publications Warehouse

Model predictions from a numerical model, Delft3D, based on the nonlinear shallow water equations are compared with analytical results and laboratory observations from seven tsunami-like benchmark experiments, and with field observations from the 26 December 2004 Indian Ocean tsunami. The model accurately predicts the magnitude and timing of the measured water levels and flow velocities, as well as the magnitude of the maximum inundation distance and run-up, for both breaking and non-breaking waves. The shock-capturing numerical scheme employed describes well the total decrease in wave height due to breaking, but does not reproduce the observed shoaling near the break point. The maximum water levels observed onshore near Kuala Meurisi, Sumatra, following the 26 December 2004 tsunami are well predicted given the uncertainty in the model setup. The good agreement between the model predictions and the analytical results and observations demonstrates that the numerical solution and wetting and drying methods employed are appropriate for modeling tsunami inundation for breaking and non-breaking long waves. Extension of the model to include sediment transport may be appropriate for long, non-breaking tsunami waves. Using available sediment transport formulations, the sediment deposit thickness at Kuala Meurisi is predicted generally within a factor of 2.

Apotsos, Alex; Buckley, Mark; Gelfenbaum, Guy; Jaffe, Bruce; Vatvani, Deepak

2011-01-01

395

Improving Agent Based Models and Validation through Data Fusion  

PubMed Central

This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

2011-01-01

396

Nearshore Tsunami Inundation Model Validation: Toward Sediment Transport Applications  

NASA Astrophysics Data System (ADS)

Model predictions from a numerical model, Delft3D, based on the nonlinear shallow water equations are compared with analytical results and laboratory observations from seven tsunami-like benchmark experiments, and with field observations from the 26 December 2004 Indian Ocean tsunami. The model accurately predicts the magnitude and timing of the measured water levels and flow velocities, as well as the magnitude of the maximum inundation distance and run-up, for both breaking and non-breaking waves. The shock-capturing numerical scheme employed describes well the total decrease in wave height due to breaking, but does not reproduce the observed shoaling near the break point. The maximum water levels observed onshore near Kuala Meurisi, Sumatra, following the 26 December 2004 tsunami are well predicted given the uncertainty in the model setup. The good agreement between the model predictions and the analytical results and observations demonstrates that the numerical solution and wetting and drying methods employed are appropriate for modeling tsunami inundation for breaking and non-breaking long waves. Extension of the model to include sediment transport may be appropriate for long, non-breaking tsunami waves. Using available sediment transport formulations, the sediment deposit thickness at Kuala Meurisi is predicted generally within a factor of 2.

Apotsos, Alex; Buckley, Mark; Gelfenbaum, Guy; Jaffe, Bruce; Vatvani, Deepak

2011-11-01

397

An analytical approach to computing biomolecular electrostatic potential. II. Validation and applications  

NASA Astrophysics Data System (ADS)

An ability to efficiently compute the electrostatic potential produced by molecular charge distributions under realistic solvation conditions is essential for a variety of applications. Here, the simple closed-form analytical approximation to the Poisson equation rigorously derived in Part I for idealized spherical geometry is tested on realistic shapes. The effects of mobile ions are included at the Debye-Hückel level. The accuracy of the resulting closed-form expressions for electrostatic potential is assessed through comparisons with numerical Poisson-Boltzmann (NPB) reference solutions on a test set of 580 representative biomolecular structures under typical conditions of aqueous solvation. For each structure, the deviation from the reference is computed for a large number of test points placed near the dielectric boundary (molecular surface). The accuracy of the approximation, averaged over all test points in each structure, is within 0.6 kcal/mol/|e|~kT per unit charge for all structures in the test set. For 91.5% of the individual test points, the deviation from the NPB potential is within 0.6 kcal/mol/|e|. The deviations from the reference decrease with increasing distance from the dielectric boundary: The approximation is asymptotically exact far away from the source charges. Deviation of the overall shape of a structure from ideal spherical does not, by itself, appear to necessitate decreased accuracy of the approximation. The largest deviations from the NPB reference are found inside very deep and narrow indentations that occur on the dielectric boundaries of some structures. The dimensions of these pockets of locally highly negative curvature are comparable to the size of a water molecule; the applicability of a continuum dielectric models in these regions is discussed. The maximum deviations from the NPB are reduced substantially when the boundary is smoothed by using a larger probe radius (3 A?) to generate the molecular surface. A detailed accuracy analysis is presented for several proteins of various shapes, including lysozyme whose surface features a functionally relevant region of negative curvature. The proposed analytical model is computationally inexpensive; this strength of the approach is demonstrated by computing and analyzing the electrostatic potential generated by a full capsid of the tobacco ring spot virus at atomic resolution (500 000 atoms). An analysis of the electrostatic potential of the inner surface of the capsid reveals what might be a RNA binding pocket. These results are generated with the modest computational power of a desktop personal computer.

Gordon, John C.; Fenley, Andrew T.; Onufriev, Alexey

2008-08-01

398

Animal models of post-traumatic stress disorder: face validity  

PubMed Central

Post-traumatic stress disorder (PTSD) is a debilitating condition that develops in a proportion of individuals following a traumatic event. Despite recent advances, ethical limitations associated with human research impede progress in understanding PTSD. Fortunately, much effort has focused on developing animal models to help study the pathophysiology of PTSD. Here, we provide an overview of animal PTSD models where a variety of stressors (physical, psychosocial, or psychogenic) are used to examine the long-term effects of severe trauma. We emphasize models involving predator threat because they reproduce human individual differences in susceptibility to, and in the long-term consequences of, psychological trauma. PMID:23754973

Goswami, Sonal; Rodríguez-Sierra, Olga; Cascardi, Michele; Paré, Denis

2013-01-01

399

Automatic Specialization of Actor-oriented Models in Ptolemy II  

Microsoft Academic Search

This report presents a series of techniques forautomatic specialization of generic component specifications. These techniques allow thetransformation of a generic component specifications into more compact and efficient ones.We have integrated these techniques into a code generator for Ptolemy II, a softwareframework for actor-oriented design in Java [15]. Combining automatic code generationwith actor specialization enables efficient implementation of models without sacrificingdesign...

Stephen Neuendorffer; Edward Lee

400

PART II TECHNIQUES PROJECT MODELLING OF THE CORROSION OF  

E-print Network

- 1 - PART II TECHNIQUES PROJECT MODELLING OF THE CORROSION OF BINARY ALLOYS R.A. Jones Produced and temperatures. In this work a neural network method was employed to study how the rate of corrosion of Fe accordance with the literature. 1. Introduction The atmosphere is the corrosive environment to which alloys

Cambridge, University of

401

Bow shock models of ultracompact H II regions  

NASA Technical Reports Server (NTRS)

This paper presents models of ultracompact H II regions as the bow shocks formed by massive stars, with strong stellar winds, moving supersonically through molecular clouds. The morphologies, sizes and brightnesses of observed objects match the models well. Plausible models are provided for the ultracompact H II regions G12.21 - 0.1, G29.96 - 0.02, G34.26 + 0.15, and G43.89 - 0.78. To do this, the equilibrium shape of the wind-blown shell is calculated, assuming momentum conservation. Then the shell is illuminated with ionizing radiation from the central star, radiative transfer for free-free emission through the shell is performed, and the resulting object is visualized at various angles for comparison with radio continuum maps. The model unifies most of the observed morphologies of ultracompact H II regions, excluding only those objects with spherical shells. Ram pressure confinement greatly lengthens the life of ultracompact H II regions, explaining the large number that exist in the Galaxy despite their low apparent kinematic ages.

Mac Low, Mordecai-Mark; Van Buren, Dave; Wood, Douglas O. S.; Churchwell, ED

1991-01-01

402

Method for appraising model validity of randomised controlled trials of homeopathic treatment: multi-rater concordance study  

PubMed Central

Background A method for assessing the model validity of randomised controlled trials of homeopathy is needed. To date, only conventional standards for assessing intrinsic bias (internal validity) of trials have been invoked, with little recognition of the special characteristics of homeopathy. We aimed to identify relevant judgmental domains to use in assessing the model validity of homeopathic treatment (MVHT). We define MVHT as the extent to which a homeopathic intervention and the main measure of its outcome, as implemented in a randomised controlled trial (RCT), reflect 'state-of-the-art' homeopathic practice. Methods Using an iterative process, an international group of experts developed a set of six judgmental domains, with associated descriptive criteria. The domains address: (I) the rationale for the choice of the particular homeopathic intervention; (II) the homeopathic principles reflected in the intervention; (III) the extent of homeopathic practitioner input; (IV) the nature of the main outcome measure; (V) the capability of the main outcome measure to detect change; (VI) the length of follow-up to the endpoint of the study. Six papers reporting RCTs of homeopathy of varying design were randomly selected from the literature. A standard form was used to record each assessor's independent response per domain, using the optional verdicts 'Yes', 'Unclear', 'No'. Concordance among the eight verdicts per domain, across all six papers, was evaluated using the kappa (?) statistic. Results The six judgmental domains enabled MVHT to be assessed with 'fair' to 'almost perfect' concordance in each case. For the six RCTs examined, the method allowed MVHT to be classified overall as 'acceptable' in three, 'unclear' in two, and 'inadequate' in one. Conclusion Future systematic reviews of RCTs in homeopathy should adopt the MVHT method as part of a complete appraisal of trial validity. PMID:22510227

2012-01-01

403

Modeling Fe II Emission and Revised Fe II (UV) Empirical Templates for the Seyfert 1 Galaxy I Zw 1  

NASA Astrophysics Data System (ADS)

We use the narrow-lined broad-line region (BLR) of the Seyfert 1 galaxy, I Zw 1, as a laboratory for modeling the ultraviolet (UV) Fe II 2100-3050 Å emission complex. We calculate a grid of Fe II emission spectra representative of BLR clouds and compare them with the observed I Zw 1 spectrum. Our predicted spectrum for log[nH/(cm-3)=11.0, log[?H/(cm-2 s-1)=20.5, and ?/(1 km s-1)=20, using Cloudy and an 830 level model atom for Fe II with energies up to 14.06 eV, gives a better fit to the UV Fe II emission than models with fewer levels. Our analysis indicates (1) the observed UV Fe II emission must be corrected for an underlying Fe II pseudocontinuum; (2) Fe II emission peaks can be misidentified as that of other ions in active galactic nuclei (AGNs) with narrow-lined BLRs possibly affecting deduced physical parameters; (3) the shape of 4200-4700 Å Fe II emission in I Zw 1 and other AGNs is a relative indicator of narrow-line region (NLR) and BLR Fe II emission; (4) predicted ratios of Ly?, C III], and Fe II emission relative to Mg II ?2800 agree with extinction corrected observed I Zw 1 fluxes, except for C IV ?1549 (5) the sensitivity of Fe II emission strength to microturbulence ? casts doubt on existing relative Fe/Mg abundances derived from Fe II (UV)/Mg II flux ratios. Our calculated Fe II emission spectra, suitable for BLRs in AGNs, are available at http://iacs.cua.edu/people/verner/FeII. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 05-26555.

Bruhweiler, F.; Verner, E.

2008-03-01

404

Prevalence of depression and validation of the Beck Depression Inventory-II and the Children's Depression Inventory-Short amongst HIV-positive adolescents in Malawi  

PubMed Central

Introduction There is a remarkable dearth of evidence on mental illness in adolescents living with HIV/AIDS, particularly in the African setting. Furthermore, there are few studies in sub-Saharan Africa validating the psychometric properties of diagnostic and screening tools for depression amongst adolescents. The primary aim of this cross-sectional study was to estimate the prevalence of depression amongst a sample of HIV-positive adolescents in Malawi. The secondary aim was to develop culturally adapted Chichewa versions of the Beck Depression Inventory-II (BDI-II) and Children's Depression Inventory-II-Short (CDI-II-S) and conduct a psychometric evaluation of these measures by evaluating their performance against a structured depression assessment using the Children's Rating Scale, Revised (CDRS-R). Study design Cross-sectional study. Methods We enrolled 562 adolescents, 12–18 years of age from two large public HIV clinics in central and southern Malawi. Participants completed two self-reports, the BDI-II and CDI-II-S, followed by administration of the CDRS-R by trained clinicians. Sensitivity, specificity and positive and negative predictive values for various BDI-II and CDI-II-S cut-off scores were calculated with receiver operating characteristics analysis. The area under the curve (AUC) was also calculated. Internal consistency was measured by standardized Cronbach's alpha coefficient, and correlation between self-reports and CDRS-R by Spearman's correlation. Results Prevalence of depression as measured by the CDRS-R was 18.9%. Suicidal ideation was expressed by 7.1% (40) using the BDI-II. The AUC for the BDI-II was 0.82 (95% CI 0.78–0.89) and for the CDI-II-S was 0.75 (95% CI 0.70–0.80). A score of ?13 in BDI-II achieved sensitivity of >80%, and a score of ?17 had a specificity of >80%. The Cronbach's alpha was 0.80 (BDI-II) and 0.66 (CDI-II-S). The correlation between the BDI-II and CDRS-R was 0.42 (p<0.001) and between the CDI-II-S and CDRS-R was 0.37 (p<0.001). Conclusions This study demonstrates that the BDI-II has sound psychometric properties in an outpatient setting among HIV-positive adolescents in Malawi. The high prevalence of depression amongst HIV-positive Malawian adolescents noted in this study underscores the need for the development of comprehensive services for HIV-positive adolescents. PMID:25085002

Kim, Maria H; Mazenga, Alick C; Devandra, Akash; Ahmed, Saeed; Kazembe, Peter N; Yu, Xiaoying; Nguyen, Chi; Sharp, Carla

2014-01-01

405

Empirical Validation of Metrics for Conceptual Models of Data Warehouses  

Microsoft Academic Search

\\u000a Data warehouses (DW), based on the multidimensional modeling, provide companies with huge historical information for the decision\\u000a making process. As these DW’s are crucial for companies in making decisions, their quality is absolutely critical. One of\\u000a the main issues that influences their quality lays on the models (conceptual, logical and physical) we use to design them.\\u000a In the last years,

Manuel A. Serrano; Coral Calero; Juan Trujillo; Sergio Luján-mora; Mario Piattini

2004-01-01

406

Assessing youth who sexually offended: the predictive validity of the ERASOR, J-SOAP-II, and YLS/CMI in a non-Western context.  

PubMed

Recent research suggested that the predictive validity of adult sexual offender risk assessment measures can be affected when used cross-culturally, but there is no published study on the predictive validity of risk assessment measures for youth who sexually offended in a non-Western context. This study compared the predictive validity of three youth risk assessment measures (i.e., the Estimate of Risk of Adolescent Sexual Offense Recidivism [ERASOR], the Juvenile Sex Offender Assessment Protocol-II [J-SOAP-II], and the Youth Level of Service/Case Management Inventory [YLS/CMI]) for sexual and nonviolent recidivism in a sample of 104 male youth who sexually offended within a Singaporean context (M (follow-up) = 1,637 days; SD (follow-up) = 491). Results showed that the ERASOR overall clinical rating and total score significantly predicted sexual recidivism but only the former significantly predicted time to sexual reoffense. All of the measures (i.e., the ERASOR overall clinical rating and total score, the J-SOAP-II total score, as well as the YLS/CMI) significantly predicted nonsexual recidivism and time to nonsexual reoffense for this sample of youth who sexually offended. Overall, the results suggest that the ERASOR appears to be suited for assessing youth who sexually offended in a non-Western context, but the J-SOAP-II and the YLS/CMI have limited utility for such a purpose. PMID:21825111

Chu, Chi Meng; Ng, Kynaston; Fong, June; Teoh, Jennifer

2012-04-01

407

Validation of an EMG-driven, graphically based isometric musculoskeletal model of the cervical spine.  

PubMed

EMG-driven musculoskeletal modeling is a method in which loading on the active and passive structures of the cervical spine may be investigated. A model of the cervical spine exists; however, it has yet to be criterion validated. Furthermore, neck muscle morphometry in this model was derived from elderly cadavers, threatening model validity. Therefore, the overall aim of this study was to modify and criterion validate this preexisting graphically based musculoskeletal model of the cervical spine. Five male subjects with no neck pain participated in this study. The study consisted of three parts. First, subject-specific neck muscle morphometry data were derived by using magnetic resonance imaging. Second, EMG drive for the model was generated from both surface (Drive 1: N=5) and surface and deep muscles (Drive 2: N=3). Finally, to criterion validate the modified model, net moments predicted by the model were compared against net moments measured by an isokinetic dynamometer in both maximal and submaximal isometric contractions with the head in the neutral posture, 20 deg of flexion, and 35 deg of extension. Neck muscle physiological cross sectional area values were greater in this study when compared to previously reported data. Predictions of neck torque by the model were better in flexion (18.2% coefficient of variation (CV)) when compared to extension (28.5% CV) and using indwelling EMG did not enhance model predictions. There were, however, large variations in predictions when all the contractions were compared. It is our belief that further work needs to be done to improve the validity of the modified EMG-driven neck model examined in this study. A number of factors could potentially improve the model with the most promising probably being optimizing various modeling parameters by using methods established by previous researchers investigating other joints of the body. PMID:18532863

Netto, Kevin J; Burnett, Angus F; Green, Jonathon P; Rodrigues, Julian P

2008-06-01

408

A Hardware Model Validation Tool for Use in Complex Space Systems  

NASA Technical Reports Server (NTRS)

One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

2010-01-01

409

Validation of simulation strategies for the flow in a model propeller turbine during a runaway event  

NASA Astrophysics Data System (ADS)

Recent researches indicate that the useful life of a turbine can be affected by transient events. This study aims to define and validate strategies for the simulation of the flow within a propeller turbine model in runaway condition. Using unsteady pressure measurements on two runner blades for validation, different strategies are compared and their results analysed in order to quantify their precision. This paper will focus on justifying the choice of the simulations strategies and on the analysis of preliminary results.

Fortin, M.; Houde, S.; Deschênes, C.

2014-12-01

410

Root zone water quality model (RZWQM2): Model use, calibration and validation  

USGS Publications Warehouse

The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model, it has many desirable features for the modeling community. This article outlines the principles of calibrating the model component by component with one or more datasets and validating the model with independent datasets. Users should consult the RZWQM2 user manual distributed along with the model and a more detailed protocol on how to calibrate RZWQM2 provided in a book chapter. Two case studies (or examples) are included in this article. One is from an irrigated maize study in Colorado to illustrate the use of field and laboratory measured soil hydraulic properties on simulated soil water and crop production. It also demonstrates the interaction between soil and plant parameters in simulated plant responses to water stresses. The other is from a maize-soybean rotation study in Iowa to show a manual calibration of the model for crop yield, soil water, and N leaching in tile-drained soils. Although the commonly used trial-and-error calibration method works well for experienced users, as shown in the second example, an automated calibration procedure is more objective, as shown in the first example. Furthermore, the incorporation of the Parameter Estimation Software (PEST) into RZWQM2 made the calibration of the model more efficient than a grid (ordered) search of model parameters. In addition, PEST provides sensitivity and uncertainty analyses that should help users in selecting the right parameters to calibrate.

Ma, Liwang; Ahuja, Lajpat; Nolan, B.T.; Malone, Robert; Trout, Thomas; Qi, Z.

2012-01-01

411

Comparative validation of statistical and dynamical downscaling models on a dense grid in central Europe: temperature  

NASA Astrophysics Data System (ADS)

Minimum and maximum temperature in two regional climate models and five statistical downscaling models are validated according to a unified set of criteria that have a potential relevance for impact assessments: persistence (temporal autocorrelations), spatial autocorrelations, extreme quantiles, skewness, kurtosis, and the degree of fit to observed data on both short and long times scales. The validation is conducted on two dense grids in central Europe as follows: (1) a station network and (2) a grid with a resolution of 10 km. The gridded dataset is not contaminated by artifacts of the interpolation procedure; therefore, we claim that using a gridded dataset as a validation base is a valid approach. The fit to observations in short time scales is equally good for the statistical downscaling (SDS) models and regional climate models (RCMs) in winter, while it is much better for the SDS models in summer. The reproduction of variability on long time scales, expressed as linear trends, is similarly successful by both SDS models and RCMs. Results for other criteria suggest that there is no justification for preferring dynamical models at the expense of statistical models—and vice versa. The non-linear SDS models do not outperform the linear ones.

Huth, Radan; Mikšovský, Ji?í; Št?pánek, Petr; Belda, Michal; Farda, Aleš; Chládová, Zuzana; Pišoft, Petr

2014-06-01

412

Some Hamiltonian models of friction II  

SciTech Connect

In the present paper we consider the motion of a very heavy tracer particle in a medium of a very dense, non-interacting Bose gas. We prove that, in a certain mean-field limit, the tracer particle will be decelerated and come to rest somewhere in the medium. Friction is caused by emission of Cerenkov radiation of gapless modes into the gas. Mathematically, a system of semilinear integro-differential equations, introduced in Froehlich et al. ['Some hamiltonian models of friction,' J. Math. Phys. 52(8), 083508 (2011)], describing a tracer particle in a dispersive medium is investigated, and decay properties of the solution are proven. This work is an extension of Froehlich et al. ['Friction in a model of hamiltonian dynamics,' Commun. Math. Phys. 315(2), 401-444 (2012)]; it is an extension because no weak coupling limit for the interaction between tracer particle and medium is assumed. The technical methods used are dispersive estimates and a contraction principle.

Egli, Daniel; Gang Zhou [Institute for Theoretical Physics, ETH Zurich, CH-8093 Zuerich (Switzerland)

2012-10-15

413

Updated Delft Mass Transport model DMT-2: computation and validation  

NASA Astrophysics Data System (ADS)

A number of research centers compute models of mass transport in the Earth's system using primarily K-Band Ranging (KBR) data from the Gravity Recovery And Climate Experiment (GRACE) satellite mission. These models typically consist of a time series of monthly solutions, each of which is defined in terms of a set of spherical harmonic coefficients up to degree 60-120. One of such models, the Delft Mass Transport, release 2 (DMT-2), is computed at the Delft University of Technology (The Netherlands) in collaboration with Wuhan University. An updated variant of this model has been produced recently. A unique feature of the computational scheme designed to compute DMT-2 is the preparation of an accurate stochastic description of data noise in the frequency domain using an Auto-Regressive Moving-Average (ARMA) model, which is derived for each particular month. The benefits of such an approach are a proper frequency-dependent data weighting in the data inversion and an accurate variance-covariance matrix of noise in the estimated spherical harmonic coefficients. Furthermore, the data prior to the inversion are subject to an advanced high-pass filtering, which makes use of a spatially-dependent weighting scheme, so that noise is primarily estimated on the basis of data collected over areas with minor mass transport signals (e.g., oceans). On the one hand, this procedure efficiently suppresses noise, which are caused by inaccuracies in satellite orbits and, on the other hand, preserves mass transport signals in the data. Finally, the unconstrained monthly solutions are filtered using a Wiener filter, which is based on estimates of the signal and noise variance-covariance matrices. In combination with a proper data weighting, this noticeably improves the spatial resolution of the monthly gravity models and the associated mass transport models.. For instance, the computed solutions allow long-term negative trends to be clearly seen in sufficiently small regions notorious for rapid mass transport losses, such us the Kangerdlugssuaq and Jakobshavn glaciers in the Greenland ice sheet, as well as the Aral Sea in the Central Asia. The updated variant of DMT-2 has been extensively tested and compared with alternative models. A number of regions/processes have been considered for that purpose. In particular, this model has been applied to estimate mass variations in Greenland and Antarctica (both total and for individual ice drainage systems), as well as to improve a hydrological model of the Rhine River basin. Furthermore, a time-series of degree-1 coefficients has been derived from the DMT-2 model using the method of Swenson et al. (2008). The obtained results are in a good agreement both with alternative GRACE-based models and with independent data, which confirms a high quality of the updated variant of DMT-2.

Hashemi Farahani, Hassan; Ditmar, Pavel; Inacio, Pedro; Klees, Roland; Guo, Jing; Guo, Xiang; Liu, Xianglin; Zhao, Qile; Didova, Olga; Ran, Jiangjun; Sun, Yu; Tangdamrongsub, Natthachet; Gunter, Brian; Riva, Ricardo; Steele-Dunne, Susan

2014-05-01

414

Modelling and validation of magnetorheological brake responses using parametric approach  

NASA Astrophysics Data System (ADS)

Magnetorheological brake (MR Brake) is one x-by-wire systems which performs better than conventional brake systems. MR brake consists of a rotating disc that is immersed with Magnetorheological Fluid (MR Fluid) in an enclosure of an electromagnetic coil. The applied magnetic field will increase the yield strength of the MR fluid where this fluid was used to decrease the speed of the rotating shaft. The purpose of this paper is to develop a mathematical model to represent MR brake with a test rig. The MR brake model is developed based on actual torque characteristic which is coupled with motion of a test rig. Next, the experimental are performed using MR brake test rig and obtained three output responses known as angular velocity response, torque response and load displacement response. Furthermore, the MR brake was subjected to various current. Finally, the simulation results of MR brake model are then verified with experimental results.

Z, Zainordin A.; A, Abdullah M.; K, Hudha

2013-12-01

415

Modeling dissolved organic carbon in temperate forest soils: TRIPLEX-DOC model development and validation  

NASA Astrophysics Data System (ADS)

Even though dissolved organic carbon (DOC) is the most active carbon (C) cycling in soil organic carbon (SOC) pools, it receives little attention from the global C budget. DOC fluxes are critical to aquatic ecosystem inputs and contribute to the C balance of terrestrial ecosystems, but few ecosystem models have attempted to integrate DOC dynamics into terrestrial C cycling. This study introduces a new process-based model, TRIPLEX-DOC, that is capable of estimating DOC dynamics in forest soils by incorporating both ecological drivers and biogeochemical processes. TRIPLEX-DOC was developed from Forest-DNDC, a biogeochemical model simulating C and nitrogen (N) dynamics, coupled with a new DOC process module that predicts metabolic transformations, sorption/desorption, and DOC leaching in forest soils. The model was validated against field observations of DOC concentrations and fluxes at white pine forest stands located in southern Ontario, Canada. The model was able to simulate seasonal dynamics of DOC concentrations and the magnitudes observed within different soil layers, as well as DOC leaching in the age sequence of these forests. Additionally, TRIPLEX-DOC estimated the effect of forest harvesting on DOC leaching, with a significant increase following harvesting, illustrating that land use change is of critical importance in regulating DOC leaching in temperate forests as an important source of C input to aquatic ecosystems.

Wu, H.; Peng, C.; Moore, T. R.; Hua, D.; Li, C.; Zhu, Q.; Peichl, M.; Arain, M. A.; Guo, Z.

2014-05-01

416

Modeling dissolved organic carbon in temperate forest soils: TRIPLEX-DOC model development and validation  

NASA Astrophysics Data System (ADS)

Even though dissolved organic carbon (DOC) is the most active carbon (C) cycling that takes place in soil organic carbon (SOC) pools, it is missing from the global C budget. Fluxes in DOC are critical to aquatic ecosystem inputs and contribute to C balances of terrestrial ecosystems. Only a few ecosystem models have attempted to integrate DOC dynamics into terrestrial C cycling. This study introduces a new process-based model, TRIPLEX-DOC that is capable of estimating DOC dynamics in forest soils by incorporating both ecological drivers and biogeochemical processes. TRIPLEX-DOC was developed from Forest-DNDC, a biogeochemical model simulating C and nitrogen (N) dynamics, coupled with a new DOC process module that predicts metabolic transformations, sorption/desorption, and DOC leaching in forest soils. The model was validated against field observations of DOC concentrations and fluxes at white pine forest stands located in southern Ontario, Canada. The model was able to simulate seasonal dynamics of DOC concentrations and the magnitudes observed within different soil layers, as well as DOC leaching in the age-sequence of these forests. Additionally, TRIPLEX-DOC estimated the effect of forest harvesting on DOC leaching, with a significant increase following harvesting, illustrating that change in land use is of critical importance in regulating DOC leaching in temperate forests as an important source of C input to aquatic ecosystems.