Note: This page contains sample records for the topic ii model validation from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: August 15, 2014.
1

Development of a livestock odor dispersion model: part II. Evaluation and validation.  

PubMed

A livestock odor dispersion model (LODM) was developed to predict odor concentration and odor frequency using routine hourly meteorological data input. The odor concentrations predicted by the LODM were compared with the results obtained from other commercial models (Industrial Source Complex Short-Term model, version 3, CALPUFF) to evaluate its appropriateness. Two sets of field odor plume measurement data were used to validate the model. The model-predicted mean odor concentrations and odor frequencies were compared with those measured. Results show that this model has good performance for predicting odor concentrations and odor frequencies. PMID:21416754

Yu, Zimu; Guo, Huiqing; Laguë, Claude

2011-03-01

2

A Wheat Grazing Model for Simulating Grain and Beef Production: Part II—Model Validation  

Microsoft Academic Search

Computer models must be thoroughly evaluated before being used for decision-making. The objective of this paper is to evaluate the ability of a newly developed wheat grazing model to predict fall-winter forage and winter wheat (Triticum aestivum L.) grain yield as well as daily weight gains of steer (Bos taurus) grazing on wheat pasture in Oklahoma. Experimental data of three

X.-C. Zhang; L. A. Hunt; W. A. Phillips; G. Horn; J. Edward; H. Zhang

2008-01-01

3

A physical model of the bidirectional reflectance of vegetation canopies. I - Theory. II - Inversion and validation  

NASA Technical Reports Server (NTRS)

A new physically based analytical model of the bidirectional reflectance of vegetation canopies is derived. The model expresses the bidirectional reflectance field of a semiinfinite canopy as a combination of functions describing (1) the optical properties of the leaves through their single-scattering albedo and their phase function, (2) the average distribution of leaf orientations, and (3) the architecture of the canopy. The model is validated against laboratory and ground-based measurements in the visible and IR spectral regions, taken over two vegetation covers. The intrinsic optical properties of leaves and the information on the geometrical canopy arrangements in space were obtained using an inversion procedure based on a nonlinear optimization technique. Model predictions of bidirectional reflectances obtained using the inversion procedure compare well with actual observations.

Verstraete, Michel M.; Pinty, Bernard; Dickinson, Robert E.

1990-01-01

4

Person Heterogeneity of the BDI-II-C and Its Effects on Dimensionality and Construct Validity: Using Mixture Item Response Models  

ERIC Educational Resources Information Center

This study was to apply the mixed Rasch model to investigate person heterogeneity of Beck Depression Inventory-II-Chinese version (BDI-II-C) and its effects on dimensionality and construct validity. Person heterogeneity was reflected by two latent classes that differ qualitatively. Additionally, person heterogeneity adversely affected the…

Wu, Pei-Chen; Huang, Tsai-Wei

2010-01-01

5

A new 3D finite element model of the IEC 60318-1 artificial ear: II. Experimental and numerical validation  

NASA Astrophysics Data System (ADS)

In part I, the feasibility of using three-dimensional (3D) finite elements (FEs) to model the acoustic behaviour of the IEC 60318-1 artificial ear was studied and the numerical approach compared with classical lumped elements modelling. It was shown that by using a more complex acoustic model that took account of thermo-viscous effects, geometric shapes and dimensions, it was possible to develop a realistic model. This model then had clear advantages in comparison with the models based on equivalent circuits using lumped parameters. In fact results from FE modelling produce a better understanding about the physical phenomena produced inside ear simulator couplers, facilitating spatial and temporal visualization of the sound fields produced. The objective of this study (part II) is to extend the investigation by validating the numerical calculations against measurements on an ear simulator conforming to IEC 60318-1. For this purpose, an appropriate commercially available device is taken and a complete 3D FE model developed for it. The numerical model is based on key dimensional data obtained with a non-destructive x-ray inspection technique. Measurements of the acoustic transfer impedance have been carried out on the same device at a national measurement institute using the method embodied in IEC 60318-1. Having accounted for the actual device dimensions, the thermo-viscous effects inside narrow slots and holes and environmental conditions, the results of the numerical modelling were found to be in good agreement with the measured values.

Bravo, Agustín; Barham, Richard; Ruiz, Mariano; López, Juan Manuel; De Arcas, Guillermo; Alonso, Jesus

2012-12-01

6

SAGE II aerosol data validation based on retrieved aerosol model size distribution from SAGE II aerosol measurements  

NASA Technical Reports Server (NTRS)

Consideration is given to aerosol correlative measurements experiments for the Stratospheric Aerosol and Gas Experiment (SAGE) II, conducted between November 1984 and July 1986. The correlative measurements were taken with an impactor/laser probe, a dustsonde, and an airborne 36-cm lidar system. The primary aerosol quantities measured by the ground-based instruments are compared with those calculated from the aerosol size distributions from SAGE II aerosol extinction measurements. Good agreement is found between the two sets of measurements.

Wang, Pi-Huan; Mccormick, M. P.; Mcmaster, L. R.; Chu, W. P.; Swissler, T. J.; Osborn, M. T.; Russell, P. B.; Oberbeck, V. R.; Livingston, J.; Rosen, J. M.

1989-01-01

7

The validation of biodynamic models  

Microsoft Academic Search

Biodynamic models may: (i) represent understanding of how the body moves (i.e., `mechanistic models'), (ii) summarise biodynamic measurements (i.e., `quantitative models'), and (iii) provide predictions of the effects of motion on human health, comfort or performance (i.e., `effects models').Model validation may involve consideration of evidence used to derive a model, comparison of the model with alternatives, and a comparison between

Michael J Griffin

2001-01-01

8

Hellfire System Model Validation.  

National Technical Information Service (NTIS)

The hybrid simulation facilities, system modeling, and validation process for a U.S. Army missile development program are discussed. Two fundamental problems in missile system design and development require an accurate, valid, proven computer simulation; ...

R. V. Hupp

1988-01-01

9

Model Validation Status Review  

SciTech Connect

The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and engineered barriers, plus the TSPA model itself Description of the model areas is provided in Section 3, and the documents reviewed are described in Section 4. The responsible manager for the Model Validation Status Review was the Chief Science Officer (CSO) for Bechtel-SAIC Co. (BSC). The team lead was assigned by the CSO. A total of 32 technical specialists were engaged to evaluate model validation status in the 21 model areas. The technical specialists were generally independent of the work reviewed, meeting technical qualifications as discussed in Section 5.

E.L. Hardin

2001-11-28

10

INACTIVATION OF CRYPTOSPORIDIUM OOCYSTS IN A PILOT-SCALE OZONE BUBBLE-DIFFUSER CONTACTOR - II: MODEL VALIDATION AND APPLICATION  

EPA Science Inventory

The ADR model developed in Part I of this study was successfully validated with experimenta data obtained for the inactivation of C. parvum and C. muris oocysts with a pilot-scale ozone-bubble diffuser contactor operated with treated Ohio River water. Kinetic parameters, required...

11

Black Liquor Combustion Validated Recovery Boiler Modeling, Final Year Report, Volume 3: Appendix II, Sections 2 & 3 and Appendix III  

Microsoft Academic Search

This project was initiated in October 1990 with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. Many of these objectives were accomplished

T. M. Grace; W. J. Frederick; M. Salcudean; R. A. Wessel

1998-01-01

12

Model Valid Prediction Period  

Microsoft Academic Search

A new concept, valid prediction period (VPP), is presented here to evaluate model predictability. VPP is defined as the time period when the prediction error first exceeds a pre-determined criterion (i.e., the tolerance level). It depends not only on the instantaneous error growth, but also on the noise level, the initial error, and tolerance level. The model predictability skill is

P. C. Chu

2002-01-01

13

Macrotransport-solidification kinetics modeling of equiaxed dendritic growth: Part II. Computation problems and validation on INCONEL 718 superalloy castings  

NASA Astrophysics Data System (ADS)

In Part I of the article, a new analytical model that describes solidification of equiaxed dendrites was presented. In this part of the article, the model is used to simulate the solidification of INCONEL 718 superalloy castings. The model was incorporated into a commercial finite-element code, PROCAST. A special procedure called microlatent heat method (MLHM) was used for coupling between macroscopic heat flow and microscopic growth kinetics. A criterion for time-stepping selection in microscopic modeling has been derived in conjunction with MLHM. Reductions in computational (CPU) time up to 90 pct over the classic latent heat method were found by adopting this coupling. Validation of the model was performed against experimental data for an INCONEL 718 superalloy casting. In the present calculations, the model for globulitic dendrite was used. The evolution of fraction of solid calculated with the present model was compared with Scheil’s model and experiments. An important feature in solidification of INCONEL 718 is the detrimental Laves phase. Laves phase content is directly related to the intensity of microsegregation of niobium, which is very sensitive to the evolution of the fraction of solid. It was found that there is a critical cooling rate at which the amount of Laves phase is maximum. The critical cooling rate is not a function of material parameters (diffusivity, partition coefficient, etc.). It depends only on the grain size and solidification time. The predictions generated with the present model are shown to agree very well with experiments.

Nastac, L.; Stefanescu, D. M.

1996-12-01

14

Development of a new version of the Liverpool Malaria Model. II. Calibration and validation for West Africa  

PubMed Central

Background In the first part of this study, an extensive literature survey led to the construction of a new version of the Liverpool Malaria Model (LMM). A new set of parameter settings was provided and a new development of the mathematical formulation of important processes related to the vector population was performed within the LMM. In this part of the study, so far undetermined model parameters are calibrated through the use of data from field studies. The latter are also used to validate the new LMM version, which is furthermore compared against the original LMM version. Methods For the calibration and validation of the LMM, numerous entomological and parasitological field observations were gathered for West Africa. Continuous and quality-controlled temperature and precipitation time series were constructed using intermittent raw data from 34 weather stations across West Africa. The meteorological time series served as the LMM data input. The skill of LMM simulations was tested for 830 different sets of parameter settings of the undetermined LMM parameters. The model version with the highest skill score in terms of entomological malaria variables was taken as the final setting of the new LMM version. Results Validation of the new LMM version in West Africa revealed that the simulations compare well with entomological field observations. The new version reproduces realistic transmission rates and simulated malaria seasons are comparable to field observations. Overall the new model version performs much better than the original model. The new model version enables the detection of the epidemic malaria potential at fringes of endemic areas and, more importantly, it is now applicable to the vast area of malaria endemicity in the humid African tropics. Conclusions A review of entomological and parasitological data from West Africa enabled the construction of a new LMM version. This model version represents a significant step forward in the modelling of a weather-driven malaria transmission cycle. The LMM is now more suitable for the use in malaria early warning systems as well as for malaria projections based on climate change scenarios, both in epidemic and endemic malaria areas.

2011-01-01

15

Validation of SAGE II NO2 measurements  

NASA Technical Reports Server (NTRS)

The validity of NO2 measurements from the stratospheric aerosol and gas experiment (SAGE) II is examined by comparing the data with climatological distributions of NO2 and by examining the consistency of the observations themselves. The precision at high altitudes is found to be 5 percent, which is also the case at specific low altitudes for certain latitudes where the mixing ratio is 4 ppbv, and the precision is 0.2 ppbv at low altitudes. The autocorrelation distance of the smoothed profile measurement noise is 3-5 km and 10 km for 1-km and 5-km smoothing, respectively. The SAGE II measurements agree with spectroscopic measurements to within 10 percent, and the SAGE measurements are about 20 percent smaller than average limb monitor measurements at the mixing ratio peak. SAGE I and SAGE II measurements are slightly different, but the difference is not attributed to changes in atmospheric NO2.

Cunnold, D. M.; Zawodny, J. M.; Chu, W. P.; Mccormick, M. P.; Pommereau, J. P.; Goutail, F.

1991-01-01

16

Rail Vehicle Dynamics Model Validation.  

National Technical Information Service (NTIS)

The validation of mathematical models of rail vehicle dynamics using test data poses a number of difficult problems, which are addressed in this report. Previous attempts to validate rail vehicle models are reviewed critically, and experience gained in va...

S. E. Shladover R. L. Hull

1981-01-01

17

A novel spatial and stochastic model to evaluate the within and between farm transmission of classical swine fever virus: II validation of the model.  

PubMed

A new, recently published, stochastic and spatial model for the evaluation of classical swine fever virus (CSFV) spread into Spain has been validated by using several methods. Internal validity, sensitivity analysis, validation using historical data, comparison with other models and experiments on data validity were used to evaluate the overall reliability and consistency of the model. More than 100 modifications in input data and parameters were evaluated. Outputs were obtained after 1000 iterations for each new scenario of the model. As a result, the model was shown to be consistent, being the probability of infection by local spread, the time from infectious to clinical signs state, the probability of detection based on clinical signs at day t after detection of the index case outside the control and surveillance zones and the maximum number of farms to be depopulated at day t the parameters that have more influence (>10% of change) on the magnitude and duration of the epidemic. The combination of a within- and between-farm spread model was also shown to give significantly different results than using a purely between-farm spread model. Methods and results presented here were intended to be useful to better understand and apply the model, to identify key parameters for which it will be critical to have good estimates and to provide better support for prevention and control of future CSFV outbreaks. PMID:21899960

Martínez-López, B; Ivorra, B; Ngom, D; Ramos, A M; Sánchez-Vizcaíno, J M

2012-02-24

18

Data Assimilation of Photosynthetic Light-use Efficiency using Multi-angular Satellite Data: II Model Implementation and Validation  

NASA Technical Reports Server (NTRS)

Spatially explicit and temporally continuous estimates of photosynthesis will be of great importance for increasing our understanding of and ultimately closing the terrestrial carbon cycle. Current capabilities to model photosynthesis, however, are limited by accurate enough representations of the complexity of the underlying biochemical processes and the numerous environmental constraints imposed upon plant primary production. A potentially powerful alternative to model photosynthesis through these indirect observations is the use of multi-angular satellite data to infer light-use efficiency (e) directly from spectral reflectance properties in connection with canopy shadow fractions. Hall et al. (this issue) introduced a new approach for predicting gross ecosystem production that would allow the use of such observations in a data assimilation mode to obtain spatially explicit variations in e from infrequent polar-orbiting satellite observations, while meteorological data are used to account for the more dynamic responses of e to variations in environmental conditions caused by changes in weather and illumination. In this second part of the study we implement and validate the approach of Hall et al. (this issue) across an ecologically diverse array of eight flux-tower sites in North America using data acquired from the Compact High Resolution Imaging Spectroradiometer (CHRIS) and eddy-flux observations. Our results show significantly enhanced estimates of e and therefore cumulative gross ecosystem production (GEP) over the course of one year at all examined sites. We also demonstrate that e is greatly heterogeneous even across small study areas. Data assimilation and direct inference of GEP from space using a new, proposed sensor could therefore be a significant step towards closing the terrestrial carbon cycle.

Hilker, Thomas; Hall, Forest G.; Tucker, J.; Coops, Nicholas C.; Black, T. Andrew; Nichol, Caroline J.; Sellers, Piers J.; Barr, Alan; Hollinger, David Y.; Munger, J. W.

2012-01-01

19

Validation process of simulation model.  

National Technical Information Service (NTIS)

It is presented a methodology on empirical about any detailed simulation model. This kind of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparison between ...

M. J. San Isidro Pindado

1997-01-01

20

Verification, validation and accreditation of simulation models  

Microsoft Academic Search

The paper discusses verification, validation, and accreditation of simulation models. The different approaches to deciding model validity are presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; a recommended procedure is presented;

Robert G. Sargent

2000-01-01

21

Internal validation of predictive models  

Microsoft Academic Search

The performance of a predictive model is overestimated when simply determined on the sample of subjects that was used to construct the model. Several internal validation methods are available that aim to provide a more accurate estimate of model performance in new subjects. We evaluated several variants of split-sample, cross-validation and bootstrapping methods with a logistic regression model that included

Ewout W Steyerberg; Frank E Harrell; Gerard J. J. M Borsboom; M. J. C Eijkemans; Yvonne Vergouwe; J. Dik F Habbema

2001-01-01

22

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of si- mulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are de- fined; conceptual model validity, model verification, op- erational validity, and data validity are discussed; a way to document results

Robert G. Sargent

1994-01-01

23

Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes?  

PubMed Central

Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-? C? root mean square deviation [RMSD]) the high-resolution (1.8-?) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur.

Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

2011-01-01

24

Macrotransport-solidification kinetics modeling of equiaxed dendritic growth: Part II. Computation problems and validation on INCONEL 718 superalloy castings  

Microsoft Academic Search

In Part I of the article, a new analytical model that describes solidification of equiaxed dendrites was presented. In this\\u000a part of the article, the model is used to simulate the solidification of INCONEL 718 superalloy castings. The model was incorporated\\u000a into a commercial finite-element code, PROCAST. A special procedure called microlatent heat method (MLHM) was used for coupling\\u000a between

L. Nastac; D. M. Stefanescu

1996-01-01

25

MEDSLIK-II, a Lagrangian marine surface oil spill model for short-term forecasting - Part 2: Numerical simulations and validations  

NASA Astrophysics Data System (ADS)

In this paper we use MEDSLIK-II, a Lagrangian marine surface oil spill model described in Part 1 (De Dominicis et al., 2013), to simulate oil slick transport and transformation processes for realistic oceanic cases, where satellite or drifting buoys data are available for verification. The model is coupled with operational oceanographic currents, atmospheric analyses winds and remote sensing data for initialization. The sensitivity of the oil spill simulations to several model parameterizations is analyzed and the results are validated using surface drifters, SAR (synthetic aperture radar) and optical satellite images in different regions of the Mediterranean Sea. It is found that the forecast skill of Lagrangian trajectories largely depends on the accuracy of the Eulerian ocean currents: the operational models give useful estimates of currents, but high-frequency (hourly) and high-spatial resolution is required, and the Stokes drift velocity has to be added, especially in coastal areas. From a numerical point of view, it is found that a realistic oil concentration reconstruction is obtained using an oil tracer grid resolution of about 100 m, with at least 100 000 Lagrangian particles. Moreover, sensitivity experiments to uncertain model parameters show that the knowledge of oil type and slick thickness are, among all the others, key model parameters affecting the simulation results. Considering acceptable for the simulated trajectories a maximum spatial error of the order of three times the horizontal resolution of the Eulerian ocean currents, the predictability skill for particle trajectories is from 1 to 2.5 days depending on the specific current regime. This suggests that re-initialization of the simulations is required every day.

De Dominicis, M.; Pinardi, N.; Zodiatis, G.; Archetti, R.

2013-11-01

26

Fall Risk Assessment for Older Adults: The Hendrich II Model  

Microsoft Academic Search

TARGET POPULATION: The Hendrich II Fall Risk Model is intended to be used in the acute care setting to identify adults at risk for falls. The Model is being validated for further application of the specific risk factors in pediatrics and obstetrical populations. VALIDITY AND RELIABILITY: The Hendrich II Fall Risk Model was validated in a large case control study

Deanna Gray-Miceli

27

Statistical validation of genetic models  

Microsoft Academic Search

Various aspects of statistical validation of genetic models are reviewed. Possible ways of decomposing additive genetic variance over time and ways of comparing predictions at different times are suggested. Ways of circumventing difficulties because of selection are suggested. The uses of cross-validatory techniques are illustrated. Techniques introduced to validate multiple country evaluation are discussed. The analogy of multiple country evaluation

Robin Thompson

2001-01-01

28

Testing and validating environmental models  

USGS Publications Warehouse

Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series against data time series, and plotting predicted versus observed values) have little diagnostic power. We propose that it may be more useful to statistically extract the relationships of primary interest from the time series, and test the model directly against them.

Kirchner, J. W.; Hooper, R. P.; Kendall, C.; Neal, C.; Leavesley, G.

1996-01-01

29

The influence of synoptic airflow on UK daily precipitation extremes. Part II: regional climate model and E-OBS data validation  

NASA Astrophysics Data System (ADS)

We investigate how well the variability of extreme daily precipitation events across the United Kingdom is represented in a set of regional climate models and the E-OBS gridded data set. Instead of simply evaluating the climatologies of extreme precipitation measures, we develop an approach to validate the representation of physical mechanisms controlling extreme precipitation variability. In part I of this study we applied a statistical model to investigate the influence of the synoptic scale atmospheric circulation on extreme precipitation using observational rain gauge data. More specifically, airflow strength, direction and vorticity are used as predictors for the parameters of the generalised extreme value (GEV) distribution of local precipitation extremes. Here we employ this statistical model for our validation study. In a first step, the statistical model is calibrated against a gridded precipitation data set provided by the UK Met Office. In a second step, the same statistical model is calibrated against 14 ERA40 driven 25 km resolution RCMs from the ENSEMBLES project and the E-OBS gridded data set. Validation indices describing relevant physical mechanisms are derived from the statistical models for observations and RCMs and are compared using pattern standard deviation, pattern correlation and centered pattern root mean squared error as validation measures. The results for the different RCMs and E-OBS are visualised using Taylor diagrams. We show that the RCMs adequately simulate moderately extreme precipitation and the influence of airflow strength and vorticity on precipitation extremes, but show deficits in representing the influence of airflow direction. Also very rare extremes are misrepresented, but this result is afflicted with a high uncertainty. E-OBS shows considerable biases, in particular in regions of sparse data. The proposed approach might be used to validate other physical relationships in regional as well as global climate models.

Maraun, Douglas; Osborn, Timothy J.; Rust, Henning W.

2012-07-01

30

SASSYS validation with the EBR-II (Experimental Breeder Reactor II) shutdown heat removal tests  

SciTech Connect

SASSYS-1 is a coupled neutronic and thermal-hydraulic code developed for the analysis of transients in liquid-metal-cooled reactors (LMRs). The code is especially suited for evaluating off-normal reactor transients - protected design-basis accidents and unprotected anticipated transients without scram. Because SASSYS is heavily used in support of the Integral Fast Reactor concept and of innovative LMR designs, such as PRISM, a strong validation base for the code must exist. Part of the validation process for SASSYS is analysis of experiments performed on operating reactors, such as the metal-fueled Experimental Breeder Reactor II (EBR-II). During the course of a series of historic whole-plant experiments, EBR-II illustrated key safety features of metal-fueled LMRs. These experiments, the shutdown heat removal tests (SHRTs), culminated in unprotected loss of flow and loss of heat sink transients from full power and flow. Analysis of these and earlier SHRT experiments constitutes a vital part of SASSYS validation, because it facilitates scrutiny of specific SASSYS models and of integrated code capability.

Herzog, J.P. (Argonne National Laboratory, IL (USA))

1989-11-01

31

Black Liquor Combustion Validated Recovery Boiler Modeling, Final Year Report, Volume 2: Appendix I, Section 5, and Appendix II, Section 1  

Microsoft Academic Search

This project was initiated in October 1990 with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. Many of these objectives were accomplished

T. M. Grace; W. J. Frederick; M. Salcudean; R. A. Wessel

1998-01-01

32

SAGE II aerosol validation - Selected altitude measurements, including particle micromeasurements  

NASA Technical Reports Server (NTRS)

The validity of particulate extinction coefficients derived from limb path solar radiance measurements obtained during the Stratospheric Aerosol and Gas Experiment (SAGE) II is tested. The SAGE II measurements are compared with correlative aerosol measurements taken during January 1985, August 1985, and July 1986 with impactors, laser spectrometers, and filter samplers on a U-2 aircraft, an upward pointing lidar on a P-3 aircraft, and balloon-borne optical particle counters. The data for July 29, 1986 are discussed in detail. The aerosol measurements taken on this day at an altitude of 20.5 km produce particulate extinction values which validate the SAGE II values for similar wavelengths.

Oberbeck, Verne R.; Russell, Philip B.; Pueschel, Rudolf F.; Snetsinger, Kenneth G.; Ferry, Guy V.; Livingston, John M.; Rosen, James N.; Osborn, Mary T.; Kritz, Mark A.

1989-01-01

33

Requirements for Validating System Models.  

National Technical Information Service (NTIS)

As computer models become larger and more complex we have to do some hard thinking about how they can be validated, and then how they can become a permanent part of our collective scientific thinking. Until, recently, the only form of quantitative express...

M. S. Gussenhoven

1983-01-01

34

MODEL VALIDATION REPORT FOR THE HOUSATONIC RIVER  

EPA Science Inventory

The Model Validation Report will present a comparison of model validation runs to existing data for the model validation period. The validation period spans a twenty year time span to test the predictive capability of the model over a longer time period, similar to that which wil...

35

Statistical validation of system models  

SciTech Connect

It is common practice in system analysis to develop mathematical models for system behavior. Frequently, the actual system being modeled is also available for testing and observation, and sometimes the test data are used to help identify the parameters of the mathematical model. However, no general-purpose technique exists for formally, statistically judging the quality of a model. This paper suggests a formal statistical procedure for the validation of mathematical models of systems when data taken during operation of the system are available. The statistical validation procedure is based on the bootstrap, and it seeks to build a framework where a statistical test of hypothesis can be run to determine whether or not a mathematical model is an acceptable model of a system with regard to user-specified measures of system behavior. The approach to model validation developed in this study uses experimental data to estimate the marginal and joint confidence intervals of statistics of interest of the system. These same measures of behavior are estimated for the mathematical model. The statistics of interest from the mathematical model are located relative to the confidence intervals for the statistics obtained from the experimental data. These relative locations are used to judge the accuracy of the mathematical model. An extension of the technique is also suggested, wherein randomness may be included in the mathematical model through the introduction of random variable and random process terms. These terms cause random system behavior that can be compared to the randomness in the bootstrap evaluation of experimental system behavior. In this framework, the stochastic mathematical model can be evaluated. A numerical example is presented to demonstrate the application of the technique.

Barney, P. [Sandia National Labs., Albuquerque, NM (United States); Ferregut, C.; Perez, L.E. [Texas Univ., El Paso, TX (United States); Hunter, N.F. [Los Alamos National Lab., NM (United States); Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States)

1997-01-01

36

Turbulence Modeling Verification and Validation  

NASA Technical Reports Server (NTRS)

Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

Rumsey, Christopher L.

2014-01-01

37

Model Validation with Hybrid Dynamic Simulation  

SciTech Connect

Abstract—Model validation has been one of the central topics in power engineering studies for years. As model validation aims at obtaining reasonable models to represent actual behavior of power system components, it has been essential to validate models against actual measurements or known benchmark behavior. System-wide model simulation results can be compared with actual recordings. However, it is difficult to construct a simulation case for a large power system such as the WECC system and to narrow down to problematic models in a large system. Hybrid dynamic simulation with its capability of injecting external signals into dynamic simulation enables rigorous comparison of measurements and simulation in a small subsystem of interest. This paper presents such a model validation methodology with hybrid dynamic simulation. Two application examples on generator and load model validation are presented to show the validity of this model validation methodology. This methodology is further extended for automatic model validation and dichotomous subsystem model validation.

Huang, Zhenyu; Kosterev, Dmitry; Guttromson, Ross T.; Nguyen, Tony B.

2006-06-18

38

Statistical validation of stochastic models  

SciTech Connect

It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

1996-12-31

39

Modelling leaching of inorganic Hg(II) in a Scandinavian iron-humus podzol — validation and long-term leaching under various deposition rates  

Microsoft Academic Search

Increasing mercury contents are reported from freshwater systems and fish in northern Europe and North America. Mercury input\\u000a from soils is a major source with the leaching being affected by increased atmospheric mercury deposition compared to pre-industrial\\u000a times and by other environmental conditions such as acid rain. The results of a mathematical model-calculation of vertical\\u000a inorganic Hg(II) leaching in a

K. Schlüter; S. Gäth

1997-01-01

40

MODELLING LEACHING OF INORGANIC Hg(II) IN A SCANDINAVIAN IRON-HUMUS PODZOL - VALIDATION AND LONG-TERM LEACHING UNDER VARIOUS DEPOSITION RATES  

Microsoft Academic Search

Increasing mercury contents are reported from freshwater systems and fish in northern Europe and North America. Mercury input from soils is a major source with the leaching being affected by increased atmospheric mercury deposition compared to pre-industrial times and by other environmental conditions such as acid rain. The results of a mathematical model-calculation of vertical inorganic Hg(II) leaching in a

K. SCHLÜTER; S. GÄTH

1997-01-01

41

Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces  

NASA Technical Reports Server (NTRS)

A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

Raiszadeh, Ben; Queen, Eric M.

2002-01-01

42

The MicroArray Quality Control (MAQC)-II study of common practices for the development and validation of microarray-based predictive models.  

PubMed

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis. PMID:20676074

Shi, Leming; Campbell, Gregory; Jones, Wendell D; Campagne, Fabien; Wen, Zhining; Walker, Stephen J; Su, Zhenqiang; Chu, Tzu-Ming; Goodsaid, Federico M; Pusztai, Lajos; Shaughnessy, John D; Oberthuer, André; Thomas, Russell S; Paules, Richard S; Fielden, Mark; Barlogie, Bart; Chen, Weijie; Du, Pan; Fischer, Matthias; Furlanello, Cesare; Gallas, Brandon D; Ge, Xijin; Megherbi, Dalila B; Symmans, W Fraser; Wang, May D; Zhang, John; Bitter, Hans; Brors, Benedikt; Bushel, Pierre R; Bylesjo, Max; Chen, Minjun; Cheng, Jie; Cheng, Jing; Chou, Jeff; Davison, Timothy S; Delorenzi, Mauro; Deng, Youping; Devanarayan, Viswanath; Dix, David J; Dopazo, Joaquin; Dorff, Kevin C; Elloumi, Fathi; Fan, Jianqing; Fan, Shicai; Fan, Xiaohui; Fang, Hong; Gonzaludo, Nina; Hess, Kenneth R; Hong, Huixiao; Huan, Jun; Irizarry, Rafael A; Judson, Richard; Juraeva, Dilafruz; Lababidi, Samir; Lambert, Christophe G; Li, Li; Li, Yanen; Li, Zhen; Lin, Simon M; Liu, Guozhen; Lobenhofer, Edward K; Luo, Jun; Luo, Wen; McCall, Matthew N; Nikolsky, Yuri; Pennello, Gene A; Perkins, Roger G; Philip, Reena; Popovici, Vlad; Price, Nathan D; Qian, Feng; Scherer, Andreas; Shi, Tieliu; Shi, Weiwei; Sung, Jaeyun; Thierry-Mieg, Danielle; Thierry-Mieg, Jean; Thodima, Venkata; Trygg, Johan; Vishnuvajjala, Lakshmi; Wang, Sue Jane; Wu, Jianping; Wu, Yichao; Xie, Qian; Yousef, Waleed A; Zhang, Liang; Zhang, Xuegong; Zhong, Sheng; Zhou, Yiming; Zhu, Sheng; Arasappan, Dhivya; Bao, Wenjun; Lucas, Anne Bergstrom; Berthold, Frank; Brennan, Richard J; Buness, Andreas; Catalano, Jennifer G; Chang, Chang; Chen, Rong; Cheng, Yiyu; Cui, Jian; Czika, Wendy; Demichelis, Francesca; Deng, Xutao; Dosymbekov, Damir; Eils, Roland; Feng, Yang; Fostel, Jennifer; Fulmer-Smentek, Stephanie; Fuscoe, James C; Gatto, Laurent; Ge, Weigong; Goldstein, Darlene R; Guo, Li; Halbert, Donald N; Han, Jing; Harris, Stephen C; Hatzis, Christos; Herman, Damir; Huang, Jianping; Jensen, Roderick V; Jiang, Rui; Johnson, Charles D; Jurman, Giuseppe; Kahlert, Yvonne; Khuder, Sadik A; Kohl, Matthias; Li, Jianying; Li, Li; Li, Menglong; Li, Quan-Zhen; Li, Shao; Li, Zhiguang; Liu, Jie; Liu, Ying; Liu, Zhichao; Meng, Lu; Madera, Manuel; Martinez-Murillo, Francisco; Medina, Ignacio; Meehan, Joseph; Miclaus, Kelci; Moffitt, Richard A; Montaner, David; Mukherjee, Piali; Mulligan, George J; Neville, Padraic; Nikolskaya, Tatiana; Ning, Baitang; Page, Grier P; Parker, Joel; Parry, R Mitchell; Peng, Xuejun; Peterson, Ron L; Phan, John H; Quanz, Brian; Ren, Yi; Riccadonna, Samantha; Roter, Alan H; Samuelson, Frank W; Schumacher, Martin M; Shambaugh, Joseph D; Shi, Qiang; Shippy, Richard; Si, Shengzhu; Smalter, Aaron; Sotiriou, Christos; Soukup, Mat; Staedtler, Frank; Steiner, Guido; Stokes, Todd H; Sun, Qinglan; Tan, Pei-Yi; Tang, Rong; Tezak, Zivana; Thorn, Brett; Tsyganova, Marina; Turpaz, Yaron; Vega, Silvia C; Visintainer, Roberto; von Frese, Juergen; Wang, Charles; Wang, Eric; Wang, Junwei; Wang, Wei; Westermann, Frank; Willey, James C; Woods, Matthew; Wu, Shujian; Xiao, Nianqing; Xu, Joshua; Xu, Lei; Yang, Lun; Zeng, Xiao; Zhang, Jialu; Zhang, Li; Zhang, Min; Zhao, Chen; Puri, Raj K; Scherf, Uwe; Tong, Weida; Wolfinger, Russell D

2010-08-01

43

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experimental Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-II, and are also valuable tools for the analysis of innovative reactor designs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1992-07-01

44

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experimental Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-II, and are also valuable tools for the analysis of innovative reactor designs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1992-01-01

45

Validation for a recirculation model.  

PubMed

Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation. PMID:11318387

LaPuma, P T

2001-04-01

46

SRVAL. Stock-Recruitment Model VALidation Code  

SciTech Connect

SRVAL is a computer simulation model of the Hudson River striped bass population. It was designed to aid in assessing the validity of curve-fits of the linearized Ricker stock-recruitment model, modified to incorporate multiple-age spawners and to include an environmental variable, to variously processed annual catch-per-unit-effort (CPUE) statistics for a fish population. It is sometimes asserted that curve-fits of this kind can be used to determine the sensitivity of fish populations to such man-induced stresses as entrainment and impingement at power plants. SRVAL was developed to test such assertions and was utilized in testimony written in connection with the Hudson River Power Case (U. S. Environmental Protection Agency, Region II).

Christensen, S.W. [Oak Ridge National Lab., Oak Ridge, TN (United States)

1989-12-07

47

Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part I: Benchmark comparisons of WIMS-D5 and DRAGON cell and control rod parameters with MCNP5  

SciTech Connect

In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out covering cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)

Mollerach, R. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina); Leszczynski, F. [Comision Nacional de Energia Atomica, Avenida del Libertador 8250, Buenos Aires 1429 (Argentina); Fink, J. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina)

2006-07-01

48

Validation plan for the German CAMAELEON model  

NASA Astrophysics Data System (ADS)

Engineers and scientists at the US Army's Night Vision and Electronic Sensors Directorate (NVESD) are in the process of evaluating the German CAMAELEON model, a signature evaluation model that was created for use in designing and evaluating camouflage in the visible spectrum and is based on computational vision methodologies. Verification and preliminary validation have been very positive. For this reason, NVESD has planned and is currently in the early execution phase of a more elaborate validation effort using data from an Army field exercise known as DISSTAF-II. The field exercise involved tank gunners, using the currently fielded M1 Abrams tank sights to search for, to target, and to `fire on' (i.e. to pull the trigger to mark target location) a variety of foreign and domestic vehicles in realistic scenarios. Data from this field exercise will be combined with results of a laboratory measurement of perceptual target detectabilities. The purpose of the laboratory measurement is to separate modeled effects from unmodeled effects in the field data. In the laboratory, observers will be performing a task as similar as possible to that modeled by CAMAELEON. An important feature of this data is that the observers will know where the target is located and will rate the detectability of the targets in a paired comparison experiment utilizing the X-based perceptual experiment testbed developed at the University of Tennessee. For the laboratory measurement the subjects will view exactly the same images as those to be analyzed by CAMAELEON. Three correlations that will be found are expected to be especially important. The correlation between perceptual detectability and model predictions will show the accuracy with which the model predicts human performance of the modeled task (rating target detectabilities). The correlation between laboratory and field data will show how well perceived detectability predicts tank gunner target detection in a realistic scenario. Finally, the correlation between model predictions and detection probabilities will show the extent to which the model can actually predict human field performance.

McManamey, James R.

1997-06-01

49

Model Validation with Hybrid Dynamic Simulation  

SciTech Connect

Abstract—Model validation has been one of the central topics in power engineering studies for years. As model validation aims at obtaining reasonable models to represent actual behavior of power system components, it has been essential to validate models against actual measurements or known benchmark behavior. System-wide model simulation results can be compared with actual recordings. However, it is difficult to construct a simulation case for a large power system such as the WECC system and to narrow down to problematic models in a large system. Hybrid dynamic simulation with its capability of injecting external signals into dynamic simulation enables rigorous comparison of measurements and simulation in a small subsystem of interest. This paper presents such a model validation methodology with hybrid dynamic simulation. Two application examples on generator and load model validation are presented to show the validity of this model validation methodology. This methodology is further extended for automatic model validation and dichotomous subsystem model validation. A few methods to define model quality indices have been proposed to quantify model error for model validation criteria development.

Huang, Zhenyu; Kosterev, Dmitry; Guttromson, Ross T.; Nguyen, Tony B.

2006-06-22

50

Validation of matrix diffusion modeling  

NASA Astrophysics Data System (ADS)

Crystalline rock has been chosen as the host medium for repository of highly radioactive spent nuclear fuel in Finland. Radionuclide transport takes place along water-carrying fractures, and matrix diffusion has been indicated as an important retarding mechanism that affects the transport of mobile fission and activation products. The model introduced here for matrix diffusion contains a flow channel facing a porous matrix with stagnant water into which tracer molecules advected in the channel can diffuse. In addition, the possibility of a finite depth of the matrix and an initial tracer distribution (‘contamination’) in the matrix are included in the model. In order to validate the developed matrix diffusion model, a relatively simple measuring system was constructed. Matrix diffusion was illustrated by observing the migration of 0.1 ml KCl pulses in the water flowing through a channel facing a porous matrix made of synthetic fibre felt. Migration of K + and Cl - ions was monitored by measuring the electrical conductivity of the solution. The experimental system allowed also measurements on the concentration profile inside the porous matrix, but the focus is here on the input and output (breakthrough) pulses. Measurements were performed for two different initial distributions of KCl tracer in the porous matrix. There was excellent agreement between modeling and experimental results with consistent values for the diffusion coefficient used as the fitting parameter.

Voutilainen, M.; Kekäläinen, P.; Hautojärvi, A.; Timonen, J.

51

Assimilation of Geosat Altimetric Data in a Nonlinear Shallow-Water Model of the Indian Ocean by Adjoint Approach. Part II: Some Validation and Interpretation of the Assimilated Results. Part 2; Some Validation and Interpretation of the Assimilated Results  

NASA Technical Reports Server (NTRS)

This paper examines the results of assimilating Geosat sea level variations relative to the November 1986-November 1988 mean reference, in a nonlinear reduced-gravity model of the Indian Ocean, Data have been assimilated during one year starting in November 1986 with the objective of optimizing the initial conditions and the yearly averaged reference surface. The thermocline slope simulated by the model with or without assimilation is validated by comparison with the signal, which can be derived from expandable bathythermograph measurements performed in the Indian Ocean at that time. The topography simulated with assimilation on November 1986 is in very good agreement with the hydrographic data. The slopes corresponding to the South Equatorial Current and to the South Equatorial Countercurrent are better reproduced with assimilation than without during the first nine months. The whole circulation of the cyclonic gyre south of the equator is then strongly intensified by assimilation. Another assimilation experiment is run over the following year starting in November 1987. The difference between the two yearly mean surfaces simulated with assimilation is in excellent agreement with Geosat. In the southeastern Indian Ocean, the correction to the yearly mean dynamic topography due to assimilation over the second year is negatively correlated to the one the year before. This correction is also in agreement with hydrographic data. It is likely that the signal corrected by assimilation is not only due to wind error, because simulations driven by various wind forcings present the same features over the two years. Model simulations run with a prescribed throughflow transport anomaly indicate that assimilation is rather correcting in the interior of the model domain for inadequate boundary conditions with the Pacific.

Greiner, Eric; Perigaud, Claire

1996-01-01

52

A tutorial on verification and validation of simulation models  

Microsoft Academic Search

In this tutorial paper we give a general introduction to verification and validation of simulation models, define the various validation techniques, and present a recommended model validation procedure.

Robert G. Sargent

1984-01-01

53

Verification and validation of simulation models  

Microsoft Academic Search

This paper surveys verification and validation of models, especially simulation models in operations research. For verification it discusses 1) general good programming practice (such as modular programming), 2) checking intermediate simulation outputs through tracing and statistical testing per module, 3) statistical testing of final simulation outputs against analytical results, and 4) animation. For validation it discusses 1) obtaining real-worl data,

Jack P. C. Kleijnen

1995-01-01

54

Paleoclimate validation of a numerical climate model.  

National Technical Information Service (NTIS)

An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the...

F. J. Schelling H. W. Church B. D. Zak S. L. Thompson

1994-01-01

55

Modeling Input Validation in UML  

Microsoft Academic Search

Security is an integral part of most software systems but it is not considered as an explicit part in the development process yet. Input validation is the most critical part of software security that is not covered in the design phase of software development life-cycle resulting in many security vulnerabilities. Our objective is to extend UML to new integrated framework

Pedram Hayati; Nastaran Jafari; S. Mohammad Rezaei; Saeed Sarenche; Vidyasagar Potdar

2008-01-01

56

NASA GSFC CCMC Recent Model Validation Activities  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) holds the largest assembly of state-of-the-art physics-based space weather models developed by the international space physics community. In addition to providing the community easy access to these modern space research models to support science research, its another primary goal is to test and validate models for transition from research to operations. In this presentation, we provide an overview of the space science models available at CCMC. Then we will focus on the community-wide model validation efforts led by CCMC in all domains of the Sun-Earth system and the internal validation efforts at CCMC to support space weather servicesjoperations provided its sibling organization - NASA GSFC Space Weather Center (http://swc.gsfc.nasa.gov). We will also discuss our efforts in operational model validation in collaboration with NOAA/SWPC.

Rastaetter, L.; Pulkkinen, A.; Taktakishvill, A.; Macneice, P.; Shim, J. S.; Zheng, Yihua; Kuznetsova, M. M.; Hesse, M.

2012-01-01

57

Traveling with cognitive tests: testing the validity of a KABC-II adaptation in India.  

PubMed

The authors evaluated the adequacy of an extensive adaptation of the American Kaufman Assessment Battery for Children, second edition (KABC-II), for 6- to 10-year-old Kannada-speaking children of low socioeconomic status in Bangalore, South India. The adapted KABC-II was administered to 598 children. Subtests showed high reliabilities, the Cattell-Horn-Carroll model underlying the original KABC-II was largely replicated, and external relations with demographic characteristics and an achievement measure were consistent with expectations. The subtests showed relatively high loadings on the general cognitive factor, presumably because of the high task novelty and, hence, cognitive complexity of the tests for the children. The findings support the suitability and validity of the KABC-II adaptation. The authors emphasize that test adaptations can only be adequate if they meet both judgmental (qualitative) and statistical (quantitative) adaptation criteria. PMID:19745212

Malda, Maike; van de Vijver, Fons J R; Srinivasan, Krishnamachari; Transler, Catherine; Sukumar, Prathima

2010-03-01

58

Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces  

NASA Technical Reports Server (NTRS)

A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

2009-01-01

59

SAGE II aerosol data validation - Comparative studies of SAGE II and SAM II data sets  

NASA Technical Reports Server (NTRS)

Data from the Stratospheric Aerosol and Gas Experiment (SAGE II) satellite are compared with data from the Stratospheric Aerosol Measurement (SAM II) satellite. Both experiments produce aerosol extinction profiles by measuring the attenuation of solar radiation during each sunrise and sunset observed by the satelltie. The SAGE II obtains profiles at 1.02 microns and three smaller wavelengths, whereas the SAM II measures at only one radiometric channel at 1.0 microns. It is found that the differences between the two sets of data are generally within the error bars associated with each measurement. In addition, the sunrise and sunset data from SAGE II are analyzed.

Yue, G. K.; Mccormick, M. P.; Chu, W. P.; Wang, P. H.; Osborn, M. T.

1989-01-01

60

Validity of geographically modeled environmental exposure estimates.  

PubMed

Abstract Geographic modeling is increasingly being used to estimate long-term environmental exposures in epidemiologic studies of chronic disease outcomes. However, without validation against measured environmental concentrations, personal exposure levels, or biologic doses, these models cannot be assumed a priori to be accurate. This article discusses three examples of epidemiologic associations involving exposures estimated using geographic modeling, and identifies important issues that affect geographically modeled exposure assessment in these areas. In air pollution epidemiology, geographic models of fine particulate matter levels have frequently been validated against measured environmental levels, but comparisons between ambient and personal exposure levels have shown only moderate correlations. Estimating exposure to magnetic fields by using geographically modeled distances is problematic because the error is larger at short distances, where field levels can vary substantially. Geographic models of environmental exposure to pesticides, including paraquat, have seldom been validated against environmental or personal levels, and validation studies have yielded inconsistent and typically modest results. In general, the exposure misclassification resulting from geographic models of environmental exposures can be differential and can result in bias away from the null even if non-differential. Therefore, geographic exposure models must be rigorously constructed and validated if they are to be relied upon to produce credible scientific results to inform epidemiologic research. To our knowledge, such models have not yet successfully predicted an association between an environmental exposure and a chronic disease outcome that has eventually been established as causal, and may not be capable of doing so in the absence of thorough validation. PMID:24766059

Chang, Ellen T; Adami, Hans-Olov; Bailey, William H; Boffetta, Paolo; Krieger, Robert I; Moolgavkar, Suresh H; Mandel, Jack S

2014-05-01

61

Validation of the STAFF-5 Computer Model.  

National Technical Information Service (NTIS)

STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of th...

J. F. Fletcher S. R. Fields

1981-01-01

62

Validation of Software Reliability Models.  

National Technical Information Service (NTIS)

This report presents the results of a study and investigation of software reliability models. In particular, the purpose was to investigate the statistical properties of selected software reliability models, including the statistical properties of the par...

R. E. Schafer J. E. Angus J. F. Alter S. E. Emoto

1979-01-01

63

Tank waste source term inventory validation. Volume II. Letter report  

SciTech Connect

This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories.

NONE

1995-04-01

64

An approach to validation of thermomechanical models  

SciTech Connect

Thermomechanical models are being developed to support the design of an Exploratory Studies Facility (ESF) and a potential high-level nuclear waste repository at Yucca Mountain, Nevada. These models are used for preclosure design of underground openings, such as access drifts, emplacement drifts, and waste emplacement boreholes; and in support of postclosure issue resolution relating to waste canister performance, disturbance of the hydrological properties of the host rock, and overall system performance assessment. For both design and performance assessment, the purpose of using models in analyses is to better understand and quantify some phenomenon or process. Therefore, validation is an important process that must be pursued in conjunction with the development and application of models. The Site Characterization Plan (SCP) addressed some general aspects of model validation, but no specific approach has, as yet, been developed for either design or performance assessment models. This paper will discuss a proposed process for thermomechanical model validation and will focus on the use of laboratory and in situ experiments as part of the validation process. The process may be generic enough in nature that it could be applied to the validation of other types of models, for example, models of unsaturated hydrologic flow.

Costin, L.S. [Sandia National Labs., Albuquerque, NM (United States); Hardy, M.P.; Brechtel, C.E. [Agapito (J.F.T.) and Associates, Inc., Grand Junction, CO (United States)

1993-08-01

65

Hot gas defrost model development and validation  

Microsoft Academic Search

This paper describes the development, validation, and application of a transient model for predicting the heat and mass transfer effects associated with an industrial air-cooling evaporator during a hot gas defrost cycle. The inputs to the model include the space dry bulb temperature, space humidity, coil geometry, frost thickness, frost density, and hot gas inlet temperature. The model predicts the

N. Hoffenbecker; S. A. Klein; D. T. Reindl

2005-01-01

66

Real-time validation of mechanical models coupling PGD and constitutive relation error  

NASA Astrophysics Data System (ADS)

In this work, we introduce a general framework that enables to perform real-time validation of mechanical models. This framework is based on two main ingredients: (i) the constitutive relation error which constitutes a convenient and mechanically sound tool for model validation; (ii) a powerful method for model reduction, the proper generalized decomposition, which is used to compute a solution with separated representations and thus to run the validation process quickly. Performances of the proposed approach are illustrated on machining applications.

Bouclier, Robin; Louf, François; Chamoin, Ludovic

2013-10-01

67

SASSYS validation with the EBR-II shutdown heat removal tests  

SciTech Connect

SASSYS is a coupled neutronic and thermal hydraulic code developed for the analysis of transients in liquid metal cooled reactors (LMRs). The code is especially suited for evaluating of normal reactor transients -- protected (design basis) and unprotected (anticipated transient without scram) transients. Because SASSYS is heavily used in support of the IFR concept and of innovative LMR designs, such as PRISM, a strong validation base for the code must exist. Part of the validation process for SASSYS is analysis of experiments performed on operating reactors, such as the metal fueled Experimental Breeder Reactor -- II (EBR-II). During the course of a series of historic whole-plant experiments, EBR-II illustrated key safety features of metal fueled LMRs. These experiments, the Shutdown Heat Removal Tests (SHRT), culminated in unprotected loss of flow and loss of heat sink transients from full power and flow. Analysis of these and earlier SHRT experiments constitutes a vital part of SASSYS validation, because it facilitates scrutiny of specific SASSYS models and of integrated code capability. 12 refs., 11 figs.

Herzog, J.P. (Argonne National Lab., IL (USA))

1989-01-01

68

Validation of a Lagrangian particle model  

NASA Astrophysics Data System (ADS)

In this paper a custom-developed model of dispersion of pollutants is presented. The proposed approach is based on both a Lagrangian particle model and an urban-scale diagnostic model of the air velocity field. Both models constitute a part of an operational air quality assessment system. The proposed model is validated by comparing its computed results with the results of measurements obtained in a wind tunnel reflecting conditions of the Mock Urban Setting Test (MUST) experiment. Commonly used measures of errors and model concordance are employed and the results obtained are additionally compared with those obtained by other authors for CFD and non-CFD class models. The obtained results indicate that the validity of the model presented in this paper is acceptable.

Brzozowska, Lucyna

2013-05-01

69

Fraction Model II  

NSDL National Science Digital Library

With this tool, students can explore different representations for fractions. They can create a fraction, selecting any numerator or denominator up to 20, and see a model of the fraction as well as its percent and decimal equivalents. For the model, they can choose either a circle, a rectangle, or a set model.

Illuminations, Nctm

2000-01-01

70

Validation of the Hot Strip Mill Model  

SciTech Connect

The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

Richard Shulkosky; David Rosberg; Jerrud Chapman

2005-03-30

71

Validation plan for the German CAMAELEON model  

Microsoft Academic Search

Engineers and scientists at the US Army's Night Vision and Electronic Sensors Directorate (NVESD) are in the process of evaluating the German CAMAELEON model, a signature evaluation model that was created for use in designing and evaluating camouflage in the visible spectrum and is based on computational vision methodologies. Verification and preliminary validation have been very positive. For this reason,

James R. McManamey

1997-01-01

72

Structural system identification: Structural dynamics model validation  

SciTech Connect

Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

Red-Horse, J.R.

1997-04-01

73

Feature extraction for structural dynamics model validation  

SciTech Connect

This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

2010-11-08

74

Verification and Validation of the Sparc Model.  

National Technical Information Service (NTIS)

SPARC (SPARC Performs Automated Reasoning in Chemistry) chemical reactivity models were validated on more than 5000 ionization pKas (in the gas phase and in many organic solvents including water as a function of temperature), 1200 carboxylic acid ester hy...

S. H. Hilal S. W. Karickhoff L. A. Carreira

2003-01-01

75

Full-Scale Cookoff Model Validation Experiments  

Microsoft Academic Search

This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release

M A McClelland; M K Rattanapote; E R Heimdahl; W E Erikson; P O Curran; A I Atwood

2003-01-01

76

MELPPROG debris meltdown model and validation experiments  

Microsoft Academic Search

The MELPROG computer code is being developed to provide mechanistic treatment of Light Water Reactor (LWR) accidents from accident initiation through vessel failure. This paper describes a two-dimensional (r-z) debris meltdown model that is being developed for use in the MELPROG code and discusses validation experiments. Of interest to this study is melt progression in particle beds that can form

S. S. Dosanjh; R. O. Gauntt

1988-01-01

77

Real-time remote scientific model validation  

NASA Technical Reports Server (NTRS)

This paper describes flight results from the use of a CLIPS-based validation facility to compare analyzed data from a space life sciences (SLS) experiment to an investigator's preflight model. The comparison, performed in real-time, either confirms or refutes the model and its predictions. This result then becomes the basis for continuing or modifying the investigator's experiment protocol. Typically, neither the astronaut crew in Spacelab nor the ground-based investigator team are able to react to their experiment data in real time. This facility, part of a larger science advisor system called Principal Investigator in a Box, was flown on the space shuttle in October, 1993. The software system aided the conduct of a human vestibular physiology experiment and was able to outperform humans in the tasks of data integrity assurance, data analysis, and scientific model validation. Of twelve preflight hypotheses associated with investigator's model, seven were confirmed and five were rejected or compromised.

Frainier, Richard; Groleau, Nicolas

1994-01-01

78

Validation of Hadronic Models in Geant4  

SciTech Connect

Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

Koi, Tatsumi; Wright, Dennis H. [Stanford Linear Accelerator Center, Menlo Park, California (United States); Folger, Gunter; Ivantchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai [CERN, Geneva (Switzerland); Heikkinen, Aatos [Helsinki Institute of Physics, Helsinki (Finland); Truscott, Pete; LeiFan [QinetiQ, Farnborough (United Kingdom); Wellisch, Hans-Peter [Geneva, (Switzerland)

2007-03-19

79

Validation of Hadronic Models in GEANT4  

SciTech Connect

Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott, Peter; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

2007-09-26

80

Perturbations of reactor flow and inlet temperature in EBR-II for reactivity feedback validation  

SciTech Connect

A loss-of-flow (LOF) accident in current LMFBR safety studies is a total loss of pumping power coupled with a failure of the reactor shutdown system (RSS). Another accident of current interest is a loss-of-heat sink for the primary system without involving the RSS. Both types of transients are included in the Shutdown Heat Removal Tests (SHRT) presently being conducted in the Experimental Breeder Reactor II (EBR-II). An initial series of tests in EBR-II was successfully completed in June of 1984 which included investigation of the inherent reactivity feedback characteristics. In these tests, the reactor was perturbed at various power levels by independent variations in primary flow and reactor inlet temperature, and the reactor power and temperature response were measured. The reactor and balance of plant are extensively instrumented and measurements were recorded on a data acquisition system. Results show that the plant responds safely without any external control and confirm that the effects of reduced primary flow or reduced heat rejection are mitigated. More specifically, the data provide a basis for model validation and confidence in predictions of an upcoming series of unprotected LOF and loss-of-heat sink tests. In this paper, the pretest predictions, measured results, and posttest analyses are described in detail. Finally, the application of this feedback model to future EBR-II core loadings and implications for other LMFBR's are discussed.

Mohr, D.; Chang, L.K.

1985-01-01

81

Using airborne laser scanning profiles to validate marine geoid models  

NASA Astrophysics Data System (ADS)

Airborne laser scanning (ALS) is a remote sensing method which utilizes LiDAR (Light Detection And Ranging) technology. The datasets collected are important sources for large range of scientific and engineering applications. Mostly the ALS is used to measure terrain surfaces for compilation of Digital Elevation Models but it can also be used in other applications. This contribution focuses on usage of ALS system for measuring sea surface heights and validating gravimetric geoid models over marine areas. This is based on the ALS ability to register echoes of LiDAR pulse from the water surface. A case study was carried out to analyse the possibilities for validating marine geoid models by using ALS profiles. A test area at the southern shores of the Gulf of Finland was selected for regional geoid validation. ALS measurements were carried out by the Estonian Land Board in spring 2013 at different altitudes and using different scan rates. The one wavelength Leica ALS50-II laser scanner on board of a small aircraft was used to determine the sea level (with respect to the GRS80 reference ellipsoid), which follows roughly the equipotential surface of the Earth's gravity field. For the validation a high-resolution (1'x2') regional gravimetric GRAV-GEOID2011 model was used. This geoid model covers the entire area of Estonia and surrounding waters of the Baltic Sea. The fit between the geoid model and GNSS/levelling data within the Estonian dry land revealed RMS of residuals ±1… ±2 cm. Note that such fitting validation cannot proceed over marine areas. Therefore, an ALS observation-based methodology was developed to evaluate the GRAV-GEOID2011 quality over marine areas. The accuracy of acquired ALS dataset were analyzed, also an optimal width of nadir-corridor containing good quality ALS data was determined. Impact of ALS scan angle range and flight altitude to obtainable vertical accuracy were investigated as well. The quality of point cloud is analysed by cross validation between overlapped flight lines and the comparison with tide gauge stations readings. The comparisons revealed that the ALS based profiles of sea level heights agree reasonably with the regional geoid model (within accuracy of the ALS data and after applying corrections due to sea level variations). Thus ALS measurements are suitable for measuring sea surface heights and validating marine geoid models.

Julge, Kalev; Gruno, Anti; Ellmann, Artu; Liibusk, Aive; Oja, Tõnis

2014-05-01

82

HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments  

SciTech Connect

HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

McCann, R.A.; Lowery, P.S.

1987-10-01

83

Model validation in soft systems practice  

SciTech Connect

The concept of `a model` usually evokes the connotation `model of part of the real world`. That is an almost automatic response. It makes sense especially in relation to the way the concept has been developed and used in natural science. Classical operational research (OR), with its scientific aspirations, and systems engineering, use the concept in the same way and in addition use models as surrogates for the real world, on which experimentation is cheap. In these fields the key feature of a model is representativeness. In soft systems methodology (SSM) models are not of part of the world; they are only relevant to debate about the real world and are used in a cyclic learning process. The paper shows how the different concepts of validation in classical OR and SSM lead to a way of sharply defining the nature of `soft OR`. 21 refs.

Checkland, P. [Univ. of Lancaster (United Kingdom)

1995-03-01

84

Predictive Validation of an Influenza Spread Model  

PubMed Central

Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive ability.

Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

2013-01-01

85

Validation of Space Weather Models at Community Coordinated Modeling Center  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. Space weather models and coupled model chains hosted at the CCMC range from the solar corona to the Earth's upper atmosphere. CCMC has developed a number of real-time modeling systems, as well as a large number of modeling and data products tailored to address the space weather needs of NASA's robotic missions. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. CCMC has been leading recent comprehensive modeling challenges under GEM, CEDAR and SHINE programs. The presentation will focus on experience in carrying out comprehensive and systematic validation of large sets of. space weather models

Kuznetsova, M. M.; Hesse, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; Zheng, Y.; MacNeice, P. J.; Shim, J.; Taktakishvili, A.; Chulaki, A.

2011-01-01

86

Construct validation of the health belief model.  

PubMed

A multitrait-multimethod design was employed to assess the construct validity of the Health Belief Model. The data were obtained from a nonrepresentative sample of 85 graduate students at The University of Michigan's School of Public Health. The traits consisted of the respondents' perceptions of: health interest, locus of control, susceptibility to influenza, severity of influenza, benefits provided by a flu shot, and the barriers or costs associated with getting a flu shot. Each trait was measured by three methods: a seven-point Likert scale, a fixed-alternative multiple choice scale, and a vignette. The results indicate that the Health Belief Model variables can be measured with a substantial amount of convergent validity using Likert or multiple choice questionnaire items. With regard to discriminant validity, evidence suggests that subjects' perceptions of barriers and benefits are quite different from their perceptions of susceptibility and severity. Perceptions of susceptibility and severity are substantially but not entirely independent. Perceived benefits and barriers demonstrate a strong negative relationship which suggests the possibility that these two variables represent opposite ends of a single continuum and not separate health beliefs. These preliminary results provide the basis for developing brief health belief scales that may be administered to samples of consumers and providers to assess educational needs. Such needs assessment, in turn, could then be used to tailor messages and programs to meet the particular needs of a client group. PMID:299611

Cummings, K M; Jette, A M; Rosenstock, I M

1978-01-01

87

Bayesian structural equation modeling method for hierarchical model validation  

Microsoft Academic Search

A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level

Xiaomo Jiang; Sankaran Mahadevan

2009-01-01

88

Teaching "Instant Experience" with Graphical Model Validation Techniques  

ERIC Educational Resources Information Center

Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

Ekstrøm, Claus Thorn

2014-01-01

89

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experiment Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-2, and are also valuable tools for the analysis of innovative reactor designs. 29 refs., 6 figs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1991-01-01

90

Validating the Beck Depression Inventory-II for Hong Kong Community Adolescents  

ERIC Educational Resources Information Center

The primary purpose of this study was to test for the validity of a Chinese version of the Beck Depression Inventory-II (C-BDI-II) for use with Hong Kong community (i.e., nonclinical) adolescents. Based on a randomized triadic split of the data (N = 1460), we conducted exploratory factor analysis on Group1 (n = 486) and confirmatory factor…

Byrne, Barbara M.; Stewart, Sunita M.; Lee, Peter W. H.

2004-01-01

91

Computer Modeling and Simulation in Support of the Stiff Suspension Active Seismic Isolation for LIGO II  

Microsoft Academic Search

We have developed a very general and versatile approach to model and predict the behavior of complex isolation systems that are being designed for LIGO II. The code is currently being validated against the Stanford prototype, it has already been validated against the GEO 600 code for their triple pendulum, and we will also be validating it with the Stiff

Brian Lantz; Wensheng Hua; Sam Richman

92

Validity and reliability of SCREEN II (Seniors in the Community: Risk evaluation for eating and nutrition, Version II)  

Microsoft Academic Search

Background:Nutrition risk screening for community-living seniors is of great interest in the health arena. However, to be useful, nutrition risk indices need to be valid and reliable. The following three studies describe construct validation, test–retest and inter-rater reliability of SCREEN II.Methods:Study (1) seniors were recruited from the general community and from a geriatrician's clinic to complete a nutritional assessment and

H H Keller; R Goy; S-L Kane

2005-01-01

93

New metrics for permafrost model validation  

NASA Astrophysics Data System (ADS)

Meteorological data from Arctic regions are historically scarce, due principally to their remote and inhospitable nature, and therefore, decreased human habitation compared with more temperature environments. Simulating the future climate of these regions has become a problem of significant importance, as recent projections indicate a high degree of sensitivity to forecasted increases in temperature, as well as the possibility of strong positive feedbacks to the climate system. For these climate projections to be properly constrained, they must be validated through comparison with relevant climate observables in a past time frame. Active layer thickness (ALT) has become a key descriptor of the state of permafrost, in both observation and simulation. As such, it is an ideal metric for model validation as well. Concerted effort to create a database of ALT measurements in Arctic regions culminated in the inception of the Circumpolar Active Layer Measurement (CALM) project over 20 years ago. This paper examines in detail the utility of Alaskan CALM data as a model validation tool. Derivation of ALT data from soil temperature stations and boreholes is also examined, as well as forced numerical modelling of soil temperatures by surface air temperature (SAT) and ground surface temperature (GST). Results indicate that existing individual or repeated borehole temperature logs are generally unsuitable for deriving ALT because of coarse vertical resolution, and failing to capture the exact timing of maximum annual thaw. However, because of their systematic temporal resolution, and comparatively fine vertical resolution, daily soil temperature data compare favourably with the ALT measurements from CALM data. Numerical simulation of subsurface temperatures also agree well with CALM data if forced by GST; results from SAT-forced simulations are less straightforward due to coupling processes, such as snow cover, that complicate heat conduction at the ground surface.

Stevens, M. B.; Beltrami, H.; Gonzalez-Rouco, J. F.

2012-04-01

94

Validation of uncertainty estimates in hydrologic modelling  

NASA Astrophysics Data System (ADS)

Meaningful characterization of uncertainties affecting conceptual rainfall-runoff (CRR) models remains a challenging research area in the hydrological community. Numerous methods aimed at quantifying the uncertainty in hydrologic predictions have been proposed over the last decades. In most cases, the outcome of such methods takes the form of a predictive interval, computed from a predictive distribution. Regardless of the method used to derive it, it is important to notice that the predictive distribution results from the assumptions made during the inference. Consequently, unsupported assumptions may lead to inadequate predictive distributions, i.e. under- or over-estimated uncertainties. It follows that the estimated predictive distribution must be thoroughly scrutinized ("validated"); as discussed by Hall et al. [2007] "Without validation, calibration is worthless, and so is uncertainty estimation". The aim of this communication is to study diagnostic tools aimed at assessing the reliability of uncertainty estimates. From a methodological point of view, this requires diagnostic approaches that compare a time-varying distribution (the predictive distribution at all times t) to a time series of observations. This is a much more stringent test than validation methods currently used in hydrology, which simply compare two time series (observations and "optimal" simulations). Indeed, standard goodness-of-fit assessments (e.g. using the Nash-Sutcliff statistic) can not check if the predictive distribution is consistent with the observed data. The usefulness of the proposed diagnostic tools will be illustrated with a case study comparing the performance of several uncertainty quantification frameworks. In particular, it will be shown that standard validation approaches (e.g. based on the Nash-Sutcliff statistic or verifying that about p% of the observations lie within the p% predictive interval) are not able to discriminate competing frameworks whose performance (in terms of uncertainty quantification) is evidently different.

Thyer, M.; Engeland, K.; Renard, B.; Kuczera, G.; Franks, S.

2009-04-01

95

Plasma Reactor Modeling and Validation Experiments  

NASA Technical Reports Server (NTRS)

Plasma processing is a key processing stop in integrated circuit manufacturing. Low pressure, high density plum reactors are widely used for etching and deposition. Inductively coupled plasma (ICP) source has become popular recently in many processing applications. In order to accelerate equipment and process design, an understanding of the physics and chemistry, particularly, plasma power coupling, plasma and processing uniformity and mechanism is important. This understanding is facilitated by comprehensive modeling and simulation as well as plasma diagnostics to provide the necessary data for model validation which are addressed in this presentation. We have developed a complete code for simulating an ICP reactor and the model consists of transport of electrons, ions, and neutrals, Poisson's equation, and Maxwell's equation along with gas flow and energy equations. Results will be presented for chlorine and fluorocarbon plasmas and compared with data from Langmuir probe, mass spectrometry and FTIR.

Meyyappan, M.; Bose, D.; Hash, D.; Hwang, H.; Cruden, B.; Sharma, S. P.; Rao, M. V. V. S.; Arnold, Jim (Technical Monitor)

2001-01-01

96

Measuring avoidance of pain: validation of the Acceptance and Action Questionnaire II-pain version.  

PubMed

Psychometric research on widely used questionnaires aimed at measuring experiential avoidance of chronic pain has led to inconclusive results. To test the structural validity, internal consistency, and construct validity of a recently developed short questionnaire: the Acceptance and Action Questionnaire II-pain version (AAQ-II-P). Cross-sectional validation study among 388 adult patients with chronic nonspecific musculoskeletal pain admitted for multidisciplinary pain rehabilitation in four tertiary rehabilitation centers in the Netherlands. Cronbach's ? was calculated to analyze internal consistency. Principal component analysis was performed to analyze factor structure. Construct validity was analyzed by examining the association between acceptance of pain and measures of psychological flexibility (two scales and sum), pain catastrophizing (three scales and sum), and mental and physical functioning. Interpretation was based on a-priori defined hypotheses. The compound of the seven items of the AAQ-II-P shows a Cronbach's ? of 0.87. The single component explained 56.2% of the total variance. Correlations ranged from r=-0.21 to 0.73. Two of the predefined hypotheses were rejected and seven were not rejected. The AAQ-II-P measures a single component and has good internal consistency, and construct validity is not rejected. Thus, the construct validity of the AAQ-II-P sum scores as indicator of experiential avoidance of pain was supported. PMID:24418966

Reneman, Michiel F; Kleen, Marco; Trompetter, Hester R; Schiphorst Preuper, Henrica R; Köke, Albère; van Baalen, Bianca; Schreurs, Karlein M G

2014-06-01

97

Model-Based Method for Sensor Validation  

NASA Technical Reports Server (NTRS)

Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

Vatan, Farrokh

2012-01-01

98

Modeling Earth Dynamics: Complexity, Uncertainty, and Validation  

NASA Astrophysics Data System (ADS)

28th IUGG Conference on Mathematical Geophysics; Pisa, Italy, 7-11 June 2010; The capabilities and limits of mathematical models applied to a variety of geophysical processes were discussed during the 28th international Conference on Mathematical Geophysics, held in Italy (see the conference Web site (http://cmg2010.pi.ingv.it), which includes abstracts). The conference was organized by the International Union of Geodesy and Geophysics (IUGG) Commission on Mathematical Geophysics (CMG) and the Istituto Nazionale di Geofisica e Vulcanologia and was cosponsored by the U.S. National Science Foundation. The meeting was attended by more than 160 researchers from 26 countries and was dedicated to the theme “Modelling Earth Dynamics: Complexity, Uncertainty, and Validation.” Many talks were dedicated to illustration of the complexities affecting geophysical processes. Novel applications of geophysical fluid dynamics were presented, with specific reference to volcanological and ­subsurface/surface flow processes. In most cases, investigations highlighted the need for multidimensional and multiphase flow models able to describe the nonlinear effects associated with the nonhomogeneous nature of the matter. Fluid dynamic models of atmospheric, oceanic, and environmental systems also illustrated the fundamental role of nonlinear couplings between the different subsystems. Similarly, solid Earth models have made it possible to obtain the first tomographies of the planet; to formulate nonlocal and dynamic damage models of rocks; to investigate statistically the triggering, clustering, and synchronization of faults; and to develop realistic simulators of the planetary dynamo, plate tectonics, and gravity and magnetic fields.

Neri, A.

2010-12-01

99

Boron-10 Lined Proportional Counter Model Validation  

SciTech Connect

The Department of Energy Office of Nuclear Safeguards (NA-241) is supporting the project “Coincidence Counting With Boron-Based Alternative Neutron Detection Technology” at Pacific Northwest National Laboratory (PNNL) for the development of an alternative neutron coincidence counter. The goal of this project is to design, build and demonstrate a boron-lined proportional tube-based alternative system in the configuration of a coincidence counter. This report discusses the validation studies performed to establish the degree of accuracy of the computer modeling methods current used to simulate the response of boron-lined tubes. This is the precursor to developing models for the uranium neutron coincidence collar under Task 2 of this project.

Lintereur, Azaree T.; Siciliano, Edward R.; Kouzes, Richard T.

2012-06-30

100

Validity  

NSDL National Science Digital Library

In this chapter, the authors will describe the four types of validity: construct validity, content validity, concurrent validity, and predictive validity. Depending on the test and the rationale or purpose for its administration, and understanding of the

Christmann, Edwin P.; Badgett, John L.

2008-11-01

101

DEVELOPMENT OF THE MESOPUFF II DISPERSION MODEL  

EPA Science Inventory

The development of the MESOPUFF II regional-scale air quality model is described. MESOPUFF II is a Lagrangian variable-trajectory puff superposition model suitable for modeling the transport, diffusion and removal of air pollutants from multiple point and area sources at transpor...

102

Validation of HEDR models. Hanford Environmental Dose Reconstruction Project  

SciTech Connect

The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model predictions with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

1994-01-01

103

Full-Scale Cookoff Model Validation Experiments  

SciTech Connect

This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release of the live data. Predictions were comparable to the measured values and were highly dependent on the established boundary conditions. Both HWP tests failed at a weld located near the aft closure of the device. More than 90 percent of unreacted explosive was recovered in the end heated experiment and less than 30 percent recovered in the side heated test.

McClelland, M A; Rattanapote, M K; Heimdahl, E R; Erikson, W E; Curran, P O; Atwood, A I

2003-11-25

104

Constructing and Validating a Decadal Prediction Model  

NASA Astrophysics Data System (ADS)

For the purpose of identifying potential sources of predictability of Scottish mean air temperature (SMAT), a redundancy analysis (RA) was accomplished to quantitatively assess the predictability of SMAT from North Atlantic SSTs as well as the temporal consistency of this predictability. The RA was performed between the main principal components of North Atlantic SST anomalies and SMAT anomalies for two time periods: 1890-1960 and 1960-2006. The RA models developed using data from the 1890-1960 period were validated using the 1960-2006 period; in a similar way the model developed based on the 1960-2006 period was validated using data from the 1890-1960 period. The results indicate the potential to forecast decadal trends in SMAT for all seasons in 1960-2006 time period and all seasons with the exception of winter for the period 1890-1960 with the best predictability achieved in summer. The statistical models show the best performance when SST anomalies in the European shelf seas (45°N-65°N, 20W-20E) rather than those for the SSTs over the entire North Atlantic (30°N-75°N, 80°W-30°E) were used as a predictor. The results of the RA demonstrated that similar SSTs modes were responsible for predictions in the first and second half of the 20th century, establishing temporal consistency, though with stronger influence in the more recent half. The SST pattern responsible for explaining the largest amount of variance in SMAT was stronger in the second half of the 20th century and showed increasing influence from the area of the North Sea, possibly due to faster sea-surface warming in that region in comparison with the open North Atlantic. The Wavelet Transform (WT), Cross Wavelet Transform (XWT) and Wavelet Coherence (WTC) techniques were used to analyse RA-model-based forecasts of SMAT in the time-frequency domain. Wavelet-based techniques applied to the predicted and observed time series revealed a good performance of RA models to predict the frequency variability in the SMAT time series. A better performance was obtained for predicting the SMAT during the period 1960-2006 based on 1890-1960 than vice versa, with the exception of winter 1890-1960. In the same frequency bands and in the same time interval there was high coherence between observed and predicted time series. In particular, winter, spring and summer wavelets at 8±1.5 year band were highly correlated in both time periods, with higher correlation in 1960-2006 and in summer.

Foss, I.; Woolf, D. K.; Gagnon, A. S.; Merchant, C. J.

2010-05-01

105

Numerical Validation of Quasigeostrophic Ellipsoidal Vortex Model  

NASA Astrophysics Data System (ADS)

In geophysical flows, coherent vortex structures persist for long time and their interactions dominate the dynamics of geophysical turbulence. Meacham et al. obtained a series of exact unsteady solution of the quasigeostrophic equation, which represents a uniform ellipsoidal vortex patch embedded in a uniform 3D shear field. Miyazaki et al. derived a Hamiltonian dynamical system describing the interactions of N ellipsoidal vortices, where each coherent vortex was modeled by an ellipsoid of uniform potential vorticity. In this paper, direct numerical simulations based on a Contour Advective Semi-Lagrangian algorithm (CASL) are performed in order to assess the validity of the Hamiltonian model. First, the instability of a tilted spheroid is investigated. A prolate spheroid becomes unstable against the third Legendre mode when the aspect ratio is less than 0.44 and the inclination angle is larger than 0.48. Weakly unstable flatter spheroidal vortices emit thin filaments from their top and bottom, whereas strongly ustable slender spheriodal vortices are broken up into two pieces. Secondly, the interaction of two co-rotaing spheroidal vortices on slightly different vertical levels is studied in detail. It is shown that the Hamiltonian model can predict the critical merger distance fairly well. Considerable amounts of energy and enstrophy are dissipated in these events. The correlation between the energy dissipation and the enstrophy dissipation is good, suggesting the existence of a deterministic reset-rule.

Miyazaki, Takeshi; Asai, Akinori; Yamamoto, Masahiro; Fujishima, Shinsuke

2002-11-01

106

Diurnal ocean surface layer model validation  

NASA Technical Reports Server (NTRS)

The diurnal ocean surface layer (DOSL) model at the Fleet Numerical Oceanography Center forecasts the 24-hour change in a global sea surface temperatures (SST). Validating the DOSL model is a difficult task due to the huge areas involved and the lack of in situ measurements. Therefore, this report details the use of satellite infrared multichannel SST imagery to provide day and night SSTs that can be directly compared to DOSL products. This water-vapor-corrected imagery has the advantages of high thermal sensitivity (0.12 C), large synoptic coverage (nearly 3000 km across), and high spatial resolution that enables diurnal heating events to be readily located and mapped. Several case studies in the subtropical North Atlantic readily show that DOSL results during extreme heating periods agree very well with satellite-imagery-derived values in terms of the pattern of diurnal warming. The low wind and cloud-free conditions necessary for these events to occur lend themselves well to observation via infrared imagery. Thus, the normally cloud-limited aspects of satellite imagery do not come into play for these particular environmental conditions. The fact that the DOSL model does well in extreme events is beneficial from the standpoint that these cases can be associated with the destruction of the surface acoustic duct. This so-called afternoon effect happens as the afternoon warming of the mixed layer disrupts the sound channel and the propagation of acoustic energy.

Hawkins, Jeffrey D.; May, Douglas A.; Abell, Fred, Jr.

1990-01-01

107

Boron-10 Lined Proportional Counter Model Validation  

SciTech Connect

The decreasing supply of 3He is stimulating a search for alternative neutron detectors; one potential 3He replacement is 10B-lined proportional counters. Simulations are being performed to predict the performance of systems designed with 10B-lined tubes. Boron-10-lined tubes are challenging to model accurately because the neutron capture material is not the same as the signal generating material. Thus, to simulate the efficiency, the neutron capture reaction products that escape the lining and enter the signal generating fill gas must be tracked. The tube lining thickness and composition are typically proprietary vendor information, and therefore add additional variables to the system simulation. The modeling methodologies used to predict the neutron detection efficiency of 10B-lined proportional counters were validated by comparing simulated to measured results. The measurements were made with a 252Cf source positioned at several distances from a moderated 2.54-cm diameter 10B-lined tube. Models were constructed of the experimental configurations using the Monte Carlo transport code MCNPX, which is capable of tracking the reaction products from the (n,10B) reaction. Several different lining thicknesses and compositions were simulated for comparison with the measured data. This paper presents the results of the evaluation of the experimental and simulated data, and a summary of how the different linings affect the performance of a coincidence counter configuration designed with 10B-lined proportional counters.

Lintereur, Azaree T.; Ely, James H.; Kouzes, Richard T.; Rogers, Jeremy L.; Siciliano, Edward R.

2012-11-18

108

Validity of the California Verbal Learning Test–II in Multiple Sclerosis  

Microsoft Academic Search

Multiple sclerosis (MS) is a disease of the central nervous system where roughly 50% of patients exhibit cognitive impairment. Episodic memory defects are particularly common in MS and the California Verbal Learning Test: 2nd Edition (CVLT-II) was recommended for assessment in MS in a recently published consensus position paper. We investigated the validity of the CVLT-II in 351 MS patients

Shane Stegen; Igor Stepanov; Diane Cookfair; Eben Schwartz; David Hojnacki; Bianca Weinstock-Guttman; Ralph H. B. Benedict

2010-01-01

109

Validating agent based models through virtual worlds.  

SciTech Connect

As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior. Results from our work indicate that virtual worlds have the potential for serving as a proxy in allocating and populating behaviors that would be used within further agent-based modeling studies.

Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina [Sandia National Laboratories, Livermore, CA] [Sandia National Laboratories, Livermore, CA; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E. [North Carolina State University, Raleigh, NC] [North Carolina State University, Raleigh, NC; Bernstein, Jeremy Ray Rhythm [Gaikai, Inc., Aliso Viejo, CA] [Gaikai, Inc., Aliso Viejo, CA

2014-01-01

110

Validation of the MetaMax II portable metabolic measurement system.  

PubMed

The purpose of this study was to investigate the validity of the MetaMax II portable metabolic measurement system against the Douglas Bag technique. Nine recreationally active male subjects were included in a validation at 100 W, 10 well-trained male subjects at 200 W and 10 well-trained males at 250 W and at maximal exercise (volitional fatigue at a mean workload of 325 W). All testing was performed on an electronically braked bicycle at 60 rpm. At 100 W, the influence on MetaMax II measurements of adding a Douglas Bag breathing valve in series to the MetaMax II was investigated. The oxygen uptake was, for the MetaMax II, at 100 W mean 0.03 l x min (-1) higher (p < 0.01), at 200 W mean 0.02 l x min (-1) (n. s.) lower, at 250 W mean 0.04 l x min (-1) (n. s.) higher, and at 325 W mean 0.11 l x min (-1) (p < 0.05) higher. The carbon dioxide excretion was, for the MetaMax II, at 100 W mean 0.06 l x min (-1) (p < 0.01) lower, at 200 W mean 0.11 l x min (-1) (p < 0.05) lower, at 250 W mean 0.03 l x min (-1) (n. s.) lower, and at 325 W mean 0.16 l x min (-1) (p < 0.05) lower. The addition of a breathing valve in series to the MetaMax II resulted in lower breathing frequency, a higher ventilated tidal volume, and an affected gas measurement validation. In conclusion, the MetaMax II was found to be valid for metabolic gas measurements between 100 and at least 250 W. PMID:14986194

Larsson, P U; Wadell, K M E; Jakobsson, E J I; Burlin, L U; Henriksson-Larsén, K B

2004-02-01

111

Numerical Validation of Quasigeostrophic Ellipsoidal Vortex Model  

NASA Astrophysics Data System (ADS)

In geophysical flows, coherent vortex structures persist for long time and their interactions dominate the dynamics of geophysical turbulence. Meacham et al.1,2) obtained a series of exact unsteady solution of the quasigeostrophic equation, which represents a uniform ellipsoidal vortex patch embedded in a uniform 3D shear field. Miyazaki et al.3,4) have derived a Hamiltonian dynamical system of 3N degrees of freedom, describing the interactions of N ellipsoidal vortices, where each coherent vortex was modeled by an ellipsoid of uniform potential vorticity. The center of vorticity and the angular momentum are conserved, besides the total energy and Casimirs of the system, such as the vortex height and the vortex volume. There are three Poisson-commutable invariants, which is less than the degree of freedom for N>=2, and chaotic motions are observed even in a two-body system. In this paper, direct numerical simulations based on a Contour Advective Semi-Lagrangian algorithm (CASL) are performed in order to assess the validity of the Hamiltonian model. First, the instability of a tilted spheroid is investigated. A prolate spheroid becomes unstable against the third Legendre mode when the aspect ratio is less than 0.44 and the inclination angle is larger than 0.48.5) Weakly unstable flatter spheroidal vortices emit thin filaments from their top and bottom, whereas strongly unstable slender spheroidal vortices are broken up into two pieces. Secondly, the interaction of two co-rotating spheroidal vortices on slightly different vertical levels, which plays a key role in the turbulence dynamics, is studied in detail. The Hamiltonian model can predict the critical distance of symmetric mergers very well, except for mergers of vortices on the same horizontal plane. The model gives poorer predictions in asymmetric cases, where vorticity exchange occurs (instead of merger) along the threshold determined by the Hamiltonian model. The slenderer vortex loses half of its original volume, and the flatter vortex expands slightly absorbing some of the filaments ejected from the slenderer vortex. This is a new dynamical process linked with the energy and enstrophy cascades. Considerable amounts of energy and enstrophy are dissipated in these events. The correlation between the energy dissipation and the enstrophy dissipation is good, suggesting the existence of a simple deterministic reset-rule. 1)S. P. Meacham, et al.: Dyn. Atmos. Oceans 21 (1994) 167. 2)S. P. Meacham, et al: Phys. Fluids 9 (1997) 2310. 3)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 69 (2000) 3233. 4)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 70 (2001) 1942. 5)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 68 (1999) 2592.

Miyazaki, T.; Fujishima, S.

2002-05-01

112

Concurrent Validity of the Online Version of the Keirsey Temperament Sorter II.  

ERIC Educational Resources Information Center

Data from the Keirsey Temperament Sorter II online instrument and Myers Briggs Type Indicator (MBTI) for 203 college freshmen were analyzed. Positive correlations appeared between the concurrent MBTI and Keirsey measures of psychological type, giving preliminary support to the validity of the online version of Keirsey. (Contains 28 references.)…

Kelly, Kevin R.; Jugovic, Heidi

2001-01-01

113

Concurrent and Predictive Validity of the Phelps Kindergarten Readiness Scale-II  

ERIC Educational Resources Information Center

The purpose of this research was to establish the concurrent and predictive validity of the Phelps Kindergarten Readiness Scale, Second Edition (PKRS-II; L. Phelps, 2003). Seventy-four kindergarten students of diverse ethnic backgrounds enrolled in a northeastern suburban school participated in the study. The concurrent administration of the…

Duncan, Jennifer; Rafter, Erin M.

2005-01-01

114

Validity and utility of bipolar spectrum models.  

PubMed

The bipolar spectrum model suggests that several patient presentations not currently recognized by the DSM warrant consideration as part of a mood disorders continuum. These include hypomania or mania associated with antidepressants; manic symptoms which fall short of the current DSM threshold for hypomania; and depression attended by multiple non-manic markers that are associated with bipolar course. Evidence supporting the inclusion of these groups within the realm of bipolar disorder (BP) is examined. Several diagnostic tools for detecting and characterizing these patient groups are described. Finally, options for altering DSM-IV criteria to allow some of the above patient presentations to be recognized as bipolar are considered. More data on the validity and utility of these alterations would be useful, but limited changes appear warranted now. We describe an additional BP Not Otherwise Specified (BP NOS) example which creates a subthreshold hypomanic analogue to cyclothymia, consistent with existing BP NOS criteria. This change should be accompanied by additional requirements for the assessment and reporting of non-manic bipolar markers. PMID:18199236

Phelps, James; Angst, Jules; Katzow, Jacob; Sadler, John

2008-02-01

115

Design and Development Research: A Model Validation Case  

ERIC Educational Resources Information Center

This is a report of one case of a design and development research study that aimed to validate an overlay instructional design model incorporating the theory of multiple intelligences into instructional systems design. After design and expert review model validation, The Multiple Intelligence (MI) Design Model, used with an Instructional Systems…

Tracey, Monica W.

2009-01-01

116

What do we mean by validating a prognostic model?  

Microsoft Academic Search

SUMMARY Prognostic models are used in medicine for investigating patient outcome in relation to patient and disease characteristics. Such models do not always work well in practice, so it is widely recommended that they need to be validated. The idea of validating a prognostic model is generally taken to mean establishing that it works satisfactorily for patients other than those

Douglas G. Altman; Patrick Royston

2000-01-01

117

Challenges of Validating Global Assimilative Models of the Ionosphere  

Microsoft Academic Search

This paper addresses the often surprisingly difficult challenges that arise in conceptually simple validations of global models of the ionosphere. AFRL has been tasked with validating the Utah State University GAIM (Global Assimilation of Ionospheric Measurements) model of the ionosphere, which is run in real time by the Air Force Weather Agency. The USU-GAIM model currently assimilates, in addition to

G. J. Bishop; L. F. McNamara; J. A. Welsh; D. T. Decker; C. R. Baker

2008-01-01

118

Wavelet spectrum analysis approach to model validation of dynamic systems  

Microsoft Academic Search

Feature-based validation techniques for dynamic system models could be unreliable for nonlinear, stochastic, and transient dynamic behavior, where the time series is usually non-stationary. This paper presents a wavelet spectral analysis approach to validate a computational model for a dynamic system. Continuous wavelet transform is performed on the time series data for both model prediction and experimental observation using a

Xiaomo Jiang; Sankaran Mahadevan

2011-01-01

119

Micromachined accelerometer design, modeling and validation  

SciTech Connect

Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

1998-04-01

120

Validation of the HOTCHAN code for analyzing the EBR-II core following an unprotected loss of flow  

SciTech Connect

A series of reactor experiments involving unprotected (no scram) loss-of-primary flow (LOF) down to natural convection was successfully conducted in February and April of 1986 on the Experimental Breeder Reactor II (EBR-II). The predicted and measured behavior of a special instrumented assembly, the XX09-fueled INSAT, was compared for the most severe test (SHRT 45) to demonstrate the validation of the thermal-hydraulic code HOTCHAN. The particular test of interest in this paper was initiated at full power by tripping the primary and secondary pumps. These tests were part of the shutdown heat removal tests (SHRT) being conducted in EBR-II. The reactor and balance of plant are extensively instrumented, and measurements were recorded by a data acquisition system. The reactor and plant response confirm predictions that the driver fuel cladding can survive temperatures above the eutectic threshold for the transient following a station blackout without scramming the reactor. In addition, the in-core data provide a firm basis for validation of the Argonne/EBR-II developed HOTCHAN code for analyzing the thermal-hydraulic behavior of specific fuel subassemblies. In this paper the analytical model for HOTCHAN is described as well as it relationship to the NATDEMO code. The predicted behavior of the hottest driver subassembly is also discussed and compared with the INSAT XX09 results.

Mohr, D.; Chang, L.K.; Planchon, H.P.

1988-01-01

121

Techniques and Issues in Agent-Based Modeling Validation  

SciTech Connect

Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

Pullum, Laura L [ORNL; Cui, Xiaohui [New York Institute of Technology (NYIT)

2012-01-01

122

Cross-Validation for Nonlinear Mixed Effects Models  

PubMed Central

Cross-validation is frequently used for model selection in a variety of applications. However, it is difficult to apply cross-validation to mixed effects models (including nonlinear mixed effects models or NLME models) due to the fact that cross-validation requires “out-of-sample” predictions of the outcome variable, which cannot be easily calculated when random effects are present. We describe two novel variants of cross-validation that can be applied to nonlinear mixed effects models. One variant, where out-of-sample predictions are based on post hoc estimates of the random effects, can be used to select the overall structural model. Another variant, where cross-validation seeks to minimize the estimated random effects rather than the estimated residuals, can be used to select covariates to include in the model. We show that these methods produce accurate results in a variety of simulated data sets and apply them to two publicly available population pharmacokinetic data sets.

Colby, Emily; Bair, Eric

2013-01-01

123

Brief Report: Construct Validity of Two Identity Status Measures: The EIPQ and the EOM-EIS-II  

ERIC Educational Resources Information Center

The present study was designed to examine construct validity of two identity status measures, the Ego Identity Process Questionnaire (EIPQ; J. Adolescence 18 (1995) 179) and the Extended Objective Measure of Ego Identity Status II (EOM-EIS-II; J. Adolescent Res. 1 (1986) 183). Construct validity was operationalized in terms of how identity status…

Schwartz, Seth J.

2004-01-01

124

Validation of Quasigeostrophic Ellipsoidal Vortex Model  

NASA Astrophysics Data System (ADS)

Numerical simulations of decaying three-dimensional quasigeostrophicturbulence in- dicate that the vorticity field develops coherent vortex structures whose interactions dominate the dynamics of the turbulence. Meacham et al.1 ,2) obtained exact unsteady solutions representing a uniform ellipsoidal vortex patch embedded in a uniform hori- zontal strain and vertical shear with uniform background vorticity. Miyazaki et al.3 ,4) developed turbulence-vortex-models based on these solutions, where each vortex was represented by an ellipsoid of uniform potential vorticity. The dynamics of N inter- acting ellipsoidal vortices are shown to be a Hamiltonian system of 3N degrees of freedom. The center of vorticity and the angular momentum are conserved, besides the total energy and Casimirs of the system, such as the vortex height and the vortex volume. There are three Poisson-commutable invariants, which is less than the degree of freedom for N 2, chaotic motions are observed even in a two-body system. In this paper, direct numerical simulations based on a Contour Advective Semi- Lagrangian algorithm (CASL) are performed in order to assess the validity of the Hamiltonian model. First, the instability of a tilted spheroid is investigated. A pro- late spheroid becomes unstable against the third Legendre mode when the aspect ra- tio / is less than 0.44 and the inclination angle is larger than 0.48 (Miyazaki et al.5)). Weakly unstable flatter spheroidal vortices emit thin filaments, whereas strongly ustable slender spheriodal vortices are broken up into two pieces. Secondly, the inter- action of two co-rotaing spheroidal vortices, which plays a key role in the dynamics of quasigeostrophic turbulence, is studied in detail. It is shown that the critical distance of merger is well estimated by the Hamiltonian ellipsoidal vortex model. Consider- able amounts of energy and enstrophy are dissipated in these events. The correlation between the energy dissipation and the enstrophy dissipation is good, suggesting the existence of a deterministic reset-rule. If we can define the properties of the vortex (vortices) born after filamentaion, break-up, and merger, we can perform a dissipative 'quasi-turbulence simulation'. 1)S. P. Meacham, et al.: Dyn. Atmos. Oceans 21 (1994) 167. 2)S. P. Meacham, et al: Phys. Fluids 9 (1997) 2310. 3) T. Miyazaki, et al.: J. Phys. Soc. Jpn. 69 (2000) 3233. 4)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 70 (2001) 1942. T. Miyazaki, et al.: J. Phys. 5) Soc. Jpn. 68 (1999) 2592.

Miyazaki, T.; Fujishima, S.

125

The Space Weather Modeling Framework (SWMF): Models and Validation  

NASA Astrophysics Data System (ADS)

In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. SWMF is a powerful tool for coupling regional models describing the space environment from the solar photosphere to the bottom of the ionosphere. Presently, SWMF contains over a dozen components: the solar corona (SC), eruptive event generator (EE), inner heliosphere (IE), outer heliosphere (OH), solar energetic particles (SE), global magnetosphere (GM), inner magnetosphere (IM), radiation belts (RB), plasmasphere (PS), ionospheric electrodynamics (IE), polar wind (PW), upper atmosphere (UA) and lower atmosphere (LA). This talk will present an overview of SWMF, new results obtained with improved physics as well as some validation studies.

Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Ridley, Aaron; Manchester, Ward, IV

126

Semi-Supervised Model Selection Based on Cross-Validation  

Microsoft Academic Search

We propose a new semi-supervised model selection method that is derived by applying the structural risk minimization principle to a recent semi-supervised generalization error bound. This bound that we build on is based on the cross-validation estimate underlying the popular cross-validation model selection heuristic. Thus, the proposed semi-supervised method is closely connected to cross-validation which makes studying these methods side

Matti Kaariainen

2006-01-01

127

Nonaxisymmetric turbine end wall design: Part II -- Experimental validation  

SciTech Connect

The Durham Linear Cascade has been redesigned with the nonaxisymmetric profiled end wall described in the first part of this paper, with the aim of reducing the effects of secondary flow. The design intent was to reduce the passage vortex strength and to produce a more uniform exit flow angle profile in the radial direction with less overturning at the wall. The new end wall has been tested in the linear cascade and a comprehensive set of measurements taken. These include traverses of the flow field at a number of axial planes and surface static pressure distributions on the end wall. Detailed comparisons have been made with the CFD design predictions, and also for the results with a planar end wall. In this way an improved understanding of the effects of end wall profiling has been obtained. The experimental results generally agree with the design predictions, showing a reduction in the strength of the secondary flow at the exit and a more uniform flow angle profile. In a turbine stage these effects would be expected to improve the performance of any downstream blade row. There is also a reduction in the overall loss, which was not given by the CFD design predictions. Areas where there are discrepancies between the CFD calculations and measurement are likely to be due to the turbulence model used. Conclusions for how the three-dimensional linear design system should be used to define end wall geometries for improved turbine performance are presented.

Hartland, J.C.; Gregory-Smith, D.G.; Harvey, N.W.; Rose, M.G.

2000-04-01

128

Validations of Computational Weld Models: Comparison of Residual Stresses.  

National Technical Information Service (NTIS)

The objective of this project was to validate the capability of VrWeld to simulate the weld buildup process in two experimental setups. Setup I had a central depression with dimensions of 100 x 100 x 3 mm, while Setup II had a central depression with dime...

J. Goldak

2010-01-01

129

Model for Use of Sociometry to Validate Attitude Measures.  

ERIC Educational Resources Information Center

A study concerning the development and validation of an instrument intended to measure Goal II of quality education is presented. This goal is that quality education should help every child acquire understanding and appreciation of persons belonging to social, cultural and ethnic groups different from his own. The rationale for measurement…

McGuiness, Thomas P.; Stank, Peggy L.

130

Validation of numerical models of ceramic pin grid array packages  

Microsoft Academic Search

Thermal data for devices provided in manufacturers' data sheets are measured under idealized conditions and are not adequate to predict accurately junction temperature under other conditions. A validated model for the device, which can be employed in a variety of environments, is therefore required. This paper reports on the experimental and simulation work carried out to validate the thermal models

M. O'Flaherty; C. Cahill; K. Rodgers; O. Slattery

1997-01-01

131

Model-driven Validation of SystemC Designs  

Microsoft Academic Search

Functional test generation for dynamic validation of current system level designs is a challenging task. Manual test writing or auto- mated random test generation techniques are often used for such validation practices. However, directing tests to particular reach- able states of a SystemC model is often difficult, especially when these models are large and complex. In this work, we present

Hiren D. Patel; Sandeep K. Shukla

2007-01-01

132

Model Based Analysis and Validation of Access Control Policies.  

National Technical Information Service (NTIS)

We present a model based approach to describing, analyzing and validating access control policies. Access control policies are described using VDM - a model oriented formal method. Policy descriptions are concise and may be easily manipulated. The structu...

J. S. Fitzgerald J. W. Bryans P. Periorellis

2006-01-01

133

Verification and Validation of RF Environmental Models - Methodology Overview.  

National Technical Information Service (NTIS)

This technical report describes the general methodology behind the validation and verification of the RF environmental models as applied to HWIL Simulation. The different phases of verification including implementation of RF models and propagated RF signa...

A. M. Baird R. B. Goldman W. C. Bryan W. C. Holt F. M. Belrose

1980-01-01

134

Software Cost Estimating Models: A Calibration, Validation, and Comparison.  

National Technical Information Service (NTIS)

This study was a calibration, validation and comparison of four software effort estimation models. The four models evaluated were REVIC, SASET, SEER, and COSTMODL. A historical database was obtained and used as the input data. Two software environments we...

G. L. Ourada

1991-01-01

135

Motion Capture Experiments for Validating Optimization-Based Human Models  

Microsoft Academic Search

\\u000a Optimization-based digital human model research has gained significant momentum among various human models. Any task can be\\u000a formulated to an optimization problem, and the model can predict not only postures but also motions. However, these optimization-based\\u000a digital human models need validation using experiments. The motion capture system is one of the ways to validate predicted\\u000a results. This paper summarizes the

Aimee Cloutier; Robyn Boothby; Jingzhou Yang

136

Parameterization of Model Validating Sets for Uncertainty Bound Optimizations  

NASA Technical Reports Server (NTRS)

Given experimental data and a priori assumptions on nominal model and a linear fractional transformation uncertainty structure, feasible conditions for model validation is given. All unknown but bounded exogenous inputs are assumed to occur at the plant outputs. With the satisfaction of the feasible conditions for model validation, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization can be used as a basis for the development of a systematic way to construct model validating uncertainty models which have specific linear fractional transformation structure for use in robust control design and analysis. The proposed feasible condition (existence) test and the parameterization is computationally attractive as compared to similar tests currently available.

Lim, K. B.; Giesy, D. P.

1998-01-01

137

Criterion Models and Construct Validity for Criteria.  

National Technical Information Service (NTIS)

Methods pertaining to the measurement of criteria and ascertaining underlying criterion constructs were reviewed. Three criterion measurement models, i.e., the ultimate criterion model, the multiple criterion model, and a 'general' criterion model for the...

L. R. James

1972-01-01

138

Validation of the HOTCHAN code for analyzing the EBR-II driver following loss of flow without scram  

SciTech Connect

A series of experiments involving unprotected (no scram) loss-of-primary flow (LOF) down to natural convection was successfully conducted in February 1985 on the Experimental Breeder Reactor-II (EBR-II). The predicted and measured behavior of a special instrumented assembly, the XX09 fueled INSAT, is compared for the most severe test in the group to demonstrate the validation of the thermal-hydraulic code HOTCHAN. The particular test of interest in this paper was initiated at full power by tripping the primary and secondary pumps. These tests were part of the Shutdown Heat Removal Tests (SHRT) being conducted in EBR-II. The reactor and balance of plant are extensively instrumented and measurements were recorded by a data acquisition system. The reactor and plant response confirm predictions that the driver fuel cladding can survive temperatures above the eutectic threshold for the transient following a station blackout without scramming the reactor. The incore data provide an additional basis for validation of the recently developed HOTCHAN code for analyzing the thermal-hydraulic behavior of specific fuel subassemblies. In this paper the analytical model for HOTCHAN will be described as well as its relationship to the NATDEMO code. The predicted behavior of the hottest driver subassembly is also discussed and compared with XX09 results.

Mohr, D.; Chang, L.K.; Betten, P.R.; Feldman, E.E.; Planchon, H.P.

1987-01-01

139

Multi-terminal Subsystem Model Validation for Pacific DC Intertie  

SciTech Connect

this paper proposes to validate dynamic model of Pacific DC Intertie with the concept of hybrid simulation by combing simulation with PMU measurements. The Playback function available in GE PSLF is adopted for hybrid simulation. It is demonstrated for the first time the feasibility of using Playback function on multi-terminal subsystem. Sensitivity studies are also presented as a result of common PMU measurement quality problem, ie, offset noise and time synchronization. Results indicate a good tolerance of PDCI model generally. It is recommended that requirements should apply to phasor measurements in model validation work to ensure better analysis. Key parameters are identified based on impact of value change to model behavior. Two events are employed for preliminary model validation with PMU measurements. Suggestions are made for PDCI model validation work in the future.

Yang, Bo; Huang, Zhenyu; Kosterev, Dmitry

2008-07-20

140

Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised  

NASA Technical Reports Server (NTRS)

Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

Lim, K. B.; Giesy, D. P.

2000-01-01

141

Validation of Numerical Shallow Water Models for Tidal Lagoons  

SciTech Connect

An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

Eliason, D.; Bourgeois, A.

1999-11-01

142

A time-domain approach to model validation  

Microsoft Academic Search

In this paper we offer a novel approach to control-oriented model validation problems. The problem is to decide whether a postulated nominal model with bounded uncertainty is consistent with measured input-output data. Our approach directly uses time-domain input-output data to validate uncertainty models. The algorithms we develop are computationally tractable and reduce to (generally nondifferentiable) convex feasibility problems or to

Kameshwar Poolla; Pramod Khargonekar; Ashok Tikku; James Krause; Krishan Nagpal

1994-01-01

143

Predicting aquaculture-derived benthic organic enrichment: Model validation  

Microsoft Academic Search

A sediment trap validation study was conducted near the commercial sea bass and sea bream fish farm in order to assess the predictive capability of a particle tracking deposition model. The validation procedure consisted of two distinct phases. First, the deposition of particulate waste (i.e. fecal pellets and excess feed) was measured near a single net pen containing 19tons of

Marko Jusup; Jasminka Klanjš?ek; Donat Petricioli; Tarzan Legovi?

2009-01-01

144

Validation of an Evaluation Model for Learning Management Systems  

ERIC Educational Resources Information Center

This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

Kim, S. W.; Lee, M. G.

2008-01-01

145

VALIDATION METHODS FOR CHEMICAL EXPOSURE AND HAZARD ASSESSMENT MODELS  

EPA Science Inventory

Mathematical models and computer simulation codes designed to aid in hazard assessment for environmental protection must be verified and validated before they can be used with confidence in a decision-making or priority-setting context. Operational validation, or full-scale testi...

146

ECOLOGICAL MODEL TESTING: VERIFICATION, VALIDATION OR NEITHER?  

EPA Science Inventory

Consider the need to make a management decision about a declining animal population. Two models are available to help. Before a decision is made based on model results, the astute manager or policy maker may ask, "Do the models work?" Or, "Have the models been verified or validat...

147

Systematic approach to verification and validation: High explosive burn models  

Microsoft Academic Search

Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V

Ralph Menikoff; Christina A. Scovel

2012-01-01

148

Considerations for the validation of species-habitat models  

Microsoft Academic Search

The multitude of approaches to wildlife-habitat modeling reflect the broad objectives and goals of various research, management, and conservation programs. Validating models is an often overlooked component of using models effectively and confidently to achieve the desired objectives. Statistical models that attempt to predict the presence or absence of a species are often developed with logistic regression. In this paper,

Jennifer M. Psyllakis; Michael P. Gillingham

149

Validation of HEDR models. Hanford Environmental Dose Reconstruction Project.  

National Technical Information Service (NTIS)

The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of...

B. A. Napier J. C. Simpson P. W. Eslinger J. V. Ramsdell M. E. Thiede

1994-01-01

150

Validation of Gleams Model for Poultry Litter Management.  

National Technical Information Service (NTIS)

The GLEAMS model was applied to represent 6 fertilizer and broiler litter management practices on Coastal bermudagrass with multiple cuttings for hay at Watkinsville, Georgia. Observed data from a 7 year study were used to validate GLEAMS with comparisons...

W. G. Knisel M. C. Smith S. R. Wilkinson

1995-01-01

151

Validation Methodology for Human Behavior Representation Models.  

National Technical Information Service (NTIS)

The Department of Defense relies Heavily on mathematical models and computer simulations to analyze and acquire new weapon systems. Models and simulations help decision-makers understand the differences between systems and provide insights into the implic...

S. R. Goerger M. L. McGinnis R. P. Darken

2005-01-01

152

Validation of the Archimedes Diabetes Model  

Microsoft Academic Search

trolled trials by repeating in the model the steps taken for the real trials and comparing the results calculated by the model with the results of the trial. Eighteen trials were chosen by an independent advisory committee. Half the trials had been used to help build the model (\\

DAVID M. EDDY; LEONARD SCHLESSINGER

153

Highlights of Transient Plume Impingement Model Validation and Applications  

NASA Technical Reports Server (NTRS)

This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

Woronowicz, Michael

2011-01-01

154

Validation of climate model output using Bayesian statistical methods  

Microsoft Academic Search

The growing interest in and emphasis on high spatial resolution estimates of future climate has demonstrated the need to apply\\u000a regional climate models (RCMs) to that problem. As a consequence, the need for validation of these models, an assessment of\\u000a how well an RCM reproduces a known climate, has also grown. Validation is often performed by comparing RCM output to

Mark A. Snyder; Bruno Sansó; Lisa C. Sloan

2007-01-01

155

HEDR model validation plan. Hanford Environmental Dose Reconstruction Project  

SciTech Connect

The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ``tools`` for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ``validate`` these tools. In the sense of the HEDR Project, ``validation`` is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model.

Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

1993-06-01

156

Gear Windage Modeling Progress - Experimental Validation Status  

NASA Technical Reports Server (NTRS)

In the Subsonics Rotary Wing (SRW) Project being funded for propulsion work at NASA Glenn Research Center, performance of the propulsion system is of high importance. In current rotorcraft drive systems many gearing components operate at high rotational speed (pitch line velocity > 24000 ft/ min). In our testing of high speed helical gear trains at NASA Glenn we have found that the work done on the air - oil mist within the gearbox can become a significant part of the power loss of the system. This loss mechanism is referred to as windage. The effort described in this presentation is to try to understand the variables that affect windage, develop a good experimental data base to validate, the analytical project being conducted at Penn State University by Dr. Rob Kunz under a NASA SRW NRA. The presentation provides an update to the status of these efforts.

Kunz, Rob; Handschuh, Robert F.

2008-01-01

157

A Cartilage Growth Mixture Model With Collagen Remodeling: Validation Protocols  

Microsoft Academic Search

A cartilage growth mixture (CGM) model is proposed to address limitations of a model used in a previous study. New stress constitutive equations for the solid matrix are derived and collagen (COL) remodeling is incorporated into the CGM model by allowing the intrinsic COL material constants to evolve during growth. An analytical validation protocol based on experimental data from a

Stephen M. Klisch; Anna Asanbaeva; Sevan R. Oungoulian; Koichi Masuda; Eugene J.-MA. Thonar; Andrew Davol; Robert L. Sah

2008-01-01

158

Dynamic Modeling and Experimental Validation for Interactive Endodontic Simulation  

Microsoft Academic Search

To facilitate training of endodontic operations, we have developed an interactive virtual environment to simulate endodontic shaping operations. This paper presents methodologies for dynamic modeling, visual\\/haptic display and model validation of endodontic shaping. We first investigate the forces generated in the course of shaping operations and discuss the challenging issues in their modeling. Based on the special properties and constraints

Min Li; Yun-hui Liu

2007-01-01

159

A method for validation of reference sets in SIMCA modelling  

Microsoft Academic Search

A method for validation of the reference set in Soft Independent Modelling of Class Analogies (SIMCA) is proposed. The reference set is used to build the SIMCA model and the remaining samples are fitted to this model. Thus, it is important that the reference set is representative for the reference class. In this work it is suggested that the reference

Geir Rune Flåten; Bjørn Grung; Olav M. Kvalheim

2004-01-01

160

Modelling and Validation of Response Times in Zoned RAID  

Microsoft Academic Search

We present and validate an enhanced analytical queueing network model of zoned RAID. The model focuses on RAID levels 01 and 5, and yields the distribution of I\\/O request response time. Whereas our previous work could only sup- port arrival streams of I\\/O requests of the same type, the model presented here supports heterogeneous streams with a mixture of read

Abigail S. Lebrecht; Nicholas J. Dingle; William J. Knottenbelt

2008-01-01

161

Validation of Model Forecasts of the Ambient Solar Wind (Invited)  

NASA Astrophysics Data System (ADS)

Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge(WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

MacNeice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

2009-12-01

162

Experimental Validation of a State Model for PEMFC Auxiliary Control  

Microsoft Academic Search

This work aims to validate experimentally a state model for a proton exchange membrane fuel cell (PEMFC) system control. In this model, the air-supply dynamic is analyzed. The air is assumed to be compressible, and the model takes into account the compressor, the humidification device, the inlet and outlet manifolds, the fuel-cell stack, and the proportional solenoid valve at the

Kodjo Agbossou; Yves Dube; Nadine Hassanaly; K. Pelope Adzakpa; Julien Ramousse

2009-01-01

163

Experimental Validation of a State Model for PEMFC Auxiliaries Control  

Microsoft Academic Search

This work aims to validate experimentally a state model for a proton exchange membrane fuel cell (PEMFC) system control. In this model, the air supply dynamic will be analyzed. The air is assumed to be compressible and the model takes into account the compressor, the humidification device, the inlet and outlet manifolds, the fuel cell stack and the proportional solenoid

K. P. Adzakpa; J. Ramousse; K. Agbossou; Y. Dube; N. Hassanaly; F. Zemmar

2008-01-01

164

Validation of Model Forecasts of the Ambient Solar Wind  

NASA Technical Reports Server (NTRS)

Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

2009-01-01

165

Combustion turbine dynamic model validation from tests  

Microsoft Academic Search

Studies have been conducted on the Alaskan Railbelt System to examine the hydrothermal power system response after the hydroelectric power units at Bradley Lake are installed. The models and data for the generating units for the initial studies were not complete. Typical models were used, but their response appeared to be faster than judged by operating experience. A testing program

L. N. Hannett; Afzal Khan

1993-01-01

166

Model validation in soft systems practice  

Microsoft Academic Search

The concept of `a model` usually evokes the connotation `model of part of the real world`. That is an almost automatic response. It makes sense especially in relation to the way the concept has been developed and used in natural science. Classical operational research (OR), with its scientific aspirations, and systems engineering, use the concept in the same way and

Checkland

1995-01-01

167

International Space Station Power System Model Validated  

NASA Technical Reports Server (NTRS)

System Power Analysis for Capability Evaluation (SPACE) is a computer model of the International Space Station's (ISS) Electric Power System (EPS) developed at the NASA Glenn Research Center. This uniquely integrated, detailed model can predict EPS capability, assess EPS performance during a given mission with a specified load demand, conduct what-if studies, and support on-orbit anomaly resolution.

Hojnicki, Jeffrey S.; Delleur, Ann M.

2002-01-01

168

Multi-model ensemble: technique and validation  

NASA Astrophysics Data System (ADS)

In this study, a method of numerical weather prediction by ensemble for the South American region is proposed. This method takes into account combinations of the numerical predictions of various models, assigning greater weight to models that exhibit the best performance. Nine operational numerical models were used to perform this study. The main objective of the study is to obtain a weather forecasting product (short-to-medium range) that combines what is best in each of the nine models used in the study, thus producing more reliable predictions. The proposed method was evaluated during austral summer (December 2012, and January and February 2013) and winter (June, July and August 2013). The results show that the proposed method can significantly improve the results provided by the numerical models, and consequently has promising potential for operational applications in any weather forecasting center.

Rozante, J. R.; Moreira, D. S.; Godoy, R. C. M.; Fernandes, A. A.

2014-05-01

169

Validation of the NATO-standard ship signature model (SHIPIR)  

NASA Astrophysics Data System (ADS)

An integrated naval infrared target, threat and countermeasure simulator (SHIPIR/NTCS) has been developed. The SHIPIR component of the model has been adopted by both NATO and the US Navy as a common tool for predicting the infrared (IR) signature of naval ships in their background. The US Navy has taken a lead role in further developing and validating SHIPIR for use in the Twenty-First Century Destroyer (DD-21) program. As a result, the US Naval Research Laboratory (NRL) has performed an in-depth validation of SHIPIR. This paper presents an overview of SHIPIR, the model validation methodology developed by NRL, and the results of the NRL validation study. The validation consists of three parts: a review of existing validation information, the design, execution, and analysis of a new panel test experiment, and the comparison of experiment with predictions from the latest version of SHIPIR (v2.5). The results show high levels of accuracy in the radiometric components of the model under clear-sky conditions, but indicate the need for more detailed measurement of solar irradiance and cloud model data for input to the heat transfer and in-band sky radiance sub-models, respectively.

Vaitekunas, David A.; Fraedrich, Douglas S.

1999-07-01

170

Design and validation of a multiphase 3D model to simulate tropospheric pollution.  

PubMed

This work presents the Transport and Chemical Aerosol Model (TCAM) formulation and its validation in the frame of CityDelta-CAFE project. TCAM is a 3D eulerian multiphase model simulating tropospheric secondary pollution at mesoscale. It is included in the GAMES (Gas Aerosol Modelling Evaluation System) modelling system, designed to support the analysis of secondary pollution dynamics and to assess the impact of emission control strategies. The presented validation assessment has been performed in the frame of the CityDelta II project over the Milan domain and concerns both gas and aerosol 1999 simulations. Ozone, nitrogen oxides and aerosol computed and observed patterns have been compared and analysed by means of statistical indicators showing high model performances for both winter and summer pollution regimes. PMID:17963821

Carnevale, Claudio; Decanini, Edoardo; Volta, Marialuisa

2008-02-01

171

Validation of the Poisson Stochastic Radiative Transfer Model  

NASA Technical Reports Server (NTRS)

A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

Zhuravleva, Tatiana; Marshak, Alexander

2004-01-01

172

A glucose-insulin pharmacodynamic surface modeling validation and comparison of metabolic system models  

Microsoft Academic Search

Metabolic system modeling for model-based glycaemic control is becoming increasingly important. Few metabolic system models are clinically validated for both fit to the data and prediction ability. This research introduces a new additional form of pharmaco-dynamic (PD) surface comparison for model analysis and validation. These 3D surfaces are developed for 3 clinically validated models and 1 model with an added

J. Geoffrey Chase; Steen Andreassen; Ulrike Pielmeier; Christopher E. Hann; Kirsten A. McAuley; J. I. Mann

2009-01-01

173

Simulation Validation Through Linear Model Comparison.  

National Technical Information Service (NTIS)

The Manned Flight Simulator at the Naval Air Warfare Center in Patuxent River, MD maintains high fidelity fixed and rotary wing simulation models. The aircraft simulations are utilized for a wide range of activities including flight test support, pilot tr...

K. Balderson D. P. Gaublomme J. W. Thomas

1996-01-01

174

VERIFICATION AND VALIDATION OF THE SPARC MODEL  

EPA Science Inventory

Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

175

Crop yield model validation for Cameroon  

NASA Astrophysics Data System (ADS)

A crop simulation model must first be capable of representing the actual performance of crops grown in any region before it can be applied to the prediction of climate variability and change impacts. A cropping systems model (CropSyst) simulations of crop productivity in the sub-Saharan Central African (using Cameroon as the case study) region, under the current climate were compared with observed yields of maize, sorghum, groundnut, bambara groundnut and soybean from eight sites. The model produced both over-and-under estimates, but with a mean percentage difference of only -2.8%, ranging from -0.6% to -4.5%. Based on these results, we judged the CropSyst simulations sufficiently reliable to justify use of the model in assessing crop growth vulnerability to climatic changes in Cameroon and else where.

Tingem, Munang; Rivington, Mike; Bellocchi, Gianni; Colls, Jeremy

2009-05-01

176

How to build valid and credible simulation models  

Microsoft Academic Search

The authors present techniques for building valid and credible simulation models. Ideas to be discussed include the importance of a definitive problem formulation, discussions with subject-matter experts, interacting with the decision-maker on a regular basis, development of a written conceptual model, structured walk-through of the conceptual model, use of sensitivity analysis to determine important model factors, and comparison of model

Averill M. Law; Michael G. McComas

2001-01-01

177

Validating predictions from climate envelope models  

USGS Publications Warehouse

Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romanach, Stephanie S.; Mazzotti, F.

2013-01-01

178

Validating Predictions from Climate Envelope Models  

PubMed Central

Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

Watling, James I.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romanach, Stephanie S.

2013-01-01

179

Spatial statistical modeling of shallow landslides—Validating predictions for different landslide inventories and rainfall events  

NASA Astrophysics Data System (ADS)

Statistical models that exploit the correlation between landslide occurrence and geomorphic properties are often used to map the spatial occurrence of shallow landslides triggered by heavy rainfalls. In many landslide susceptibility studies, the true predictive power of the statistical model remains unknown because the predictions are not validated with independent data from other events or areas. This study validates statistical susceptibility predictions with independent test data. The spatial incidence of landslides, triggered by an extreme rainfall in a study area, was modeled by logistic regression. The fitted model was then used to generate susceptibility maps for another three study areas, for which event-based landslide inventories were also available. All the study areas lie in the northern foothills of the Swiss Alps. The landslides had been triggered by heavy rainfall either in 2002 or 2005. The validation was designed such that the first validation study area shared the geomorphology and the second the triggering rainfall event with the calibration study area. For the third validation study area, both geomorphology and rainfall were different. All explanatory variables were extracted for the logistic regression analysis from high-resolution digital elevation and surface models (2.5 m grid). The model fitted to the calibration data comprised four explanatory variables: (i) slope angle (effect of gravitational driving forces), (ii) vegetation type (grassland and forest; root reinforcement), (iii) planform curvature (convergent water flow paths), and (iv) contributing area (potential supply of water). The area under the Receiver Operating Characteristic (ROC) curve ( AUC) was used to quantify the predictive performance of the logistic regression model. The AUC values were computed for the susceptibility maps of the three validation study areas (validation AUC), the fitted susceptibility map of the calibration study area (apparent AUC: 0.80) and another susceptibility map obtained for the calibration study area by 20-fold cross-validation (cross-validation AUC: 0.74). The AUC values of the first and second validation study areas (0.72 and 0.69, respectively) and the cross-validation AUC matched fairly well, and all AUC values were distinctly smaller than the apparent AUC. Based on the apparent AUC one would have clearly overrated the predictive performance for the first two validation areas. Rather surprisingly, the AUC value of the third validation study area (0.82) was larger than the apparent AUC. A large part of the third validation study area consists of gentle slopes, and the regression model correctly predicted that no landslides occur in the flat parts. This increased the predictive performance of the model considerably. The predicted susceptibility maps were further validated by summing the predicted susceptibilities for the entire validation areas and by comparing the sums with the observed number of landslides. The sums exceeded the observed counts for all the validation areas. Hence, the logistic regression model generally over-estimated the risk of landslide occurrence. Obviously, a predictive model that is based on static geomorphic properties alone cannot take a full account of the complex and time dependent processes in the subsurface. However, such a model is still capable of distinguishing zones highly or less prone to shallow landslides.

von Ruette, Jonas; Papritz, Andreas; Lehmann, Peter; Rickli, Christian; Or, Dani

2011-10-01

180

Validation of geometric models for fisheye lenses  

NASA Astrophysics Data System (ADS)

The paper focuses on the photogrammetric investigation of geometric models for different types of optical fisheye constructions (equidistant, equisolid-angle, sterographic and orthographic projection). These models were implemented and thoroughly tested in a spatial resection and a self-calibrating bundle adjustment. For this purpose, fisheye images were taken with a Nikkor 8 mm fisheye lens on a Kodak DSC 14n Pro digital camera in a hemispherical calibration room. Both, the spatial resection and the bundle adjustment resulted in a standard deviation of unit weight of 1/10 pixel with a suitable set of simultaneous calibration parameters introduced into the camera model. The camera-lens combination was treated with all of the four basic models mentioned above. Using the same set of additional lens distortion parameters, the differences between the models can largely be compensated, delivering almost the same precision parameters. The relative object space precision obtained from the bundle adjustment was ca. 1:10 000 of the object dimensions. This value can be considered as a very satisfying result, as fisheye images generally have a lower geometric resolution as a consequence of their large field of view and also have a inferior imaging quality in comparison to most central perspective lenses.

Schneider, D.; Schwalbe, E.; Maas, H.-G.

181

Validity of microgravity simulation models on earth.  

PubMed

Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect incomplete knowledge of the characteristics inherent to each model. During water immersion, the hydrostatic pressure lowers the peripheral vascular capacity and causes increased thoracic blood volume and high vascular perfusion. In turn, these changes lead to high urinary flow, low vasomotor tone, and a high rate of water exchange between interstitium and plasma. In contrast, the increase in thoracic blood volume during a space mission is combined with stimulated orthosympathetic tone and lowered urine flow. During bed rest, body tissues are compressed by pressure from gravity, whereas microgravity causes a negative pressure around the body. The differences in renal function between space and experimental models appear to be explained by the physical forces affecting tissues and hemodynamics as well as by the changes secondary to these forces. These differences may help in selecting experimental models to study possible effects of microgravity. PMID:11532704

Regnard, J; Heer, M; Drummer, C; Norsk, P

2001-09-01

182

Global Validation of Linear Model Assumptions  

PubMed Central

An easy-to-implement global procedure for testing the four assumptions of the linear model is proposed. The test can be viewed as a Neyman smooth test and it only relies on the standardized residual vector. If the global procedure indicates a violation of at least one of the assumptions, the components of the global test statistic can be utilized to gain insights into which assumptions have been violated. The procedure can also be used in conjunction with associated deletion statistics to detect unusual observations. Simulation results are presented indicating the sensitivity of the procedure in detecting model violations under a variety of situations, and its performance is compared with three potential competitors, including a procedure based on the Box-Cox power transformation. The procedure is demonstrated by applying it to a new car mileage data set and a water salinity data set that has been used previously to illustrate model diagnostics.

Pena, Edsel A.; Slate, Elizabeth H.

2005-01-01

183

Validation of SAGE II aerosol measurements by comparison with correlative sensors  

NASA Technical Reports Server (NTRS)

The SAGE II limb-scanning radiometer carried on the Earth Radiation Budget Satellite functions at wavelengths of 0.385, 0.45, 0.525, and 1.02 microns to identify vertical profiles of aerosol density by atmospheric extinction measurements from cloud tops upward. The data are being validated by correlating the satellite data with data gathered with, e.g., lidar, sunphotometer, and dustsonde instruments. Work thus far has shown that the 1 micron measurements from the ground and satellite are highly correlated and are therefore accurate to within measurement uncertainty.

Swissler, T. J.

1986-01-01

184

Validating Requirements for Fault Tolerant Systems Using Model Checking  

NASA Technical Reports Server (NTRS)

Model checking is shown to be an effective tool in validating the behavior of a fault tolerant embedded spacecraft controller. The case study presented here shows that by judiciously abstracting away extraneous complexity, the state space of the model could be exhaustively searched allowing critical functional requirements to be validated down to the design level. Abstracting away detail not germane to the problem of interest leaves by definition a partial specification behind. The success of this procedure shows that it is feasible to effectively validate a partial specification with this technique. Three anomalies were found in the system one of which is an error in the detailed requirements, and the other two are missing/ambiguous requirements. Because the method allows validation of partial specifications, it also is an effective methodology towards maintaining fidelity between a co-evolving specification and an implementation.

Schneider, Francis; Easterbrook, Steve M.; Callahan, John R.; Holzmann, Gerard J.

1997-01-01

185

Solution Verification Linked to Model Validation, Reliability, and Confidence  

SciTech Connect

The concepts of Verification and Validation (V&V) can be oversimplified in a succinct manner by saying that 'verification is doing things right' and 'validation is doing the right thing'. In the world of the Finite Element Method (FEM) and computational analysis, it is sometimes said that 'verification means solving the equations right' and 'validation means solving the right equations'. In other words, if one intends to give an answer to the equation '2+2=', then one must run the resulting code to assure that the answer '4' results. However, if the nature of the physics or engineering problem being addressed with this code is multiplicative rather than additive, then even though Verification may succeed (2+2=4 etc), Validation may fail because the equations coded are not those needed to address the real world (multiplicative) problem. We have previously provided a 4-step 'ABCD' quantitative implementation for a quantitative V&V process: (A) Plan the analyses and validation testing that may be needed along the way. Assure that the code[s] chosen have sufficient documentation of software quality and Code Verification (i.e., does 2+2=4?). Perform some calibration analyses and calibration based sensitivity studies (these are not validated sensitivities but are useful for planning purposes). Outline the data and validation analyses that will be needed to turn the calibrated model (and calibrated sensitivities) into validated quantities. (B) Solution Verification: For the system or component being modeled, quantify the uncertainty and error estimates due to spatial, temporal, and iterative discretization during solution. (C) Validation over the data domain: Perform a quantitative validation to provide confidence-bounded uncertainties on the quantity of interest over the domain of available data. (D) Predictive Adequacy: Extend the model validation process of 'C' out to the application domain of interest, which may be outside the domain of available data in one or more planes of multi-dimensional space. Part 'D' should provide the numerical information about the model and its predictive capability such that given a requirement, an adequacy assessment can be made to determine of more validation analyses or data are needed.

Logan, R W; Nitta, C K

2004-06-16

186

Validation of Orthorectified Interferometric Radar Imagery and Digital Elevation Models  

NASA Technical Reports Server (NTRS)

This work was performed under NASA's Verification and Validation (V&V) Program as an independent check of data supplied by EarthWatch, Incorporated, through the Earth Science Enterprise Scientific Data Purchase (SDP) Program. This document serves as the basis of reporting results associated with validation of orthorectified interferometric interferometric radar imagery and digital elevation models (DEM). This validation covers all datasets provided under the first campaign (Central America & Virginia Beach) plus three earlier missions (Indonesia, Red River: and Denver) for a total of 13 missions.

Smith Charles M.

2004-01-01

187

Validity Generalization of Holland's Hexagonal Model  

ERIC Educational Resources Information Center

Holland's hexagonal model for six occupational groups was tested with data on estimated occupational rewards. Data on rated occupational reward characteristics were available for 148 occupations. Although the hexagonal shape was distorted, the groups were arrayed in the order postualted by Holland. (Author)

Toenjes, Carol M.; Borgen, Fred H.

1974-01-01

188

Shuttle Spacesuit: Fabric/LCVG Model Validation.  

National Technical Information Service (NTIS)

Considerable progress has been made in understanding the basic transmission properties of the spacesuit fabric as it is now used in the space program. Some improved details for the physical model for the cooling tube are still required. It is desirable to...

J. W. Wilson J. Tweed C. Zeitlin M. H. Y. Kim B. M. Anderson F. A. Cucinotta J. Ware A. E. Persans

2004-01-01

189

Discussion of model calibration and validation for transient dynamics simulation.  

SciTech Connect

Model calibration refers to a family of inverse problem-solving numerical techniques used to infer the value of parameters from test data sets. The purpose of model calibration is to optimize parametric or non-parametric models in such a way that their predictions match reality. In structural dynamics an example of calibration is the finite element model updating technology. Our purpose is essentially to discuss calibration in the broader context of model validation. Formal definitions are proposed and the notions of calibration and validation are illustrated using an example of transient structural dynamics that deals with the propagation of a shock wave through a hyper-foam pad. An important distinction that has not been made in finite element model updating and that is introduced here is that parameters of the numerical models or physical tests are categorized into input parameters, calibration variables, controllable and uncontrollable variables. Such classification helps to define model validation goals. Finally a path forward for validating numerical model is discussed and the relationship with uncertainty assessment is stressed.

Hemez, F. M. (François M.); Doebling, S. W. (Scott W.); Wilson, A. C. (Amanda C.)

2001-01-01

190

Validation of the Greek translation of the obesity-specific Moorehead-Ardelt Quality-of-Life Questionnaire II.  

PubMed

Morbid obesity adversely affects quality of life. The assessment of health-related quality of life (HRQoL) needs specific measuring instruments. The Moorehead-Ardelt Quality-of-Life Questionnaire II (MA II) is an obesity-specific instrument widely used in bariatric surgery. The objective of this study was to translate and validate the MA II in Greek language. The study included the translation of the MA II followed by cross-validation with the Greek version of 36-item Short Form Health Survey (SF-36) and a Visual Analogue Scale (VAS) in subjects visiting an obesity clinic. Internal consistency was indicated by Cronbach's alpha coefficient and test-retest reliability by intraclass correlation coefficient (ICC). Construct validity was studied using Pearson's correlations between the MA II, the SF-36 and the VAS. A total of 175 patients were enrolled in the study. Test-retest analysis was applied to 40 patients with a 15-day interval. A very good internal consistency with Cronbach's alpha coefficient of 0.85 was shown. Excellent test-retest reliability was observed with an overall ICC of 0.981. Significant correlations between the Greek MA II and the other instruments as well as of each item of the MA II with the scores of SF-36 and the VAS indicated high construct and convergent validity. A negative correlation between the translated MA II total score and BMI confirmed high clinical validity. The Greek version of the MA II questionnaire has been generated and shown to be valid and reliable in measuring HRQoL in morbidly obese patients before and after bariatric surgery. PMID:22411571

Charalampakis, Vasileios; Daskalakis, Markos; Bertsias, Georgios; Papadakis, John A; Melissas, John

2012-05-01

191

On cluster validity for the fuzzy c-means model  

Microsoft Academic Search

Many functionals have been proposed for validation of partitions of object data produced by the fuzzy c-means (FCM) clustering algorithm. We examine the role a subtle but important parameter-the weighting exponent m of the FCM model-plays in determining the validity of FCM partitions. The functionals considered are the partition coefficient and entropy indexes of Bezdek, the Xie-Beni (1991), and extended

N. R. Pal; J. C. Bezdek

1995-01-01

192

Testing and Validation of a Low Cost Cystoscopy Teaching Model  

PubMed Central

Objective The objective of this study was to determine whether the use of a low cost cystoscopy model effectively trains residents in cystourethroscopy and to validate the model as a teaching tool. Study Design A randomized, controlled, and evaluator-blinded study was performed. Baseline skills in 29 OB/GYN residents were assessed, using the validated Objective Structured Assessment of Technical Skills (OSATS) checklists for cystourethroscopy, on fresh-frozen cadavers. Residents were randomized to one of two arms, a study arm using the cystoscopy model and a control arm. Repeat OSATS testing was performed. Results The study group demonstrated statistically significant decreases in cystoscope assembly time (p=0.004), and increases in task specific checklist and global rating scale scores (p values <0.0001) compared to the controls. Conclusions Use of the bladder model exhibited validity in enhancing performance and knowledge of cystourethroscopy among OB/GYN residents.

BOWLING, C. Bryce; GREER, W. Jerod; BRYANT, Shannon A.; GLEASON, Jonathan L.; SZYCHOWSKI, Jeff M.; VARNER, R. Edward; HOLLEY, Robert L.; RICHTER, Holly E.

2011-01-01

193

Modeling and Formal Validation of High-Performance Embedded Systems  

Microsoft Academic Search

This paper presents an approach for the modeling and formal validation of high-performance systems. The approach relies on the repetitive model of computation used to express the parallelism of such systems within the Gaspard framework, which is dedicated to the codesign of high-performance system-on-chip. The system descriptions obtained with this model are then projected on the synchronous model of computation.

Abdoulaye Gamatié; Éric Rutten; Huafeng Yu; Pierre Boulet; Jean-luc Dekeyser

2008-01-01

194

Verification and Validation of Agent-based Scientific Simulation Models  

Microsoft Academic Search

Most formalized model verification and validation tech- niques come from industrial and system engineering for discrete-event system simulations. These techniques are widely used in computational science. The agent-based modeling approach is different from discrete event modeling approaches largely used in industrial and system engineer- ing in many aspects. Since the agent-based modeling ap- proach has recently become an attractive and

Xiaorong Xiang; Ryan Kennedy; Gregory Madey; Steve Cabaniss

2005-01-01

195

Institutional Effectiveness: A Model for Planning, Assessment & Validation.  

ERIC Educational Resources Information Center

The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

Truckee Meadows Community Coll., Sparks, NV.

196

VALIDATION STUDY OF THE ACUTE BIOTIC LIGAND MODEL FOR SILVER  

Microsoft Academic Search

An important final step in development of an acute biotic ligand model for silver is to validate predictive capabilities of the biotic ligand model developed for fish and invertebrates. To accomplish this, eight natural waters, collected from across North America, were characterized with respect to ionic composition, pH, dissolved organic carbon, and sulfide. Tests were conducted with the cladoceran Ceriodaphnia

Gretchen K. Bielmyer; Martin Grosell; Paul R. Paquin; Rooni Mathews; Kuen B. Wu; Robert C. Santore; Kevin V. Brix

2007-01-01

197

Near-real time validation of an operational hydrographic model  

Microsoft Academic Search

The Irish Marine Institute maintains an operational model of the NE Atlantic from which weekly hydrographic forecasts are published on the institute's web site. A method for the systematic validation of the operational model has been developed, making use of temperature and salinity profile data from ARGO floats, surface water temperature and salinity data from the Irish weather buoys and

H. Cannaby; M. Cure; K. Lyons; G. Nolan

198

Mechanical validation of whole bone composite tibia models  

Microsoft Academic Search

Composite synthetic models of the human tibia have recently become commercially available as substitutes for cadaveric specimens. Their use is justified by the advantages they offer as a substitute for real tibias. The present investigation concentrated on an extensive experimental validation of the mechanical behaviour of the whole bone composite model, compared to human specimens for different loading conditions. The

Luca Cristofolini; Marco Viceconti

2000-01-01

199

Validation of an aeroelastic model of Vestas V39.  

National Technical Information Service (NTIS)

An aeroelastic model of the pitch controlled Vestas V39 wind turbine is validated by comparing simulations with measurements. The comparison is carried out on a statistical basis as well as on a power spectra level, and the model is found to be good compa...

G. C. Larsen P. Voelund

1998-01-01

200

Validating Finite Element Models of Assembled Shell Structures  

NASA Technical Reports Server (NTRS)

The validation of finite element models of assembled shell elements is presented. The topics include: 1) Problems with membrane rotations in assembled shell models; 2) Penalty stiffness for membrane rotations; 3) Physical stiffness for membrane rotations using shell elements with 6 dof per node; and 4) Connections avoiding rotations.

Hoff, Claus

2006-01-01

201

Web Application Model Recovery for User Input Validation Testing  

Microsoft Academic Search

Abstract The invalidated input is one of the most critical web application security flaws. However, testing the user input validation function is an ,intellectual and labor intensive task. We are ,developing a model ,driven framework to help testers to accomplish this job in visual view with guidance. This paper reports our on-going work. A meta-model of Web application for user

Nuo Li; Mao-zhong Jin; Chao Liu

2007-01-01

202

NUMERICAL MODELING IN REINFORCED CONCRETE AND ITS VALIDATION  

Microsoft Academic Search

The main purpose of this mini symposium is to bring together the experts in the field of numerical modelling and experimental validation in reinforced concrete. Concrete is a composite material that exhibits a heterogeneous internal structure and the study of its damage mechanisms is complex. When reinforced with steel bars, modelling the failure mechanisms becomes even more complicated (1,2). Faced

RENA C. YU; GIULIO VENTURA; GONZALO RUIZ; JACINTO R. CARMONA

203

Contaminant transport model validation: The Oak Ridge Reservation  

Microsoft Academic Search

In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an

R. R. Lee; R. H. Ketelle

1988-01-01

204

Validation of 1-D transport and sawtooth models for ITER  

SciTech Connect

In this paper the authors describe progress on validating a number of local transport models by comparing their predictions with relevant experimental data from a range of tokamaks in the ITER profile database. This database, the testing procedure and results are discussed. In addition a model for sawtooth oscillations is used to investigate their effect in an ITER plasma with alpha-particles.

Connor, J.W.; Turner, M.F. [UKAEA, Culham (United Kingdom); Attenberger, S.E.; Houlberg, W.A. [ORNL, Oak Ridge, TN (United States)] [and others

1996-12-31

205

Can Earthquake Loss Models be Validated Using Field Observations?  

Microsoft Academic Search

The occurrence of a damaging earthquake provides an opportunity to compare observed and estimated damage, provided that detailed observations of the earthquake effects are made in the field. A question that arises is whether such comparisons can provide the basis for validation of an earthquake loss model. In order to explore this issue, a case study loss model for the

Helen Crowley; Peter J. Stafford; Julian J. Bommer

2008-01-01

206

Differential Validation of a Path Analytic Model of University Dropout.  

ERIC Educational Resources Information Center

Tinto's conceptual schema of college dropout forms the theoretical framework for the development of a model of university student dropout intention. This study validated Tinto's model in two different departments within a single university. Analyses were conducted on a sample of 684 college freshmen in the Education and Economics Department. A…

Winteler, Adolf

207

Rigorous valid ranges for optimally reduced kinetic models  

SciTech Connect

Reduced chemical kinetic models are often used in place of a detailed mechanism because of the computational expense of solving the complete set of equations describing the reacting system. Mathematical methods for model reduction are usually associated with a nominal set of reaction conditions for which the model is reduced. The important effects of variability in these nominal conditions are often ignored because there is no convenient way to deal with them. In this work, we introduce a method to identify rigorous valid ranges for reduced models; i.e., the reduced models are guaranteed to replicate the full model to within an error tolerance under all conditions in the identified valid range. Previous methods have estimated valid ranges using a limited set of variables (usually temperature and a few species compositions) and cannot guarantee that the reduced model is accurate at all points in the estimated range. The new method is demonstrated by identifying valid ranges for models reduced from the GRI-Mech 3.0 mechanism with 53 species and 325 reactions, and a truncated propane mechanism with 94 species and 505 reactions based on the comprehensive mechanism of Marinov et al. A library of reduced models is also generated for several prespecified ranges composing a desired state space. The use of these reduced models with error control in reacting flow simulations is demonstrated through an Adaptive Chemistry example. By using the reduced models in the simulation only when they are valid the Adaptive Chemistry solution matches the solution obtained using the detailed mechanism. (author)

Oluwole, Oluwayemisi O.; Bhattacharjee, Binita; Tolsma, John E.; Barton, Paul I.; Green, William H. [Department of Chemical Engineering, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)

2006-07-15

208

Experiments for foam model development and validation.  

SciTech Connect

A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

2008-09-01

209

Computer thermal modeling for the Salt Rock II experiment  

SciTech Connect

The Salt Block II experiment consisted of a cylindrical block of bedded salt which was heated from within by a cylindrical electric heater. It was an extensively instrumented laboratory experiment that served, among other things, as a touchstone against which to measure the validity of a computer thermal model. The thermal model consisted of 282 nodes joined by 572 conductors, and was constructed for use with the CINDA heat transfer code. Both transient and steady-state temperature distributions within the salt were computed for heater power levels of 200, 400, 600, 1000 and 1500 watts. Temperature versus time plots are presented for 23 locations throughout the Block over a 58-day period. Comparisons of the model results and experimental results are shown for both transient and steady-state conditions. The computed steady-state results were used to develop equations describing both the temperature and the temperature derivative as functions of radial location.

George, O.L. Jr.

1980-10-01

210

Antibody modeling assessment II. Structures and models.  

PubMed

To assess the state-of-the-art in antibody structure modeling, a blinded study was conducted. Eleven unpublished Fab crystal structures were used as a benchmark to compare Fv models generated by seven structure prediction methodologies. In the first round, each participant submitted three non-ranked complete Fv models for each target. In the second round, CDR-H3 modeling was performed in the context of the correct environment provided by the crystal structures with CDR-H3 removed. In this report we describe the reference structures and present our assessment of the models. Some of the essential sources of errors in the predictions were traced to the selection of the structure template, both in terms of the CDR canonical structures and VL/VH packing. On top of this, the errors present in the Protein Data Bank structures were sometimes propagated in the current models, which emphasized the need for the curated structural database devoid of errors. Modeling non-canonical structures, including CDR-H3, remains the biggest challenge for antibody structure prediction. Proteins 2014; 82:1563-1582. © 2014 Wiley Periodicals, Inc. PMID:24633955

Teplyakov, Alexey; Luo, Jinquan; Obmolova, Galina; Malia, Thomas J; Sweet, Raymond; Stanfield, Robyn L; Kodangattil, Sreekumar; Almagro, Juan Carlos; Gilliland, Gary L

2014-08-01

211

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory. (author)

Ko, Y.-C. [Nuclear Science and Engineering Department, MIT, Cambridge, MA 02139 (United States); Hu, L.-W. [Nuclear Reactor Laboratory, MIT, Cambridge, MA 02139 (United States)], E-mail: lwhu@mit.edu; Olson, Arne P.; Dunn, Floyd E. [RERTR Program, Argonne National Laboratory, Argonne, IL 60439 (United States)

2008-07-15

212

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU.  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory.

Ko, Y. C.; Hu, L. W.; Olson, A. P.; Dunn, F. E.; Nuclear Engineering Division; MIT

2007-01-01

213

Validation of green tea polyphenol biomarkers in a phase II human intervention trial  

Microsoft Academic Search

Health benefits of green tea polyphenols (GTPs) have been reported in many animal models, but human studies are inconclusive. This is partly due to a lack of biomarkers representing green tea consumption. In this study, GTP components and metabolites were analyzed in plasma and urine samples collected from a phase II intervention trial carried out in 124 healthy adults who

Jia-Sheng Wang; Haitao Luo; Piwen Wang; Lili Tang; Jiahua Yu; Tianren Huang; Stephen Cox; Weimin Gao

2008-01-01

214

Simultaneous model building and validation with uniform designs of experiments  

NASA Astrophysics Data System (ADS)

This article describes an implementation of a particular design of experiment (DoE) plan based upon optimal Latin hypercubes that have certain space-filling and uniformity properties with the goal of maximizing the information gained. The feature emphasized here is the concept of simultaneous model building and model validation plans whose union contains the same properties as the component sets. Two Latin hypercube DoE are constructed simultaneously for use in a meta-modelling context for model building and model validation. The goal is to optimize the uniformity of both sets with respect to space-filling properties of the designs whilst satisfying the key concept that the merged DoE, comprising the union of build and validation sets, has similar space-filling properties. This represents a development of an optimal sampling approach for the first iteration—the initial model building and validation where most information is gained to take the full advantage of parallel computing. A permutation genetic algorithm using several genetic operator strategies is implemented in which fitness evaluation is based upon the Audze-Eglais potential energy function, and an example is presented based upon the well-known six-hump camel back function. The relative efficiency of the strategies and the associated computational aspects are discussed with respect to the quality of the designs obtained. The requirement for such design approaches arises from the need for multiple calls to traditionally expensive system and discipline analyses within iterative multi-disciplinary optimisation frameworks.

Narayanan, A.; Toropov, V. V.; Wood, A. S.; Campean, I. F.

2007-07-01

215

Nonequilibrium stage modelling of dividing wall columns and experimental validation  

Microsoft Academic Search

Dealing with complex process units like dividing wall columns pushes the focus on the determination of suitable modelling\\u000a approaches. For this purpose a nonequilibrium stage model is developed. The successful validation is achieved by an experimental\\u000a investigation of fatty alcohol mixtures under vacuum condition at pilot scale. Aim is the recovery of high purity products.\\u000a The proposed model predicts the

Christoph Hiller; Christina Buck; Christoph Ehlers; Georg Fieg

2010-01-01

216

Data for model validation summary report. A summary of data for validation and benchmarking of recovery boiler models  

SciTech Connect

One of the tasks in the project was to obtain data from operating recovery boilers for the purpose of model validation. Another task was to obtain water model data and computer output from University of British Columbia for purposes of benchmarking the UBC model against other codes. In the course of discussions on recovery boiler modeling over the course of this project, it became evident that there would be value in having some common cases for carrying out benchmarking exercises with different recovery boiler models. In order to facilitate such a benchmarking exercise, the data that was obtained on this project for validation and benchmarking purposes has been brought together in a single, separate report. The intent is to make this data available to anyone who may want to use it for model validation. The report contains data from three different cases. Case 1 is an ABBCE recovery boiler which was used for model validation. The data are for a single set of operating conditions. Case 2 is a Babcock & Wilcox recovery boiler that was modified by Tampella. In this data set, several different operating conditions were employed. The third case is water flow data supplied by UBC, along with computational output using the UBC code, for benchmarking purposes.

Grace, T.; Lien, S.; Schmidl, W.; Salcudean, M.; Abdullah, Z.

1997-07-01

217

Using the split Hopkinson pressure bar to validate material models.  

PubMed

This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. PMID:25071238

Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian

2014-08-28

218

Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites  

NASA Technical Reports Server (NTRS)

This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

Turner, Travis L.

2001-01-01

219

Electro-thermal modelling of a supercapacitor and experimental validation  

NASA Astrophysics Data System (ADS)

This paper reports on the electro-thermal modelling of a Maxwell supercapacitor (SC), model BMOD0083 with a rated capacitance of 83 F and rated voltage of 48 V. One electrical equivalent circuit was used to model the electrical behaviour whilst another served to simulate the thermal behaviour. The models were designed to predict the SC operating voltage and temperature, by taking the electric current and ambient temperature as input variables. A five-stage iterative method, applied to three experiments, served to obtain the parameter values for each model. The models were implemented in MATLAB-Simulink®, where they interacted to reciprocally provide information. These models were then validated through a number of tests, subjecting the SC to different current and frequency profiles. These tests included the validation of a bank of supercapacitors integrated into an electric microgrid, in a real operating environment. Satisfactory results were obtained from the electric and thermal models, with RMSE values of less than 0.65 V in all validations.

Berrueta, Alberto; San Martín, Idoia; Hernández, Andoni; Ursúa, Alfredo; Sanchis, Pablo

220

Climate Model Validation Using Spectrally Resolved Shortwave Radiation Measurements  

NASA Astrophysics Data System (ADS)

The climate science community has made significant strides in the development and improvement of Global Climate Models (GCMs) to predict how the Earth's climate system will change over the next several decades. It is crucial to evaluate how well these models reproduce observed climate variability using strict validation techniques to assist the climate modeling community with improving GCM prediction. The ability of climate models to simulate Earth's present-day climate is an initial evaluation of their ability to predict future changes in climate. Models are evaluated in several ways, including model intercomparison projects and comparing model simulations of physical variables with observations of those variables. We are developing new methods for rigorous climate model validation and physical attribution of the cause of model errors using existing, direct measurements of hyperspectral shortwave reflectance. We have also developed a SCIAMACHY-based (Scanning Imaging Absorption Spectrometer for Atmospheric Cartography) hyperspectral shortwave climate validation product to demonstrate using the product to validate GCMs. We are also investigating the information content added by using multispectral and hyperspectral data to study climate variability and validate climate models. The goal is to determine if it is necessary to use data with continuous spectral sampling across the shortwave spectral range, or if it is sufficient to use a subset of carefully selected spectral bands (e.g. MODIS-like data) to study climate trends and evaluate climate model performance. We are carrying out this activity by comparing the information content within broadband, multispectral (discrete-band sampling), and hyperspectral (high spectral resolution with continuous spectral sampling) data sets. Changes in climate-relevant atmospheric and surface variables impact the spectral, spatial, and temporal variability of Earth-reflected solar radiation (0.3-2.5 ?m) through spectrally dependent scattering and absorption processes. Previous studies have demonstrated that highly accurate, hyperspectral (spectrally contiguous and overlapping) measurements of shortwave reflectance are important for monitoring climate variability from space. We are continuing to work to demonstrate that highly accurate, high information content hyperspectral shortwave measurements can be used to detect changes in climate, identify climate variance drivers, and validate GCMs.

Roberts, Y.; Taylor, P. C.; Lukashin, C.; Feldman, D.; Pilewskie, P.; Collins, W.

2013-12-01

221

Criteria for Validating Mouse Models of Psychiatric Diseases  

PubMed Central

Animal models of human diseases are in widespread use for biomedical research. Mouse models with a mutation in a single gene or multiple genes are excellent research tools for understanding the role of a specific gene in the etiology of a human genetic disease. Ideally, the mouse phenotypes will recapitulate the human phenotypes exactly. However, exact matches are rare, particularly in mouse models of neuropsychiatric disorders. This article summarizes the current strategies for optimizing the validity of a mouse model of a human brain dysfunction. We address the common question raised by molecular geneticists and clinical researchers in psychiatry, “what is a ‘good enough’ mouse model”?

Chadman, Kathryn K.; Yang, Mu; Crawley, Jacqueline N.

2010-01-01

222

Validated intraclass correlation statistics to test item performance models.  

PubMed

A new method, with an application program in Matlab code, is proposed for testing item performance models on empirical databases. This method uses data intraclass correlation statistics as expected correlations to which one compares simple functions of correlations between model predictions and observed item performance. The method rests on a data population model whose validity for the considered data is suitably tested and has been verified for three behavioural measure databases. Contrarily to usual model selection criteria, this method provides an effective way of testing under-fitting and over-fitting, answering the usually neglected question "does this model suitably account for these data?" PMID:21287127

Courrieu, Pierre; Brand-D'abrescia, Muriele; Peereman, Ronald; Spieler, Daniel; Rey, Arnaud

2011-03-01

223

A Mathematical Model for the Validation of Gene Selection Methods  

Microsoft Academic Search

Gene selection methods aim at determining biologically relevant subsets of genes in DNA microarray experiments. However, their assessment and validation represent a major difficulty since the subset of biologically relevant genes is usually unknown. To solve this problem a novel procedure for generating biologically plausible synthetic gene expression data is proposed. It is based on a proper mathematical model representing

Marco Muselli; Alberto Bertoni; Marco Frasca; Alessandro Beghini; Francesca Ruffino; Giorgio Valentini

2011-01-01

224

Validity of Cardiovascular Risk Prediction Models in Kidney Transplant Recipients  

PubMed Central

Background. Predicting cardiovascular risk is of great interest in renal transplant recipients since cardiovascular disease is the leading cause of mortality. Objective. To conduct a systematic review to assess the validity of cardiovascular risk prediction models in this population. Methods. Five databases were searched (MEDLINE, EMBASE, SCOPUS, CINAHL, and Web of Science) and cohort studies with at least one year of follow-up were included. Variables that described population characteristics, study design, and prognostic performance were extracted. The Quality in Prognostic Studies (QUIPS) tool was used to evaluate bias. Results. Seven studies met the criteria for inclusion, of which, five investigated the Framingham risk score and three used a transplant-specific model. Sample sizes ranged from 344 to 23,575, and three studies lacked sufficient event rates to confidently reach conclusion. Four studies reported discrimination (as measured by c-statistic), which ranged from 0.701 to 0.75, while only one risk model was both internally and externally validated. Conclusion. The Framingham has underestimated cardiovascular events in renal transplant recipients, but these studies have not been robust. A risk prediction model has been externally validated at least on one occasion, but comprehensive validation in multiple cohorts and impact analysis are recommended before widespread clinical application is advocated.

Stewart, Samuel Alan; Shoker, Ahmed

2014-01-01

225

Tuning, Validation, and Uncertainty Estimates for a Sound Exposure Model.  

National Technical Information Service (NTIS)

To tune and validate the performance of an acoustic model, sound signals were transmitted from a calibrated source, at varied mid-range frequencies, and received on a moored acoustic recording package at the edge of Tanner Bank near the Southern Californi...

F. J. Carmody

2011-01-01

226

A Model for Investigating Predictive Validity at Highly Selective Institutions.  

ERIC Educational Resources Information Center

A statistical model for investigating predictive validity at highly selective institutions is described. When the selection ratio is small, one must typically deal with a data set containing relatively large amounts of missing data on both criterion and predictor variables. Standard statistical approaches are based on the strong assumption that…

Gross, Alan L.; And Others

227

Linear Model to Assess the Scale's Validity of a Test  

ERIC Educational Resources Information Center

Wright and Stone had proposed three features to assess the quality of the distribution of the items difficulties in a test, on the so called "most probable response map": line, stack and gap. Once a line is accepted as a design model for a test, gaps and stacks are practically eliminated, producing an evidence of the "scale validity" of the test.…

Tristan, Agustin; Vidal, Rafael

2007-01-01

228

MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES  

EPA Science Inventory

This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...

229

Validating Work Discrimination and Coping Strategy Models for Sexual Minorities  

ERIC Educational Resources Information Center

The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

2009-01-01

230

Extened Validity of Linearized Kinematic Model for Optimal Missile Avoidance.  

National Technical Information Service (NTIS)

Optimal missile avoidance is analyzed with a two-dimensional linearized kinematic model. It is shown that inclusion of a control-effort penalization term in the payoff function leads to extend the domain of validity of the trajectory linearization. The re...

Y. Rotsztein J. Shinar

1980-01-01

231

Bibliometric Modeling Processes and the Empirical Validity of Lotka's Law.  

ERIC Educational Resources Information Center

Examines the elements involved in fitting a bibliometric model to empirical data, proposes a consistent methodology for applying Lotka's law, and presents the results of an empirical test of the methodology. The results are discussed in terms of the validity of Lotka's law and the suitability of the proposed methodology. (49 references) (CLB)

Nicholls, Paul Travis

1989-01-01

232

Defect distribution model validation and effective process control  

Microsoft Academic Search

Assumption of the underlying probability distribution is an essential part of effective process control. In this article, we demonstrate how to improve the effectiveness of equipment monitoring and process induced defect control through properly selecting, validating and using the hypothetical distribution models. The testing method is based on probability plotting, which is made possible through order statistics. Since each ordered

Lei Zhong

2003-01-01

233

VALIDATION OF VOID COALESCENCE MODEL FOR DUCTILE DAMAGE.  

SciTech Connect

A model for void coalescence for ductile damage in metals is presented. The basic mechanism is void linking through an instability in the intervoid ligament. The formation probability of void clusters is calculated, as a function of cluster size, imposed stress, and strain. Numerical approximations are validated in a 1 D hydrocode.

Tonks, D. L. (Davis L.); Zurek, A. K. (Anna K.); Thissell, W. R. (W. Richards)

2001-01-01

234

VIBROACOUSTIC MODEL VALIDATION FOR A CURVED HONEYCOMB COMPOSITE PANEL  

Microsoft Academic Search

Finite element and boundary element models are developed to investigate the vibroacoustic response of a curved honeycomb composite sidewall panel. Results from vibroacoustic tests conducted in the NASA Langley Structural Acoustic Loads and Transmission facility are used to validate the numerical predictions. The sidewall panel is constructed from a flexible honeycomb core sandwiched between carbon fiber reinforced composite laminate face

Ralph D. Buehrle; Jay H. Robinsonÿ; Ferdinand W. Grosveld

235

Validation and Test of a Land Use Plan Design Model.  

National Technical Information Service (NTIS)

The purpose of the project was to test and validate the use of random search technique as an optimization tool in a land use plan design model. It was accomplished through a set of controlled experiments performed with a hypothetical study area. A compute...

K. G. Sinha J. T. Adamski A. J. Hartmann

1973-01-01

236

ID Model Construction and Validation: A Multiple Intelligences Case  

ERIC Educational Resources Information Center

This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one, the theoretical foundations of multiple Intelligences and ID were examined to guide the development…

Tracey, Monica W.; Richey, Rita C.

2007-01-01

237

Solar swimming pool heating: Description of a validated model  

SciTech Connect

In the framework of a European Demonstration Programme, co-financed by CEC and national bodies, a model was elaborated and validated for open-air swimming pools having a minimal surface of 100 m[sup 2] and a minimal depth of 0.5 m. The model consists of two parts, the energy balance of the pool and the solar plant. The theoretical background of the energy balance of an open-air swimming pool was found to be poor. Special monitoring campaigns were used to validate the dynamic model using mathematical parameter identification methods. The final model was simplified in order to shorten calculation time and to improve the user-friendliness by reducing the input values to the most important one. The programme is commercially available. However, it requires the hourly meteorological data of a test reference year (TRY) as an input. The users are mainly designing engineers.

Haaf, W.; Luboschik, U.; Tesche, B. (IST Energietechnik GmbH, Hauptsitz Wollbach, Kandern (Germany))

1994-07-01

238

How to build valid and credible simulation models  

Microsoft Academic Search

In this tutorial we present techniques for building valid and credible simulation models. Ideas to be discussed include the importance of a definitive problem formulation, discussions with subject-matter experts, interacting with the decision-maker on a regular basis, development of a written assumptions document, structured walk-through of the assumptions document, use of sensitivity analysis to determine important model factors, and comparison

Averill M. Law

2006-01-01

239

How to build valid and credible simulation models  

Microsoft Academic Search

In this tutorial we present techniques for building valid and credible simulation models. Ideas to be discussed include the importance of a definitive problem formulation, discussions with subject-matter experts, interacting with the decision-maker on a regular basis, development of a written assumptions document, structured walk-through of the assumptions document, use of sensitivity analysis to determine important model factors, and comparison

Averill M. Law

2008-01-01

240

How to build valid and credible simulation models  

Microsoft Academic Search

In this tutorial we present techniques for building valid and credible simulation models. Ideas to be discussed include the importance of a definitive problem formulation, discussions with subject-matter experts, interacting with the decision-maker on a regular basis, development of a written assumptions document, structured walk-through of the assumptions document, use of sensitivity analysis to determine important model factors, and comparison

Averill M. Law

2005-01-01

241

Validation of Turandot, a fast processor model for microarchitecture exploration  

Microsoft Academic Search

We describe the results in validating the performance projections from a parameterized trace-driven simulation model of a speculative out-of-order superscalar processor, which has been developed with the objective of acting as a microarchitecture exploration tool. Because of its objective, the model -called Turandot- has been designed to deliver much higher simulation speed than what is achieved from detailed (RTL) processor

Mayan Moudgill; Pradip Bose; Jaime H. Moreno

1999-01-01

242

A standardized approach to PV system performance model validation.  

SciTech Connect

PV performance models are used to predict how much energy a PV system will produce at a given location and subject to prescribed weather conditions. These models are commonly used by project developers to choose between module technologies and array designs (e.g., fixed tilt vs. tracking) for a given site or to choose between different geographic locations, and are used by the financial community to establish project viability. Available models can differ significantly in their underlying mathematical formulations and assumptions and in the options available to the analyst for setting up a simulation. Some models lack complete documentation and transparency, which can result in confusion on how to properly set up, run, and document a simulation. Furthermore, the quality and associated uncertainty of the available data upon which these models rely (e.g., irradiance, module parameters, etc.) is often quite variable and frequently undefined. For these reasons, many project developers and other industry users of these simulation tools have expressed concerns related to the confidence they place in PV performance model results. To address this problem, we propose a standardized method for the validation of PV system-level performance models and a set of guidelines for setting up these models and reporting results. This paper describes the basic elements for a standardized model validation process adapted especially for PV performance models, suggests a framework to implement the process, and presents an example of its application to a number of available PV performance models.

Stein, Joshua S.; Jester, Terry (Hudson Clean Energy Partners); Posbic, Jean (BP Solar); Kimber, Adrianne (First Solar); Cameron, Christopher P.; Bourne, Benjamin (SunPower Corporation)

2010-10-01

243

Finite Element Model Development and Validation for Aircraft Fuselage Structures  

NASA Technical Reports Server (NTRS)

The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.

Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

2000-01-01

244

Validating the BHR RANS model for variable density turbulence  

SciTech Connect

The BHR RANS model is a turbulence model for multi-fluid flows in which density variation plays a strong role in the turbulence processes. In this paper they demonstrate the usefulness of BHR over a wide range of flows which include the effects of shear, buoyancy, and shocks. The results are in good agreement with experimental and DNS data across the entire set of validation cases, with no need to retune model coefficients between cases. The model has potential application to a number of aerospace related flow problems.

Israel, Daniel M [Los Alamos National Laboratory; Gore, Robert A [Los Alamos National Laboratory; Stalsberg - Zarling, Krista L [Los Alamos National Laboratory

2009-01-01

245

Propeller aircraft interior noise model utilization study and validation  

NASA Technical Reports Server (NTRS)

Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

Pope, L. D.

1984-01-01

246

Propeller aircraft interior noise model utilization study and validation  

NASA Astrophysics Data System (ADS)

Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

Pope, L. D.

1984-09-01

247

Validation of the HOTCHAN code for analyzing the EBR-II core following an unprotected loss of flow  

Microsoft Academic Search

A series of reactor experiments involving unprotected (no scram) loss-of-primary flow (LOF) down to natural convection was successfully conducted in February and April of 1986 on the Experimental Breeder Reactor II (EBR-II). The predicted and measured behavior of a special instrumented assembly, the XX09-fueled INSAT, was compared for the most severe test (SHRT 45) to demonstrate the validation of the

D. Mohr; L. K. Chang; H. P. Planchon

1988-01-01

248

Validation of the HOTCHAN code for analyzing the EBR-II driver following loss of flow without scram  

Microsoft Academic Search

A series of experiments involving unprotected (no scram) loss-of-primary flow (LOF) down to natural convection was successfully conducted in February 1985 on the Experimental Breeder Reactor-II (EBR-II). The predicted and measured behavior of a special instrumented assembly, the XX09 fueled INSAT, is compared for the most severe test in the group to demonstrate the validation of the thermal-hydraulic code HOTCHAN.

D. Mohr; L. K. Chang; P. R. Betten; E. E. Feldman; H. P. Planchon

1987-01-01

249

A process improvement model for software verification and validation  

NASA Technical Reports Server (NTRS)

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

Callahan, John; Sabolish, George

1994-01-01

250

A process improvement model for software verification and validation  

NASA Technical Reports Server (NTRS)

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

Callahan, John; Sabolish, George

1994-01-01

251

Validation of Knowledge Acquisition for Surgical Process Models  

PubMed Central

Objective Surgical Process Models (SPMs) are models of surgical interventions. The objectives of this study are to validate acquisition methods for Surgical Process Models and to assess the performance of different observer populations. Design The study examined 180 SPM of simulated Functional Endoscopic Sinus Surgeries (FESS), recorded with observation software. About 150,000 single measurements in total were analyzed. Measurements Validation metrics were used for assessing the granularity, content accuracy, and temporal accuracy of structures of SPMs. Results Differences between live observations and video observations are not statistically significant. Observations performed by subjects with medical backgrounds gave better results than observations performed by subjects with technical backgrounds. Granularity was reconstructed correctly by 90%, content by 91%, and the mean temporal accuracy was 1.8 s. Conclusion The study shows the validity of video as well as live observations for modeling Surgical Process Models. For routine use, the authors recommend live observations due to their flexibility and effectiveness. If high precision is needed or the SPM parameters are altered during the study, video observations are the preferable approach.

Neumuth, Thomas; Jannin, Pierre; Strauss, Gero; Meixensberger, Juergen; Burgert, Oliver

2009-01-01

252

The validation of three human reliability quantification techniques--THERP, HEART and JHEDI: Part II--Results of validation exercise.  

PubMed

This is the second of three papers dealing with the validation of three Human Reliability Assessment (HRA) techniques. The first paper introduced the need for validation, the techniques themselves and pertinent validation issues. This second paper details the results of the validation study carried out on the Human Reliability Quantification techniques THERP, HEART and JHEDI. The validation study used 30 real Human Error Probabilities (HEPs) and 30 active Human Reliability Assessment (HRA) assessors, 10 per technique. The results were that 23 of the assessors showed a significant correlation between their estimates and the real HEPs, supporting the predictive accuracy of the techniques. Overall precision showed 72% (60-87%) of all HEPs to be within a factor of 10 of the true HEPs, with 38% of all estimates being within a factor of three of the true values. Techniques also tended to be pessimistic rather than optimistic, when they were imprecise. These results lend support to the empirical validity of these three approaches. PMID:9414337

Kirwan, B; Kennedy, R; Taylor-Adams, S; Lambert, B

1997-02-01

253

A verification and validation process for model-driven engineering  

NASA Astrophysics Data System (ADS)

Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

Delmas, R.; Pires, A. F.; Polacsek, T.

2013-12-01

254

A Methodical Approach for Developing Valid Human Performance Models of Flight Deck Operations  

Microsoft Academic Search

\\u000a Validation is critically important when human performance models are used to predict the effect of future system designs on\\u000a human performance. A model of flight deck operations was validated using a rigorous, iterative, model validation process.\\u000a The process included the validation of model inputs (task trace and model input parameters), process models (workload, perception,\\u000a and visual attention) and model outputs

Brian F. Gore; Becky L. Hooey; Nancy Haan; Deborah L. Bakowski; Eric Mahlstedt

255

Validation results of wind diesel simulation model TKKMOD  

NASA Astrophysics Data System (ADS)

The document summarizes the results of TKKMOD validation procedure. TKKMOD is a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project Engineering Design Tools for Wind-Diesel Systems (JOUR-0078). The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, energy losses in the system components, diesel fuel consumption, and the number of diesel engine starts. The work has been funded through the Finnish Advanced Energy System R&D Programme (NEMO). The validation has been performed using the data from EFI (Norwegian Electric Power Institute), since data from the Finnish reference system is not yet available. The EFI system has a slightly different configuration with similar overall operating principles and approximately same battery capacity. The validation data set, 394 hours of measured data, is from the first prototype wind-diesel system on the island FROYA off the Norwegian coast.

Manninen, L. M.

256

Validation of the SUNY Satellite Model in a Meteosat Evironment  

SciTech Connect

The paper presents a validation of the SUNY satellite-to-irradiance model against four ground-truth stations from the Indian solar radiation network located in and around the province of Rajasthan, India. The SUNY model had initially been developed and tested to process US weather satellite data from the GOES series and has been used as part of the production of the US National Solar Resource Data Base (NSRDB). Here the model is applied to processes data from the European weather satellites Meteosat 5 and 7.

Perez, R.; Schlemmer, J.; Renne, D.; Cowlin, S.; George, R.; Bandyopadhyay, B.

2009-01-01

257

Verifying and Validating Proposed Models for FSW Process Optimization  

NASA Technical Reports Server (NTRS)

This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

Schneider, Judith

2008-01-01

258

Validating and Tuning Physical Models With Multiple Data Sources (Invited)  

NASA Astrophysics Data System (ADS)

The conventional process of validating ionospheric models is to compare model and observations and adjust appropriate parameters to bring them into agreement. To tune or validate a coupled thermosphere ionosphere model requires multiple data sources, since comparison with one source is usually not sufficient to uniquely constrain the model forcing. Matching one parameter can be achieved in a variety of ways. For instance, the ionospheric peak height, hmF2, can be raised either by increasing heat sources and so adjusting the thermal expansion, or by decreasing turbulent mixing. Decreased mixing in the lower thermosphere tends to increase the ratio of neutral atomic oxygen to molecular nitrogen in the mid and upper thermosphere, so increases the scale height causing the atmosphere to expand. The balance between EUV and high latitude heat sources affects neutral winds, which also play a key role in hmF2. The standalone Field Line Interhemispheric Plasma (FLIP) model has algorithms that can adjust empirical neutral winds and composition, to tune the model to match hmF2 and NmF2 measurements from ionosondes and radars. Tuning coupled thermosphere ionosphere models, on the other hand, require adjusting external forcing within their uncertainties, such as magnetospheric convection for Joule heating, solar EUV fluxes for dissociation, ionization, and heating, and lower atmosphere waves and turbulence. A coupled model of the Thermosphere Ionosphere, Plasmasphere with electrodynamics (CTIPe) has been used with a range of data sources, and compared with FLIP, to determine if a unique combination of external sources can bring model and data into agreement. Once a coupled model and standalone ionosphere model have been tuned to match observations, the neutral parameters from the coupled models can be compared with those required in the standalone ionosphere model. The comparison imposes an important constraint on the tuning, and contributes to our understanding of the physical processes.

Fuller-Rowell, T. J.; Fedrizzi, M.; Codrescu, M.; Richards, P. G.

2009-12-01

259

Approaches to Validation of Models for Low Gravity Fluid Behavior  

NASA Technical Reports Server (NTRS)

This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

2005-01-01

260

Rationality Validation of a Layered Decision Model for Network Defense  

SciTech Connect

We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDM rationality through simulation.

Wei, Huaqiang; Alves-Foss, James; Zhang, Du; Frincke, Deb

2007-08-31

261

A model for the separation of cloud and aerosol in SAGE II occultation data  

NASA Technical Reports Server (NTRS)

The Stratospheric Aerosol and Gas Experiment (SAGE) II satellite experiment measures the extinction due to aerosols and thin cloud, at wavelengths of 0.525 and 1.02 micrometers, down to an altitude of 6 km. The wavelength dependence of the extinction due to aerosols differs from that of the extinction due to cloud and is used as the basis of a model for separating these two components. The model is presented and its validation using airborne lidar data, obtained coincident with SAGE II observations, is described. This comparison shows that smaller SAGE II cloud extinction values correspond to the presence of subvisible cirrus cloud in the lidar record. Examples of aerosol and cloud data products obtained using this model to interpret SAGE II upper tropospheric and lower stratospheric data are also shown.

Kent, G. S.; Winker, D. M.; Osborn, M. T.; Skeens, K. M.

1993-01-01

262

Finite element modeling for validation of structural damage identification experimentation.  

SciTech Connect

The project described in this report was performed to couple experimental and analytical techniques in the field of structural health monitoring and darnage identification. To do this, a finite dement model was Constructed of a simulated three-story building used for damage identification experiments. The model was used in conjunction with data from thie physical structure to research damage identification algorithms. Of particular interest was modeling slip in joints as a function of bolt torque and predicting the smallest change of torque that could be detected experimentally. After being validated with results from the physical structure, the model was used to produce data to test the capabilities of damage identification algorithms. This report describes the finite element model constructed, the results obtained, and proposed future use of the model.

Stinemates, D. W. (Daniel W.); Bennett, J. G. (Joel G.)

2001-01-01

263

Validity and reliability multiple intelligent item using rasch measurement model  

Microsoft Academic Search

This study was undertaken to produce empirical evidence of validity and reliability of the item using a survey questionnaire Multiple Intelligences (MI) analyzed using Rasch Model for polythomus data aided by Winstep software.The questionnaire was conducted on 179 students from Selangor with MI instruments @ e-MyMICA.The results showed that all the PTMEA Corr is in positive values, where an item

Siti Rahayah Ariffin; Bishanani Omar; Anita Isa; Sharida Sharif

2010-01-01

264

Validation of General Circulation Model Radiative Fluxes Using Surface Observations  

Microsoft Academic Search

The surface radiative fluxes of the ECHAM3 General Circulation Model (GCM) with T2 1, T42, and T 106 resolutions have been validated using observations from the Global Energy Balance Archive (GEBA, World Climate Program-Water Project A7). GEBA contains the most comprehensive dataset now available for worldwide instrumentally measured surface energy fluxes.The GCM incoming shortwave radiation at the surface has been

Martin Wild; Atsumu Ohmura; Hans Gilgen; Erich Roeckner

1995-01-01

265

Validation of general circulation model radiative fluxes using surface observations  

Microsoft Academic Search

The surface radiative fluxes of the ECHAM3 General Circulation Model (GCM) with T21, T42, and T106 resolutions have been validated using observations from the Global Energy Balance Archive (GEBA, World Climate Program-Water Project A7). GEBA contains the most comprehensive dataset now available for worldwide instrumentally measured surface energy fluxes. The GCM incoming shortwave radiation at the surface has been compared

M. Wild; A. Oshmura; H. Gilgen

1995-01-01

266

Validating Complex Construction Simulation Models Using 3D Visualization  

Microsoft Academic Search

One of primary impediments in the use of discrete-event simulation to plan and design construction operations is that decision-makers often do not have the means, the knowledge, and\\/or the time to check the veracity and the validity of simulation models and thus have little confidence in the results. Visualizing simulated operations in 3D can be of substantial help in the

Vineet R. Kamat; Julio C. Martinez

2003-01-01

267

ENERGETIC MATERIAL RESPONSE IN A COOKOFF MODEL VALIDATION EXPERIMENT  

Microsoft Academic Search

The cookoff experiments described in this paper belong to the small-scale experimental portion of a three-year phased study of the slow cookoff problem. This paper presents the response of three energetic materials in a small-scale cookoff experiment. The experimental effort is being used to validate the cookoff models currently under development by the Department of Energy (DOE).1-2 In this phase

A. I. Atwood; P. O. Curran; D. T. Bui; T. L. Boggs; K. B. Lee

268

Validation of detailed thermal hydraulic models used for LMR safety and for improvement of technical specifications  

SciTech Connect

Detailed steady-state and transient coolant temperatures and flow rates from an operating reactor have been used to validate the multiple pin model in the SASSYS-1 liquid metal reactor systems analysis code. This multiple pin capability can be used for explicit calculations of axial and lateral temperature distributions within individual subassemblies. Thermocouples at a number of axial locations and in a number of different coolant sub-channels m the XXO9 instrumented subassembly in the EBR-II reactor provided temperature data from the Shutdown Heat Removal Test (SHRT) series. Flow meter data for XXO9 and for the overall system are also available from these tests. Results of consistent SASSYS-1 multiple pin analyses for both the SHRT-45 loss-of-flow-without-scram-test and the S14RT-17 protected loss-of-flow test agree well with the experimental data, providing validation of the SASSYS-1 code over a wide range of conditions.

Dunn, F.E.

1995-12-31

269

In-Drift Microbial Communities Model Validation Calculations  

SciTech Connect

The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

D. M. Jolley

2001-09-24

270

In-Drift Microbial Communities Model Validation Calculation  

SciTech Connect

The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

D. M. Jolley

2001-10-31

271

IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS  

SciTech Connect

The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

D.M. Jolley

2001-12-18

272

The two-track model of bereavement questionnaire (TTBQ): development and validation of a relational measure.  

PubMed

The Two-Track Model of Bereavement Questionnaire (TTBQ) was designed to assess response to loss over time. Respondents were 354 persons who completed the 70-item self-report questionnaire constructed in accordance with the Two-Track Model of Bereavement. Track I focuses on the bereaved's biopsychosocial functioning and Track II concerns the bereaved's ongoing relationship to the range of memories, images, thoughts, and feeling states associated with the deceased. Factor analysis identified 5 factors that accounted for 51% of the variance explained. In accord with the theoretical and clinical model, 3 factors were primarily associated with the relationship to the deceased (Track II): Active Relational Grieving, Close and Positive Relationship, and Conflictual Relationship; and 2 factors with aspects of functioning (Track I): General Biopsychosocial Functioning and Traumatic Perception of the Loss. Construct and concurrent validity were examined and were found satisfactory. Differences by kinship, cause of death, gender, and time elapsed were examined across the 5 factors, the total TTBQ, and the ITG. The new measure is shown to have both construct and concurrent validity. Discussions of the results and implications for the measurement of response to loss conclude the article. PMID:19368062

Rubin, Simon Shimshon; Nadav, Ofri Bar; Malkinson, Ruth; Koren, Dan; Goffer-Shnarch, Moran; Michaeli, Ella

2009-04-01

273

A community diagnostic tool for chemistry climate model validation  

NASA Astrophysics Data System (ADS)

This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag) tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model), and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs) participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP) is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

Gettelman, A.; Eyring, V.; Fischer, C.; Shiona, H.; Cionni, I.; Neish, M.; Morgenstern, O.; Wood, S. W.; Li, Z.

2012-09-01

274

A community diagnostic tool for Chemistry Climate Model Validation  

NASA Astrophysics Data System (ADS)

This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag) tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model), and/or many types of observations. The tool can also compute quantitative performance metrics. The initial construction and application is to coupled Chemistry-Climate Models (CCMs) participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP) is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool is supporting model development as well as quantifying model improvements, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth System. User modifications are encouraged and easy to perform with a minimum of coding.

Gettelman, A.; Eyring, V.; Fischer, C.; Shiona, H.; Cionni, I.; Neish, M.; Morgenstern, O.; Wood, S. W.; Li, Z.

2012-05-01

275

Objective Model Validation and Application of Empirical Liquefaction Models  

Microsoft Academic Search

Abstract Empirical Liquefaction Models,(ELMs) are the standard,approach,for predicting,the occurrence of soil liquefaction. These models are typically based on in situ index tests, such as the Standard Penetration Test (SPT) and Cone Penetration Test (CPT), and are broadly classified as deterministic and probabilistic models. No objective and quantitative comparison,of these models has been published. Similarly, no rigorous procedure has been published

Thomas Oommen; Laurie G. Baise; Richard Vogel

276

Modeling, simulation and validation of 14 DOF full vehicle model  

Microsoft Academic Search

An accurate full vehicle model is required in representing the behavior of the vehicle in order to design vehicle control system such as yaw control, antiroll control, automated highway system etc. There are many vehicle models built for the study of the vehicle dynamics specifically for the ride and handling behavior. This paper describes the vehicle model development of the

Joga Dharma Setiawan; Mochamad Safarudin; Amrik Singh

2009-01-01

277

Shoulder model validation and joint contact forces during wheelchair activities  

PubMed Central

Chronic shoulder impingement is a common problem for manual wheelchair users. The loading associated with performing manual wheelchair activities of daily living is substantial and often at a high frequency. Musculoskeletal modeling and optimization techniques can be used to estimate the joint contact forces occurring at the shoulder to assess the soft tissue loading during an activity and to possibly identify activities and strategies that place manual wheelchair users at risk for shoulder injuries. The purpose of this study was to validate an upper extremity musculoskeletal model and apply the model to wheelchair activities for analysis of the estimated joint contact forces. Upper extremity kinematics and handrim wheelchair kinetics were measured over three conditions: level propulsion, ramp propulsion, and a weight relief lift. The experimental data were used as input to a subject-specific musculoskeletal model utilizing optimization to predict joint contact forces of the shoulder during all conditions. The model was validated using a mean absolute error calculation. Model results confirmed that ramp propulsion and weight relief lifts place the shoulder under significantly higher joint contact loading than level propulsion. In addition, they exhibit large superior contact forces that could contribute to impingement. This study highlights the potential impingement risk associated with both the ramp and weight relief lift activities. Level propulsion was shown to have a low relative risk of causing injury, but with consideration of the frequency with which propulsion is performed, this observation is not conclusive.

Morrow, Melissa M.B.; Kaufman, Kenton R.; An, Kai-Nan

2010-01-01

278

Open-Source MFIX-DEM Software for Gas-Solids Flows: Part II - Validation Studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas–solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen

2012-04-01

279

Validation of the WATEQ4 geochemical model for uranium  

SciTech Connect

As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

1983-09-01

280

Aggregating Validity Indicators Embedded in Conners' CPT-II Outperforms Individual Cutoffs at Separating Valid from Invalid Performance in Adults with Traumatic Brain Injury.  

PubMed

Continuous performance tests (CPT) provide a useful paradigm to assess vigilance and sustained attention. However, few established methods exist to assess the validity of a given response set. The present study examined embedded validity indicators (EVIs) previously found effective at dissociating valid from invalid performance in relation to well-established performance validity tests in 104 adults with TBI referred for neuropsychological testing. Findings suggest that aggregating EVIs increases their signal detection performance. While individual EVIs performed well at their optimal cutoffs, two specific combinations of these five indicators generally produced the best classification accuracy. A CVI-5A ?3 had a specificity of .92-.95 and a sensitivity of .45-.54. At ?4 the CVI-5B had a specificity of .94-.97 and sensitivity of .40-.50. The CVI-5s provide a single numerical summary of the cumulative evidence of invalid performance within the CPT-II. Results support the use of a flexible, multivariate approach to performance validity assessment. PMID:24957927

Erdodi, Laszlo A; Roth, Robert M; Kirsch, Ned L; Lajiness-O'neill, Renee; Medoff, Brent

2014-08-01

281

Experimental validation of flexible robot arm modeling and control  

NASA Technical Reports Server (NTRS)

Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

Ulsoy, A. Galip

1989-01-01

282

Validation Analysis of the Shoal Groundwater Flow and Transport Model  

SciTech Connect

Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of groundwater withdrawal activities in the area. The conceptual and numerical models were developed based upon regional hydrogeologic investigations conducted in the 1960s, site characterization investigations (including ten wells and various geophysical and geologic studies) at Shoal itself prior to and immediately after the test, and two site characterization campaigns in the 1990s for environmental restoration purposes (including eight wells and a year-long tracer test). The new wells are denoted MV-1, MV-2, and MV-3, and are located to the northnortheast of the nuclear test. The groundwater model was generally lacking data in the north-northeastern area; only HC-1 and the abandoned PM-2 wells existed in this area. The wells provide data on fracture orientation and frequency, water levels, hydraulic conductivity, and water chemistry for comparison with the groundwater model. A total of 12 real-number validation targets were available for the validation analysis, including five values of hydraulic head, three hydraulic conductivity measurements, three hydraulic gradient values, and one angle value for the lateral gradient in radians. In addition, the fracture dip and orientation data provide comparisons to the distributions used in the model and radiochemistry is available for comparison to model output. Goodness-of-fit analysis indicates that some of the model realizations correspond well with the newly acquired conductivity, head, and gradient data, while others do not. Other tests indicated that additional model realizations may be needed to test if the model input distributions need refinement to improve model performance. This approach (generating additional realizations) was not followed because it was realized that there was a temporal component to the data disconnect: the new head measurements are on the high side of the model distributions, but the heads at the original calibration locations themselves have also increased over time. This indicates that the steady-state assumption of the groundwater model is in error. To test the robustness of the model d

A. Hassan; J. Chapman

2008-11-01

283

Image decomposition as a tool for validating stress analysis models  

NASA Astrophysics Data System (ADS)

It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

Patki, A.; Wang, W.; Mottershead, J.; Patterson, E.

2010-06-01

284

A turbulence model for iced airfoils and its validation  

NASA Technical Reports Server (NTRS)

A turbulence model based on the extension of the algebraic eddy viscosity formulation of Cebeci and Smith developed for two dimensional flows over smooth and rough surfaces is described for iced airfoils and validated for computed ice shapes obtained for a range of total temperatures varying from 28 to -15 F. The validation is made with an interactive boundary layer method which uses a panel method to compute the inviscid flow and an inverse finite difference boundary layer method to compute the viscous flow. The interaction between inviscid and viscous flows is established by the use of the Hilbert integral. The calculated drag coefficients compare well with recent experimental data taken at the NASA-Lewis Icing Research Tunnel (IRT) and show that, in general, the drag increase due to ice accretion can be predicted well and efficiently.

Shin, Jaiwon; Chen, Hsun H.; Cebeci, Tuncer

1992-01-01

285

Validation of chemistry models employed in a particle simulation method  

NASA Technical Reports Server (NTRS)

The chemistry models employed in a statistical particle simulation method, as implemented in the Intel iPSC/860 multiprocessor computer, are validated and applied. Chemical relaxation of five-species air in these reservoirs involves 34 simultaneous dissociation, recombination, and atomic-exchange reactions. The reaction rates employed in the analytic solutions are obtained from Arrhenius experimental correlations as functions of temperature for adiabatic gas reservoirs in thermal equilibrium. Favorable agreement with the analytic solutions validates the simulation when applied to relaxation of O2 toward equilibrium in reservoirs dominated by dissociation and recombination, respectively, and when applied to relaxation of air in the temperature range 5000 to 30,000 K. A flow of O2 over a circular cylinder at high Mach number is simulated to demonstrate application of the method to multidimensional reactive flows.

Haas, Brian L.; Mcdonald, Jeffrey D.

1991-01-01

286

Modeling TCP Throughput: A Simple Model and Its Empirical Validation  

Microsoft Academic Search

In this paper we develop a simple analytic characterization of the steady state throughput, as a function of loss rate and round trip time for a bulk transfer TCP flow, i.e., a flow with an unlimited amount of data to send. Unlike the models in [6, 7, 10], our model captures not only the behavior of TCP's fast retransmit mechanism

Jitendra Padhye; Victor Firoiu; Donald F. Towsley; James F. Kurose

1998-01-01

287

Validation of coupled atmosphere-fire behavior models  

SciTech Connect

Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexity of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.

Bossert, J.E.; Reisner, J.M.; Linn, R.R.; Winterkamp, J.L. [Los Alamos National Lab., NM (United States); Schaub, R. [Dynamac Corp., Kennedy Space Center, FL (United States); Riggan, P.J. [Forest Service, Riverside, CA (United States)

1998-12-31

288

Validation of High Displacement Piezoelectric Actuator Finite Element Models  

NASA Technical Reports Server (NTRS)

The paper presents the results obtained by using NASTRAN(Registered Trademark) and ANSYS(Regitered Trademark) finite element codes to predict doming of the THUNDER piezoelectric actuators during the manufacturing process and subsequent straining due to an applied input voltage. To effectively use such devices in engineering applications, modeling and characterization are essential. Length, width, dome height, and thickness are important parameters for users of such devices. Therefore, finite element models were used to assess the effects of these parameters. NASTRAN(Registered Trademark) and ANSYS(Registered Trademark) used different methods for modeling piezoelectric effects. In NASTRAN(Registered Trademark), a thermal analogy was used to represent voltage at nodes as equivalent temperatures, while ANSYS(Registered Trademark) processed the voltage directly using piezoelectric finite elements. The results of finite element models were validated by using the experimental results.

Taleghani, B. K.

2000-01-01

289

Coupled Thermal-Chemical-Mechanical Modeling of Validation Cookoff Experiments  

SciTech Connect

The cookoff of energetic materials involves the combined effects of several physical and chemical processes. These processes include heat transfer, chemical decomposition, and mechanical response. The interaction and coupling between these processes influence both the time-to-event and the violence of reaction. The prediction of the behavior of explosives during cookoff, particularly with respect to reaction violence, is a challenging task. To this end, a joint DoD/DOE program has been initiated to develop models for cookoff, and to perform experiments to validate those models. In this paper, a series of cookoff analyses are presented and compared with data from a number of experiments for the aluminized, RDX-based, Navy explosive PBXN-109. The traditional thermal-chemical analysis is used to calculate time-to-event and characterize the heat transfer and boundary conditions. A reaction mechanism based on Tarver and McGuire's work on RDX{sup 2} was adjusted to match the spherical one-dimensional time-to-explosion data. The predicted time-to-event using this reaction mechanism compares favorably with the validation tests. Coupled thermal-chemical-mechanical analysis is used to calculate the mechanical response of the confinement and the energetic material state prior to ignition. The predicted state of the material includes the temperature, stress-field, porosity, and extent of reaction. There is little experimental data for comparison to these calculations. The hoop strain in the confining steel tube gives an estimation of the radial stress in the explosive. The inferred pressure from the measured hoop strain and calculated radial stress agree qualitatively. However, validation of the mechanical response model and the chemical reaction mechanism requires more data. A post-ignition burn dynamics model was applied to calculate the confinement dynamics. The burn dynamics calculations suffer from a lack of characterization of the confinement for the flaw-dominated failure mode experienced in the tests. High-pressure burning rates are needed for more detailed post-ignition studies. Sub-models for chemistry, mechanical response and burn dynamics need to be validated against data from less complex experiments. The sub-models can then be used in integrated analysis for comparison with experimental data taken during integrated tests.

ERIKSON,WILLIAM W.; SCHMITT,ROBERT G.; ATWOOD,A.I.; CURRAN,P.D.

2000-11-27

290

Validation of thermal models for a prototypical MEMS thermal actuator.  

SciTech Connect

This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the polycrystalline silicon test structures, as well as uncontrolled nonuniform changes in this quantity over time and during operation.

Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

2008-09-01

291

Description and validation of the Moorepark Dairy System Model.  

PubMed

A stochastic budgetary simulation model of a dairy farm was developed to allow investigation of the effects of varying biological, technical, and physical processes on farm profitability. The model integrates animal inventory and valuation, milk supply, feed requirement, land and labor utilization, and economic analysis. A key model output is the estimated distribution of farm profitability, which is a function of total receipts from milk, calves, and cull cows less all variable and fixed costs (including an imputed cost for labor). An application of the model was demonstrated by modeling 2 calving patterns: a mean calving date of February 24 (S1) and a mean calving date of January 27 (S2). Monte Carlo simulation was used to determine the influence of variation in milk price, concentrate cost, and silage quality on farm profitability under each scenario. Model validation was conducted by comparing the results from the model against data collected from 21 commercial dairy farms. The net farm profit with S1 was 53,547 euros, and that with S2 was 51,687 euros; the annual EU milk quota was 468,000 kg, and farm size was 40 ha. Monte Carlo simulation showed that the S1 scenario was stochastically dominant over the S2 scenario. Sensitivity analyses showed that farm profit was most sensitive to changes in milk price. The partial coefficients of determination were 99.2, 0.7, and 0.1% for milk price, concentrate cost, and silage quality, respectively, in S1; the corresponding values in S2 were 97.6, 2.3, and 0.1%. Validations of the model showed that it could be used with confidence to study systems of milk production under Irish conditions. PMID:15453512

Shalloo, L; Dillon, P; Rath, M; Wallace, M

2004-06-01

292

Development and Validation of a 3-Dimensional CFB Furnace Model  

NASA Astrophysics Data System (ADS)

At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents CFB process analysis focused on combustion and NO profiles in pilot and industrial scale bituminous coal combustion.

Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

293

Optimization and validation of a micellar electrokinetic chromatographic method for the analysis of several angiotensin-II-receptor antagonists.  

PubMed

We have optimized a micellar electrokinetic capillary chromatographic method for the separation of six angiotensin-II-receptor antagonists (ARA-IIs): candesartan, eprosartan mesylate, irbesartan, losartan potassium, telmisartan, and valsartan. A face-centred central composite design was applied to study the effect of the pH, the molarity of the running buffer, and the concentration of the micelle-forming agent on the separation properties. A combination of the studied parameters permitted the separation of the six ARA-IIs, which was best carried out using a 55-mM sodium phosphate buffer solution (pH 6.5) containing 15 mM of sodium dodecyl sulfate. The same system can also be applied for the quantitative determination of these compounds, but only for the more stable ARA-IIs (candesartan, eprosartan mesylate, losartan potassium, and valsartan). Some system parameters (linearity, precision, and accuracy) were validated. PMID:12564683

Hillaert, S; De Beer, T R M; De Beer, J O; Van den Bossche, W

2003-01-10

294

Organic acid modeling and model validation: Workshop summary. Final report  

SciTech Connect

A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

Sullivan, T.J.; Eilers, J.M.

1992-08-14

295

Organic acid modeling and model validation: Workshop summary  

SciTech Connect

A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

Sullivan, T.J.; Eilers, J.M.

1992-08-14

296

Multiphase Flow Modeling - Validation and Application CRADA MC94-019, Final Report.  

National Technical Information Service (NTIS)

For the development and validation of multiphase flow modeling capability, a cooperative research and development agreement (CRADA) is in effect between Morgantown Energy Technology Center (METC) and Fluent Inc. To validate the Fluent multiphase model, se...

Madhava Syamlal Philip A. Nicoletti

1995-01-01

297

Validation of GOCE densities and evaluation of thermosphere models  

NASA Astrophysics Data System (ADS)

Atmospheric densities from ESA’s GOCE satellite at a mean altitude of 270 km are validated by comparison with predictions from the near real time model HASDM along the GOCE orbit in the time frame 1 November 2009 through 31 May 2012. Except for a scale factor of 1.29, which is due to different aerodynamic models being used in HASDM and GOCE, the agreement is at the 3% (standard deviation) level when comparing daily averages. The models NRLMSISE-00, JB2008 and DTM2012 are compared with the GOCE data. They match at the 10% level, but significant latitude-dependent errors as well as errors with semiannual periodicity are detected. Using the 0.1 Hz sampled data leads to much larger differences locally, and this dataset can be used presently to analyze variations down to scales as small as 150 km.

Bruinsma, S. L.; Doornbos, E.; Bowman, B. R.

2014-08-01

298

Bolted connection modeling and validation through laser-aided testing  

NASA Astrophysics Data System (ADS)

Bolted connections are widely employed in facility structures, such as light masts, transmission poles, and wind turbine towers. The complex connection behavior plays a significant role in the overall dynamic characteristics of a structure. A finite element (FE) modeling study of a bolt-connected square tubular steel beam is presented in this paper. Modal testing was performed in a controlled laboratory condition to validate the FE model, developed for the bolted beam. Two laser Doppler vibrometers were used simultaneously to measure structural vibration. A simplified joint model was proposed to further save computation time for structures with bolted connections. This study is an on-going effort to marshal knowledge associated with detecting damage on facility structures with bolted connections.

Dai, Kaoshan; Gong, Changqing; Smith, Benjiamin

2013-04-01

299

Modeling and Validation of Damped Plexiglas Windows for Noise Control  

NASA Technical Reports Server (NTRS)

Windows are a significant path for structure-borne and air-borne noise transmission in general aviation aircraft. In this paper, numerical and experimental results are used to evaluate damped plexiglas windows for the reduction of structure-borne and air-borne noise transmitted into the interior of an aircraft. In contrast to conventional homogeneous windows, the damped plexiglas windows were fabricated using two or three layers of plexiglas with transparent viscoelastic damping material sandwiched between the layers. Transmission loss and radiated sound power measurements were used to compare different layups of the damped plexiglas windows with uniform windows of the same nominal thickness. This vibro-acoustic test data was also used for the verification and validation of finite element and boundary element models of the damped plexiglas windows. Numerical models are presented for the prediction of radiated sound power for a point force excitation and transmission loss for diffuse acoustic excitation. Radiated sound power and transmission loss predictions are in good agreement with experimental data. Once validated, the numerical models were used to perform a parametric study to determine the optimum configuration of the damped plexiglas windows for reducing the radiated sound power for a point force excitation.

Buehrle, Ralph D.; Gibbs, Gary P.; Klos, Jacob; Mazur, Marina

2003-01-01

300

Development, Verification, and Validation of Multiphase Models for Polydisperse Flows  

SciTech Connect

This report describes in detail the technical findings of the DOE Award entitled 'Development, Verification, and Validation of Multiphase Models for Polydisperse Flows.' The focus was on high-velocity, gas-solid flows with a range of particle sizes. A complete mathematical model was developed based on first principles and incorporated into MFIX. The solid-phase description took two forms: the Kinetic Theory of Granular Flows (KTGF) and Discrete Quadrature Method of Moments (DQMOM). The gas-solid drag law for polydisperse flows was developed over a range of flow conditions using Discrete Numerical Simulations (DNS). These models were verified via examination of a range of limiting cases and comparison with Discrete Element Method (DEM) data. Validation took the form of comparison with both DEM and experimental data. Experiments were conducted in three separate circulating fluidized beds (CFB's), with emphasis on the riser section. Measurements included bulk quantities like pressure drop and elutriation, as well as axial and radial measurements of bubble characteristics, cluster characteristics, solids flux, and differential pressure drops (axial only). Monodisperse systems were compared to their binary and continuous particle size distribution (PSD) counterparts. The continuous distributions examined included Gaussian, lognormal, and NETL-provided data for a coal gasifier.

Christine Hrenya; Ray Cocco; Rodney Fox; Shankar Subramaniam; Sankaran Sundaresan

2011-12-31

301

Low Frequency Eddy Current Benchmark Study for Model Validation  

NASA Astrophysics Data System (ADS)

This paper presents results of an eddy current model validation study. Precise measurements were made using an impedance analyzer to investigate changes in impedance due to Electrical Discharge Machining (EDM) notches in aluminum plates. Each plate contained one EDM notch at an angle of 0, 10, 20, or 30 degrees from the normal of the plate surface. Measurements were made with the eddy current probe both scanning parallel and perpendicular to the notch length. The experimental response from the vertical and oblique notches will be reported and compared to results from different numerical simulation codes.

Mooers, R. D.; Cherry, M. R.; Knopp, J. S.; Aldrin, J. C.; Sabbagh, H. A.; Boehnlein, T. R.

2011-06-01

302

Validity of the essential states model in fullerenes  

NASA Astrophysics Data System (ADS)

Calculations of the dispersion of the second order hyperpolarizability of C60, C70 and C76 have been performed with a sum over states -- complete neglect of differential overlap/spectroscopic parametrization, plus configuration interaction method. The third harmonic generation spectra of C60 and C70 thin films have been measured in a wide spectral range, and they are in excellent agreement with calculations. The validity of the essential states model description for fullerenes is discussed in terms of symmetry properties of the molecular system.

Zamboni, Roberto; Taliani, Carlo; Kajzar, Francois; Fanti, Marianna; Orlandi, Giorgio; Zerbetto, Francesco

1995-12-01

303

A validation study of a stochastic model of human interaction  

NASA Astrophysics Data System (ADS)

The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.

Burchfield, Mitchel Talmadge

304

ON VALIDATION OF SOURCE AND SINK MODELS: PROBLEMS AND POSSIBLE SOLUTIONS  

EPA Science Inventory

The paper discusses solutions for problems relating to validating indoor air quality (IAQ) source and sink models. hile model validation remains the weakest part of the entire process of IAQ model development, special problems have made the validation of indoor source and sink mo...

305

Validation study of air-sea gas transfer modeling  

SciTech Connect

Laboratory results have demonstrated the importance of bubble plumes to air-water gas transfer (Asher et al., 1994). Bubble plumes enhance gas transfer by disrupting surface films, by directly transporting a gas, and by the creation of turbulence. Models of bubble gas transfer have been developed by different authors (Atkinson, 1973; Memery and Merlivat, 1985; Woolf and Thorpe, 1991) to determine the magnitude of gas transfer due to bubbles. Laboratory measurements of both the gas transfer rate k{sub L}, and the bubble distribution {phi} in a whitecap simulation tank (WST) have allowed these models to be validated and deficiencies in the theoretical assumptions to be explored. In the WST, each bucket tip simulates a wave breaking event. Important tests of these models include whether they can explain the experimentally determined solubility and Schmidt number dependency of k{sub L}, predict the time varying bubble concentrations, predict the evasion-invasion asymmetry, and predict the fraction of k{sub L} due to bubble plumes. Four different models were tested, a steady state model (Atkinson, 1973), a non-turbulence model with constant bubble radius (Memery and Merlivat, 1985), a turbulence model with constant bubble radius (Wolf and Thorpe, 1991), and a turbulence model with varying bubble radius. All models simulated multiple bubble tip cycles. The two turbulence models were run for sufficient tip cycles to generate statistically significant number of eddies ({number_sign}{gt}50) for bubbles affected by turbulence (V{sub B}{le}V{sub T}), found to be at least four tip cycles. The models allowed up to nine gases simultaneously and were run under different conditions of trace and major gas concentrations and partial pressures.

Asher, W.E.; Farley, P.J. [Pacific Northwest Lab., Richland, WA (United States); Leifer, I.S. [Georgia Inst. of Tech., Atlanta, GA (United States)

1995-07-01

306

Exploring terrestrial and atmospheric constraints in land surface model validation  

NASA Astrophysics Data System (ADS)

Simulating land surface processes is important both for applied hydrological forecasting, e.g., of floods and droughts, and for representation of land-atmosphere energy exchanges in numerical weather prediction and climate models. Evaluation of model estimates of terrestrial surface fluxes, such as runoff, as well as fluxes to the atmosphere, such as evapotranspiration, are also needed to ensure the realistic prediction of both moisture and energy fluxes. To this end, we evaluate the performance of a unified land model, ULM, which is a merger of the Noah land surface scheme used in NOAA’s weather prediction and climate models with the Sacramento Soil Moisture Accounting Model, used by the National Weather Service for operational streamflow prediction. Parameter estimation uses multiple objective functions that include minimization of the residual between modeled fluxes and both (i) observed streamflow and (ii) multiple observation-based estimates of evapotranspiration (ET). We describe the selection of a set of reference or benchmarking river basins taken in large part from the MOPEX (Model Parameter Estimation Experiment) and FLUXNET stations, supplemented by surface flux products from the North American Regional Reanalysis.

Livneh, B.; Restrepo, P. J.; Lettenmaier, D. P.

2010-12-01

307

Cross-Cultural Validation of the Beck Depression Inventory-II across U.S. and Turkish Samples  

ERIC Educational Resources Information Center

The purpose of this study was to test the Beck Depression Inventory-II (BDI-II) for factorial invariance across Turkish and U.S. college student samples. The results indicated that (a) a two-factor model has an adequate fit for both samples, thus providing evidence of configural invariance, and (b) there is a metric invariance but "no" sufficient…

Canel-Cinarbas, Deniz; Cui, Ying; Lauridsen, Erica

2011-01-01

308

Some validity issues in the theory and modeling of WPRM  

NASA Astrophysics Data System (ADS)

In many publications in the recent to older litererature on the theory and modeling of wave propagation in random media, WPRM, in ocean and atmospheric environments there are serious validity issues that arise. The issues discussed will include the following. (1) The validity of using the Markov approximation in theoretical treatments must be established for specific environments before theories based on that approximation can be used. Parameters that are required to be small must be shown to be so before the applicabilty to specific environments can be assured. (2) Most publications that purport to model the pdf's of intensity do not include the standard ``goodness of fit'' parameters. The Kolmogorov-Smirnov GoF test emphasizes the center of the distribution and often the pdf's of WPRM are characterized by very high tailed distributions where the fitting of that component can be important to understanding the scattering physics. GoF tests that include the full range of received intensity are needed. (3) Signal processors invariably require that the quadrature components of their receptions have Gaussian pdf's. It has been shown by simulations of WPRM that this is often not even close to the case. What significance this effect has on signal processing algorithms is a difficult and unresolved issue.

Ewart, Terry; Henyey, Frank

2002-05-01

309

Validation of prognostic indices using the frailty model.  

PubMed

A major issue when proposing a new prognostic index is its generalisibility to daily clinical practice. Validation is therefore required. Most validation techniques assess whether "on average" the results obtained by the prognostic index in classifying patients in a new sample of patients are similar to the results obtained in the construction set. We introduce a new important aspect of the generalisibility of a prognostic index: the heterogeneity of the prognostic index risk group hazard ratios over different centers. If substantial variability between centers exists, the prognostic index may have no discriminatory capability in some of the centers. To model such heterogeneity, we use a frailty model including a random center effect and a random prognostic index by center interaction. Statistical inference is based on a Bayesian approach using a Laplacian approximation for the marginal posterior distribution of the variances of the random effects. We investigate different ways to summarize the information available from this marginal posterior distribution. Our approach is applied to a real bladder cancer database for which we demonstrate how to investigate and interpret heterogeneity in prognostic index effect over centers. PMID:18618249

Legrand, C; Duchateau, L; Janssen, P; Ducrocq, V; Sylvester, R

2009-03-01

310

Scuffing modeling and experimental validation in the mixed lubrication regime  

NASA Astrophysics Data System (ADS)

The lubrication breakdown in sliding lubricated contacts is generally recognized as the principal cause of scuffing in the mixed lubrication regime. The models of the breakdown of surface protective films were reviewed. The mechanism of the scuffing failure was analyzed on a basis of influences of different variables, which included surface roughness, temperature, lubricants, speed, and oxidation. A newly developed scuffing Critical Temperature and Pressure model (CTP) was discussed. An FFT-based transient flash temperature model for general three-dimensional rough surface contacts was developed to determine the interfacial temperature distribution based on actual run-in profiles of the mating surfaces. The present model significantly reduces the computational time for the flash temperature calculation and it can accommodate simulations of very complicated heat sources and processes. To determine the dominant mechanism for scuffing in the mixed lubrication regime, the scuffing experiments were conducted first in high-speed rolling and sliding lubricated contacts on a two-disc machine. Then the scuffing failure were investigated and validated through experiments on a special ball-on-disc tester. The scuffing model was developed in a new scuffing map, which indicated that the maximum failure temperature, Tmax, is a transient temperature between the CTP mode and the asperity plastic deformation mode. Scuffing failures for materials of different hardness will undergo the similar process. In general, the harder materials have a higher Tmax than the softer materials. Good agreement was found between the test results and the scuffing model.

Gao, Jianqun

2000-10-01

311

Modeling and Validation of a Propellant Mixer for Controller Design  

NASA Technical Reports Server (NTRS)

A mixing chamber used in rocket engine testing at the NASA Stennis Space Center is modelled by a system of two nonlinear ordinary differential equations. The mixer is used to condition the thermodynamic properties of cryogenic liquid propellant by controlled injection of the same substance in the gaseous phase. The three inputs of the mixer are the positions of the valves regulating the liquid and gas flows at the inlets, and the position of the exit valve regulating the flow of conditioned propellant. Mixer operation during a test requires the regulation of its internal pressure, exit mass flow, and exit temperature. A mathematical model is developed to facilitate subsequent controller designs. The model must be simple enough to lend itself to subsequent feedback controller design, yet its accuracy must be tested against real data. For this reason, the model includes function calls to thermodynamic property data. Some structural properties of the resulting model that pertain to controller design, such as uniqueness of the equilibrium point, feedback linearizability and local stability are shown to hold under conditions having direct physical interpretation. The existence of fixed valve positions that attain a desired operating condition is also shown. Validation of the model against real data is likewise provided.

Richter, Hanz; Barbieri, Enrique; Figueroa, Fernando

2003-01-01

312

Modelling and validation of multiple reflections for enhanced laser welding  

NASA Astrophysics Data System (ADS)

The effects of multiple internal reflections within a laser weld joint as functions of joint geometry and processing conditions have been characterized. A computer-based ray tracing model is used to predict the reflective propagation of laser beam energy focused into the narrow gap of a metal joint for the purpose of predicting the location of melting and coalescence to form a weld. Quantitative comparisons are made between simulation cases. Experimental results are provided for qualitative model validation. This method is proposed as a way to enhance process efficiency and design laser welds which display deep penetration and high depth-to-width aspect ratios without high powered systems or keyhole mode melting.

Milewski, J.; Sklar, E.

1996-05-01

313

Derivation and empirical validation of a refined traffic flow model  

NASA Astrophysics Data System (ADS)

The gas-kinetic foundation of fluid-dynamic traffic equations suggested in previous papers (D. Helbing, Physica A 219 (1995) 375 and 391) is further refined by applying the theory of dense gases and granular materials to the Boltzmann-like traffic model by Paveri-Fontana. It is shown that, despite the phenomenologically similar behaviour or ordinary and granular fluids, the relations for these cannot directly be transferred to vehicular traffic. The dissipative and anisotropic interactions of vehicles as well as their velocity-dependent space requirements lead to a considerably different structure of the macroscopic traffic equations, also in comparison with the previously suggested traffic flow models. As a consequence. the instability mechanisms of emergent density waves are different. Crucial assumptions are validated by empirical traffic data and essential results are illustrated by figures.

Helbing, Dirk

1996-02-01

314

Ultrasonic transducers for cure monitoring: design, modelling and validation  

NASA Astrophysics Data System (ADS)

The finite element method (FEM) has been applied to simulate the ultrasonic wave propagation in a multilayered transducer, expressly designed for high-frequency dynamic mechanical analysis of polymers. The FEM model includes an electro-acoustic (active element) and some acoustic (passive elements) transmission lines. The simulation of the acoustic propagation accounts for the interaction between the piezoceramic and the materials in the buffer rod and backing, and the coupling between the electric and mechanical properties of the piezoelectric material. As a result of the simulations, the geometry and size of the modelled ultrasonic transducer has been optimized and used for the realization of a prototype transducer for cure monitoring. The transducer performance has been validated by measuring the velocity changes during the polymerization of a thermosetting matrix of composite materials.

Lionetto, Francesca; Montagna, Francesco; Maffezzoli, Alfonso

2011-12-01

315

Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting.  

National Technical Information Service (NTIS)

This research has two objectives-to verify and validate the U.S. Army's Forecast and Allocation of Army Recruiting Resources (FAARR) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simu...

G. M. Piskator

1998-01-01

316

Validation of general circulation model radiative fluxes using surface observations  

SciTech Connect

The surface radiative fluxes of the ECHAM3 General Circulation Model (GCM) with T21, T42, and T106 resolutions have been validated using observations from the Global Energy Balance Archive (GEBA, World Climate Program-Water Project A7). GEBA contains the most comprehensive dataset now available for worldwide instrumentally measured surface energy fluxes. The GCM incoming shortwave radiation at the surface has been compared with more than 700 long-term monitoring stations. The ECHAM3 models show a clear tendency to overestimate the global annual-mean incoming shortwave radiation at the surface due to an underestimation of atmospheric absorption. The model-calculated global-mean surface shortwave absorption around 165 W M{sup -2} is estimated to be too high by 10-15 W m{sup -2}. A similar or higher overestimate is present in several other GCMs. Deficiencies in the clear-sky absorption of the ECHAM3 radiation scheme are proposed as a contributor to the flux discrepancies. A stand-alone validation of the radiation scheme under clear-sky conditions revealed overestimates of up to 50 W m{sup -2} for daily maximum values of incoming shortwave fluxes. Further, the lack of shortwave absorption by the model clouds is suggested to contribute to the overestimated surface shortwave radiation. There are indications that the incoming longwave radiation at the surface is underestimated in ECHAM3 and other GCMs. This largely offsets the overestimated shortwave flux in the global mean, so that the 102 W m{sup -2} calculated in ECHAM3 for the surface net radiation is considered to be a realistic value. A common feature of several GCMs is, therefore, a superficially correct simulation of global mean net radiation, as the overestimate in the shortwave balance is compensated by an underestimate in the longwave balance. 41 refs., 14 figs., 5 tabs.

Wild, M.; Oshmura, A.; Gilgen, H. [Swiss Federal Institute of Technology, Zurich (Switzerland)] [and others] [Swiss Federal Institute of Technology, Zurich (Switzerland); and others

1995-05-01

317

A validated predictive model of coronary fractional flow reserve  

PubMed Central

Myocardial fractional flow reserve (FFR), an important index of coronary stenosis, is measured by a pressure sensor guidewire. The determination of FFR, only based on the dimensions (lumen diameters and length) of stenosis and hyperaemic coronary flow with no other ad hoc parameters, is currently not possible. We propose an analytical model derived from conservation of energy, which considers various energy losses along the length of a stenosis, i.e. convective and diffusive energy losses as well as energy loss due to sudden constriction and expansion in lumen area. In vitro (constrictions were created in isolated arteries using symmetric and asymmetric tubes as well as an inflatable occluder cuff) and in vivo (constrictions were induced in coronary arteries of eight swine by an occluder cuff) experiments were used to validate the proposed analytical model. The proposed model agreed well with the experimental measurements. A least-squares fit showed a linear relation as (?p or FFR)experiment = a(?p or FFR)theory + b, where a and b were 1.08 and ?1.15 mmHg (r2 = 0.99) for in vitro ?p, 0.96 and 1.79 mmHg (r2 = 0.75) for in vivo ?p, and 0.85 and 0.1 (r2 = 0.7) for FFR. Flow pulsatility and stenosis shape (e.g. eccentricity, exit angle divergence, etc.) had a negligible effect on myocardial FFR, while the entrance effect in a coronary stenosis was found to contribute significantly to the pressure drop. We present a physics-based experimentally validated analytical model of coronary stenosis, which allows prediction of FFR based on stenosis dimensions and hyperaemic coronary flow with no empirical parameters.

Huo, Yunlong; Svendsen, Mark; Choy, Jenny Susana; Zhang, Z.-D.; Kassab, Ghassan S.

2012-01-01

318

Systematic validation of disease models for pharmacoeconomic evaluations. Swiss HIV Cohort Study.  

PubMed

Pharmacoeconomic evaluations are often based on computer models which simulate the course of disease with and without medical interventions. The purpose of this study is to propose and illustrate a rigorous approach for validating such disease models. For illustrative purposes, we applied this approach to a computer-based model we developed to mimic the history of HIV-infected subjects at the greatest risk for Mycobacterium avium complex (MAC) infection in Switzerland. The drugs included as a prophylactic intervention against MAC infection were azithromycin and clarithromycin. We used a homogenous Markov chain to describe the progression of an HIV-infected patient through six MAC-free states, one MAC state, and death. Probability estimates were extracted from the Swiss HIV Cohort Study database (1993-95) and randomized controlled trials. The model was validated testing for (1) technical validity (2) predictive validity (3) face validity and (4) modelling process validity. Sensitivity analysis and independent model implementation in DATA (PPS) and self-written Fortran 90 code (BAC) assured technical validity. Agreement between modelled and observed MAC incidence confirmed predictive validity. Modelled MAC prophylaxis at different starting conditions affirmed face validity. Published articles by other authors supported modelling process validity. The proposed validation procedure is a useful approach to improve the validity of the model. PMID:10461580

Sendi, P P; Craig, B A; Pfluger, D; Gafni, A; Bucher, H C

1999-08-01

319

Predictive validity of behavioural animal models for chronic pain  

PubMed Central

Rodent models of chronic pain may elucidate pathophysiological mechanisms and identify potential drug targets, but whether they predict clinical efficacy of novel compounds is controversial. Several potential analgesics have failed in clinical trials, in spite of strong animal modelling support for efficacy, but there are also examples of successful modelling. Significant differences in how methods are implemented and results are reported means that a literature-based comparison between preclinical data and clinical trials will not reveal whether a particular model is generally predictive. Limited reports on negative outcomes prevents reliable estimate of specificity of any model. Animal models tend to be validated with standard analgesics and may be biased towards tractable pain mechanisms. But preclinical publications rarely contain drug exposure data, and drugs are usually given in high doses and as a single administration, which may lead to drug distribution and exposure deviating significantly from clinical conditions. The greatest challenge for predictive modelling is, however, the heterogeneity of the target patient populations, in terms of both symptoms and pharmacology, probably reflecting differences in pathophysiology. In well-controlled clinical trials, a majority of patients shows less than 50% reduction in pain. A model that responds well to current analgesics should therefore predict efficacy only in a subset of patients within a diagnostic group. It follows that successful translation requires several models for each indication, reflecting critical pathophysiological processes, combined with data linking exposure levels with effect on target. LINKED ARTICLES This article is part of a themed issue on Translational Neuropharmacology. To view the other articles in this issue visit http://dx.doi.org/10.1111/bph.2011.164.issue-4

Berge, Odd-Geir

2011-01-01

320

Climate Model Datasets on Earth System Grid II (ESG II)  

DOE Data Explorer

Earth System Grid (ESG) is a project that combines the power and capacity of supercomputers, sophisticated analysis servers, and datasets on the scale of petabytes. The goal is to provide a seamless distributed environment that allows scientists in many locations to work with large-scale data, perform climate change modeling and simulation,and share results in innovative ways. Though ESG is more about the computing environment than the data, still there are several catalogs of data available at the web site that can be browsed or search. Most of the datasets are restricted to registered users, but several are open to any access. There are catalogs of datasets for: • POP (Parallel Ocean Program) • PCM (Parallel Climate Model) • NARCCAP (North American Regional Climate Change Assesment Program) • CLM (CCSM Community Land Model) • CSIM (CCSM Sea Ice Model) • CCSM POP (modified version of Parallel Ocean Program) • CCSM (Community Climate System Model)

321

Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression  

NASA Astrophysics Data System (ADS)

SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

2013-03-01

322

NAIRAS aircraft radiation model development, dose climatology, and initial validation  

NASA Astrophysics Data System (ADS)

The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway.

Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

2013-10-01

323

Geophysical Monitoring for Validation of Transient Permafrost Models (Invited)  

NASA Astrophysics Data System (ADS)

Permafrost is a widespread phenomenon at high latitudes and high altitudes and describes the permanently frozen state of the subsurface in lithospheric material. In the context of climate change, both, new monitoring and modelling techniques are required to observe and predict potential permafrost changes, e.g. the warming and degradation which may lead to the liberation of carbon (Arctic) and the destabilisation of permafrost slopes (mountains). Mountain permafrost occurrences in the European Alps are characterised by temperatures only a few degrees below zero and are therefore particularly sensitive to projected climate changes in the 21st century. Traditional permafrost observation techniques are mainly based on thermal monitoring in vertical and horizontal dimension, but they provide only weak indications of physical properties such as ice or liquid water content. Geophysical techniques can be used to characterise permafrost occurrences and to monitor their changes as the physical properties of frozen and unfrozen ground measured by geophysical techniques are markedly different. In recent years, electromagnetic, seismic but especially electrical methods have been used to continuously monitor permafrost occurrences and to detect long-term changes within the active layer and regarding the ice content within the permafrost layer. On the other hand, coupled transient thermal/hydraulic models are used to predict the evolution of permafrost occurrences under different climate change scenarios. These models rely on suitable validation data for a certain observation period, which is usually restricted to data sets of ground temperature and active layer depth. Very important initialisation and validation data for permafrost models are, however, ground ice content and unfrozen water content in the active layer. In this contribution we will present a geophysical monitoring application to estimate ice and water content and their evolution in time at a permafrost station in the Swiss Alps. These data are then used to validate the coupled mass and energy balance soil model COUP, which is used for long-term projections of the permafrost evolution in the Swiss Alps. For this, we apply the recently developed 4-phase model, which is based on simple petrophysical relationships and which uses geoelectric and seismic tomographic data sets as input data.. In addition, we use continuously measured electrical resistivity tomography data sets and soil moisture data in daily resolution to compare modelled ice content changes and geophysical observations in high temporal resolution. The results show still large uncertainties in both model approaches regarding the absolute ice content values, but much smaller uncertainties regarding the changes in ice and unfrozen water content. We conclude that this approach is well suited for the analysis of permafrost changes in both, model and monitoring studies, even though more efforts are needed for obtaining in situ ground truth data of ice content and porosity.

Hauck, C.; Hilbich, C.; Marmy, A.; Scherler, M.

2013-12-01

324

Development and validation of a realistic head model for EEG  

NASA Astrophysics Data System (ADS)

The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients present the unique opportunity to generate sources at known positions in the human brain using the depth electrodes. Known dipolar sources were created inside the human brain at known locations by injecting a weak biphasic current (sub-threshold) between alternate contacts on the depth electrode. The corresponding bioelectric fields (intracranial and scalp EEG) were recorded in patients during the injection of biphasic pulses. The in vivo depth stimulation data provides a direct test of the performance of the forward model. The factors affecting the accuracy of the intracranial measurements are quantified in a precise manner by studying the effects of including different tissue types and anisotropy. The results show that white matter anisotropy is crucial for predicting the electric fields in a precise manner for intracranial locations, thereby affecting the source reconstructions. Accurate modeling of the skull is necessary for predicting accurately the scalp measurements. In sum, with the aid of high-resolution finite element realistic head models it is possible to accurately predict electric fields generated by current sources in the brain and thus in a precise way, understand the relationship between electromagnetic measure and neuronal activity at the voxel-scale.

Bangera, Nitin Bhalchandra

325

Comparison of LIDAR system performance for alternative single-mode receiver architectures: modeling and experimental validation  

NASA Astrophysics Data System (ADS)

In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.

Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.

2013-05-01

326

Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.  

SciTech Connect

A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

2004-10-01

327

Parallel Measurement and Modeling of Transport in the DARHT II Beamline on ETA II.  

National Technical Information Service (NTIS)

To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT ...

F. W. Chambers B. A. Raymond S. Falabella B. S. Lee R. A. Richardson J. T. Weir H. A. Davis M. E. Schultze

2005-01-01

328

Improvements to and Validations of the QinetiQ Atmospheric Radiation Model (QARM)  

Microsoft Academic Search

The QinetiQ atmospheric radiation model (QARM) is a comprehensive model of the energetic radiation in the atmosphere. In this paper we report on the improvement and validation activities for this model. The improvements include the implementation of two additional cosmic ray models, new response matrix, dose rate and flight dose calculation facilities. Tests\\/validations of the model have been carried out

Fan Lei; A. Hands; S. Clucas; C. Dyer; P. Truscott

2005-01-01

329

Improvement to and Validations of the QinetiQ Atmospheric Radiation Model (QARM)  

Microsoft Academic Search

The QinetiQ atmospheric radiation model (QARM) is a comprehensive model of the energetic radiation in the atmosphere. In this paper we report on the improvement and validation activities for this model. The improvements include the implementation of two additional cosmic ray models, new response matrix, dose rate and flight dose calculation facilities. Tests\\/validations of the model have been carried out

Fan Lei; Alex Hands; Simon Clucas; Clive Dyer; Pete Truscott

2006-01-01

330

Bioaerosol optical sensor model development and initial validation  

NASA Astrophysics Data System (ADS)

This paper describes the development and initial validation of a bioaerosol optical sensor model. This model was used to help determine design parameters and estimate performance of a new low-cost optical sensor for detecting bioterrorism agents. In order to estimate sensor performance in detecting biowarfare simulants and rejecting environmental interferents, use was made of a previously reported catalog of EEM (excitation/emission matrix) fluorescence cross-section measurements and previously reported multiwavelength-excitation biosensor modeling work. In the present study, the biosensor modeled employs a single high-power 365 nm UV LED source plus an IR laser diode for particle size determination. The sensor has four output channels: IR size channel, UV elastic channel and two fluorescence channels. The sensor simulation was used to select the fluorescence channel wavelengths of 400-450 and 450-600 nm. Using these selected fluorescence channels, the performance of the sensor in detecting simulants and rejecting interferents was estimated. Preliminary measurements with the sensor are presented which compare favorably with the simulation results.

Campbell, Steven D.; Jeys, Thomas H.; Eapen, Xuan Le

2007-05-01

331

Passive millimeter-wave imaging model application and validation  

NASA Astrophysics Data System (ADS)

The military use of millimeter wave radiometers has been studied since the 1960's. It is only recently that advances in the technology have made passive millimeter wave (PMMW) systems practical. It is well established that metal targets will have a large contrast ratio versus the background in the millimeter wave (MMW) regime and that atmospheric propagation through clouds, fog and light rain is possible. The limitations have been the noise figures of the detectors, the size of the systems, and the cost of the systems. Through the advent of millimeter wave monolithic integrated circuits technology, MMW devices are becoming smaller, more sensitive, and less expensive. In addition many efforts are currently under way to develop PMMW array imaging devices. This renewed interest has likewise brought forth the need for passive millimeter wave system modeling capabilities. To fill this need, Nichols Research Corporation has developed for Eglin AFB a physics-based image synthesis code, capable of modeling the dominant effects in the MMW regime. This code has been developed to support the development of the next generation of PMMW seeker systems. This paper will describe the phenomenology of PMMW signatures, the Irma software, validation of the Irma models and the application of the models to both Air Force and Navy problems.

Blume, Bradley T.; Chenault, David B.

1997-06-01

332

First principles Candu fuel model and validation experimentation  

SciTech Connect

Many modeling projects on nuclear fuel rest on a quantitative understanding of the co-existing phases at various stages of burnup. Since the various fission products have considerably different abilities to chemically associate with oxygen, and the O/M ratio is slowly changing as well, the chemical potential (generally expressed as an equivalent oxygen partial pressure) is a function of burnup. Concurrently, well-recognized small fractions of new phases such as inert gas, noble metals, zirconates, etc. also develop. To further complicate matters, the dominant UO{sub 2} fuel phase may be non-stoichiometric and most of minor phases have a variable composition dependent on temperature and possible contact with the coolant in the event of a sheathing defect. A Thermodynamic Fuel Model to predict the phases in partially burned Candu nuclear fuel containing many major fission products has been under development. This model is capable of handling non-stoichiometry in the UO{sub 2} fluorite phase, dilute solution behaviour of significant solute oxides, noble metal inclusions, a second metal solid solution U(Pd-Rh-Ru)3, zirconate and uranate solutions as well as other minor solid phases, and volatile gaseous species. The treatment is a melding of several thermodynamic modeling projects dealing with isolated aspects of this important multi-component system. To simplify the computations, the number of elements has been limited to twenty major representative fission products known to appear in spent fuel. The proportion of elements must first be generated using SCALES-5. Oxygen is inferred from the concentration of the other elements. Provision to study the disposition of very minor fission products is included within the general treatment but these are introduced only on an as needed basis for a particular purpose. The building blocks of the model are the standard Gibbs energies of formation of the many possible compounds expressed as a function of temperature. To these data are added mixing terms associated with the appearance of the component species in particular phases. To validate the model, coulometric titration experiments relating the chemical potential of oxygen to the moles of oxygen introduced to SIMFUEL are underway. A description of the apparatus to oxidize and reduce samples in a controlled way in a H{sub 2}/H{sub 2}O mixture is presented. Existing measurements for irradiated fuel, both defected and non-defected, are also being incorporated into the validation process. (authors)

Corcoran, E.C.; Kaye, M.H.; Lewis, B.J.; Thompson, W.T. [Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Akbari, F. [Atomic Energy of Canada Limited - Chalk River Ontario, Ontario KOJ IJ0 (Canada); Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Higgs, J.D. [Atomic Energy of Canada Limited - 430 Bayside Drive, Saint John, NB E2J 1A8 (Canada); Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Verrall, R.A.; He, Z.; Mouris, J.F. [Atomic Energy of Canada Limited - Chalk River Laboratories, Chalk River Ontario, Ontario KOJ IJ0 (Canada)

2007-07-01

333

Validating clustering of molecular dynamics simulations using polymer models  

PubMed Central

Background Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers to rigorously test the utility of clustering algorithms for studying biopolymers.

2011-01-01

334

Use of Synchronized Phasor Measurements for Model Validation in ERCOT  

NASA Astrophysics Data System (ADS)

This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.

Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill

2013-05-01

335

Validation of a Deterministic Vibroacoustic Response Prediction Model  

NASA Technical Reports Server (NTRS)

This report documents the recently completed effort involving validation of a deterministic theory for the random vibration problem of predicting the response of launch pad structures in the low-frequency range (0 to 50 hertz). Use of the Statistical Energy Analysis (SEA) methods is not suitable in this range. Measurements of launch-induced acoustic loads and subsequent structural response were made on a cantilever beam structure placed in close proximity (200 feet) to the launch pad. Innovative ways of characterizing random, nonstationary, non-Gaussian acoustics are used for the development of a structure's excitation model. Extremely good correlation was obtained between analytically computed responses and those measured on the cantilever beam. Additional tests are recommended to bound the problem to account for variations in launch trajectory and inclination.

Caimi, Raoul E.; Margasahayam, Ravi

1997-01-01

336

A Data set for the Validation of Reflectance Models  

NASA Astrophysics Data System (ADS)

Three mature forest stands in the CHRIS scene of the Järvselja forest test site in Estonia have been selected for the validation of forest reflectance models. CHRIS data are supported by measurements of downward spectral fluxes and airborne reflectance measurements at the test site. Rigorous atmospheric correction of CHRIS data has been performed based on the use of spectral measurements at the test site and AERONET sun-photometer data. Airborne measurements are used for the updating of CHRIS calibration coefficients. The ground truth measurements include data on stand structure - exact positions and breast-height diameter of trees, tree crown dimensions, LAI-2000 data and hemispherical images of tree canopy. Reflectance spectra (400-1050 nm) of leaves and needles, of trunk and branch bark, and of ground vegetation have been measured.

Kuusk, A.; Kuusk, J.; Lang, M.

2008-08-01

337

Using remote sensing for validation of a large scale hydrologic and hydrodynamic model in the Amazon  

NASA Astrophysics Data System (ADS)

We present the validation of the large-scale, catchment-based hydrological MGB-IPH model in the Amazon River basin. In this model, physically-based equations are used to simulate the hydrological processes, such as the Penman Monteith method to estimate evapotranspiration, or the Moore and Clarke infiltration model. A new feature recently introduced in the model is a 1D hydrodynamic module for river routing. It uses the full Saint-Venant equations and a simple floodplain storage model. River and floodplain geometry parameters are extracted from SRTM DEM using specially developed GIS algorithms that provide catchment discretization, estimation of river cross-sections geometry and water storage volume variations in the floodplains. The model was forced using satellite-derived daily rainfall TRMM 3B42, calibrated against discharge data and first validated using daily discharges and water levels from 111 and 69 stream gauges, respectively. Then, we performed a validation against remote sensing derived hydrological products, including (i) monthly Terrestrial Water Storage (TWS) anomalies derived from GRACE, (ii) river water levels derived from ENVISAT satellite altimetry data (212 virtual stations from Santos da Silva et al., 2010) and (iii) a multi-satellite monthly global inundation extent dataset at ~25 x 25 km spatial resolution (Papa et al., 2010). Validation against river discharges shows good performance of the MGB-IPH model. For 70% of the stream gauges, the Nash and Suttcliffe efficiency index (ENS) is higher than 0.6 and at Óbidos, close to Amazon river outlet, ENS equals 0.9 and the model bias equals,-4.6%. Largest errors are located in drainage areas outside Brazil and we speculate that it is due to the poor quality of rainfall datasets in these areas poorly monitored and/or mountainous. Validation against water levels shows that model is performing well in the major tributaries. For 60% of virtual stations, ENS is higher than 0.6. But, similarly, largest errors are also located in drainage areas outside Brazil, mostly Japurá River, and in the lower Amazon River. In the latter, correlation with observations is high but the model underestimates the amplitude of water levels. We also found a large bias between model and ENVISAT water levels, ranging from -3 to -15 m. The model provided TWS in good accordance with GRACE estimates. ENS values for TWS over the whole Amazon equals 0.93. We also analyzed results in 21 sub-regions of 4 x 4°. ENS is smaller than 0.8 only in 5 areas, and these are found mostly in the northwest part of the Amazon, possibly due to same errors reported in discharge results. Flood extent validation is under development, but a previous analysis in Brazilian part of Solimões River basin suggests a good model performance. The authors are grateful for the financial and operational support from the brazilian agencies FINEP, CNPq and ANA and from the french observatories HYBAM and SOERE RBV.

Paiva, R. C.; Bonnet, M.; Buarque, D. C.; Collischonn, W.; Frappart, F.; Mendes, C. B.

2011-12-01

338

A cross-validation deletion-substitution-addition model selection algorithm: Application to marginal structural models  

Microsoft Academic Search

The cross-validation deletion–substitution–addition (cvDSA) algorithm is based on data-adaptive estimation methodology to select and estimate marginal structural models (MSMs) for point treatment studies as well as models for conditional means where the outcome is continuous or binary. The algorithm builds and selects models based on user-defined criteria for model selection, and utilizes a loss function-based estimation procedure to distinguish between

Thaddeus J. Haight; Yue Wang; Mark J. van der Laan; Ira B. Tager

2010-01-01

339

A third-generation wave model for coastal regions 1. Model description and validation  

Microsoft Academic Search

A third-generation numerical wave model to compute random, short-crested waves in coastal regions with shallow water and ambient currents (Simulating Waves Nearshore (SWAN)) has been developed, implemented, and validated. The model is based on a Eulerian formulation of the discrete spectral balance of action density that accounts for refractive propagation over arbitrary bathymetry and current fields. It is driven by

N. Booij; R. C. Ris; L. H. Holthuijsen

1999-01-01

340

Distributed hydrological modelling of a Mediterranean mountainous catchment – Model construction and multi-site validation  

Microsoft Academic Search

A multi-site validation approach is necessary to further constrain distributed hydrological models. Such an approach has been tested on the Gardon catchment located in the mountainous Mediterranean zone of southern France using data gathered over a 10 year period on nine internal subcatchments. A spatially distributed hydrological model linked to a Geographical Information System, was developed on the basis of

Roger Moussa; Nanée Chahinian; Claude Bocquillon

2007-01-01

341

Circulation Control Model Experimental Database for CFD Validation  

NASA Technical Reports Server (NTRS)

A 2D circulation control wing was tested in the Basic Aerodynamic Research Tunnel at the NASA Langley Research Center. A traditional circulation control wing employs tangential blowing along the span over a trailing-edge Coanda surface for the purpose of lift augmentation. This model has been tested extensively at the Georgia Tech Research Institute for the purpose of performance documentation at various blowing rates. The current study seeks to expand on the previous work by documenting additional flow-field data needed for validation of computational fluid dynamics. Two jet momentum coefficients were tested during this entry: 0.047 and 0.114. Boundary-layer transition was investigated and turbulent boundary layers were established on both the upper and lower surfaces of the model. Chordwise and spanwise pressure measurements were made, and tunnel sidewall pressure footprints were documented. Laser Doppler Velocimetry measurements were made on both the upper and lower surface of the model at two chordwise locations (x/c = 0.8 and 0.9) to document the state of the boundary layers near the spanwise blowing slot.

Paschal, Keith B.; Neuhart, Danny H.; Beeler, George B.; Allan, Brian G.

2012-01-01

342

Validation of an Acoustic Impedance Prediction Model for Skewed Resonators  

NASA Technical Reports Server (NTRS)

An impedance prediction model was validated experimentally to determine the composite impedance of a series of high-aspect ratio slot resonators incorporating channel skew and sharp bends. Such structures are useful for packaging acoustic liners into constrained spaces for turbofan noise control applications. A formulation of the Zwikker-Kosten Transmission Line (ZKTL) model, incorporating the Richards correction for rectangular channels, is used to calculate the composite normalized impedance of a series of six multi-slot resonator arrays with constant channel length. Experimentally, acoustic data was acquired in the NASA Langley Normal Incidence Tube over the frequency range of 500 to 3500 Hz at 120 and 140 dB OASPL. Normalized impedance was reduced using the Two-Microphone Method for the various combinations of channel skew and sharp 90o and 180o bends. Results show that the presence of skew and/or sharp bends does not significantly alter the impedance of a slot resonator as compared to a straight resonator of the same total channel length. ZKTL predicts the impedance of such resonators very well over the frequency range of interest. The model can be used to design arrays of slot resonators that can be packaged into complex geometries heretofore unsuitable for effective acoustic treatment.

Howerton, Brian M.; Parrott, Tony L.

2009-01-01

343

Verification and Validation of Core Mechanical Performance Code ARKAS with IAEA Benchmark Problems, (II)  

Microsoft Academic Search

Verification and validation of the “ARKAS” code, using problems defined in the IWGFR Coordinated Research Programme (CRP) for the comparison of LMFBR Core Mechanics Codes, are discussed. The problems to be used in the verification (code against code) and validation (code against experiment) were defined and calculated by 11 core mechanics codes from 9 countries. The solutions obtained by these

Masatoshi NAKAGAWA

1993-01-01

344

User's guide for the stock-recruitment model validation program. Environmental Sciences Division Publication No. 1985  

SciTech Connect

SRVAL is a FORTRAN IV computer code designed to aid in assessing the validity of curve-fits of the linearized Ricker stock-recruitment model, modified to incorporate multiple-age spawners and to include an environmental variable, to variously processed annual catch-per-unit effort statistics for a fish population. It is sometimes asserted that curve-fits of this kind can be used to determine the sensitivity of fish populations to such man-induced stresses as entrainment and impingement at power plants. The SRVAL code was developed to test such assertions. It was utilized in testimony written in connection with the Hudson River Power Case (US Environmental Protection Agency, Region II). This testimony was recently published as a NUREG report. Here, a user's guide for SRVAL is presented.

Christensen, S.W.; Kirk, B.L.; Goodyear, C.P.

1982-06-01

345

Forces in the Shoulder Joint: on validation of musculoskeletal shoulder models  

Microsoft Academic Search

Detailed information about muscle forces in the human musculoskeletal system are highly demanded for several applications. Unfortunately, the measurement of muscle forces in-vivo is hardly possible. To date, musculoskeletal models are best alternative for the direct measurement of these forces. A major concern in musculoskeletal modeling is, however, model validity. To validate a model we need to compare its predictions

A. Asadi Nikooyan

2011-01-01

346

An evaluation of diagnostic tests and their roles in validating forest biometric models  

Microsoft Academic Search

Model validation is an important part of model development. It is performed to increase the credibility and gain sufficient confidence about a model. This paper evaluated the usefulness of 10 statistical tests, five parametric and five nonparametric, in validating forest biometric models. The five parametric tests are the paired t test, the ?2 test, the separate t test, the simultaneous

Yuqing Yang; Robert A. Monserud; Shongming Huang

2004-01-01

347

Validation model for Raman based skin carotenoid detection.  

PubMed

Raman spectroscopy holds promise as a rapid objective non-invasive optical method for the detection of carotenoid compounds in human tissue in vivo. Carotenoids are of interest due to their functions as antioxidants and/or optical absorbers of phototoxic light at deep blue and near UV wavelengths. In the macular region of the human retina, carotenoids may prevent or delay the onset of age-related tissue degeneration. In human skin, they may help prevent premature skin aging, and are possibly involved in the prevention of certain skin cancers. Furthermore, since carotenoids exist in high concentrations in a wide variety of fruits and vegetables, and are routinely taken up by the human body through the diet, skin carotenoid levels may serve as an objective biomarker for fruit and vegetable intake. Before the Raman method can be accepted as a widespread optical alternative for carotenoid measurements, direct validation studies are needed to compare it with the gold standard of high performance liquid chromatography. This is because the tissue Raman response is in general accompanied by a host of other optical processes which have to be taken into account. In skin, the most prominent is strongly diffusive, non-Raman scattering, leading to relatively shallow light penetration of the blue/green excitation light required for resonant Raman detection of carotenoids. Also, sizable light attenuation exists due to the combined absorption from collagen, porphyrin, hemoglobin, and melanin chromophores, and additional fluorescence is generated by collagen and porphyrins. In this study, we investigate for the first time the direct correlation of in vivo skin tissue carotenoid Raman measurements with subsequent chromatography derived carotenoid concentrations. As tissue site we use heel skin, in which the stratum corneum layer thickness exceeds the light penetration depth, which is free of optically confounding chromophores, which can be easily optically accessed for in vivo RRS measurement, and which can be easily removed for subsequent biochemical measurements. Excellent correlation (coefficient R=0.95) is obtained for this tissue site which could serve as a model site for scaled up future validation studies of large populations. The obtained results provide proof that resonance Raman spectroscopy is a valid non-invasive objective methodology for the quantitative assessment of carotenoid antioxidants in human skin in vivo. PMID:20678465

Ermakov, Igor V; Gellermann, Werner

2010-12-01

348

Simplified modeling of the EBR-II control rods  

Microsoft Academic Search

Simplified models of EBR-II control and safety rods have been developed for core modeling under various operational and shutdown conditions. A parametric study was performed on normal worth, high worth, and safety rod type control rods. A summary of worth changes due to individual modeling approximations is tabulated. Worth effects due to structural modeling simplification are negligible. Fuel region homogenization

1995-01-01

349

Calibration and Validation of Airborne InSAR Geometric Model  

NASA Astrophysics Data System (ADS)

The image registration or geo-coding is a very important step for many applications of airborne interferometric Synthetic Aperture Radar (InSAR), especially for those involving Digital Surface Model (DSM) generation, which requires an accurate knowledge of the geometry of the InSAR system. While the trajectory and attitude instabilities of the aircraft introduce severe distortions in three dimensional (3-D) geometric model. The 3-D geometrical model of an airborne SAR image depends on the SAR processor itself. Working at squinted model, i.e., with an offset angle (squint angle) of the radar beam from broadside direction, the aircraft motion instabilities may produce distortions in airborne InSAR geometric relationship, which, if not properly being compensated for during SAR imaging, may damage the image registration. The determination of locations of the SAR image depends on the irradiated topography and the exact knowledge of all signal delays: range delay and chirp delay (being adjusted by the radar operator) and internal delays which are unknown a priori. Hence, in order to obtain reliable results, these parameters must be properly calibrated. An Airborne InSAR mapping system has been developed by the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS) to acquire three-dimensional geo-spatial data with high resolution and accuracy. To test the performance of the InSAR system, the Validation/Calibration (Val/Cal) campaign has carried out in Sichun province, south-west China, whose results will be reported in this paper.

Chunming, Han; huadong, Guo; Xijuan, Yue; Changyong, Dou; Mingming, Song; Yanbing, Zhang

2014-03-01

350

Validation of a New Rainbow Model Over the Hawaiian Islands  

NASA Astrophysics Data System (ADS)

A new realistic model of the rainbow has been developed at the CNRM. It is based on the Airy theory. The main entry parameters are the droplet size distribution, the angle of the sun above the horizon, the temperature of the droplets and the wavelength. The island of Hawaii seems to be a perfect place for the validation of the rainbow model. Not only because of its famous rainbows, but also because of the convenient ring road along the coast. The older lower islands for more frequent viewing opportunities having to do with the proximity of clear sky to heavy rainfall. Both Oahu and Kauai as well as the western part of Maui have coastal roads that offer good access to rainbows. The best time to view rainbows is when the sun angle is lowest, in other words near the winter solstice. Figure 1 = Map of mean annual rainfall for the islands of Kauai and Oahu, developed from the new 2011 Rainfall Atlas of Hawaii. The base period of the statistics is 1978-2007. Figure 2 = Moisture zone map by Gon et al (1998). Blue areas are the wet ones. Green areas are the Mesic ones. Yellow areas are the dry ones.

Ricard, J. L.; Adams, P. L.; Barckike, J.

2012-12-01

351

Validating the topographic climatology logic of the MTCLIM model  

SciTech Connect

The topographic climatology logic of the MTCLIM model was validated using a comparison of modeled air temperatures vs. remotely sensed, thermal infrared (TIR) surface temperatures from three Daedalus Thematic Mapper Simulator scenes. The TIR data was taken in 1990 near Sisters, Oregon, as part of the NASA OTTER project. The original air temperature calculation method was modified for the spatial context of this study. After stratifying by canopy closure and relative solar loading, r{sup 2} values of 0.74, 0.89, and 0.97 were obtained for the March, June, and August scenes, respectively, using a modified air temperature algorithm. Consistently lower coefficients of determination were obtained using the original air temperature algorithm on the same data r{sup 2} values of .070, .52, and .66 for the March, June, and August samples respectively. The difficulties of comparing screen height air temperatures with remotely sensed surface temperatures are discussed, and several ideas for follow-on studies are suggested.

Glassy, J.M.; Running, S.W. [Univ. of Montana, Missoula, MT (United States)

1995-06-01

352

Narrowband VLF observations as validation of Plasmaspheric model  

NASA Astrophysics Data System (ADS)

PLASMON is a European Union FP7 project which will use observations of whistlers and field line resonances to construct a data assimilative model of the plasmasphere. This model will be validated by comparison with electron precipitation data derived from narrowband VLF observations of subionospheric propagation from the AARDDVARK network. A VLF receiver on Marion Island, located at 46.9° S 37.1° E (L = 2.60), is able to observe the powerful NWC transmitter in Australia over a 1.4 < L < 3.0 path which passes exclusively over the ocean. The signal is thus very strong and exhibits an excellent signal-to-noise ratio. Data from the UltraMSK narrowband VLF receiver on Marion Island are used to examine evidence of particle precipitation along this path, thereby inferring the rate at which electrons are scattered into the bounce loss cone. This path covers a small range of L-values so that there is little ambiguity in the source of any peturbations. Perturbations detected on the path during geomagnetic storms should predominantly be responses to energetic electron precipitation processes occurring inside the plasmasphere. Comparisons will be made to preliminary plasmaspheric results from the PLASMON project.

Collier, Andrew; Clilverd, Mark; Rodger, C. J.; Delport, Brett; Lichtenberger, János

2012-07-01

353

Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model  

NASA Technical Reports Server (NTRS)

This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

MacNeice, Peter

2009-01-01

354

A numerical model on transient, two-dimensional flow and heat transfer in He II  

NASA Astrophysics Data System (ADS)

A new numerical model is developed to study the unique features of flow and heat transfer in superfluid helium or He II. The model, called the simplified model, is derived from the original two-fluid model. It consists of a conventional continuity equation, a momentum equation for the total fluid in the form of a modified Navier-Stokes equation, and an energy equation in the form of the conventional temperature-based energy equation, in which the heat flux due to Gorter-Mellink internal convection is properly incorporated. To verify the validity of the simplified model, the analytical results by the simplified model are compared with those by the original two-fluid model in the analysis of one-dimensional heat transfer in a vertical He II duct heated at the bottom boundary. To demonstrate the capability of the present model for multi-dimensional problems, two-dimensional analysis is performed for internal-convection heat transfer in an He II pool with one of the walls partially heated. The two-dimensional results obtained by the present model are also compared with that by the modified two-dimensional model by Ramadan and Witt.

Kitamura, T.; Shiramizu, K.; Fujimoto, N.; Rao, Y. F.; Fukuda, K.

355

Using Laboratory Magnetospheres to Develop and Validate Space Weather Models  

NASA Astrophysics Data System (ADS)

Reliable space weather predictions can be used to plan satellite operations, predict radio outages, and protect the electrical transmission grid. While direct observation of the solar corona and satellite measurements of the solar wind give warnings of possible subsequent geomagnetic activity, more accurate and reliable models of how solar fluxes effect the earth's space environment are needed. The recent development in laboratory magnetic dipoles have yielded well confined high-beta plasmas with intense energetic electron belts similar to magnetospheres. With plasma diagnostics spanning from global to small spatial scales and user-controlled experiments, these devices can be used to study current issues in space weather such as fast particle excitation and rapid depolarization events. In levitated dipole experiments, which remove the collisional loss along field lines that normally dominate laboratory dipole plasmas, slow radial convection processes can be observed. We describe ongoing experiments and investigations that (i) control interchange mixing through application of vorticity injection, (ii) make whole-plasma, high-speed images of turbulent plasma dynamics, (iii) simulate nonlinear gyrokinetic dynamics of bounded driven dipole plasma, and (iv) compare laboratory plasma measurements and global convection models.; Photographs of the LDX and CTX Laboratory Magnetospheres. Trapped plasma and energetic particles are created and studied with a variety of imaging diagnostics. Shown to the right are multiple probes for simultaneous measurements of plasma structures and turbulent mixing.

Mauel, M. E.; Garnier, D.; Kesner, J.

2012-12-01

356

Experimental validation of 2D profile photoresist shrinkage model  

NASA Astrophysics Data System (ADS)

For many years, lithographic resolution has been the main obstacle in allowing the pace of transistor densification to meet Moore's Law. For the 32 nm node and beyond, new lithography techniques will be used, including immersion ArF (iArF) lithography and extreme ultraviolet lithography (EUVL). As in the past, these techniques will use new types of photoresists with the capability to print smaller feature widths and pitches. These smaller feature sizes will also require the use of thinner layers of photoresists, such as under 100 nm. In previous papers, we focused on ArF and iArF photoresist shrinkage. We evaluated the magnitude of shrinkage for both R&D and mature resists as a function of chemical formulation, lithographic sensitivity, scanning electron microscope (SEM) beam condition, and feature size. Shrinkage results were determined by the well accepted methodology described in SEMATECH's CD-SEM Unified Specification. In other associated works, we first developed a 1-D model for resist shrinkage for the bottom linewidth and then a 2-D profile model that accounted for shrinkage of all aspects of a trapezoidal profile along a given linescan. A fundamental understanding of the phenomenology of the shrinkage trends was achieved, including how the shrinkage behaves differently for different sized and shaped features. In the 1-D case, calibration of the parameters to describe the photoresist material and the electron beam was all that was required to fit the models to real shrinkage data, as long as the photoresist was thick enough that the beam could not penetrate the entire layer of resist. The later 2-D model included improvements for solving the CD shrinkage in thin photoresists, which is now of great interest for upcoming realistic lithographic processing to explore the change in resist profile with electron dose and to predict the influence of initial resist profile on shrinkage characteristics. The 2-D model also included shrinkage due to both the primary electron beam directly impacting the profile and backscattered electrons from the electron beam impacting the surrounding substrate. This dose from backscattering was shown to be an important component in the resist shrinkage process, such that at lower beam energies, it dominates linewidth shrinkage. In this work, results from a previous paper will be further explored with numerically simulated results and compared to experimental results to validate the model. With these findings, we can demonstrate the state of readiness of these models for predicting the shrinkage characteristics of photoresist measurements and estimating the errors in calculating the original CD from the shrinkage trend.

Bunday, Benjamin; Cordes, Aaron; Self, Andy; Ferry, Lorena; Danilevsky, Alex

2011-03-01

357

Nonparametric model validations for hidden Markov models with applications in financial econometrics  

PubMed Central

We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

Zhao, Zhibiao

2011-01-01

358

Mitigation of turbidity currents in reservoirs with passive retention systems: validation of CFD modeling  

NASA Astrophysics Data System (ADS)

Sediment deposition by continuous turbidity currents may affect eco-environmental river dynamics in natural reservoirs and hinder the maneuverability of bottom discharge gates in dam reservoirs. In recent years, innovative techniques have been proposed to enforce the deposition of turbidity further upstream in the reservoir (and away from the dam), namely, the use of solid and permeable obstacles such as water jet screens , geotextile screens, etc.. The main objective of this study is to validate a computational fluid dynamics (CFD) code applied to the simulation of the interaction between a turbidity current and a passive retention system, designed to induce sediment deposition. To accomplish the proposed objective, laboratory tests were conducted where a simple obstacle configuration was subjected to the passage of currents with different initial sediment concentrations. The experimental data was used to build benchmark cases to validate the 3D CFD software ANSYS-CFX. Sensitivity tests of mesh design, turbulence models and discretization requirements were performed. The validation consisted in comparing experimental and numerical results, involving instantaneous and time-averaged sediment concentrations and velocities. In general, a good agreement between the numerical and the experimental values is achieved when: i) realistic outlet conditions are specified, ii) channel roughness is properly calibrated, iii) two equation k - ? models are employed iv) a fine mesh is employed near the bottom boundary. Acknowledgements This study was funded by the Portuguese Foundation for Science and Technology through the project PTDC/ECM/099485/2008. The first author thanks the assistance of Professor Moitinho de Almeida from ICIST and to all members of the project and of the Fluvial Hydraulics group of CEHIDRO.

Ferreira, E.; Alves, E.; Ferreira, R. M. L.

2012-04-01

359

Alaska North Slope Tundra Travel Model and Validation Study  

SciTech Connect

The Alaska Department of Natural Resources (DNR), Division of Mining, Land, and Water manages cross-country travel, typically associated with hydrocarbon exploration and development, on Alaska's arctic North Slope. This project is intended to provide natural resource managers with objective, quantitative data to assist decision making regarding opening of the tundra to cross-country travel. DNR designed standardized, controlled field trials, with baseline data, to investigate the relationships present between winter exploration vehicle treatments and the independent variables of ground hardness, snow depth, and snow slab thickness, as they relate to the dependent variables of active layer depth, soil moisture, and photosynthetically active radiation (a proxy for plant disturbance). Changes in the dependent variables were used as indicators of tundra disturbance. Two main tundra community types were studied: Coastal Plain (wet graminoid/moist sedge shrub) and Foothills (tussock). DNR constructed four models to address physical soil properties: two models for each main community type, one predicting change in depth of active layer and a second predicting change in soil moisture. DNR also investigated the limited potential management utility in using soil temperature, the amount of photosynthetically active radiation (PAR) absorbed by plants, and changes in microphotography as tools for the identification of disturbance in the field. DNR operated under the assumption that changes in the abiotic factors of active layer depth and soil moisture drive alteration in tundra vegetation structure and composition. Statistically significant differences in depth of active layer, soil moisture at a 15 cm depth, soil temperature at a 15 cm depth, and the absorption of photosynthetically active radiation were found among treatment cells and among treatment types. The models were unable to thoroughly investigate the interacting role between snow depth and disturbance due to a lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

Harry R. Bader; Jacynthe Guimond

2006-03-01

360

The role of global cloud climatologies in validating numerical models  

NASA Technical Reports Server (NTRS)

Reliable estimates of the components of the surface radiation budget are important in studies of ocean-atmosphere interaction, land-atmosphere interaction, ocean circulation and in the validation of radiation schemes used in climate models. The methods currently under consideration must necessarily make certain assumptions regarding both the presence of clouds and their vertical extent. Because of the uncertainties in assumed cloudiness, all these methods involve perhaps unacceptable uncertainties. Here, a theoretical framework that avoids the explicit computation of cloud fraction and the location of cloud base in estimating the surface longwave radiation is presented. Estimates of the global surface downward fluxes and the oceanic surface net upward fluxes were made for four months (April, July, October and January) in 1985 to 1986. These estimates are based on a relationship between cloud radiative forcing at the top of the atmosphere and the surface obtained from a general circulation model. The radiation code is the version used in the UCLA/GLA general circulation model (GCM). The longwave cloud radiative forcing at the top of the atmosphere as obtained from Earth Radiation Budget Experiment (ERBE) measurements is used to compute the forcing at the surface by means of the GCM-derived relationship. This, along with clear-sky fluxes from the computations, yield maps of the downward longwave fluxes and net upward longwave fluxes at the surface. The calculated results are discussed and analyzed. The results are consistent with current meteorological knowledge and explainable on the basis of previous theoretical and observational works; therefore, it can be concluded that this method is applicable as one of the ways to obtain the surface longwave radiation fields from currently available satellite data.

HARSHVARDHAN

1991-01-01

361

PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II  

SciTech Connect

To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model is coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.

Chambers, F W; Raymond, B A; Falabella, S; Lee, B S; Richardson, R A; Weir, J T; Davis, H A; Schultze, M E

2005-05-31

362

Validation of Community Models: Identifying Events in Space Weather Model Timelines  

NASA Technical Reports Server (NTRS)

I develop and document a set of procedures which test the quality of predictions of solar wind speed and polarity of the interplanetary magnetic field (IMF) made by coupled models of the ambient solar corona and heliosphere. The Wang-Sheeley-Arge (WSA) model is used to illustrate the application of these validation procedures. I present an algorithm which detects transitions of the solar wind from slow to high speed. I also present an algorithm which processes the measured polarity of the outward directed component of the IMF. This removes high-frequency variations to expose the longer-scale changes that reflect IMF sector changes. I apply these algorithms to WSA model predictions made using a small set of photospheric synoptic magnetograms obtained by the Global Oscillation Network Group as input to the model. The results of this preliminary validation of the WSA model (version 1.6) are summarized.

MacNeice, Peter

2009-01-01

363

Using Process Algebra to Validate Behavioral Aspects of Object-Oriented Models  

Microsoft Academic Search

We present in this paper a rigorous and automated based approach for the behavioral validation of control software systems. This approach relies on metamodeling, model-transformations and process algebra and combines semi- formal object-oriented models with formal validation. We perform the validation of behavioral aspects of object-oriented models by using a projection into a well- defined formal technical space (Finite State

Alban Rasse; Jean-marc Perronne; Pierre-alain Muller; Bernard Thirion

2005-01-01

364

Model of the Expansion of H II Region RCW 82  

NASA Astrophysics Data System (ADS)

This paper aims to resolve the problem of formation of young objects observed in the RCW 82 H II region. In the framework of a classical trigger model the estimated time of fragmentation is larger than the estimated age of the H II region. Thus the young objects could not have formed during the dynamical evolution of the H II region. We propose a new model that helps resolve this problem. This model suggests that the H II region RCW 82 is embedded in a cloud of limited size that is denser than the surrounding interstellar medium. According to this model, when the ionization-shock front leaves the cloud it causes the formation of an accelerating dense gas shell. In the accelerated shell, the effects of the Rayleigh-Taylor (R-T) instability dominate and the characteristic time of the growth of perturbations with the observed magnitude of about 3 pc is 0.14 Myr, which is less than the estimated age of the H II region. The total time t ?, which is the sum of the expansion time of the H II region to the edge of the cloud, the time of the R-T instability growth, and the free fall time, is estimated as 0.44 < t ? < 0.78 Myr. We conclude that the young objects in the H II region RCW 82 could be formed as a result of the R-T instability with subsequent fragmentation into large-scale condensations.

Krasnobaev, K. V.; Tagirova, R. R.; Kotova, G. Yu.

2014-05-01

365

Predicted Ligands for the Human Urotensin-II G?Protein-Coupled Receptor with Some Experimental Validation.  

PubMed

Human Urotensin-II (U-II) is the most potent mammalian vasoconstrictor known.1 Thus, a U-II antagonist would be of therapeutic value in a number of cardiovascular disorders.2 Here, we describe our work on the prediction of the structure of the human U-II receptor (hUT2 R) using GEnSeMBLE (GPCR Ensemble of Structures in Membrane BiLayer Environment) complete sampling Monte Carlo method. With the validation of our predicted structures, we designed a series of new potential antagonists predicted to bind more strongly than known ligands. Next, we carried out R-group screening to suggest a new ligand predicted to bind with 7?kcal?mol(-1) better energy than 1-{2-[4-(2-bromobenzyl)-4-hydroxypiperidin-1-yl]ethyl}-3-(thieno[3,2-b]pyridin-7-yl)urea, the designed antagonist predicted to have the highest affinity for the receptor. Some of these predictions were tested experimentally, validating the computational results. Using the pharmacophore generated from the predicted structure for hUT2 R bound to ACT-058362, we carried out virtual screening based on this binding site. The most potent hit compounds identified contained 2-(phenoxymethyl)-1,3,4-thiadiazole core, with the best derivative exhibiting an IC50 value of 0.581??M against hUT2 R when tested in?vitro. Our efforts identified a new scaffold as a potential new lead structure for the development of novel hUT2 R antagonists, and the computational methods used could find more general applicability to other GPCRs. PMID:24989481

Kim, Soo-Kyung; Goddard, William A; Yi, Kyu Yang; Lee, Byung Ho; Lim, Chae Jo; Trzaskowski, Bartosz

2014-08-01

366

Multi-Model Validation in the Chesapeake Bay Region in June 2010.  

National Technical Information Service (NTIS)

In this paper, we discuss the validation of water level and current predictions from three coastal hydrodynamic models and document the resource and operational requirements for each modeling system. The ADvanced CIRCulation Model (ADCIRC), the Navy Coast...

C. A. Blain G. A. Jacobs M. K. Cambazoglu P. Y. Chu R. S. Linzell

2013-01-01

367

Validation of qualitative models of genetic regulatory networks by model checking: analysis of the nutritional stress response in Escherichia coli  

Microsoft Academic Search

Motivation: The modeling and simulation of genetic regu- latory networks have created the need for tools for model validation. The main challenges of model validation are the achievement of a match between the precision of model pre- dictions and experimental data, as well as the efficient and reliable comparison of the predictions and observations. Results: We present an approach towards

Grégory Batt; Delphine Ropers; Hidde De Jong; Johannes Geiselmann; Radu Mateescu; Michel Page; Dominique Schneider

2005-01-01

368

Validation of a finite element model of pediatric patient-specific mandible  

Microsoft Academic Search

A finite element (FE) model of pediatric patient-specific mandible is presented as a component of craniofacial surgery planning to predict more precisely the complex biomechanical reactions under mechanical loading. Such model needs to be validated prior to application. The FE model validation, however, is a challenge since invasive tests and measurements on a pediatric patient are prohibited. This study developed

L. Zhao; P. K. Patel; G. E. O. Widera; G. F. Harris

2003-01-01

369

Validation of hydrometeor occurrence predicted by the ECMWF model using millimeter wave radar data  

Microsoft Academic Search

Validation of hydrometeor prediction by global models is an important issue as it pertains to the accuracy of climate predictions. In this study we use data from a continuously operating millimeter wave radar at a research site in north central Oklahoma, USA to validate output from the operational ECMWF forecast model. We demonstrate that the ECMWF model shows good overall

Gerald G. Mace; Christian Jakob; Kenneth P. Moran

1998-01-01

370

Validation of hydrometeor occurrence predicted by the ECMWF Model using millimeter wave radar data  

Microsoft Academic Search

Validation of hydrometeor prediction by global models is an important issue as it pertains to the accuracy of climate predictions. In this study we use data from a continuously operating millimeter wave radar at a research site in north central Oklahoma, USA to validate output from the operational ECMWF forecast model. We demonstrate that the ECMWF model shows good overall

Gerald G. Mace; Christian Jakob; Kenneth P. Moran

1998-01-01

371

Experiments for calibration and validation of plasticity and failure material modeling: 304L stainless steel.  

SciTech Connect

Experimental data for material plasticity and failure model calibration and validation were obtained from 304L stainless steel. Model calibration data were taken from smooth tension, notched tension, and compression tests. Model validation data were provided from experiments using thin-walled tube specimens subjected to path dependent combinations of internal pressure, extension, and torsion.

Lee, Kenneth L.; Korellis, John S.; McFadden, Sam X.

2006-01-01

372

Validation of population-based disease simulation models: a review of concepts and methods  

Microsoft Academic Search

BACKGROUND: Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. METHODS: We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of

Jacek A Kopec; Philippe Finès; Douglas G Manuel; David L Buckeridge; William M Flanagan; Jillian Oderkirk; Michal Abrahamowicz; Samuel Harper; Behnam Sharif; Anya Okhmatovskaia; Eric C Sayre; M Mushfiqur Rahman; Michael C Wolfson

2010-01-01

373

Verification and validation of a SSM model dedicated to mode handling of flexible manufacturing systems  

Microsoft Academic Search

This paper focuses on verification and validation of a model dedicated to mode handling of flexible manufacturing systems (FMSs). This model is specified using the synchronous formalism safe state machines (SSMs). The rigorous semantics that characterize such formalism enable to provide formal verification mechanisms ensuring determinism and dependability. A structured framework for verification and validation of the model dedicated to

Nadia Hamani; Nathalie Dangoumau; Etienne Craye

2009-01-01

374

Calibration and validation of a three-dimensional subsurface irrigation hydrology model  

Microsoft Academic Search

An enhanced subsurface irrigation hydrology model, developed by Buyuktas & Wallender (Journal of Irrigation and Drainage Engineering, ASCE 128(3): 71–81), is calibrated and validated using 2 years of data collected in a field in Broadview Water District in California, USA. The first year data is used to calibrate the model, while the second year data is used for model validation.

D. Buyuktas; W. W. Wallender; R. C. Soppe; J. E. Ayars; B. Sivakumar

2004-01-01

375

Validation analysis of probabilistic models of dietary exposure to food additives  

Microsoft Academic Search

The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above ‘true’ additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for

M. B. Gilsenan; R. L. Thompson; J. Lambe; M. J. Gibney

2003-01-01

376

Modeling local paleoclimates and validation in the southwest United States  

SciTech Connect

In order to evaluate the spatial and seasonal variations of paleoclimate in the southwest US, a local climate model (LCM) is developed that computes modern and 18,000 yr B.P. (18 ka) monthly temperature and precipitation from a set of independent variables. Independent variables include: terrain elevation, insolation, CO[sub 2] concentration, January and July winds, and January and July sea-surface temperatures. Solutions are the product of a canonical regression function which is calibrated using climate data from 641 stations from AZ, CA, CO, NM, NV, UT in the National Weather Service Cooperative observer network. Validation of the LCH, using climate data at 98 climate stations from the period 1980--1984, indicates no significant departures of LCM solutions from climate data. LCM solutions of modern and 18 ka climate are computed at a 15 km spacing over a rectangular domain extending 810 km east, 360 km west, 225 km north and 330 km south of the approximate location of Yucca Mt., KV. Solutions indicate mean annual temperature was 5[degrees]C cooler at 18 ka and mean annual precipitation increased 68%. The annual cycle of temperature and precipitation at 18 ka was amplified with summers about 1[degrees]C cooler and 71% drier, and winters about 11[degrees]C colder and 35% wetter than the modern. Model results compare quite reasonably with proxy paleoclimate estimates from glacial deposits, pluvial lake deposits, pollen records, ostracodes records and packrat madden records from the southwest US However, bias (+5[degrees]C to +10[degrees]C) is indicated for LCM solutions of summer temperatures at 18 ka.

Stamm, J.F.

1992-01-01

377

Income Determination Input Output Model (IDIOM) II: A User's Manual.  

National Technical Information Service (NTIS)

This manual provides documentation on the theory, use, operation, and interpretation of IDIOM II--Income Determination Input-Output Model--an interactive, inter-industry, national-regional policy evaluation model. The model, which is based on an 86-indust...

D. A. Updegrove

1977-01-01

378

REACTIVE PLUME MODEL--RPM-II: USER'S GUIDE  

EPA Science Inventory

The Reactive Plume Model (RPM-II) is a computerized model used primarily for estimating short-term concentrations of primary and secondary pollutants resulting from point-source emissions. Two main features of the model are (1) its chemical kinetic mechanism, which explicitly sol...

379

Towards an artificial model for Photosystem II: a manganese(II,II) dimer covalently linked to ruthenium(II) tris-bipyridine via a tyrosine derivative.  

PubMed

In order to model the individual electron transfer steps from the manganese cluster to the photooxidized sensitizer P680+ in Photosystem II (PS II) in green plants, the supramolecular complex 4 has been synthesized. In this complex, a ruthenium(II) tris-bipyridine type photosensitizer has been linked to a manganese(II) dimer via a substituted L-tyrosine, which bridges the manganese ions. The trinuclear complex 4 was characterized by electron paramagnetic resonance (EPR) and electrospray ionization mass spectrometry (ESI-MS). The excited state lifetime of the ruthenium tris-bipyridine moiety in 4 was found to be about 110 ns in acetonitrile. Using flash photolysis in the presence of an electron acceptor (methylviologen), it was demonstrated that in the supramolecular complex 4 an electron was transferred from the excited state of the ruthenium tris-bipyridine moiety to methylviologen, forming a methylviologen radical and a ruthenium(III) tris-bipyridine moiety. Next, the Ru(III) species retrieved the electron from the manganese(II/II) dimer in an intramolecular electron transfer reaction with a rate constant kET > 1.0 x 10(7) s(-1), generating a manganese(II/III) oxidation state and regenerating the ruthenium(II) photosensitizer. This is the first example of intramolecular electron transfer in a supramolecular complex, in which a manganese dimer is covalently linked to a photosensitizer via a tyrosine unit, in a process which mimics the electron transfer on the donor side of PS II. PMID:10714701

Sun, L; Raymond, M K; Magnuson, A; LeGourriérec, D; Tamm, M; Abrahamsson, M; Kenéz, P H; Mårtensson, J; Stenhagen, G; Hammarström, L; Styring, S; Akermark, B

2000-01-15

380

Autoimmunity to type II collagen an experimental model of arthritis  

PubMed Central

We have found that intradermal injection of native type II collagen extracted from human, chick or rat cartilage induces an inflammatory arthritis in approximately 40% of rats of several strains whether complete Freund's adjuvant or incomplete Freund's adjuvant is used. Type I or III collagen extracted from skin, cartilage proteoglycans and alpha1(II) chains were incapable of eliciting arthritis, as was type II collagen injected without adjuvant. The disease is a chronic proliferative synovitis, resembling adjuvant arthritis in rats and rheumatoid arthritis in humans. Native type II co-lagen modified by limited pepsin digestion still produces arthritis, suggesting that type- specific determinants residing in the helical region of the molecule are responsible for the induction of disease. Since homologous type II collagen emulsified in oil without bacterial preparations regularly causes the disease, this new animal model of arthritis represents a unique example of experimentally-inducible autoimmunity to a tissue component.

1977-01-01

381

Test cell modeling and optimization for FPD-II  

SciTech Connect

The Fusion Power Demonstration, Configuration II (FPD-II), will ba a DT burning tandem mirror facility with thermal barriers, designed as the next step engineering test reactor (ETR) to follow the tandem mirror ignition test machines. Current plans call for FPD-II to be a multi-purpose device. For approximately the first half of its lifetime, it will operate as a high-Q ignition machine designed to reach or exceed engineering break-even and to demonstrate the technological feasibility of tandem mirror fusion. The second half of its operation will focus on the evaluation of candidate reactor blanket designs using a neutral beam driven test cell inserted at the midplane of the 90 m long cell. This machine called FPD-II+T, uses an insert configuration similar to that used in the MFTF-..cap alpha..+T study. The modeling and optimization of FPD-II+T are the topic of the present paper.

Haney, S.W.; Fenstermacher, M.E.

1985-04-10

382

Modeling and experimental validation of unsteady impinging flames  

SciTech Connect

This study reports on a joint experimental and analytical study of premixed laminar flames impinging onto a plate at controlled temperature, with special emphasis on the study of periodically oscillating flames. Six types of flame structures were found, based on parametric variations of nozzle-to-plate distance (H), jet velocity (U), and equivalence ratio (f). They were classified as conical, envelope, disc, cool central core, ring, and side-lifted flames. Of these, the disc, cool central core, and envelope flames were found to oscillate periodically, with frequency and sound pressure levels increasing with Re and decreasing with nozzle-to-plate distance. The unsteady behavior of these flames was modeled using the formulation derived by Durox et al. [D. Durox, T. Schuller, S. Candel, Proc. Combust. Inst. 29 (2002) 69-75] for the cool central core flames where the convergent burner acts as a Helmholtz resonator, driven by an external pressure fluctuation dependent on a velocity fluctuation at the burner mouth after a convective time delay {tau}. Based on this model, the present work shows that {tau} = [Re[2jtanh{sup -1}((2{delta}{omega}+(1+N)j{omega}{sup 2}-j{omega}{sub 0}{sup 2})/ (2{delta}{omega}+(1-N)j{omega}{sup 2}-j{omega}{sub 0}{sup 2}))]+2{pi}K]/{omega}, i.e., there is a relation between oscillation frequency ({omega}), burner acoustic characteristics ({omega}{sub 0},{delta}), and time delay {tau}, not explicitly dependent on N, the flame-flow normalized interaction coefficient [D. Durox, T. Schuller, S. Candel, Proc. Combust. Inst. 29 (2002) 69-75], because {partial_derivative}t/{partial_derivative}N = 0. Based on flame motion and noise analysis, K was found to physically represent the integer number of perturbations on flame surface or number of coherent structures on impinging jet. Additionally, assuming that {tau}={beta}H/U, where H is the nozzle-to-plate distance and U is the mean jet velocity, it is shown that {beta}{sub Disc}=1.8, {beta}{sub CCC}=1.03, and {beta}{sub Env}=1.0. A physical analysis of the proportionality constant {beta} showed that for the disc flames, {tau} corresponds to the ratio between H and the velocity of the coherent structures. In the case of envelope and cool central core flames, {tau} corresponds to the ratio between H and the mean jet velocity. The predicted frequency fits the experimental data, supporting the validity of the mathematical modeling, empirical formulation, and assumptions made. (author)

Fernandes, E.C.; Leandro, R.E. [Center for Innovation, Technology and Policy Research, Mechanical Engineering Department, Instituto Superior Tecnico, Av. Rovisco Pais, 1049-001 Lisboa Codex (Portugal)

2006-09-15

383

ON-BOARD PREDICTION OF POWER CONSUMPTION IN AUTOMOBILE ACTIVE SUSPENSION SYSTEMS—II: VALIDATION AND PERFORMANCE EVALUATION  

Microsoft Academic Search

The focus of this part of the paper is on validation and performance evaluation. The indirect (standard) and novel direct predictors or part I, which use time-recursive realisations and no leading indicators, are critically compared by using the non-linear active suspension system model. The results, constituting the first known comparison between indirect and direct schemes, show similar performance with a

R. Ben Mrad; S. D. Fassois; J. A. Levitt; B. I. Bachrach

1996-01-01

384

Comparing Validity and Reliability in Special Education Title II and IDEA Data  

ERIC Educational Resources Information Center

Previous researchers have found that special education teacher shortages are pervasive and exacerbated by federal policies regarding "highly qualified" teacher requirements. The authors examined special education teacher personnel data from 2 federal data sources to determine if these sources offer a reliable and valid means of…

Steinbrecher, Trisha D.; McKeown, Debra; Walther-Thomas, Chriss

2013-01-01

385

The SEGUE Stellar Parameter Pipeline. II. Validation with Galactic Globular and Open Clusters  

Microsoft Academic Search

We validate the accuracy and precision of the current SEGUE (Sloan Extension for Galactic Understanding and Exploration) Stellar Parameter Pipeline (SSPP), which determines stellar atmospheric parameters (effective temperature, surface gravity, and metallicity) and radial velocities (RVs), by comparing these estimates for selected members of three globular clusters (M 13, M 15, and M 2) and two open clusters (NGC 2420

Young Sun Lee; Timothy C. Beers; Thirupathi Sivarani; Jennifer A. Johnson; Deokkeun An; Ronald Wilhelm; Carlos Allende Prieto; Lars Koesterke; Paola Re Fiorentin; Coryn A. L. Bailer-Jones; John E. Norris; Brian Yanny; Constance Rockosi; Heidi J. Newberg; Kyle M. Cudworth; Kaike Pan

2008-01-01

386

The SEGUE Stellar Parameter Pipeline. II. Validation with Galactic Globular and Open Clusters  

Microsoft Academic Search

The authors validate the performance and accuracy of the current SEGUE (Sloan Extension for Galactic Understanding and Exploration) Stellar Parameter Pipeline (SSPP), which determines stellar atmospheric parameters (effective temperature, surface gravity, and metallicity) by comparing derived overall metallicities and radial velocities from selected likely members of three globular clusters (M 13, M 15, and M 2) and two open clusters

Y. S. Lee; T. C. Beers; T. Sivarani; J. A. Johnson; D. An; R. Wilhelm; C. Allende Prieto; L. Koesterke; P. Re Fiorentin; C. A. L. Bailer-Jones; J. E. Norris

2007-01-01

387

Family Emotional Involvement and Criticism Scale (FEICS): II. Reliability and Validity Studies  

Microsoft Academic Search

This article is a report on a replication study of the reliability and validity of the Family Emotional Involvement and Criticism Scale (FEICS). A sample of 928 people (a 63% return rate for 1,480 mailed questionnaires) participated in the study. The sample was drawn from patients in a Family Medicine practice. FEICS consists of 14 items: 7 items assess Perceived

Cleveland G. Shields; Peter Franks; Jeffrey J. Harp; Thomas L. Campbell; Susan H. McDaniel

1994-01-01

388

Effects of Risperidone on Aberrant Behavior in Persons with Developmental Disabilities: II. Social Validity Measures.  

ERIC Educational Resources Information Center

Consumer satisfaction and social validity were measured during a double-blind, placebo-controlled evaluation of risperidone in treating aberrant behaviors of persons with developmental disabilities. A survey showed all 17 caregivers felt participation was positive. Community members (n=52) also indicated that when on medication, the 5 participants…

McAdam, David B.; Zarcone, Jennifer R.; Hellings, Jessica; Napolitano, Deborah A.; Schroeder, Stephen R.

2002-01-01

389

EXPERIMENTAL VALIDATION OF A STRUCTURAL HEALTH MONITORING METHODOLOGY: PART II. NOVELTY DETECTION ON A GNAT AIRCRAFT  

Microsoft Academic Search

This paper concerns the second phase of an experimental validation programme for a structural health monitoring methodology based on novelty detection. This phase seeks to apply one of the methods considered in the first stage of the work on a more realistic structure, namely the wing of a Gnat aircraft, as opposed to the previously investigated laboratory structure. The novelty

G. Manson; K. Worden; D. Allman

2003-01-01

390

The African American Acculturation Scale II: Cross-Validation and Short Form.  

ERIC Educational Resources Information Center

Studied African American culture, using a new, shortened, 33-item African American Acculturation Scale (AAAS-33) to assess the scale's validity and reliability. Comparisons between the original form and AAAS-33 reveal high correlations, however, the longer form may be sensitive to some beliefs, practices, and attitudes not assessed by the short…

Landrine, Hope; Klonoff, Elizabeth A.

1995-01-01

391

Asymmetric Gepner models II. Heterotic weight lifting  

NASA Astrophysics Data System (ADS)

A systematic study of "lifted" Gepner models is presented. Lifted Gepner models are obtained from standard Gepner models by replacing one of the N=2 building blocks and the E factor by a modular isomorphic N=0 model on the bosonic side of the heterotic string. The main result is that after this change three family models occur abundantly, in sharp contrast to ordinary Gepner models. In particular, more than 250 new and unrelated moduli spaces of three family models are identified. We discuss the occurrence of fractionally charged particles in these spectra.

Gato-Rivera, B.; Schellekens, A. N.

2011-05-01

392

Simplified Risk Model Version II (SRM-II) Structure and Application  

SciTech Connect

The Simplified Risk Model Version II (SRM-II) is a quantitative tool for efficiently evaluating the risk from Department of Energy waste management activities. Risks evaluated include human safety and health and environmental impact. Both accidents and normal, incident-free operation are considered. The risk models are simplifications of more detailed risk analyses, such as those found in environmental impact statements, safety analysis reports, and performance assessments. However, wherever possible, conservatisms in such models have been removed to obtain best estimate results. The SRM-II is used to support DOE complex-wide environmental management integration studies. Typically such studies involve risk predictions covering the entire waste management program, including such activities as initial storage, handling, treatment, interim storage, transportation, and final disposal.

S. A. Eide; T. E. Wierman

1999-08-01

393

Simplified Risk Model Version II (SRM-II) Structure and Application  

SciTech Connect

The Simplified Risk Model Version II (SRM-II) is a quantitative tool for efficiently evaluating the risk from Department of Energy waste management activities. Risks evaluated include human safety and health and environmental impact. Both accidents and normal, incident-free operation are considered. The risk models are simplifications of more detailed risk analyses, such as those found in environmental impact statements, safety analysis reports, and performance assessments. However, wherever possible, conservatisms in such models have been removed to obtain best estimate results. The SRM-II is used to support DOE complex-wide environmental management integration studies. Typically such activities involve risk predictions including such activities as initial storage, handling, treatment, interim storage, transportation, and final disposal.

Eide, Steven Arvid; Wierman, Thomas Edward

1999-08-01

394

USU GAIM: Validation of the Ionospheric Forecasting Model (IFM) Using the TOPEX TEC Measurements  

NASA Astrophysics Data System (ADS)

As a part of the validation program in the USU GAIM project, a newly improved Ionospheric Forecasting Model (IFM) was systematically validated by using a large database of the TOPEX TEC measurements. The TOPEX data used for the validation is for the period from August 1992 to March 2003 and the total number of 18-second averaged data is close to 11 million. This model validation work covers a wide range of seasonal (winter, summer, equinox) and solar (low, median, and high F10.7) conditions as well as all UT variations. The validation results indicate that the features of the spatial distrubution of the IFM TEC are systematically consistent to those of the TOPEX TEC. The differences between the IFM TEC and the TOPEX are within 20% for almost all locations and conditions. In many conditions, the differences are even below 10%. This validation work further proves the validity of the IFM for the ionospheric assimilation in the USU GAIM project.

Zhu, L.; Schunk, R. W.; Jee, G.; Scherliess, L.; Sojka, J. J.; Thompson, D. C.

2004-12-01

395

Science Motivation Questionnaire II: Validation with Science Majors and Nonscience Majors  

ERIC Educational Resources Information Center

From the perspective of social cognitive theory, the motivation of students to learn science in college courses was examined. The students--367 science majors and 313 nonscience majors--responded to the Science Motivation Questionnaire II, which assessed five motivation components: intrinsic motivation, self-determination, self-efficacy, career…

Glynn, Shawn M.; Brickman, Peggy; Armstrong, Norris; Taasoobshirazi, Gita

2011-01-01

396

World War II Logistic Principles at the Operational Level of War: Are They Valid Today.  

National Technical Information Service (NTIS)

Current doctrine for logistic support of joint operations at the operational level of war is used as a means to analyze the historic logistic lessons learned in WW II, both in the Pacific and European Theaters of Operations. The logistic lessons learned d...

J. A. Brown

1993-01-01

397

MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom  

NASA Astrophysics Data System (ADS)

The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

398

Importance of Sea Ice for Validating Global Climate Models  

NASA Technical Reports Server (NTRS)

Reproduction of current day large-scale physical features and processes is a critical test of global climate model performance. Without this benchmark, prognoses of future climate conditions are at best speculation. A fundamental question relevant to this issue is, which processes and observations are both robust and sensitive enough to be used for model validation and furthermore are they also indicators of the problem at hand? In the case of global climate, one of the problems at hand is to distinguish between anthropogenic and naturally occuring climate responses. The polar regions provide an excellent testing ground to examine this problem because few humans make their livelihood there, such that anthropogenic influences in the polar regions usually spawn from global redistribution of a source originating elsewhere. Concomitantly, polar regions are one of the few places where responses to climate are non-anthropogenic. Thus, if an anthropogenic effect has reached the polar regions (e.g. the case of upper atmospheric ozone sensitivity to CFCs), it has most likely had an impact globally but is more difficult to sort out from local effects in areas where anthropogenic activity is high. Within this context, sea ice has served as both a monitoring platform and sensitivity parameter of polar climate response since the time of Fridtjof Nansen. Sea ice resides in the polar regions at the air-sea interface such that changes in either the global atmospheric or oceanic circulation set up complex non-linear responses in sea ice which are uniquely determined. Sea ice currently covers a maximum of about 7% of the earth's surface but was completely absent during the Jurassic Period and far more extensive during the various ice ages. It is also geophysically very thin (typically <10 m in Arctic, <3 m in Antarctic) compared to the troposphere (roughly 10 km) and deep ocean (roughly 3 to 4 km). Because of these unique conditions, polar researchers regard sea ice as one of the more important features to monitor in terms of heat, mass, and momentum transfer between the air and sea and furthermore, the impact of such responses to global climate.

Geiger, Cathleen A.

1997-01-01

399

A Process for Verifying and Validating Requirements for Fault Tolerant Systems Using Model Checking  

NASA Technical Reports Server (NTRS)

Model checking is shown to be an effective tool in validating the behavior of a fault tolerant embedded spacecraft controller. The case study presented here shows that by judiciously abstracting away extraneous complexity, the state space of the model could be exhaustively searched allowing critical functional requirement to be validated down to the design level.

Schneider, F.; Easterbrook, S.; Callahan, J.; Holzmann, G.; Reinholtz, W.; Ko, A.; Shahabuddin, M.

1999-01-01

400

Reconceptualizing the Learning Transfer Conceptual Framework: Empirical Validation of a New Systemic Model  

ERIC Educational Resources Information Center

The main purpose of this study was to examine the validity of a new systemic model of learning transfer and thus determine if a more holistic approach to training transfer could better explain the phenomenon. In all, this study confirmed the validity of the new systemic model and suggested that a high performance work system could indeed serve as…

Kontoghiorghes, Constantine

2004-01-01

401

Techniques for Down-Sampling a Measured Surface Height Map for Model Validation  

NASA Technical Reports Server (NTRS)

This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

Sidick, Erkin

2012-01-01

402

The Validity of the Job Characteristics Model: A Review and Meta-Analysis.  

ERIC Educational Resources Information Center

Assessed the validity of Hackman and Oldham's Job Characteristics Model by conducting a comprehensive review of nearly 200 relevant studies on the model as well as by applying meta-analytic procedures to much of the data. Available correlational results were reasonably valid and support the multidimensionality of job characteristics and their…

Fried, Yitzhak; Ferris, Gerald R.

1987-01-01

403

A method and application of multi-scale validation in spatial land use models  

Microsoft Academic Search

The majority of the large number of existing land use models lack a proper validation, often because of data problems. Moreover, despite recognition of the necessity to incorporate a multi-scale analysis, scale dependencies are normally not considered during validation. In this paper, a multi-scale land use change modelling framework, conversion of land use and its effects (CLUE), is calibrated for

Kasper Kok; Andrew Farrow; A. Veldkamp; Peter H. Verburg

2001-01-01

404

Revisiting the JDL data fusion model II  

Microsoft Academic Search

This paper suggests refinements and extensions of the {JDL} Data Fusion Model, the standard process model used for a multiplicity of community purposes. However, this Model has not been reviewed in accordance with (a) the dynamics of world events and (b) the changes, discoveries, and new methods in both the data fusion research and development community and related {IT} technologies.

James Llinas; Christopher Bowman; Galina Rogova; Alan Steinberg; E. Waltz; F. White

2004-01-01

405

System-analytical modelling—Part II  

Microsoft Academic Search

System-analytical modelling (SAM) shows that plants possess two types of biological time that alternate during the annual cycle of plant development. The alternation of these biotimes and the process of yield formation are described by the previously derived information principle. The elaborated model of the agroecosystems of wheat is characterized by theoretically best accuracy. The model serves as a basis

Yuri B. Kirsta

2006-01-01

406

Schizosaccharomyces pombe and its Ni(II)-insensitive mutant GA1 in Ni(II) uptake from aqueous solutions: a biodynamic model.  

PubMed

In the present study, Ni(II) uptake from aqueous solution by living cells of the Schizosaccharomyces pombe haploid 972 with h (-) mating type and a Ni(II)-insensitive mutant GA1 derived from 972 was investigated at various initial glucose and Ni(II) concentrations. A biodynamic model was developed to predict the unsteady and steady-state phases of the uptake process. Gompertz growth and uptake process parameters were optimized to predict the maximum growth rate ? m and the process metric C r, the remaining Ni(II) content in the aqueous solution. The simulated overall metal uptake values were found to be in acceptable agreement with experimental results. The model validation was done through regression statistics and uncertainty and sensitivity analyses. To gain insight into the phenomenon of Ni(II) uptake by wild-type and mutant S. pombe, probable active and passive metal transport mechanisms in yeast cells were discussed in view of the simulation results. The present work revealed the potential of mutant GA1 to remove Ni(II) cations from aqueous media. The results obtained provided new insights for understanding the combined effect of biosorption and bioaccumulation processes for metal removal and offered a possibility for the use of growing mutant S. pombe cell in bioremediation. PMID:24752843

Sayar, Nihat Alpagu; Durmaz-Sam, Selcen; Kazan, Dilek; Sayar, Ahmet Alp

2014-08-01

407

Contributions to the validation of the CJS model for granular materials  

NASA Astrophysics Data System (ADS)

Behavior model validation in the field of geotechnics is addressed, with the objective of showing the advantages and limits of the CJS (Cambou Jafari Sidoroff) behavior model for granular materials. Several levels are addressed: theoretical analysis of the CJS model to reveal consistence and first capacities; shaping (followed by validation by confrontation with other programs) of a computation code by finite elements (FINITEL) to integrate this model and prepare it for complex applications; validation of the code/model structure thus constituted by comparing its results to those of experiments in the case of nonhomogeneous (superficial foundations) problems.

Elamrani, Khadija

1992-07-01

408

Understanding and Using the Implicit Association Test: II. Method Variables and Construct Validity  

Microsoft Academic Search

The Implicit Association Test (IAT) assesses relative strengths of four associations involving two pairs of contrasted concepts (e. g., male-female and family-career). In four studies, analyses of data from 11 Web IATs, averaging 12,000 respondents per data set, supported the following conclusions: (a) sorting IAT trials into subsets does not yield conceptually distinct measures; (b) valid IAT measures can be

Brian A. Nosek; Anthony G. Greenwald; Mahzarin R. Banaji

2005-01-01

409

Using Structural Equation Modeling To Test for Differential Reliability and Validity: An Empirical Demonstration.  

ERIC Educational Resources Information Center

Demonstrates empirically a structural equation modeling technique for group comparison of reliability and validity. Data, which are from a study of 495 mothers' attitudes toward pregnancy, have a one-factor measurement model and three sets of subpopulation comparisons. (SLD)

Raines-Eudy, Ruth

2000-01-01

410

ShipIR model validation using NATO SIMVEX experiment results  

NASA Astrophysics Data System (ADS)

An infrared field trial has been conducted by a NATO science panel on IR ship signatures, TG-16. This trial was planned, designed and executed for the expressed purpose of the validation of predictive IR ship signature simulations. The details of the trial were dictated by a thoughtful validation methodology, which exploits the concept of "experimental precision." Two governmental defense laboratories, the Norwegian Defence Research Establishment and the US Naval Research Laboratory have used this trial data to perform a validation analysis on the ShipIR IR signature code. This analysis quantifies prediction accuracy of the current versions of the code and identifies specific portions of the code that need to be upgraded to improve prediction accuracy.

Fraedrich, Doug S.; Stark, Espen; Heen, Lars T.; Miller, Craig

2003-09-01

411

Validation of cell-based fluorescence assays: practice guidelines from the ICSH and ICCS - part II - preanalytical issues.  

PubMed

Flow cytometry and other technologies of cell-based fluorescence assays are as a matter of good laboratory practice required to validate all assays, which when in clinical practice may pass through regulatory review processes using criteria often defined with a soluble analyte in plasma or serum samples in mind. Recently the U.S. Food and Drug Administration (FDA) has entered into a public dialogue in the U.S. regarding their regulatory interest in laboratory developed tests (LDTs) or so-called "home brew" assays performed in clinical laboratories. The absence of well-defined guidelines for validation of cell-based assays using fluorescence detection has thus become a subject of concern for the International Council for Standardization of Haematology (ICSH) and International Clinical Cytometry Society (ICCS). Accordingly, a group of over 40 international experts in the areas of test development, test validation, and clinical practice of a variety of assay types using flow cytometry and/or morphologic image analysis were invited to develop a set of practical guidelines useful to in vitro diagnostic (IVD) innovators, clinical laboratories, regulatory scientists, and laboratory inspectors. The focus of the group was restricted to fluorescence reporter reagents, although some common principles are shared by immunohistochemistry or immunocytochemistry techniques and noted where appropriate. The work product of this two year effort is the content of this special issue of this journal, which is published as 5 separate articles, this being Validation of Cell-based Fluorescence Assays: Practice Guidelines from the ICSH and ICCS - Part II - Preanalytical issues. © 2013 International Clinical Cytometry Society. PMID:24022851

Davis, Bruce H; Dasgupta, Amar; Kussick, Steven; Han, Jin-Yeong; Estrellado, Annalee

2013-01-01

412

Large-eddy simulation of flow past urban-like surfaces: A model validation study  

NASA Astrophysics Data System (ADS)

Accurate prediction of atmospheric boundary layer (ABL) flow and its interaction with urban surfaces is critical for understanding the transport of momentum and scalars within and above cities. This, in turn, is essential for predicting the local climate and pollutant dispersion patterns in urban areas. Large-eddy simulation (LES) explicitly resolves the large-scale turbulent eddy motions and, therefore, can potentially provide improved understanding and prediction of flows inside and above urban canopies. This study focuses on developing and validating an LES framework to simulate flow past urban-like surfaces. In particular, large-eddy simulations were performed of flow past an infinite long two-dimensional (2D) building and an array of 3D cubic buildings. An immersed boundary (IB) method was employed to simulate both 2D and 3D buildings. Four subgrid-scale (SGS) models, including (i) the traditional Smagorinsky model, (ii) the Lagrangian dynamic model, (iii) the Lagrangian scale-dependent dynamic model, and (iv) the modulated gradient model, were evaluated using the 2D building case. The simulated velocity streamlines and the vertical profiles of the mean velocities and variances were compared with experimental results. The modulated gradient model shows the best overall agreement with the experimental results among the four SGS models. In particular, the flow recirculation, the reattachment position and the vertical profiles are accurately reproduced with a grid resolution of (Nx)x(Ny)x(Nz) =160x40x160 ((nx)x(nz) =13x16 covering the block). After validating the LES framework with the 2D building case, it was further applied to simulate a boundary-layer flow past a 3D building array. A regular aligned building array with seven rows of cubic buildings was simulated. The building spacings in the streamwise and spanwise directions were both equal to the building height. A developed turbulent boundary-layer flow was used as the incoming flow. The results were compared with wind tunnel experimental data. Good agreement was observed between the LES results and experimental data in the vertical profiles of the mean velocities and velocity variances at different positions within the building array.

Cheng, Wai Chi; Porté-Agel, Fernando

2013-04-01

413

A population pharmacokinetic model for docetaxel (Taxotere): model building and validation.  

PubMed

A sparse sampling strategy (3 samples per patient, 521 patients) was implemented in 22 Phase 2 studies of docetaxel (Taxotere) at the first treatment cycle for a prospective population pharmacokinetic evaluation. In addition to the 521 Phase 2 patients, 26 (data rich) patients from Phase I studies were included in the analysis. NONMEM analysis of an index set of 280 patients demonstrated that docetaxel clearance (CL) is related to alpha 1-acid glycoprotein (AAG) level, hepatic function (HEP), age (AGE), and body surface area (BSA). The index set population model prediction of CL was compared to that of a naive predictor (NP) using a validation set of 267 patients. Qualitatively, the dependence of CL on AAG, AGE, BSA, and HEP seen in the index set population model was supported in the validation set. Quantitatively, for the validation set patients overall, the performance (bias, precision) of the model was good (7 and 21%, respectively), although not better than that of the NP. However, in all the subpopulations with decreased CL, the model performed better than the NP; the more the CL differed from the population average, the better the performance. For example, in the subpopulation of patients with AAG levels > 2.27 g/L (n = 26), bias and precision of model predictions were 24 and 32% vs. 53 and 53%, respectively, for the NP. The prediction of CL using the model was better (than that of the NP) in 73% of the patients. The population model was redetermined using the whole population of 547 patients and a new covariate, albumin plasma level, was found to be a significant predictor in addition to those found previously. In the final model, HEP, AAG, and BSA are the main predictors of docetaxel CL. PMID:8875345

Bruno, R; Vivier, N; Vergniol, J C; De Phillips, S L; Montay, G; Sheiner, L B

1996-04-01

414

Modeling pedestrian shopping behavior using principles of bounded rationality: model comparison and validation  

NASA Astrophysics Data System (ADS)

Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.

Zhu, Wei; Timmermans, Harry

2011-06-01

415

Transient PVT measurements and model predictions for vessel heat transfer. Part II.  

SciTech Connect

Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models in which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.

Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.; Evans, Gregory Herbert; Rice, Steven F.

2010-07-01

416

Modeling the Photoionized Interface in Blister H II Regions  

NASA Astrophysics Data System (ADS)

We present a grid of photoionization models for the emission from photoevaporative interfaces between the ionized gas and molecular cloud in blister H II regions. For the density profiles of the emitting gas in the models, we use a general power-law form calculated for photoionized, photoevaporative flows by Bertoldi. We find that the spatial emission-line profiles are dependent on the incident flux, the shape of the ionizing continuum, and the elemental abundances. In particular, we find that the peak emissivity of the [S II] and [N II] lines are more sensitive to the elemental abundances than are the total line intensities. The diagnostics obtained from the grid of models can be used in conjunction with high spatial resolution data to infer the properties of ionized interfaces in blister H II regions. As an example, we consider a location at the tip of an ``elephant trunk'' structure in M16 (the Eagle Nebula) and show how narrowband Hubble Space Telescope Wide Field Planetary Camera 2 (HSTWFPC2) images constrain the H II region properties. We present a photoionization model that explains the ionization structure and emission from the interface seen in these high spatial resolution data.

Sankrit, Ravi; Hester, J. Jeff

2000-06-01

417

Linear Relations in Time Series Models. II  

ERIC Educational Resources Information Center

An asymptotic theory is developed for a new time series model introduced in TM 502 289. An algorithm for computing estimates of the parameters of this time series model is given, and it is shown that these estimators are asymptotically efficient in that they have the same asymptotic distribution as the maximum likelihood estimators. (Author/RC)

Rennie, Robert R.; Villegas, C.

1976-01-01

418

System modeling and simulation at EBR-II  

SciTech Connect

The codes being developed and verified using EBR-II data are the NATDEMO, DSNP and CSYRED. NATDEMO is a variation of the Westinghouse DEMO code coupled to the NATCON code previously used to simulate perturbations of reactor flow and inlet temperature and loss-of-flow transients leading to natural convection in EBR-II. CSYRED uses the Continuous System Modeling Program (CSMP) to simulate the EBR-II core, including power, temperature, control-rod movement reactivity effects and flow and is used primarily to model reactivity induced power transients. The Dynamic Simulator for Nuclear Power Plants (DSNP) allows a whole plant, thermal-hydraulic simulation using specific component and system models called from libraries. It has been used to simulate flow coastdown transients, reactivity insertion events and balance-of-plant perturbations.

Dean, E.M.; Lehto, W.K.; Larson, H.A.

1986-01-01

419

The solar CA II H profile computed with theoretical models  

NASA Astrophysics Data System (ADS)

The calibration in absolute flux units (erg/sec sq m A) of the Ca II resonance profiles in late type stars is a difficult task which has been solved in different ways. In 1976, Ayres proposed the radiative equilibrium (RE) photospheric model, a method essentially consisting in fitting the observed far wing profiles with computed fluxes. This method was applied to a set of Ca II H profiles observed at ESO and tested on the sun in 1984. Results published by Castelli in 1988 found that, with an RE model, without any increase in temperature in the upper layers, the Ca II H profiles computed in nonlocal thermodynamic equilibrium (both with partial and complete redistribution) do not show remarkable differences from the profile computed in local thermodynamic equilibrium. The paper discusses the comparison of the observed flux with that computed employing LTE and RE theoretical models, and LTE profiles.

Castelli, Fiorella