These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Filament winding cylinders. II - Validation of the process model  

NASA Technical Reports Server (NTRS)

Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

1990-01-01

2

Fatigue crack growth under variable-amplitude loading: Part II Code development and model validation q  

E-print Network

Fatigue crack growth under variable-amplitude loading: Part II ± Code development and model 2001; accepted 12 February 2001 Abstract A state-space model of fatigue crack growth has been information for code development and validates the state-space model with fatigue test data for dierent types

Ray, Asok

3

External validation of the SAPS II, APACHE II and APACHE III prognostic models in South England: a multicentre study  

Microsoft Academic Search

Objective. External validation of three prognostic models in adult intensive care patients in South England. Design. Prospective cohort study. Setting. Seventeen intensive care units (ICU) in the South West Thames Region in South England. Patients and participants. Data of 16,646 patients were analysed. Interventions. None. Measurements and results. We compared directly the predictive accuracy of three prognostic models (SAPS II,

Dieter H. Beck; Gary B. Smith; John V. Pappachan; Brian Millar

2003-01-01

4

Validating simulation models  

Microsoft Academic Search

In this paper we give a general introduction to model validation, define the various validation techniques, discuss conceptual and operational validity, and present a recommended model validation procedure.

Robert G. Sargent

1983-01-01

5

Capnography and the bain circuit II: Validation of a computer model  

Microsoft Academic Search

Validation of a computer model is described. The behavior of this model is compared both with mechanical ventilation of a\\u000a test lung in a laboratory setup that uses a washout method and with manual ventilation. A comparison is also made with results\\u000a obtained from a volunteer breathing spontaneously through a Bain circuit and with results published in the literature. This

Jan E. W. Beneken; Nikolaus Gravenstein; Samsun Lampotang; Jan J. van der Aa; Joachim S. Gravenstein

1987-01-01

6

Person Heterogeneity of the BDI-II-C and Its Effects on Dimensionality and Construct Validity: Using Mixture Item Response Models  

ERIC Educational Resources Information Center

This study was to apply the mixed Rasch model to investigate person heterogeneity of Beck Depression Inventory-II-Chinese version (BDI-II-C) and its effects on dimensionality and construct validity. Person heterogeneity was reflected by two latent classes that differ qualitatively. Additionally, person heterogeneity adversely affected the…

Wu, Pei-Chen; Huang, Tsai-Wei

2010-01-01

7

Validation of simulation models  

Microsoft Academic Search

This is a tutorial paper on validation of simulation models. Included in this tutorial are what is meant by validation, the problem dependent characteristics of simulation model validation, descriptions of the various validation techniques and their use and a discussion on the statistics used in validation techniques (but not the detailed statistical tests themselves).

Robert G. Sargent

1979-01-01

8

Black liquor combustion validated recovery boiler modeling: Final year report. Volume 2 (Appendices I, section 5 and II, section 1)  

SciTech Connect

This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 2 contains the last section of Appendix I, Radiative heat transfer in kraft recovery boilers, and the first section of Appendix II, The effect of temperature and residence time on the distribution of carbon, sulfur, and nitrogen between gaseous and condensed phase products from low temperature pyrolysis of kraft black liquor.

Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

1998-08-01

9

Air filtration with moisture and frosting phase changes in fiberglass insulation—II. Model validation  

Microsoft Academic Search

A numerical simulation, employing a local-volume-averaging formulation, was validated for each flow direction (exfiltration\\/infiltration), based on the experimental temperature and moisture accumulation results obtained in Part I of this study. The predicted results compare well with the measured temperature profiles throughout the insulation slab for both air exfiltration and infiltration. The comparison for the moisture accumulation profiles is generally reasonable,

D. R. Mitchell; Y.-X. Tao; R. W. Besant

1995-01-01

10

Simulation model verification and validation  

Microsoft Academic Search

Verification and validation of simulation models are discussed. The different approaches to deciding model validity are described; how model verification and validation relate to the model development process is specified; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; ways to document results are given; and a recommended validation procedure is presented

Robert G. Sargent

1991-01-01

11

Simulation model verification and validation  

Microsoft Academic Search

This paper discusses verification and validation of simulation models. The different approaches to deciding model validity are described; how model verification and validation relate to the model development process is specified; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; ways to document results are given; and a recommended validation procedure is

Robert G. Sargent

1991-01-01

12

Test Method for Boom Suspension Influence on Spray Distribution, Part II: Validation and Use of a Spray Distribution Model  

E-print Network

1 Test Method for Boom Suspension Influence on Spray Distribution, Part II: Validation and Use.lardoux@univ-lehavre.fr 2 UMR « Information et Technologie pour les Agro-bio Procédés », ENSA M., 2 pl. Viala, 34060 behaviour at farm level. The concept is for the sprayer to travel over a bump, to measure boom movements

Paris-Sud XI, Université de

13

Assessing Wildlife Habitat Value of New England Salt Marshes: II. Model Testing and Validation  

EPA Science Inventory

We test a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. Assessment scores ranged f...

14

MEDSLIK-II, a Lagrangian marine oil spill model for short-term forecasting - Part 2: Numerical simulations and validations  

NASA Astrophysics Data System (ADS)

In this paper we use MEDSLIK-II, a Lagrangian marine oil spill model described in Part 1 of this paper (De Dominicis et al., 2013), to simulate oil slick transport and transformation processes for realistic oceanic cases where satellite or drifting buoys data are available for verification. The model is coupled with operational oceanographic currents, atmospheric analyses winds and remote-sensing data for initialization. The sensitivity of the oil spill simulations to several model parameterizations is analyzed and the results are validated using surface drifters and SAR (Synthetic Aperture Radar) images in different regions of the Mediterranean Sea. It is found that the forecast skill of Lagrangian trajectories largely depends on the accuracy of the Eulerian ocean currents: the operational models give useful estimates of currents, but high-frequency (hourly) and high spatial resolution is required, and the Stokes drift velocity has to be often added, especially in coastal areas. From a numerical point of view, it is found that a realistic oil concentration reconstruction is obtained using an oil tracer grid resolution of about 100 m, with at least 100 000 Lagrangian particles. Moreover, sensitivity experiments to uncertain model parameters show that the knowledge of oil type and slick thickness are, among all the others, key model parameters affecting the simulation results. Considering acceptable for the simulated trajectories a maximum spatial error of the order of three times the horizontal resolution of the Eulerian ocean currents, the predictability skill for particle trajectories is from 1 to 2.5 days depending on the specific current regime. This suggests that re-initialization of the simulations is required every day.

De Dominicis, M.; Pinardi, N.; Zodiatis, G.; Archetti, R.

2013-03-01

15

Error characterization of the Gaia astrometric solution. II. Validating the covariance expansion model  

NASA Astrophysics Data System (ADS)

Context. To use the data in the future Gaia catalogue it is important to have accurate estimates of the statistical uncertainties and correlations of the errors in the astrometric data given in the catalogue. Aims: In a previous paper we derived a mathematical model for computing the covariances of the astrometric data based on series expansions and a simplified attitude description. The aim of the present paper is to determine to what extent this model provides an accurate representation of the expected random errors in the astrometric solution for Gaia. Methods: We simulate the astrometric core solution by making least-squares solutions of the astrometric parameters for one million stars and the attitude parameters for a five-year mission, using nearly one billion simulated elementary observations for a total of 26 million unknowns. Two cases are considered: one in which all stars have the same magnitude, and another with 30% brighter and 70% fainter stars. The resulting astrometric errors are statistically compared with the model predictions. Results: In all cases considered, and within the statistical uncertainties of the numerical experiments (typically below 0.4%), the theoretically calculated variances and covariances are consistent with the simulations. To achieve this it is however necessary to expand the covariances to at least third or fourth order, and to apply a (theoretically motivated and derived) "fudge factor" in the kinematographic model. Conclusions: The model provides a feasible method to estimate the covariance of arbitrary astrometric data, accurate enough for most applications, and as such it should be available as part of the user's interface to the Gaia catalogue. A main assumption in the current model is that the observational errors are uncorrelated (e.g., photon noise), and further studies are needed on how correlated modelling errors, in particular in the attitude, can be taken into account.

Holl, B.; Lindegren, L.; Hobbs, D.

2012-07-01

16

Verifying and validating simulation models  

Microsoft Academic Search

This paper discusses verification and validation of simulation models. The different approaches to deciding model validity are presented, how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described, ways to document results are given; and a recommended procedureispresented.

Robert G. Sargent

1996-01-01

17

Verifying and validating simulation models  

Microsoft Academic Search

This paper discusses verification and validation of simulation models. The different approaches to deciding model validity am presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined, conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; and a recommended procedure is presented.

Robert G. Sargent

1996-01-01

18

Verification validation: model verification and validation  

Microsoft Academic Search

In this paper we outline practical techniques and guidelines for verifying and validating simulation models. The goal of verification and validation is a model that is accurate when used to predict the performance of the real-world system that it represents, or to predict the difference in performance between two scenarios or two model configurations. The process of verifying and validating

John S. Carson II

2002-01-01

19

Reliability and validity: Part II.  

PubMed

Determining measurement reliability and validity involves complex processes. There is usually room for argument about most instruments. It is important that the researcher clearly describes the processes upon which she made the decision to use a particular instrument, and presents the evidence available showing that the instrument is reliable and valid for the current purposes. In some cases, the researcher may need to conduct pilot studies to obtain evidence upon which to decide whether the instrument is valid for a new population or a different setting. In all cases, the researcher must present a clear and complete explanation for the choices, she has made regarding reliability and validity. The consumer must then judge the degree to which the researcher has provided adequate and theoretically sound rationale. Although I have tried to touch on most of the important concepts related to measurement reliability and validity, it is beyond the scope of this column to be exhaustive. There are textbooks devoted entirely to specific measurement issues if readers require more in-depth knowledge. PMID:15182122

Davis, Debora Winders

2004-01-01

20

Groundwater Model Validation  

SciTech Connect

Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation data to constrain model input parameters is shown for the second case study using a Bayesian approach known as Markov Chain Monte Carlo. The approach shows a great potential to be helpful in the validation process and in incorporating prior knowledge with new field data to derive posterior distributions for both model input and output.

Ahmed E. Hassan

2006-01-24

21

Base Flow Model Validation  

NASA Technical Reports Server (NTRS)

A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

2011-01-01

22

Black liquor combustion validated recovery boiler modeling: Final year report. Volume 3 (Appendices II, sections 2--3 and III)  

SciTech Connect

This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 3 contains the following appendix sections: Formation and destruction of nitrogen oxides in recovery boilers; Sintering and densification of recovery boiler deposits laboratory data and a rate model; and Experimental data on rates of particulate formation during char bed burning.

Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

1998-08-01

23

Black Liquor Combustion Validated Recovery Boiler Modeling, Final Year Report, Volume 3: Appendix II, Sections 2 & 3 and Appendix III  

SciTech Connect

This project was initiated in October 1990 with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. Many of these objectives were accomplished at the end of the first five years and documented in a comprehensive report on that work (DOE/CE/40936-T3, 1996). A critical review of recovery boiler modeling, carried out in 1995, concluded that further enhancements of the model were needed to make reliable predictions of key output variables. In addition, there was a need for sufficient understanding of fouling and plugging processes to allow model outputs to be interpreted in terms of the effect on plugging and fouling. As a result, the project was restructured and reinitiated at the end of October 1995, and was completed in June 1997. The entire project is now complete and this report summarizes all of the work done on the project since it was restructured. The key tasks to be accomplished under the restructured project were to (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes; (2) Validate the enhanced furnace models, so that users can have confidence in the results; (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler; and (4) Facilitate the transfer of codes, black liquor submodels, and fundamental knowledge to the U.S. kraft pulp industry.

T.M. Grace, W.J. Frederick, M. Salcudean, R.A. Wessel

1998-08-01

24

Development of a new version of the Liverpool Malaria Model. II. Calibration and validation for West Africa  

PubMed Central

Background In the first part of this study, an extensive literature survey led to the construction of a new version of the Liverpool Malaria Model (LMM). A new set of parameter settings was provided and a new development of the mathematical formulation of important processes related to the vector population was performed within the LMM. In this part of the study, so far undetermined model parameters are calibrated through the use of data from field studies. The latter are also used to validate the new LMM version, which is furthermore compared against the original LMM version. Methods For the calibration and validation of the LMM, numerous entomological and parasitological field observations were gathered for West Africa. Continuous and quality-controlled temperature and precipitation time series were constructed using intermittent raw data from 34 weather stations across West Africa. The meteorological time series served as the LMM data input. The skill of LMM simulations was tested for 830 different sets of parameter settings of the undetermined LMM parameters. The model version with the highest skill score in terms of entomological malaria variables was taken as the final setting of the new LMM version. Results Validation of the new LMM version in West Africa revealed that the simulations compare well with entomological field observations. The new version reproduces realistic transmission rates and simulated malaria seasons are comparable to field observations. Overall the new model version performs much better than the original model. The new model version enables the detection of the epidemic malaria potential at fringes of endemic areas and, more importantly, it is now applicable to the vast area of malaria endemicity in the humid African tropics. Conclusions A review of entomological and parasitological data from West Africa enabled the construction of a new LMM version. This model version represents a significant step forward in the modelling of a weather-driven malaria transmission cycle. The LMM is now more suitable for the use in malaria early warning systems as well as for malaria projections based on climate change scenarios, both in epidemic and endemic malaria areas. PMID:21410939

2011-01-01

25

A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part II - Validation and localization analysis  

NASA Astrophysics Data System (ADS)

We study the mechanical failure of cemented granular materials (e.g., sandstones) using a constitutive model based on breakage mechanics for grain crushing and damage mechanics for cement fracture. The theoretical aspects of this model are presented in Part I: Tengattini et al. (2014), A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables, Part I - Theory (Journal of the Mechanics and Physics of Solids, 10.1016/j.jmps.2014.05.021). In this Part II we investigate the constitutive and structural responses of cemented granular materials through analyses of Boundary Value Problems (BVPs). The multiple failure mechanisms captured by the proposed model enable the behavior of cemented granular rocks to be well reproduced for a wide range of confining pressures. Furthermore, through comparison of the model predictions and experimental data, the micromechanical basis of the model provides improved understanding of failure mechanisms of cemented granular materials. In particular, we show that grain crushing is the predominant inelastic deformation mechanism under high pressures while cement failure is the relevant mechanism at low pressures. Over an intermediate pressure regime a mixed mode of failure mechanisms is observed. Furthermore, the micromechanical roots of the model allow the effects on localized deformation modes of various initial microstructures to be studied. The results obtained from both the constitutive responses and BVP solutions indicate that the proposed approach and model provide a promising basis for future theoretical studies on cemented granular materials.

Das, Arghya; Tengattini, Alessandro; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

2014-10-01

26

Verification and validation: verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2003-01-01

27

Data Assimilation of Photosynthetic Light-use Efficiency using Multi-angular Satellite Data: II Model Implementation and Validation  

NASA Technical Reports Server (NTRS)

Spatially explicit and temporally continuous estimates of photosynthesis will be of great importance for increasing our understanding of and ultimately closing the terrestrial carbon cycle. Current capabilities to model photosynthesis, however, are limited by accurate enough representations of the complexity of the underlying biochemical processes and the numerous environmental constraints imposed upon plant primary production. A potentially powerful alternative to model photosynthesis through these indirect observations is the use of multi-angular satellite data to infer light-use efficiency (e) directly from spectral reflectance properties in connection with canopy shadow fractions. Hall et al. (this issue) introduced a new approach for predicting gross ecosystem production that would allow the use of such observations in a data assimilation mode to obtain spatially explicit variations in e from infrequent polar-orbiting satellite observations, while meteorological data are used to account for the more dynamic responses of e to variations in environmental conditions caused by changes in weather and illumination. In this second part of the study we implement and validate the approach of Hall et al. (this issue) across an ecologically diverse array of eight flux-tower sites in North America using data acquired from the Compact High Resolution Imaging Spectroradiometer (CHRIS) and eddy-flux observations. Our results show significantly enhanced estimates of e and therefore cumulative gross ecosystem production (GEP) over the course of one year at all examined sites. We also demonstrate that e is greatly heterogeneous even across small study areas. Data assimilation and direct inference of GEP from space using a new, proposed sensor could therefore be a significant step towards closing the terrestrial carbon cycle.

Hilker, Thomas; Hall, Forest G.; Tucker, J.; Coops, Nicholas C.; Black, T. Andrew; Nichol, Caroline J.; Sellers, Piers J.; Barr, Alan; Hollinger, David Y.; Munger, J. W.

2012-01-01

28

Hydrogen peroxide metabolism and sensing in human erythrocytes: a validated kinetic model and reappraisal of the role of peroxiredoxin II.  

PubMed

Hydrogen peroxide (H2O2) metabolism in human erythrocytes has been thoroughly investigated, but unclear points persist. By integrating the available data into a mathematical model that accurately represents the current understanding and comparing computational predictions to observations we sought to (a) identify inconsistencies in present knowledge, (b) propose resolutions, and (c) examine their functional implications. The systematic confrontation of computational predictions with experimental observations of the responses of intact erythrocytes highlighted the following important discrepancy. The high rate constant (10(7)-10(8) M(-1) s(-1)) for H2O2 reduction determined for purified peroxiredoxin II (Prx2) and the high abundance of this protein indicate that under physiological conditions it consumes practically all the H2O2. However, this is inconsistent with extensive evidence that Prx2's contribution to H2O2 elimination is comparable to that of catalase. Models modified such that Prx2's effective peroxidase activity is just 10(5) M(-1) s(-1) agree near quantitatively with extensive experimental observations. This low effective activity is probably due to a strong but readily reversible inhibition of Prx2's peroxidatic activity in intact cells, implying that the main role of Prx2 in human erythrocytes is not to eliminate peroxide substrates. Simulations of the responses to physiological H2O2 stimuli highlight that a design combining abundant Prx2 with a low effective peroxidase activity spares NADPH while improving potential signaling properties of the Prx2/thioredoxin/thioredoxin reductase system. PMID:24952139

Benfeitas, Rui; Selvaggio, Gianluca; Antunes, Fernando; Coelho, Pedro M B M; Salvador, Armindo

2014-09-01

29

Validation of performance assessment models  

SciTech Connect

The purpose of model validation in a low-level waste site performance assessment is to increase confidence in predictions of the migration and fate of future releases from the wastes. Unlike the process of computer code verification, model validation is a site-specific process that requires site-specific data. This paper provides an overview of the topic of model validation and describes the general approaches, strategies, and limitations of model validation being considered by various researchers concerned with the subject.

Bergeron, M.P.; Kincaid, C.T.

1991-11-01

30

Model Valid Prediction Period  

NASA Astrophysics Data System (ADS)

A new concept, valid prediction period (VPP), is presented here to evaluate model predictability. VPP is defined as the time period when the prediction error first exceeds a pre-determined criterion (i.e., the tolerance level). It depends not only on the instantaneous error growth, but also on the noise level, the initial error, and tolerance level. The model predictability skill is then represented by a single scalar, VPP. The longer the VPP, the higher the model predictability skill is. A theoretical framework on the base of the backward Fokker-Planck equation is developed to determine the probability density function (pdf) of VPP. Verification of a Gulf of Mexico nowcast/forecast model is used as an example to demonstrate the usefulness of VPP. Power law scaling is found in the mean square error of displacement between drifting buoy and model trajectories (both at 50 m depth). The pdf of VPP is asymmetric with a long and broad tail on the higher value side, which suggests long-term predictability. The calculations demonstrate that the long-term (extreme long such as 50-60 day) predictability is not an "outlier" and shares the same statistical properties as the short-term predictions. References Chu P. C., L. M. Ivanov, and C.W. Fan, Backward Fokker-Plank equation for determining model predictability with unknown initial error distribution. J. Geophys. Res., in press, 2002. Chu P.C., L.M.Ivanov, T.M. Margolina, and O.V.Melnichenko, 2002b: On probabilistic stability of an atmospheric model to various amplitude perturbations. J. Atmos. Sci., in press Chu P.C., L.M. Ivanov, L. Kantha, O.V. Melnichenko and Y.A. Poberezhny, 2002c: The long-term correlations and power decay law in model prediction skill. Geophys. Res. Let., in press.

Chu, P. C.

2002-12-01

31

Verification, validation and accreditation of simulation models  

Microsoft Academic Search

The paper discusses verification, validation, and accreditation of simulation models. The different approaches to deciding model validity are presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; a recommended procedure is presented;

Robert G. Sargent

2000-01-01

32

Validation and verification of simulation models  

Microsoft Academic Search

This paper discusses validation and verification of simulation models. The different approaches to deciding model validity are presented; how model validation and verification relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; and a recommended procedure is presented.

Robert G. Sargent

1999-01-01

33

Verification and validation of simulation models  

Microsoft Academic Search

This paper discusses verification and validation of simulation models. The different approaches to deciding model validity are described; how model verification and validation relate to the model development process is specified; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; ways to document results are given; and a recommended procedure is presented.

Robert G. Sargent

1994-01-01

34

Validation and verification of simulation models  

Microsoft Academic Search

This paper discusses validation and verification of simulation models. The different approaches to deciding model validity are presented; how model validation and verification relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; and a recommended procedure is presented

Robert G. Sargent

1999-01-01

35

Ab initio structural modeling of and experimental validation for Chlamydia trachomatis protein CT296 reveal structural similarity to Fe(II) 2-oxoglutarate-dependent enzymes  

SciTech Connect

Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-{angstrom} C{alpha} root mean square deviation [RMSD]) the high-resolution (1.8-{angstrom}) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur.

Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott (Michigan); (Kansas); (HWMRI)

2012-02-13

36

Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes?  

PubMed Central

Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-? C? root mean square deviation [RMSD]) the high-resolution (1.8-?) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur. PMID:21965559

Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

2011-01-01

37

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described, a graphical paradigm that relates verification and validation to the model development process is presented, and various validation techniques are defined. Conceptual model validity, model verification, operational validity, and data validity are discussed and a way to document results is

Robert G. Sargent

2011-01-01

38

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2010-01-01

39

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2005-01-01

40

Validation and Verification of Simulation Models  

Microsoft Academic Search

In this paper we discuss validation and verification of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate validation and verification to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2004-01-01

41

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of si- mulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are de- fined; conceptual model validity, model verification, op- erational validity, and data validity are discussed; a way to document results

Robert G. Sargent

1994-01-01

42

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2007-01-01

43

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2003-01-01

44

Model-Based Sensor Fault Detection and Isolation System for Unmanned Ground Vehicles: Experimental Validation (part II)  

Microsoft Academic Search

This paper presents implementation details of a model-based sensor fault detection and isolation system (SFDIS) applied to unmanned ground vehicles (UGVs). Structural analysis, applied to the nonlinear model of the UGV, is followed to build the residual generation module, followed by a residual evaluation module capable of detecting single and multiple sensor faults, as detailed in part I (Monteriu et

Andrea Monteriu; Prateek Asthan; Kimon P. Valavanis; Sauro Longhi

2007-01-01

45

MEDSLIK-II, a Lagrangian marine surface oil spill model for short-term forecasting - Part 2: Numerical simulations and validations  

NASA Astrophysics Data System (ADS)

In this paper we use MEDSLIK-II, a Lagrangian marine surface oil spill model described in Part 1 (De Dominicis et al., 2013), to simulate oil slick transport and transformation processes for realistic oceanic cases, where satellite or drifting buoys data are available for verification. The model is coupled with operational oceanographic currents, atmospheric analyses winds and remote sensing data for initialization. The sensitivity of the oil spill simulations to several model parameterizations is analyzed and the results are validated using surface drifters, SAR (synthetic aperture radar) and optical satellite images in different regions of the Mediterranean Sea. It is found that the forecast skill of Lagrangian trajectories largely depends on the accuracy of the Eulerian ocean currents: the operational models give useful estimates of currents, but high-frequency (hourly) and high-spatial resolution is required, and the Stokes drift velocity has to be added, especially in coastal areas. From a numerical point of view, it is found that a realistic oil concentration reconstruction is obtained using an oil tracer grid resolution of about 100 m, with at least 100 000 Lagrangian particles. Moreover, sensitivity experiments to uncertain model parameters show that the knowledge of oil type and slick thickness are, among all the others, key model parameters affecting the simulation results. Considering acceptable for the simulated trajectories a maximum spatial error of the order of three times the horizontal resolution of the Eulerian ocean currents, the predictability skill for particle trajectories is from 1 to 2.5 days depending on the specific current regime. This suggests that re-initialization of the simulations is required every day.

De Dominicis, M.; Pinardi, N.; Zodiatis, G.; Archetti, R.

2013-11-01

46

Predicting germination in semi-arid wildland seedbeds II. Field validation of wet thermal-time models  

Microsoft Academic Search

Accurate prediction of germination for species used for semi-arid land revegetation would support selection of plant materials for specific climatic conditions and sites. Wet thermal-time models predict germination time by summing progress toward germination subpopulation percentages as a function of temperature across intermittent wet periods or within singular wet periods. Wet periods may be defined by any reasonable seedbed water

Jennifer K. Rawlins; Bruce A. Roundy; Dennis Egget; Nathan Cline

47

Thermospheric dynamics during September 18-19, 1984. II - Validation of the NCAR thermospheric general circulation model  

NASA Technical Reports Server (NTRS)

The winds, temperatures, and densities predicted by the thermospheric GCM are compared with measurements from the Equinox Transition Study of September 17-24, 1984. Agreement between predictions and observation is good in many respects. The quiet day observations contain a strong semidiurnal wind variation which is mainly due to upward-propagating tides. The storm day wind behavior is significantly different and includes a surge of equatorward winds due to a global propagating disturbance associated with the storm onset. A quantitative statistical comparison of the predicted and measured winds indicates that the equatorward winds in the model are weaker than the observed winds, particularly during storm times. A quiet day phase anomaly in the measured F region winds which is not reproduced by the model suggests the occurrence of an important unmodeled interaction between upward propagating semidiurnal tides and high-latitude effects.

Crowley, G.; Emery, B. A.; Roble, R. G.; Carlson, H. C., Jr.; Salah, J. E.

1989-01-01

48

Validation and verification of simulation models  

Microsoft Academic Search

ABSTRACT This paper discusses validation and verification of simulation models. The different approaches to deciding model validity are presented; how,model validation and verification relate to the model,development,process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways,to document,results are given; and a recommended,procedure is presented. 1,INTRODUCTION Simulation models are increasingly

Robert G. Sargent

1992-01-01

49

A N?dynamics Model for Predicting N?behavior Subject to Environmentally Friendly Fertilization Practices: II –Numerical Model and Model Validation  

Microsoft Academic Search

Nitrogen dynamics in the soil under the condition of environmentally friendly fertilization practices (EFFPs) is described by a comprehensive N-dynamics model. The model (first paper of this series, Transport in Porous Media 31(3) (1998), 249–274) is different from other models in its capability of simulating the special phenomena related to the application of EFFPs. In this paper, a finite difference

Fuli Wang; Jacob Bear; Avi Shaviv

1998-01-01

50

Testing and validating environmental models  

USGS Publications Warehouse

Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series against data time series, and plotting predicted versus observed values) have little diagnostic power. We propose that it may be more useful to statistically extract the relationships of primary interest from the time series, and test the model directly against them.

Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

1996-01-01

51

TUTORIAL: Validating biorobotic models  

NASA Astrophysics Data System (ADS)

Some issues in neuroscience can be addressed by building robot models of biological sensorimotor systems. What we can conclude from building models or simulations, however, is determined by a number of factors in addition to the central hypothesis we intend to test. These include the way in which the hypothesis is represented and implemented in simulation, how the simulation output is interpreted, how it is compared to the behaviour of the biological system, and the conditions under which it is tested. These issues will be illustrated by discussing a series of robot models of cricket phonotaxis behaviour. .

Webb, Barbara

2006-09-01

52

A dynamic model of some malaria-transmitting anopheline mosquitoes of the Afrotropical region. II. Validation of species distribution and seasonal variations  

PubMed Central

Background The first part of this study aimed to develop a model for Anopheles gambiae s.l. with separate parametrization schemes for Anopheles gambiae s.s. and Anopheles arabiensis. The characterizations were constructed based on literature from the past decades. This part of the study is focusing on the model’s ability to separate the mean state of the two species of the An. gambiae complex in Africa. The model is also evaluated with respect to capturing the temporal variability of An. arabiensis in Ethiopia. Before conclusions and guidance based on models can be made, models need to be validated. Methods The model used in this paper is described in part one (Malaria Journal 2013, 12:28). For the validation of the model, a data base of 5,935 points on the presence of An. gambiae s.s. and An. arabiensis was constructed. An additional 992 points were collected on the presence An. gambiae s.l.. These data were used to assess if the model could recreate the spatial distribution of the two species. The dataset is made available in the public domain. This is followed by a case study from Madagascar where the model’s ability to recreate the relative fraction of each species is investigated. In the last section the model’s ability to reproduce the temporal variability of An. arabiensis in Ethiopia is tested. The model was compared with data from four papers, and one field survey covering two years. Results Overall, the model has a realistic representation of seasonal and year to year variability in mosquito densities in Ethiopia. The model is also able to describe the distribution of An. gambiae s.s. and An. arabiensis in sub-Saharan Africa. This implies this model can be used for seasonal and long term predictions of changes in the burden of malaria. Before models can be used to improving human health, or guide which interventions are to be applied where, there is a need to understand the system of interest. Validation is an important part of this process. It is also found that one of the main mechanisms separating An. gambiae s.s. and An. arabiensis is the availability of hosts; humans and cattle. Climate play a secondary, but still important, role. PMID:23442727

2013-01-01

53

Statistical validation of system models  

SciTech Connect

It is common practice in system analysis to develop mathematical models for system behavior. Frequently, the actual system being modeled is also available for testing and observation, and sometimes the test data are used to help identify the parameters of the mathematical model. However, no general-purpose technique exists for formally, statistically judging the quality of a model. This paper suggests a formal statistical procedure for the validation of mathematical models of systems when data taken during operation of the system are available. The statistical validation procedure is based on the bootstrap, and it seeks to build a framework where a statistical test of hypothesis can be run to determine whether or not a mathematical model is an acceptable model of a system with regard to user-specified measures of system behavior. The approach to model validation developed in this study uses experimental data to estimate the marginal and joint confidence intervals of statistics of interest of the system. These same measures of behavior are estimated for the mathematical model. The statistics of interest from the mathematical model are located relative to the confidence intervals for the statistics obtained from the experimental data. These relative locations are used to judge the accuracy of the mathematical model. An extension of the technique is also suggested, wherein randomness may be included in the mathematical model through the introduction of random variable and random process terms. These terms cause random system behavior that can be compared to the randomness in the bootstrap evaluation of experimental system behavior. In this framework, the stochastic mathematical model can be evaluated. A numerical example is presented to demonstrate the application of the technique.

Barney, P. [Sandia National Labs., Albuquerque, NM (United States); Ferregut, C.; Perez, L.E. [Texas Univ., El Paso, TX (United States); Hunter, N.F. [Los Alamos National Lab., NM (United States); Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States)

1997-01-01

54

Turbulence Modeling Verification and Validation  

NASA Technical Reports Server (NTRS)

Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

Rumsey, Christopher L.

2014-01-01

55

Is the Rey 15-Item Memory Test II (Rey II) a valid symptom validity test?: comparison with the TOMM.  

PubMed

The Rey 15-Item Memory Test II (Rey II) is a revised version of the original Rey Memory Test and is used as a measure of test-taking effort. In the present study, the concurrent validity of the Rey II was examined by comparing Rey II test scores to a well-established measure of symptom validity, the Test of Memory Malingering (TOMM). Retrospective chart review was conducted using the records of 60 veterans who were referred for outpatient neuropsychological testing and suspected of possible symptom exaggeration. Results of the study suggest that when compared to the TOMM, the Qualitative, as opposed to the Quantitative, scoring method of the Rey II was more discriminative, but showed both positive and negative predictive power that was unacceptably low, falling at .62 and .64, respectively. Clinical implications are discussed. PMID:19023746

Whitney, Kriscinda A; Hook, Julie N; Steiner, Amy R; Shepard, Polly H; Callaway, Stephanie

2008-01-01

56

Model Validation with Hybrid Dynamic Simulation  

SciTech Connect

Abstract—Model validation has been one of the central topics in power engineering studies for years. As model validation aims at obtaining reasonable models to represent actual behavior of power system components, it has been essential to validate models against actual measurements or known benchmark behavior. System-wide model simulation results can be compared with actual recordings. However, it is difficult to construct a simulation case for a large power system such as the WECC system and to narrow down to problematic models in a large system. Hybrid dynamic simulation with its capability of injecting external signals into dynamic simulation enables rigorous comparison of measurements and simulation in a small subsystem of interest. This paper presents such a model validation methodology with hybrid dynamic simulation. Two application examples on generator and load model validation are presented to show the validity of this model validation methodology. This methodology is further extended for automatic model validation and dichotomous subsystem model validation.

Huang, Zhenyu; Kosterev, Dmitry; Guttromson, Ross T.; Nguyen, Tony B.

2006-06-18

57

Statistical validation of stochastic models  

SciTech Connect

It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

1996-12-31

58

Black Liquor Combustion Validated Recovery Boiler Modeling, Final Year Report, Volume 2: Appendix I, Section 5, and Appendix II, Section 1  

SciTech Connect

This project was initiated in October 1990 with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. Many of these objectives were accomplished at the end of the first five years and documented in a comprehensive report on that work (DOE/CE/40936-T3, 1996). A critical review of recovery boiler modeling, carried out in 1995, concluded that further enhancements of the model were needed to make reliable predictions of key output variables. In addition, there was a need for sufficient understanding of fouling and plugging processes to allow model outputs to be interpreted in terms of the effect on plugging and fouling. As a result, the project was restructured and reinitiated at the end of October 1995, and was completed in June 1997. The entire project is now complete and this report summarizes all of the work done on the project since it was restructured. The key tasks to be accomplished under the restructured project were to (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes; (2) Validate the enhanced furnace models, so that users can have confidence in the results; (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler; and (4) Facilitate the transfer of codes, black liquor submodels, and fundamental knowledge to the U.S. kraft pulp industry.

T.M. Grace, W.J. Frederick, M. Salcudean, R.A. Wessel

1998-08-01

59

Model validation with hybrid dynamic simulation  

Microsoft Academic Search

Model validation has been one of the central topics in power engineering studies for years. As model validation aims at obtaining reasonable models to represent actual behavior of power system components, it has been essential to validate models against actual measurements or known benchmark behavior. System-wide model simulation results can be compared with actual recordings. However, it is difficult to

Zhenyu Huang; M. Kosterev; Ross T. Guttromson; Tony B. Nguyen

2006-01-01

60

(Validity of environmental transfer models)  

SciTech Connect

BIOMOVS (BIOspheric MOdel Validation Study) is an international cooperative study initiated in 1985 by the Swedish National Institute of Radiation Protection to test models designed to calculate the environmental transfer and bioaccumulation of radionuclides and other trace substances. The objective of the symposium and workshop was to synthesize results obtained during Phase 1 of BIOMOVS (the first five years of the study) and to suggest new directions that might be pursued during Phase 2 of BIOMOVS. The travelers were an instrumental part of the development of BIOMOVS. This symposium allowed the travelers to present a review of past efforts at model validation and a synthesis of current activities and to refine ideas concerning future development of models and data for assessing the fate, effect, and human risks of environmental contaminants. R. H. Gardner also visited the Free University, Amsterdam, and the National Institute of Public Health and Environmental Protection (RIVM) in Bilthoven to confer with scientists about current research in theoretical ecology and the use of models for estimating the transport and effect of environmental contaminants and to learn about the European efforts to map critical loads of acid deposition.

Blaylock, B.G.; Hoffman, F.O.; Gardner, R.H.

1990-11-07

61

Verification, validation, and accreditation: verification, validation, and accreditation of simulation models  

Microsoft Academic Search

This paper discusses verification, validation, and accreditation of simulation models. The different approaches to deciding model validity are presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; a recommended procedure is presented;

Robert G. Sargent

2000-01-01

62

Verification and Validation of Simulation Model  

E-print Network

Verification and Validation of Simulation Model 1 Verification and Validation 2 #12;Verification · Examples ­ simulation model: open networks with exponential interarrival time distribution and uniform is consistent with known analytic results 5 Validation · Model should be "good enough" (subjective) · Seek

Shihada, Basem

63

A Discussion on Experimental Model Validation  

Microsoft Academic Search

Model validation is essential in modeling and simulation. It ldquofinalizesrdquo the modeling process, and provides the base for reliable experiments with the model, and thus to gain trustworthy insights of the system under study. Diverse techniques have been developed addressing different needs and are used during different phases in the modeling and simulation life cycle. Experimental model validation depends on

Stefan Leye; Jan Himmelspach; Adelinde M. Uhrmacher

2009-01-01

64

SRVAL. Stock-Recruitment Model VALidation Code  

SciTech Connect

SRVAL is a computer simulation model of the Hudson River striped bass population. It was designed to aid in assessing the validity of curve-fits of the linearized Ricker stock-recruitment model, modified to incorporate multiple-age spawners and to include an environmental variable, to variously processed annual catch-per-unit-effort (CPUE) statistics for a fish population. It is sometimes asserted that curve-fits of this kind can be used to determine the sensitivity of fish populations to such man-induced stresses as entrainment and impingement at power plants. SRVAL was developed to test such assertions and was utilized in testimony written in connection with the Hudson River Power Case (U. S. Environmental Protection Agency, Region II).

Christensen, S.W. [Oak Ridge National Lab., Oak Ridge, TN (United States)

1989-12-07

65

Obstructive lung disease models: what is valid?  

PubMed

Use of disease simulation models has led to scrutiny of model methods and demand for evidence that models credibly simulate health outcomes. We sought to describe recent obstructive lung disease simulation models and their validation. Medline and EMBASE were used to identify obstructive lung disease simulation models published from January 2000 to June 2006. Publications were reviewed to assess model attributes and four types of validation: first-order (verification/debugging), second-order (comparison with studies used in model development), third-order (comparison with studies not used in model development), and predictive validity. Six asthma and seven chronic obstructive pulmonary disease models were identified. Seven (54%) models included second-order validation, typically by comparing observed outcomes to simulations of source study cohorts. Seven (54%) models included third-order validation, in which modeled outcomes were usually compared qualitatively for agreement with studies independent of the model. Validation endpoints included disease prevalence, exacerbation, and all-cause mortality. Validation was typically described as acceptable, despite near-universal absence of criteria for judging adequacy of validation. Although over half of recent obstructive lung disease simulation models report validation, inconsistencies in validation methods and lack of detailed reporting make assessing adequacy of validation difficult. For simulation modeling to be accepted as a tool for evaluating clinical and public health programs, models must be validated to credibly simulate health outcomes of interest. Defining the required level of validation and providing guidance for quantitative assessment and reporting of validation are important future steps in promoting simulation models as practical decision tools. PMID:19353353

Ferdinands, Jill M; Mannino, David M

2008-12-01

66

Model Validation with Hybrid Dynamic Simulation  

SciTech Connect

Abstract—Model validation has been one of the central topics in power engineering studies for years. As model validation aims at obtaining reasonable models to represent actual behavior of power system components, it has been essential to validate models against actual measurements or known benchmark behavior. System-wide model simulation results can be compared with actual recordings. However, it is difficult to construct a simulation case for a large power system such as the WECC system and to narrow down to problematic models in a large system. Hybrid dynamic simulation with its capability of injecting external signals into dynamic simulation enables rigorous comparison of measurements and simulation in a small subsystem of interest. This paper presents such a model validation methodology with hybrid dynamic simulation. Two application examples on generator and load model validation are presented to show the validity of this model validation methodology. This methodology is further extended for automatic model validation and dichotomous subsystem model validation. A few methods to define model quality indices have been proposed to quantify model error for model validation criteria development.

Huang, Zhenyu; Kosterev, Dmitry; Guttromson, Ross T.; Nguyen, Tony B.

2006-06-22

67

Model validation based on residuals analysis method  

Microsoft Academic Search

A model validation approach based on residuals analysis is presented to uncertain system with unmodelled dynamics. Since the effect of unmodelling errors, the residual signal is not only induced by the noise but also the unmodelling error. Therefore, in open-loop condition, model validation is firstly transferred into a hypothesis validation and a new residual estimation method is proposed. Though analyzing

Zong Qun; Dou Liqian; Sun Liankun; Liu Wenjing

2008-01-01

68

Proposed Modifications to the Conceptual Model of Coaching Efficacy and Additional Validity Evidence for the Coaching Efficacy Scale II-High School Teams  

ERIC Educational Resources Information Center

The purpose of this study was to determine whether theoretically relevant sources of coaching efficacy could predict the measures derived from the Coaching Efficacy Scale II-High School Teams (CES II-HST). Data were collected from head coaches of high school teams in the United States (N = 799). The analytic framework was a multiple-group…

Myers, Nicholas; Feltz, Deborah; Chase, Melissa

2011-01-01

69

Testing ecological models: the meaning of validation  

Microsoft Academic Search

The ecological literature reveals considerable confusion about the meaning of validation in the context of simulation models. The confusion arises as much from semantic and philosophical considerations as from the selection of validation procedures. Validation is not a procedure for testing scientific theory or for certifying the ‘truth’ of current scientific understanding, nor is it a required activity of every

Edward J. Rykiel

1996-01-01

70

Model Validation and the Modelica Language  

Microsoft Academic Search

Model validation is a crucial aspect of the develop- ment of any dynamic system that uses computer aided engineering (CAE). The ease of applying model validation techniques is dependent on the structure of the model, and this is often dependent on the CAE tool used. The Modelica language is both well-structured, and independent of any CAE tool. As such it

Richard Dorling

71

COMPOSABLE SIMULATION MODELS AND THEIR FORMAL VALIDATION  

E-print Network

COMPOSABLE SIMULATION MODELS AND THEIR FORMAL VALIDATION CLAUDIA SZABO B. Eng., "POLITEHNICA, as well as the verification and validation of the composed model. Using a component- connector paradigm and simulation, shared models are reused and as- sembled in various combinations to meet different user

Teo, Yong-Meng

72

Validation of PEP-II Resonantly Excited Turn-by-Turn BPM Data  

SciTech Connect

For optics measurement and modeling of the PEP-II electron (HER) and position (LER) storage rings, we have been doing well with MIA [1] which requires analyzing turn-by-turn Beam Position Monitor (BPM) data that are resonantly excited at the horizontal, vertical, and longitudinal tunes. However, in anticipation that certain BPM buttons and even pins in the PEP-II IR region would be missing for the run starting in January 2007, we had been developing a data validation process to reduce the effect due to the reduced BPM data accuracy on PEP-II optics measurement and modeling. Besides the routine process for ranking BPM noise level through data correlation among BPMs with a singular-value decomposition (SVD), we could also check BPM data symplecticity by comparing the invariant ratios. Results from PEP-II measurement will be presented.

Yan, Yiton T.; Cai, Yunhai; Colocho, William.; Decker, Franz-Josef; /SLAC

2007-06-28

73

A tutorial on verification and validation of simulation models  

Microsoft Academic Search

In this tutorial paper we give a general introduction to verification and validation of simulation models, define the various validation techniques, and present a recommended model validation procedure.

Robert G. Sargent

1984-01-01

74

An expository on verification and validation of simulation models  

Microsoft Academic Search

In this expository paper we give a general introduction to verification and validation of simulation models, define the various validation techniques, and present a recommended model validation procedure.

Robert G. Sargent

1985-01-01

75

An overview of verification and validation of simulation models  

Microsoft Academic Search

We give a general introduction to verification and validation of simulation models, define the various validation techniques, and present a recommended model validation procedure in this overview paper.

Robert G. Sargent

1987-01-01

76

A tutorial on validation and verification of simulation models  

Microsoft Academic Search

We give a general introduction to validation and verification of simulation models, define the various validation techniques, and present a recommended model validation procedure in this tutorial paper.

Robert G. Sargent

1988-01-01

77

Verification and validation of simulation models  

Microsoft Academic Search

This paper surveys verification and validation of models, especially simulation models in operations research. For verification it discusses 1) general good programming practice (such as modular programming), 2) checking intermediate simulation outputs through tracing and statistical testing per module, 3) statistical testing of final simulation outputs against analytical results, and 4) animation. For validation it discusses 1) obtaining real-worl data,

Jack P. C. Kleijnen

1995-01-01

78

Verifying and validating a simulation model  

Microsoft Academic Search

This paper presents the verification and validation (V&V) of simulation model with the emphasis on the possible modification. Based on the analysis, a new framework is proposed, and new terms are defined. An example is employed to demonstrate how the framework and terms related are used in verifying and validating an existing model.

Anbin Hu; Ye San; Zicai Wang

2001-01-01

79

Verifying and validating a simulation model  

Microsoft Academic Search

This paper presents the verification and validation of a simulation model with the emphasis on possible modification. Based on the analysis, a new framework is proposed, and new terms are defined. An example is employed to demonstrate how the framework and related terms are used in verifying and validating an existing model

Anbin Hu; Ye San; Zicai Wang

2001-01-01

80

Inert doublet model and LEP II limits  

SciTech Connect

The inert doublet model is a minimal extension of the standard model introducing an additional SU(2) doublet with new scalar particles that could be produced at accelerators. While there exists no LEP II analysis dedicated for these inert scalars, the absence of a signal within searches for supersymmetric neutralinos can be used to constrain the inert doublet model. This translation however requires some care because of the different properties of the inert scalars and the neutralinos. We investigate what restrictions an existing DELPHI Collaboration study of neutralino pair production can put on the inert scalars and discuss the result in connection with dark matter. We find that although an important part of the inert doublet model parameter space can be excluded by the LEP II data, the lightest inert particle still constitutes a valid dark matter candidate.

Lundstroem, Erik; Gustafsson, Michael; Edsjoe, Joakim [Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden); INFN, Sezione di Padova, Department of Physics 'Galileo Galilei', Via Marzolo 8, I-35131, Padua (Italy) and Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden); Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden)

2009-02-01

81

Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces  

NASA Technical Reports Server (NTRS)

A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

2009-01-01

82

On validation of multibody musculoskeletal models.  

PubMed

We review the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need for justification that the models are adequate representations of the systems they simulate. The need for a consistent terminology and established standards is identified and knowledge from fields with a more progressed state-of-the-art in verification and validation is introduced. A number of practical steps for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is hoped that a more structured approach to model validation can help to improve the credibility of musculoskeletal models. PMID:22468460

Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper; Rasmussen, John

2012-02-01

83

Verification and validation: some approaches and paradigms for verifying and validating simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. The different approaches to deciding model validity are described, two different paradigms that relate verification and validation to the model development process are presented, the use of graphical data statistical references for operational validity is discussed, and a recommended procedure for model validation is given.

Robert G. Sargent

2001-01-01

84

Algorithm for model validation: theory and applications.  

PubMed

Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer-Meshkov instability. PMID:17420476

Sornette, D; Davis, A B; Ide, K; Vixie, K R; Pisarenko, V; Kamm, J R

2007-04-17

85

Description and validation of realistic and structured endourology training model  

PubMed Central

Purpose: The aim of the present study was to validate a model of training, which combines the use of non-biological and ex vivo biological bench models, as well as the modelling of urological injuries for endourological treatment in a porcine animal model. Material and Methods: A total of 40 participants took part in this study. The duration of the activity was 16 hours. The model of training was divided into 3 levels: level I, concerning the acquisition of basic theoretical knowledge; level II, involving practice with the bench models and level III, concerning practice in the porcine animal model. First, trainees practiced with animals without using a model of injured (ureteroscopy, management of guide wires and catheters under fluoroscopic control) and later practiced in lithiasic animal model. During the activity, an evaluation of the face and content validity was conducted, as well as constructive validation provided by the trainees versus experts. Evolution of the variables during the course within each group was analysed using the Student’s t test for paired samples, while comparisons between groups, were performed using the Student’s t test for unpaired samples. Results: The assessments of face and content validity were satisfactory. The constructive validation, “within one trainee” shows that were statistical significant differences between the first time the trainees performed the tasks in the animal model and the last time, mainly in the knowledge of procedure and Holmium laser lithotripsy cathegories. At the beginning of level III, there are also statistical significant differences between trainee’s scores and the expert’s scores.Conclusions: This realistic Endourology training model allows the acquisition of knowledge and technical and non-technical skills as evidenced by the face, content and constructive validity. Structured use of bench models (biological and non biological) and animal model simulators increase the endourological basic skills. PMID:25374928

Soria, Federico; Morcillo, Esther; Sanz, Juan Luis; Budia, Alberto; Serrano, Alvaro; Sanchez-Margallo, Francisco M

2014-01-01

86

Factorial validity and measurement invariance across intelligence levels and gender of the Overexcitabilities Questionnaire-II (OEQ-II).  

PubMed

The concept of overexcitability, derived from Dabrowski's theory of personality development, offers a promising approach for the study of the developmental dynamics of giftedness. The present study aimed at (a) examining the factorial structure of the Overexcitabilities Questionnaire-II scores (OEQ-II) and (b) testing measurement invariance of these scores across intelligence and gender. A sample of 641 Dutch-speaking adolescents from 11 to 15 years old, 363 girls and 278 boys, participated in this study. Results showed that a model without cross-loadings did not fit the data well (using confirmatory factor analysis), whereas a factor model in which all cross-loadings were included yielded fit statistics that were in support of the factorial structure of the OEQ-II scores (using exploratory structural equation modeling). Furthermore, our findings supported the assumption of (partial) strict measurement invariance of the OEQ-II scores across intelligence levels and across gender. Such levels of measurement invariance allow valid comparisons between factor means and factor relationships across groups. In particular, the gifted group scored significantly higher on intellectual and sensual overexcitability (OE) than the nongifted group, girls scored higher on emotional and sensual OE than boys, and boys scored higher on intellectual and psychomotor OE than girls. PMID:24079958

Van den Broeck, Wim; Hofmans, Joeri; Cooremans, Sven; Staels, Eva

2014-03-01

87

Ecological Validity of the Conners' Continuous Performance Test II in a School-Based Sample  

ERIC Educational Resources Information Center

The ecological validity of the Conners' Continuous Performance Test II (CPT-II) was examined using a sample of 206 first- and second-grade children. Children's CPT-II scores were correlated with observations of inattentive/hyperactive behavior during CPT-II administration, observations of children's behavior during analogue academic task,…

Weis, Robert; Totten, Sara J.

2004-01-01

88

Hot gas defrost model development and validation  

Microsoft Academic Search

This paper describes the development, validation, and application of a transient model for predicting the heat and mass transfer effects associated with an industrial air-cooling evaporator during a hot gas defrost cycle. The inputs to the model include the space dry bulb temperature, space humidity, coil geometry, frost thickness, frost density, and hot gas inlet temperature. The model predicts the

N. Hoffenbecker; S. A. Klein; D. T. Reindl

2005-01-01

89

Model validation: Cooling-tower performance  

Microsoft Academic Search

The purpose of the fill performance validation project is to examine the accuracy of the cooling tower computer models and fill performance data that have recently been made available through EPRI. This project compares actual full scale tower performance test results to those predicted by the tower models. The cooling tower models used in this project include: FACTS\\/FACTR, developed by

P. B. Miller; G. L. Starnes

1989-01-01

90

Some approaches and paradigms for verifying and validating simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. The different approaches to deciding model validity are described, two different paradigms that relate verification and validation to the model development process are presented, the use of graphical data statistical references for operational validity is discussed, and a recommended procedure for model validation is given

Robert G. Sargent

2001-01-01

91

Validation of a Lagrangian particle model  

NASA Astrophysics Data System (ADS)

In this paper a custom-developed model of dispersion of pollutants is presented. The proposed approach is based on both a Lagrangian particle model and an urban-scale diagnostic model of the air velocity field. Both models constitute a part of an operational air quality assessment system. The proposed model is validated by comparing its computed results with the results of measurements obtained in a wind tunnel reflecting conditions of the Mock Urban Setting Test (MUST) experiment. Commonly used measures of errors and model concordance are employed and the results obtained are additionally compared with those obtained by other authors for CFD and non-CFD class models. The obtained results indicate that the validity of the model presented in this paper is acceptable.

Brzozowska, Lucyna

2013-05-01

92

A Validation Model for the DSR Protocol  

Microsoft Academic Search

This paper presents a validation model for the Dynamic Source Routing (DSR) protocol. This model is based on a formal specification of the protocol. It also provides a verification technique to verify the protocol against the IETF DSR draft requirements (1) as well as a testing technique for the generation of a set of scenarios to check the conformance of

Ana R. Cavalli; Cyril Grepet; Stéphane Maag; Vincent Tortajada

2004-01-01

93

Numerical model representation and validation strategies  

SciTech Connect

This paper describes model representation and validation strategies for use in numerical tools that define models in terms of topology, geometry, or topography. Examples of such tools include Computer-Assisted Engineering (CAE), Computer-Assisted Manufacturing (CAM), Finite Element Analysis (FEA), and Virtual Environment Simulation (VES) tools. These tools represent either physical objects or conceptual ideas using numerical models for the purpose of posing a question, performing a task, or generating information. Dependence on these numerical representations require that models be precise, consistent across different applications, and verifiable. This paper describes a strategy for ensuring precise, consistent, and verifiable numerical model representations in a topographic framework. The main assertion put forth is that topographic model descriptions are more appropriate for numerical applications than topological or geometrical descriptions. A topographic model verification and validation methodology is presented.

Dolin, R.M.; Hefele, J.

1997-10-01

94

Statistical validation of physical system models  

SciTech Connect

It is common practice in applied mechanics to develop mathematical models for mechanical system behavior. Frequently, the actual physical system being modeled is also available for testing, and sometimes the test data are used to help identify the parameters of the mathematical model. However, no general-purpose technique exists for formally, statistically judging the quality of a model. This paper suggests a formal statistical procedure for the validation of mathematical models of physical systems when data taken during operation of the physical system are available. The statistical validation procedure is based on the bootstrap, and it seeks to build a framework where a statistical test of hypothesis can be run to determine whether or not a mathematical model is an acceptable model of a physical system with regard to user-specified measures of system behavior. The approach to model validation developed in this study uses experimental data to estimate the marginal and joint confidence intervals of statistics of interest of the physical system. These same measures of behavior are estimated for the mathematical model. The statistics of interest from the mathematical model are located relative to the confidence intervals for the statistics obtained from the experimental data. These relative locations are used to judge the accuracy of the mathematical model. A numerical example is presented to demonstrate the application of the technique.

Paez, T.L.; Barney, P. [Sandia National Lab., Albuquerque, NM (United States); Hunter, N.F. [Los Alamos National Lab., NM (United States); Ferregut, C.; Perez, L.E. [Univ. of Texas, El Paso, TX (United States). FAST Center for Structural Integrity of Aerospace Systems

1996-10-01

95

Validation of computational models in biomechanics.  

PubMed

The topics of verification and validation have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. Verification and validation are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science, these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed verification and validation as they pertain to traditional solid and fluid mechanics, it is the intent of this paper to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed, with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the verification and validation process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

Henninger, H B; Reese, S P; Anderson, A E; Weiss, J A

2010-01-01

96

Validating instrument models through the calibration process  

NASA Astrophysics Data System (ADS)

The performance of modern IR instruments is becoming so good that meeting science requirements requires an accurate instrument model be used throughout the design and development process. The huge cost overruns on recent major programs are indicative that the design and cost models being used to predict performance have lagged behind anticipated performance. Tuning these models to accurately reflect the true performance of target instruments requires a modeling process that has been developed over several instruments and validated by careful calibration. The process of developing a series of Engineering Development Models is often used on longer duration programs to achieve this end. The accuracy of the models and their components has to be validated by a carefully planned calibration process, preferably considered in the instrument design. However, a good model does not satisfy all the requirements to bring acquisition programs under control. Careful detail in the specification process and a similar, validated model on the government side will also be required. This paper discusses the model development process and calibration approaches used to verify and update the models of several new instruments, including Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and Far Infrared Spectroscopy of the Troposphere (FIRST).

Bingham, G. E.; Tansock, J. J.

2006-08-01

97

Structural system identification: Structural dynamics model validation  

SciTech Connect

Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

Red-Horse, J.R.

1997-04-01

98

Feature extraction for structural dynamics model validation  

SciTech Connect

This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

2010-11-08

99

Validation of Space Weather Models at Community Coordinated Modeling Center  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) is a multiagency partnership, which aims at the creation of next generation space weather modes. CCMC goal is to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. The presentation will demonstrate the recent progress in CCMC metrics and validation activities.

Kuznetsova, M. M.; Pulkkinen, A.; Rastaetter, L.; Hesse, M.; Chulaki, A.; Maddox, M.

2011-01-01

100

Validity of the Mania Subscale of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II).  

ERIC Educational Resources Information Center

A study tested the validity of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II) for determining the presence of mania (bipolar disorder) in 22 individuals with severe mental retardation. Results found the mania subscale to be internally consistent and able to be used to classify manic and control subjects accurately. (Author/CR)

Matson, Johnny L.; Smiroldo, Brandi B.

1997-01-01

101

VALIDATION OF IMPROVED 3D ATR MODEL  

SciTech Connect

A full-core Monte Carlo based 3D model of the Advanced Test Reactor (ATR) was previously developed. [1] An improved 3D model has been developed by the International Criticality Safety Benchmark Evaluation Project (ICSBEP) to eliminate homogeneity of fuel plates of the old model, incorporate core changes into the new model, and to validate against a newer, more complicated core configuration. This new 3D model adds capability for fuel loading design and azimuthal power peaking studies of the ATR fuel elements.

Soon Sam Kim; Bruce G. Schnitzler

2005-11-01

102

On the Validity of Climate Models  

NASA Astrophysics Data System (ADS)

We object to contributor Kevin Corbett's assertions, in his article ``On award to Crichton'' (Eos, 87(43), 464, 2006), that ``Too often now, models are taken as data and their results taken as fact, when the accuracy of the models in predicting even short-term effects is poor and the fundamental validity for most climate models is opaque....'' Corbett cites (among other references) our Eos article ``Coupled climate model appraisal: A benchmark for future studies'', implying that our findings support his remarks. In fact, our evaluation of model simulations relative to observational data leads us to very different conclusions.

Phillips, Thomas; AchutaRao, Krishna; Bader, David; Covey, Curtis; Gleckler, Peter; Sperber, Kenneth; Taylor, Karl

2007-03-01

103

Diagnostic reasoning model validation in digestive endoscopy  

Microsoft Academic Search

The development of a computer-assisted diagnostic system in digestive endoscopy implies to understand the reasoning process of endoscopists. The aim of this study is to validate a reasoning model and a knowledge base previously defined. Eight endoscopists have participated to a diagnostic test including 5 video-sequences and using a \\

J. M. Cauvin; C. Le Guillou; B. Solaiman; M. Robaszkiewicz; H. Gouerou; C. Roux

2001-01-01

104

Structural system identification: Structural dynamics model validation  

Microsoft Academic Search

Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical

Red-Horse

1997-01-01

105

Feature Extraction for Structural Dynamics Model Validation  

Microsoft Academic Search

\\u000a This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies.\\u000a Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency\\u000a spectra, and estimated timeseries models, can be used to compare characteristics of structural system dynamics. By comparing\\u000a those response features extracted from experimental data and numerical

Mayuko Nishio; Francois Hemez; Keith Worden; Nobuo Takeda; Charles Farrar

2010-01-01

106

Verification validation and accreditation of simulation models  

Microsoft Academic Search

This paper presents guidelines for conducting verifica- tion, validation and accreditation (VV&A) of simulation models. Fifteen guiding principles are introduced to help the researchers, practitioners and managers better com- prehend what VV&A is all about. The VV&A activities are described in the modeling and simulation life cycle. A taxonomy of more than 77 V&V techniques is provided to assist simulationists

Osman Balci

1997-01-01

107

Verification, Validation And Accreditation Of Simulation Models  

Microsoft Academic Search

This paper presents guidelines for conducting verification, validation and accreditation (VV&A) of simulation models. Fifteen guiding principles are introduced to help the researchers, practitioners and managers better comprehend what VV&A is all about. The VV&A activities are described in the modeling and simulation life cycle. A taxonomy of more than 77 V&V techniques is provided to assist simulationists in selecting

Osman Balci

1997-01-01

108

A Hierarchical Systems Approach to Model Validation  

NASA Astrophysics Data System (ADS)

Existing approaches to the question of how climate models should be evaluated tend to rely on either philosophical arguments about the status of models as scientific tools, or on empirical arguments about how well runs from a given model match observational data. These have led to quantitative approaches expressed in terms of model bias or forecast skill, and ensemble approaches where models are assessed according to the extent to which the ensemble brackets the observational data. Unfortunately, such approaches focus the evaluation on models per se (or more specifically, on the simulation runs they produce) as though the models can be isolated from their context. Such approach may overlook a number of important aspects of the use of climate models: - the process by which models are selected and configured for a given scientific question. - the process by which model outputs are selected, aggregated and interpreted by a community of expertise in climatology. - the software fidelity of the models (i.e. whether the running code is actually doing what the modellers think it's doing). - the (often convoluted) history that begat a given model, along with the modelling choices long embedded in the code. - variability in the scientific maturity of different model components within a coupled system. These omissions mean that quantitative approaches cannot assess whether a model produces the right results for the wrong reasons, or conversely, the wrong results for the right reasons (where, say the observational data is problematic, or the model is configured to be unlike the earth system for a specific reason). Hence, we argue that it is a mistake to think that validation is a post-hoc process to be applied to an individual "finished" model, to ensure it meets some criteria for fidelity to the real world. We are therefore developing a framework for model validation that extends current approaches down into the detailed codebase and the processes by which the code is built and tested; and up into the broader scientific context in which models are selected and used to explore theories and test hypotheses. By taking software testing into account, we can build up a picture of the day-to-day practices by which modellers make small changes to the model and test the effect of such changes, both in isolated sections of code, and on the climatology of a full model. By taking the broader scientific context into account, we examine how features of the entire scientific enterprise improve (or impede) model validity, from the collection of observational data, creation of theories, use of these theories to develop models, choices for which model and which model configuration to use, choices for how to set up the runs, and interpretation of the results. Our approach cannot quantify model validity, but it can provide a systematic account of how the detailed practices involved in the development and use of climate models contribute to the quality of modelling systems and the scientific enterprise that they support. By making the relationships between these practices and model quality more explicit, we expect to identify specific strengths and weaknesses the modelling systems, particularly with respect to structural uncertainty in the models, and better characterize the "unknown unknowns".

Easterbrook, S. M.

2011-12-01

109

Crystallographic model validation: from diagnosis to healing.  

PubMed

Model validation has evolved from a passive final gatekeeping step to an ongoing diagnosis and healing process that enables significant improvement of accuracy. A recent phase of active development was spurred by the worldwide Protein Data Bank requiring data deposition and establishing Validation Task Force committees, by strong growth in high-quality reference data, by new speed and ease of computations, and by an upswing of interest in large molecular machines and structural ensembles. Progress includes automated correction methods, concise and user-friendly validation reports for referees and on the PDB websites, extension of error correction to RNA and error diagnosis to ligands, carbohydrates, and membrane proteins, and a good start on better methods for low resolution and for multiple conformations. PMID:24064406

Richardson, Jane S; Prisant, Michael G; Richardson, David C

2013-10-01

110

Crystallographic Model Validation: from Diagnosis to Healing  

PubMed Central

Model validation has evolved from a passive final gatekeeping step to an ongoing diagnosis and healing process that enables significant improvement of accuracy. A recent phase of active development was spurred by the worldwide Protein Data Bank requiring data deposition and establishing Validation Task Force committees, by strong growth in high-quality reference data, by new speed and ease of computations, and by an upswing of interest in large molecular machines and structural ensembles. Progress includes automated correction methods, concise and user-friendly validation reports for referees and on the PDB websites, extension of error correction to RNA and error diagnosis to ligands, carbohydrates, and membrane proteins, and a good start on better methods for low resolution and for multiple conformations. PMID:24064406

Richardson, Jane S.; Prisant, Michael G.; Richardson, David C.

2013-01-01

111

Solar Sail Model Validation from Echo Trajectories  

NASA Technical Reports Server (NTRS)

The NASA In-Space Propulsion program has been engaged in a project to increase the technology readiness of solar sails. Recently, these efforts came to fruition in the form of several software tools to model solar sail guidance, navigation and control. Furthermore, solar sails are one of five technologies competing for the New Millennium Program Space Technology 9 flight demonstration mission. The historic Echo 1 and Echo 2 balloons were comprised of aluminized Mylar, which is the near-term material of choice for solar sails. Both spacecraft, but particularly Echo 2, were in low Earth orbits with characteristics similar to the proposed Space Technology 9 orbit. Therefore, the Echo balloons are excellent test cases for solar sail model validation. We present the results of studies of Echo trajectories that validate solar sail models of optics, solar radiation pressure, shape and low-thrust orbital dynamics.

Heaton, Andrew F.; Brickerhoff, Adam T.

2007-01-01

112

Using Model Checking to Validate AI Planner Domain Models  

NASA Technical Reports Server (NTRS)

This report describes an investigation into using model checking to assist validation of domain models for the HSTS planner. The planner models are specified using a qualitative temporal interval logic with quantitative duration constraints. We conducted several experiments to translate the domain modeling language into the SMV, Spin and Murphi model checkers. This allowed a direct comparison of how the different systems would support specific types of validation tasks. The preliminary results indicate that model checking is useful for finding faults in models that may not be easily identified by generating test plans.

Penix, John; Pecheur, Charles; Havelund, Klaus

1999-01-01

113

Feature selective validation (FSV) for validation of computational electromagnetics (CEM). part II- assessment of FSV performance  

Microsoft Academic Search

The feature selective validation (FSV) method has been proposed as a technique to allow the objective, quantified, comparison of data for inter alia validation of computational electromagnetics. In the companion paper \\

Antonio Orlandi; Alistair P. Duffy; Bruce Archambeault; Giulio Antonini; Dawn E. Coleby; Samuel Connor

2006-01-01

114

Concepts of Model Verification and Validation  

SciTech Connect

Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a V&V process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, V&V cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model V&V is fundamentally different from software V&V. Code developers developing computer programs perform software V&V to ensure code correctness, reliability, and robustness. In model V&V, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model V&V guidelines and procedures. The expected outcome of the model V&V process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful V&V program. This objective is motivated by the need for highly accurate numerical models for making predictions to s

B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

2004-10-30

115

HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments  

SciTech Connect

HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

McCann, R.A.; Lowery, P.S.

1987-10-01

116

Paleoclimate validation of a numerical climate model  

SciTech Connect

An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate nested with the RegCM2 regional climate model, is part of a larger study for DOE`s Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented.

Schelling, F.J. [Sandia National Labs., Las Vegas, NV (United States); Church, H.W.; Zak, B.D. [Sandia National Labs., Albuquerque, NM (United States); Thompson, S.L. [National Center for Atmospheric Research, Boulder, CO (United States)

1994-12-31

117

Paleoclimate validation of a numerical climate model  

SciTech Connect

An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE`s Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented.

Schelling, F.J. [Sandia National Laboratories, Las Vegas, NV (United states); Church, H.W.; Zak, B.D. [Sandia National Labs., Albuquerque, NM (United States); Thompson, S.L. [National Center for Atmospheric Research, Boulder, CO (United States)

1994-04-01

118

Derivation and Validation of a Prognostic Model for Pulmonary Embolism  

PubMed Central

Rationale: An objective and simple prognostic model for patients with pulmonary embolism could be helpful in guiding initial intensity of treatment. Objectives: To develop a clinical prediction rule that accurately classifies patients with pulmonary embolism into categories of increasing risk of mortality and other adverse medical outcomes. Methods: We randomly allocated 15,531 inpatient discharges with pulmonary embolism from 186 Pennsylvania hospitals to derivation (67%) and internal validation (33%) samples. We derived our prediction rule using logistic regression with 30-day mortality as the primary outcome, and patient demographic and clinical data routinely available at presentation as potential predictor variables. We externally validated the rule in 221 inpatients with pulmonary embolism from Switzerland and France. Measurements: We compared mortality and nonfatal adverse medical outcomes across the derivation and two validation samples. Main Results: The prediction rule is based on 11 simple patient characteristics that were independently associated with mortality and stratifies patients with pulmonary embolism into five severity classes, with 30-day mortality rates of 0–1.6% in class I, 1.7–3.5% in class II, 3.2–7.1% in class III, 4.0–11.4% in class IV, and 10.0–24.5% in class V across the derivation and validation samples. Inpatient death and nonfatal complications were ? 1.1% among patients in class I and ? 1.9% among patients in class II. Conclusions: Our rule accurately classifies patients with pulmonary embolism into classes of increasing risk of mortality and other adverse medical outcomes. Further validation of the rule is important before its implementation as a decision aid to guide the initial management of patients with pulmonary embolism. PMID:16020800

Aujesky, Drahomir; Obrosky, D. Scott; Stone, Roslyn A.; Auble, Thomas E.; Perrier, Arnaud; Cornuz, Jacques; Roy, Pierre-Marie; Fine, Michael J.

2005-01-01

119

O`ahu Grid Study: Validation of Grid Models  

E-print Network

O`ahu Grid Study: Validation of Grid Models Prepared for the U.S. Department of Energy Office............................................................................................................. 1 2 Model Validation.................................................................. 8 2.2.2 Dynamic Data Model

120

Spread-spectrum ranging multipath model validation  

Microsoft Academic Search

Spread-spectrum ranging multipath model validation results are presented. Previously published theoretical results are compared with data obtained from bench-testing using a multichannel satellite simulator. Results are presented for standard or wide-correlator (i.e., 1 chip early-to-late correlator spacing) and narrow-correlator (i.e., 0.1 chip) GPS C\\/A-code architectures as well as standard P-code. The close agreement of the bench data and theoretical results

M. S. Braasch; M. F. DiBenedetto

2001-01-01

121

MODEL VALIDATION VIA UNCERTAINTY PROPAGATION AND DATA TRANSFORMATIONS  

E-print Network

in modeling and simulation accuracy. Model verification and validation are the primary methods for building verification is the assessment of the solution accuracy of a mathematical model. Model validation, on the other associated with experiments. Furthermore, deterministic simulations for model validation do not consider

Chen, Wei

122

Validation and Verification of Tsunami Numerical Models  

NASA Astrophysics Data System (ADS)

In the aftermath of the 26 December, 2004 tsunami, several quantitative predictions of inundation for historic events were presented at international meetings differing substantially from the corresponding well-established paleotsunami measurements. These significant differences attracted press attention, reducing the credibility of all inundation modeling efforts. Without exception, the predictions were made using models that had not been benchmarked. Since an increasing number of nations are now developing tsunami mitigation plans, it is essential that all numerical models used in emergency planning be subjected to validation—the process of ensuring that the model accurately solves the parent equations of motion—and verification—the process of ensuring that the model represents geophysical reality. Here, we discuss analytical, laboratory, and field benchmark tests with which tsunami numerical models can be validated and verified. This is a continuous process; even proven models must be subjected to additional testing as new knowledge and data are acquired. To date, only a few existing numerical models have met current standards, and these models remain the only choice for use for real-world forecasts, whether short-term or long-term. Short-term forecasts involve data assimilation to improve forecast system robustness and this requires additional benchmarks, also discussed here. This painstaking process may appear onerous, but it is the only defensible methodology when human lives are at stake. Model standards and procedures as described here have been adopted for implementation in the U.S. tsunami forecasting system under development by the National Oceanic and Atmospheric Administration, they are being adopted by the Nuclear Regulatory Commission of the U.S. and by the appropriate subcommittees of the Intergovernmental Oceanographic Commission of UNESCO.

Synolakis, C. E.; Bernard, E. N.; Titov, V. V.; Kâno?lu, U.; González, F. I.

2008-12-01

123

Validation of Space Weather Models at Community Coordinated Modeling Center  

NASA Technical Reports Server (NTRS)

The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. Space weather models and coupled model chains hosted at the CCMC range from the solar corona to the Earth's upper atmosphere. CCMC has developed a number of real-time modeling systems, as well as a large number of modeling and data products tailored to address the space weather needs of NASA's robotic missions. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. CCMC has been leading recent comprehensive modeling challenges under GEM, CEDAR and SHINE programs. The presentation will focus on experience in carrying out comprehensive and systematic validation of large sets of. space weather models

Kuznetsova, M. M.; Hesse, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; Zheng, Y.; MacNeice, P. J.; Shim, J.; Taktakishvili, A.; Chulaki, A.

2011-01-01

124

VALIDITY OF THE STANDARD CROSS-CORRELATION TEST FOR MODEL  

E-print Network

VALIDITY OF THE STANDARD CROSS-CORRELATION TEST FOR MODEL STRUCTURE VALIDATION Sippe G. Douma)validation of this assumption that the model structure is rich enough to contain the true system. The standard test that this standard test itself is valid only under exactly those assumptions it is meant to verify. As a result

Van den Hof, Paul

125

Wide-area dynamic model validation using FNET measurements  

Microsoft Academic Search

Model validation is a highly recommended practice that should be performed periodically since the system model is very important for ensuring reliable and economic power system operation. Currently, field data are used to validate a specific model or an entire system. An FNET-based model validation procedure is proposed in this paper. It uses wide-area measurements to correct the response found

Lang Chen; Penn N. Markham; Yilu Liu

2012-01-01

126

Systematic Verification, Validation and Calibration of Traffic Simulation Models  

E-print Network

1 Systematic Verification, Validation and Calibration of Traffic Simulation Models H. Rakha1 , B attempts to provide traffic model developers and users with a framework for the verification, validation and calibration of traffic models. Examples are provided to illustrate the model verification and validation

Hellinga, Bruce

127

An Independent Validation of Vulnerability Discovery Models  

E-print Network

Having a precise vulnerability discovery model (VDM) would provide a useful quantitative insight to assess software security. Thus far, several models have been proposed with some evidence supporting their goodness-of-fit. In this work we describe an independent validation of the applicability of six existing VDMs in seventeen releases of the three popular browsers Firefox, Google Chrome and Internet Explorer. We have collected five different kinds of data sets based on different definitions of a vulnerability. We introduce two quantitative metrics, goodness-of-fit entropy and goodness-of-fit quality, to analyze the impact of vulnerability data sets to the stability as well as quality of VDMs in the software life cycles. The experiment result shows that the "confirmed-by-vendors' advisories" data sets apparently yields more stable and better results for VDMs. And the performance of the s-shape logistic model (AML) seems to be superior performance in overall. Meanwhile, Anderson thermodynamic model (AT) is ind...

Nguyen, Viet Hung

2012-01-01

128

Validation of Computational Models in Biomechanics  

PubMed Central

The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

2010-01-01

129

Computational modeling and validation for hypersonic inlets  

NASA Technical Reports Server (NTRS)

Hypersonic inlet research activity at NASA is reviewed. The basis is the experimental tests performed with three inlets: the NASA-Lewis Mach 5, the McDonnell Douglas Mach 12, and the NASA-Langley Mach 18. Both 3-D parabolized Navier-Stokes and Navier-Stokes codes were used to compute the flow within the three inlets. Modeling assumptions in the codes involve the turbulence model, the nature of the boundary layer, shock wave boundary layer interaction, and the flow spilled to the outside of the inlet. Use of the codes in conjunction with the experimental data are helping to develop a clearer knowledge of the inlet flow physics and to focus on the modeling improvements required in order to arrive at validated codes.

Povinelli, Louis A.

1990-01-01

130

Plasma Reactor Modeling and Validation Experiments  

NASA Technical Reports Server (NTRS)

Plasma processing is a key processing stop in integrated circuit manufacturing. Low pressure, high density plum reactors are widely used for etching and deposition. Inductively coupled plasma (ICP) source has become popular recently in many processing applications. In order to accelerate equipment and process design, an understanding of the physics and chemistry, particularly, plasma power coupling, plasma and processing uniformity and mechanism is important. This understanding is facilitated by comprehensive modeling and simulation as well as plasma diagnostics to provide the necessary data for model validation which are addressed in this presentation. We have developed a complete code for simulating an ICP reactor and the model consists of transport of electrons, ions, and neutrals, Poisson's equation, and Maxwell's equation along with gas flow and energy equations. Results will be presented for chlorine and fluorocarbon plasmas and compared with data from Langmuir probe, mass spectrometry and FTIR.

Meyyappan, M.; Bose, D.; Hash, D.; Hwang, H.; Cruden, B.; Sharma, S. P.; Rao, M. V. V. S.; Arnold, Jim (Technical Monitor)

2001-01-01

131

DEVELOPMENT OF THE MESOPUFF II DISPERSION MODEL  

EPA Science Inventory

The development of the MESOPUFF II regional-scale air quality model is described. MESOPUFF II is a Lagrangian variable-trajectory puff superposition model suitable for modeling the transport, diffusion and removal of air pollutants from multiple point and area sources at transpor...

132

Validating Simulation Models: A General Framework and Four Applied Examples  

E-print Network

- 1 - Validating Simulation Models: A General Framework and Four Applied Examples Robert E. Marks for discussing the empirical validation of simulation models of market phenomena, in particular of agent the "assurance" (programming verification and empirical validation) of AB models, introducing a five-step process

Tesfatsion, Leigh

133

Model Validation of Power System Components Using Hybrid Dynamic Simulation  

SciTech Connect

Hybrid dynamic simulation, with its capability of injecting external signals into dynamic simulation, opens the traditional dynamic simulation loop for interaction with actual field measurements. This simulation technique enables rigorous comparison between simulation results and actual measurements and model validation of individual power system components within a small subsystem. This paper uses a real example of generator model validation to illustrate the procedure and validity of the component model validation methodology using hybrid dynamic simulation. Initial model calibration has also been carried out to show how model validation results would be used to improve component models

Huang, Zhenyu; Nguyen, Tony B.; Kosterev, Dmitry; Guttromson, Ross T.

2008-05-31

134

Model Validation of Power System Components Using Hybrid Dynamic Simulation  

SciTech Connect

Abstract—Hybrid dynamic simulation, with its capability of injecting external signals into dynamic simulation, opens the traditional dynamic simulation loop for interaction with actual field measurements. This simulation technique enables rigorous comparison between simulation results and actual measurements and model validation of individual power system components within a small subsystem. This paper uses a real example of generator model validation to illustrate the procedure and validity of the component model validation methodology using hybrid dynamic simulation. Initial model calibration has also been carried out to show how model validation results would be used to improve component models.

Huang, Zhenyu; Nguyen, Tony B.; Kosterev, Dmitry; Guttromson, Ross T.

2006-05-21

135

Model-Based Method for Sensor Validation  

NASA Technical Reports Server (NTRS)

Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

Vatan, Farrokh

2012-01-01

136

Boron-10 Lined Proportional Counter Model Validation  

SciTech Connect

The Department of Energy Office of Nuclear Safeguards (NA-241) is supporting the project “Coincidence Counting With Boron-Based Alternative Neutron Detection Technology” at Pacific Northwest National Laboratory (PNNL) for the development of an alternative neutron coincidence counter. The goal of this project is to design, build and demonstrate a boron-lined proportional tube-based alternative system in the configuration of a coincidence counter. This report discusses the validation studies performed to establish the degree of accuracy of the computer modeling methods current used to simulate the response of boron-lined tubes. This is the precursor to developing models for the uranium neutron coincidence collar under Task 2 of this project.

Lintereur, Azaree T.; Siciliano, Edward R.; Kouzes, Richard T.

2012-06-30

137

Document Degradation Models and a Methodology for Degradation Model Validation  

Microsoft Academic Search

Document Degradation Models and a Methodology for DegradationModel Validationby Tapas KanungoChairperson of Supervisory Committee: Professor Robert M. HaralickDepartment of Electrical EngineeringPrinting, photocopying and scanning processes degrade the image quality of adocument. Although research in document understanding started in the sixties, onlytwo document degradation models have been proposed thus far. Furthermore, noattempts have been made to rigorously validate them. In...

Tapas Kanungo

1996-01-01

138

Validation of the Korean version Moorehead-Ardelt quality of life questionnaire II  

PubMed Central

Purpose To investigate the weight loss effects with higher sensitivity, disease specific quality of life (QoL) instruments were important. The Moorehead-Ardelt quality of life questionnaire II (MA-II) is widely used, because it was simple and validated the several languages. The aims of present study was performed the translation of MA-II Korean version and the validation compared with EuroQol-5 dimension (EQ-5D), obesity-related problems scale (OP-scale), and impact of weight quality of life-lite (IWQoL-Lite). Methods The study design was a multicenter, cross-sectional survey and this study was included the postoperative patients. The validation procedure is translation-back translation procedure, pilot study, and field study. The instruments of measuring QoL included the MA-II, EQ-5D, OP-scale, and IWQoL-lite. The reliability was checked through internal consistency using Cronbach alpha coefficients. The construct validity was assessed the Spearman rank correlation between 6 domains of MA-II and EQ-5D, OP-scale, and 5 domains of IWQoL-Lite. Results The Cronbach alpha of MA-II was 0.763, so the internal consistency was confirmed. The total score of MA-II was significantly correlated with all other instruments; EQ-5D, OP-scale, and IWQoL-Lite. IWQoL-lite (? = 0.623, P < 0.001) was showed the strongest correlation compared with MA-II, followed by OP-scale (? = 0.588, P < 0.001) and EQ-5D (? = 0.378, P < 0.01). Conclusion The Korean version MA-II was valid instrument of measuring the obesity-specific QoL. Through the present study, the MA-II was confirmed to have good reliability and validity and it was also answered simple for investigating. Thus, MA-II could be estimated sensitive and exact QoL in obesity patients. PMID:25368853

Lee, Yeon Ji; Song, Hyun Jin; Oh, Sung-Hee; Kwon, Jin Won; Moon, Kon-Hak; Park, Joong-Min; Lee, Sang Kuon

2014-01-01

139

Dynamic model validation for compliance with NERC standards  

Microsoft Academic Search

This paper focuses on a few different aspects of the dynamic model validation process. It briefly describes the undergoing development of NERC standards that would require the validation and periodic re-validation of dynamic simulation models and then discusses the capabilities and shortfalls of the different approaches that might be applied to comply with such requirements. Finally, the paper presents some

Leonardo T. G. Lima

2009-01-01

140

VALIDATING COMPLEX CONSTRUCTION SIMULATION MODELS USING 3D VISUALIZATION  

E-print Network

1 VALIDATING COMPLEX CONSTRUCTION SIMULATION MODELS USING 3D VISUALIZATION Vineet R. Kamat 1 Julio/or the time to check the veracity and the validity of simulation models and thus have little confidence in the results. Visualizing simulated operations in 3D can be of substantial help in the verification, validation

Kamat, Vineet R.

141

Validation of simulation models: The weak\\/missing link  

Microsoft Academic Search

The validation of the simulation model is generally acknowledged as an integral part of a simulation project. There is, however, no general agreement on how simulation models should be verified and there is often confusion as to the difference between validation and verification. In this paper we first set forth a framework for verification and validation as well as some

Stewart V. Hoover; Ronald F. Perry

1984-01-01

142

Validation of HEDR models. Hanford Environmental Dose Reconstruction Project  

SciTech Connect

The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

1994-05-01

143

Synchronous modeling and validation of priority inheritance schedulers  

E-print Network

Synchronous modeling and validation of priority inheritance schedulers Erwan Jahier, Nicolas be simulated and validated together with the software. In a previous paper [1], we proposed such a translation: Embedded systems, Simulation, Scheduling, Formal Verification, Architecture Description Languages

Paris-Sud XI, Université de

144

Parallel TCP Sockets: Simple Model, Throughput and Validation  

E-print Network

is validated by simulations and Internet measurements. The latter validates the model in cases when analysis sockets is a generic "hack" to improve throughput attained by TCP for bulk data transfers by opening

145

Diurnal ocean surface layer model validation  

NASA Technical Reports Server (NTRS)

The diurnal ocean surface layer (DOSL) model at the Fleet Numerical Oceanography Center forecasts the 24-hour change in a global sea surface temperatures (SST). Validating the DOSL model is a difficult task due to the huge areas involved and the lack of in situ measurements. Therefore, this report details the use of satellite infrared multichannel SST imagery to provide day and night SSTs that can be directly compared to DOSL products. This water-vapor-corrected imagery has the advantages of high thermal sensitivity (0.12 C), large synoptic coverage (nearly 3000 km across), and high spatial resolution that enables diurnal heating events to be readily located and mapped. Several case studies in the subtropical North Atlantic readily show that DOSL results during extreme heating periods agree very well with satellite-imagery-derived values in terms of the pattern of diurnal warming. The low wind and cloud-free conditions necessary for these events to occur lend themselves well to observation via infrared imagery. Thus, the normally cloud-limited aspects of satellite imagery do not come into play for these particular environmental conditions. The fact that the DOSL model does well in extreme events is beneficial from the standpoint that these cases can be associated with the destruction of the surface acoustic duct. This so-called afternoon effect happens as the afternoon warming of the mixed layer disrupts the sound channel and the propagation of acoustic energy.

Hawkins, Jeffrey D.; May, Douglas A.; Abell, Fred, Jr.

1990-01-01

146

Validation of the Sexual Assault Symptom Scale II (SASS II) Using a Panel Research Design  

ERIC Educational Resources Information Center

To examine the utility of a self-report scale of sexual assault trauma, 223 female victims were interviewed with the 43-item Sexual Assault Symptom Scale II (SASS II) at 1, 3, 7, 11, and 15 months postassault. Factor analyses using principal-components extraction with an oblimin rotation yielded 7 common factors with 31 items. The internal…

Ruch, Libby O.; Wang, Chang-Hwai

2006-01-01

147

A nonlinear functional approach to LFT model validation  

Microsoft Academic Search

Model validation provides a useful means of assessing the ability of a model to account for a specific experimental observation, and has application to modeling, identification and fault detection. In this paper, we consider a new approach to the model validation problem by deploying quadratic functionals, and more generally nonlinear functionals, to specify noise and dynamical perturbation sets. Specifically, we

Geir Dullerud; Roy Smith

2002-01-01

148

Validating agent based models through virtual worlds.  

SciTech Connect

As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior. Results from our work indicate that virtual worlds have the potential for serving as a proxy in allocating and populating behaviors that would be used within further agent-based modeling studies.

Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina [Sandia National Laboratories, Livermore, CA] [Sandia National Laboratories, Livermore, CA; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E. [North Carolina State University, Raleigh, NC] [North Carolina State University, Raleigh, NC; Bernstein, Jeremy Ray Rhythm [Gaikai, Inc., Aliso Viejo, CA] [Gaikai, Inc., Aliso Viejo, CA

2014-01-01

149

Lidar validation of SAGE II aerosol measurements after the 1991 Mount Pinatubo eruption  

E-print Network

Lidar validation of SAGE II aerosol measurements after the 1991 Mount Pinatubo eruption Juan Carlos the possibility of filling the vertical gaps using lidar data. We compare every coincident backscattering measurement (at a wavelength of 0.694 mm) from two lidars, at Mauna Loa, Hawaii (19.5°N, 155.6°W

Robock, Alan

150

Geochemistry Model Validation Report: Material Degradation and Release Model  

SciTech Connect

The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

H. Stockman

2001-09-28

151

Validation of Decision Models for an Autonomous Earth Surveillance Satellite  

Microsoft Academic Search

This paper presents a novel application of Validation and Verification to autonomous decision- making. We discuss the relevance of validating formal decision models for an autonomous Earth surveillance satellite, in order to prove all possible on-line decisions are constrained to a given set of validation properties. We distinguish safety and liveness properties depending on whether properties refer to the satellite's

Florent Teichteil-Königsbuch; Christel Seguin; Cédric Pralet

152

External validation of a Cox prognostic model: principles and methods  

PubMed Central

Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

2013-01-01

153

Design and Development Research: A Model Validation Case  

ERIC Educational Resources Information Center

This is a report of one case of a design and development research study that aimed to validate an overlay instructional design model incorporating the theory of multiple intelligences into instructional systems design. After design and expert review model validation, The Multiple Intelligence (MI) Design Model, used with an Instructional Systems…

Tracey, Monica W.

2009-01-01

154

What do we mean by validating a prognostic model?  

Microsoft Academic Search

SUMMARY Prognostic models are used in medicine for investigating patient outcome in relation to patient and disease characteristics. Such models do not always work well in practice, so it is widely recommended that they need to be validated. The idea of validating a prognostic model is generally taken to mean establishing that it works satisfactorily for patients other than those

Douglas G. Altman; Patrick Royston

2000-01-01

155

Collaborative Infrastructure for Test-Driven Scientific Model Validation  

E-print Network

University, USA rgerkin@asu.edu ABSTRACT One of the pillars of the modern scientific method is modelCollaborative Infrastructure for Test-Driven Scientific Model Validation Cyrus Omar, Jonathan validation: comparing a scientific model's predictions against empirical observations. Today, a scientist

Aldrich, Jonathan

156

A High-Performance Approach to Model Calibration and Validation Keywords: model validation, cognitive models, behavior moderators, genetic algorithms  

E-print Network

), a set of alternative knowledge-based cognitive architectures (ACT-R, Soar/Epic, DCOG, and iGEN) were, cognitive models, behavior moderators, genetic algorithms ABSTRACT: A new model validation approach algorithms to fit cognitive models to human performance data. The efficiency, accuracy, and non

Ritter, Frank

157

Techniques and Issues in Agent-Based Modeling Validation  

SciTech Connect

Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

Pullum, Laura L [ORNL; Cui, Xiaohui [New York Institute of Technology (NYIT)

2012-01-01

158

Industrial validation models 1 4/23/03 Experimental validation of new software technology  

E-print Network

Industrial validation models 1 4/23/03 Experimental validation of new software technology Marvin V When to apply a new technology in an organization is a critical decision for every software development organization. Earlier work defines a set of methods that the research community uses when a new technology

Zelkowitz, Marvin V.

159

Statistical Validation of Normal Tissue Complication Probability Models  

SciTech Connect

Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van't; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)] [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands) [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

2012-09-01

160

System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report  

SciTech Connect

The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

2013-12-01

161

Model validation and verification of large and complex space structures  

Microsoft Academic Search

In this paper two case studies of model validation and verification of large and complex space structures are presented. The first case study focuses on experience gained in performing model validation and verification of the mated Russian Mir Space Station and United States Space Shuttle. This study makes use of dynamic test data taken during the STS-81 flight associated with

David C. Zimmerman

2000-01-01

162

Model Validation of Power System Components Using Hybrid Dynamic Simulation  

Microsoft Academic Search

Hybrid dynamic simulation, with its capability of injecting external signals into dynamic simulation, opens the traditional dynamic simulation loop for interaction with actual field measurements. This simulation technique enables rigorous comparison between simulation results and actual measurements and model validation of individual power system components within a small subsystem. This paper uses a real example of generator model validation to

Zhenyu Huang; Tony B. Nguyen; Dmitry Kosterev; Ross T. Guttromson

2006-01-01

163

A Formal Validation Model for the Netconf Protocol  

Microsoft Academic Search

Netconf is a protocol proposed by the IETF that denes a set of operations for network conguration. One of the main issues of Netconf is to dene operations such as validate and commit, which cur- rently lack a clear description and an information model. We propose in this paper a model for validation based on XML schema trees. By using

Sylvain Hallé; Rudy Deca; Omar Cherkaoui; Roger Villemaire; Daniel Puche

2004-01-01

164

LFT uncertain model validation with time and frequency domain measurements  

Microsoft Academic Search

We study a model validation problem pertaining to linear fractional transform (LFT) uncertainty models. We extend previous validation approaches, based upon either time or frequency measurements, to one using simultaneously time and frequency domain data. We show that this problem can be reduced to two independent convex feasibility tests, each of which corresponds to the time or frequency domain data

Demin Xu; Zhang Rent; Guoxiang Gus; Jie Chenl

1997-01-01

165

A nonlinear functional approach to LFT model validation  

Microsoft Academic Search

Abstract Model validation provides a useful means of assessing the ability of a model to account for a speci\\/c experimental observation, and has application to modeling, identi\\/cation and fault detection. In this paper, we consider a new approach to the model validation problem by deploying quadratic functionals, and more generally nonlinear functionals, to specify noise and dynamical perturbation sets. Speci\\/cally,

Geir Dullerud; Roy Smith

166

Molecular modeling of cobalt(II) hyaluronate.  

PubMed

Structural data for complexes of hyaluronic acid and 3d metals(II) of the fourth group of the periodic table are lacking. A combined QM/MM method was used to solve the structure of the first coordination sphere around the cobalt(II) ion. Some available experimental data were compared with the results obtained via computation and were found to be in good agreement. Our results open the way for using molecular modeling to solve the structure of other metal(II) hyaluronates. PMID:16023623

Tratar Pirc, Elizabeta; Zidar, Jernej; Bukovec, Peter; Hodoscek, Milan

2005-09-01

167

Parameterization of Model Validating Sets for Uncertainty Bound Optimizations  

NASA Technical Reports Server (NTRS)

Given experimental data and a priori assumptions on nominal model and a linear fractional transformation uncertainty structure, feasible conditions for model validation is given. All unknown but bounded exogenous inputs are assumed to occur at the plant outputs. With the satisfaction of the feasible conditions for model validation, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization can be used as a basis for the development of a systematic way to construct model validating uncertainty models which have specific linear fractional transformation structure for use in robust control design and analysis. The proposed feasible condition (existence) test and the parameterization is computationally attractive as compared to similar tests currently available.

Lim, K. B.; Giesy, D. P.

1998-01-01

168

Instrumental Conditioning II: Modeling Action Selection  

E-print Network

errors) · in another condition: no action selection, subjects only indicate the side the `computer' hasInstrumental Conditioning II: Modeling Action Selection PSY/NEU338:Animal learning and decision making: Psychological, computational and neural perspectives how to model instrumental conditioning? Marr

Niv, Yael

169

Multi-terminal Subsystem Model Validation for Pacific DC Intertie  

SciTech Connect

this paper proposes to validate dynamic model of Pacific DC Intertie with the concept of hybrid simulation by combing simulation with PMU measurements. The Playback function available in GE PSLF is adopted for hybrid simulation. It is demonstrated for the first time the feasibility of using Playback function on multi-terminal subsystem. Sensitivity studies are also presented as a result of common PMU measurement quality problem, ie, offset noise and time synchronization. Results indicate a good tolerance of PDCI model generally. It is recommended that requirements should apply to phasor measurements in model validation work to ensure better analysis. Key parameters are identified based on impact of value change to model behavior. Two events are employed for preliminary model validation with PMU measurements. Suggestions are made for PDCI model validation work in the future.

Yang, Bo; Huang, Zhenyu; Kosterev, Dmitry

2008-07-20

170

Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised  

NASA Technical Reports Server (NTRS)

Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

Lim, K. B.; Giesy, D. P.

2000-01-01

171

Verification And Validation Of Simulation Models  

Microsoft Academic Search

The Hierarchical Modeling and Simulation System (HI-MASS) is a prototype modeling and simulation system that supports modeling based on the Hierarchical Control Flow Graph Model paradigm and simulation execution using a sequential synchronous simulation algorithm. The prototype is an object oriented C++ based system designed for a Unix environment and implemented using freely available software tools. Models are specified using

Douglas G. Fritz; Robert G. Sargent; Thorsten Daum

1995-01-01

172

Validating Simulation Models: A General Framework and Four Applied Examples  

Microsoft Academic Search

This paper provides a framework for discussing the empirical validation of simulation models of market phenomena, in particular\\u000a of agent-based computational economics models. Such validation is difficult, perhaps because of their complexity; moreover,\\u000a simulations can prove existence, but not in general necessity. The paper highlights the Energy Modeling Forum’s benchmarking\\u000a studies as an exemplar for simulators. A market of competing

Robert Ernest Marks

2007-01-01

173

A Process Improvement Model for Software Verification and Validation  

Microsoft Academic Search

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on

John Callahan; George Sabolish

1994-01-01

174

Validating Models for Disease Detection Using Twitter Todd Bodnar  

E-print Network

Validating Models for Disease Detection Using Twitter Todd Bodnar Pennsylvania State University influenza prevalence using Twitter. We then validate them with tests that are de- signed to avoid mining, regression, machine learning, Twitter 1. INTRODUCTION The rapid adoption of social media

Salathé, Marcel

175

VALIDATION METHODS FOR CHEMICAL EXPOSURE AND HAZARD ASSESSMENT MODELS  

EPA Science Inventory

Mathematical models and computer simulation codes designed to aid in hazard assessment for environmental protection must be verified and validated before they can be used with confidence in a decision-making or priority-setting context. Operational validation, or full-scale testi...

176

NTHMP Model Validation Workshop. Galveston TX, Mar. 30 -Apr. 01, 2011 Validation of MOST (Method Of  

E-print Network

___________________________________________________ Validation of MOST (Method Of Splitting Tsunami) numerical model Elena Tolkova NOAA Center for Tsunami benchmark 4 - slides 13-17 Run-up on a vertical wall, lab benchmark 5 - slides 18-21 Run-up on a conical Method Of Splitting Tsunami (MOST) numerical model * was designed to solve SWE efficiently by reducing

Tolkova, Elena

177

Validating Computational Cognitive Process Models across Multiple Timescales  

NASA Astrophysics Data System (ADS)

Model comparison is vital to evaluating progress in the fields of artificial general intelligence (AGI) and cognitive architecture. As they mature, AGI and cognitive architectures will become increasingly capable of providing a single model that completes a multitude of tasks, some of which the model was not specifically engineered to perform. These models will be expected to operate for extended periods of time and serve functional roles in real-world contexts. Questions arise regarding how to evaluate such models appropriately, including issues pertaining to model comparison and validation. In this paper, we specifically address model validation across multiple levels of abstraction, using an existing computational process model of unmanned aerial vehicle basic maneuvering to illustrate the relationship between validity and timescales of analysis.

Myers, Christopher; Gluck, Kevin; Gunzelmann, Glenn; Krusmark, Michael

2010-12-01

178

Separation of ibuprofen, codeine phosphate, their degradation products and impurities by capillary electrophoresis. II. Validation.  

PubMed

A micellar electrokinetic chromatography method for the determination of ibuprofen and codeine phosphate hemihydrate and their degradation products and impurities in a commercial tablet formulation has been validated. The validation has been performed according to the International Conference of Harmonisation's guidance on the validation of analytical methods, and selectivity, linearity, accuracy, precision, detection limit, quantitation limit, robustness and range test were performed to determine the suitability of the method. It was possible to use the fractional factorial design model from the optimisation of the method to draw conclusions about its robustness. The results confirm that the method is highly suitable for its intended purpose. PMID:9882139

Stubberud, K P; Aström, O

1998-11-20

179

Modeling of a Foamed Emulsion Bioreactor: II. Model Parametric Sensitivity  

E-print Network

ARTICLE Modeling of a Foamed Emulsion Bioreactor: II. Model Parametric Sensitivity Eunsung Kan: The sensitivity of a conceptual model of a foam emulsion bioreactor (FEBR) used for the control of toluene vapors in air was examined. Model parametric sensitivity studies showed which parameters affect the removal

180

Validation of Transient Cooling Modeling for Hypersonic Application  

E-print Network

Validation of Transient Cooling Modeling for Hypersonic Application Nicolas Gascoin and Philippe and Youssoufi Touré§ Université d'Orléans, 18000 Bourges, France DOI: 10.2514/1.26022 Hypersonic flight

Boyer, Edmond

181

Sediment Transport Model Validation in Lake Michigan  

Microsoft Academic Search

A multiple sediment type, three-dimensional hydrodynamic and sediment transport model was applied to Lake Michigan to simulate conditions during the Spring 2000 resuspension event. Model predictions were compared to data gathered by the EEGLE project including turbidity and downward mass flux. The model predictions for turbidity compared well to observed data, especially in capturing the distinctive peaks in turbidity due

Mary P. Cardenas; David J. Schwab; Brian J. Eadie; Nathan Hawley; Barry M. Lesht

2005-01-01

182

Validation of the Archimedes Diabetes Model  

Microsoft Academic Search

trolled trials by repeating in the model the steps taken for the real trials and comparing the results calculated by the model with the results of the trial. Eighteen trials were chosen by an independent advisory committee. Half the trials had been used to help build the model (\\

DAVID M. EDDY; LEONARD SCHLESSINGER

183

Statistical Validation of Engineering and Scientific Models: Background  

SciTech Connect

A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made.

Hills, Richard G.; Trucano, Timothy G.

1999-05-01

184

SAGE III Aerosol Extinction Validation in the Arctic Winter: Comparisons with SAGE II and POAM III  

NASA Technical Reports Server (NTRS)

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use potential vorticity as a spatial coordinate and thus greatly increased the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020-nm extinction ratio shows a consistent bias of approx. 30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that the two data sets are not well correlated below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

Thomason, L. W.; Poole, L. R.; Randall, C. E.

2007-01-01

185

HEDR model validation plan. Hanford Environmental Dose Reconstruction Project  

SciTech Connect

The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ``tools`` for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ``validate`` these tools. In the sense of the HEDR Project, ``validation`` is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model.

Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

1993-06-01

186

Highlights of Transient Plume Impingement Model Validation and Applications  

NASA Technical Reports Server (NTRS)

This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

Woronowicz, Michael

2011-01-01

187

Validating orbitally-tuned age models  

NASA Astrophysics Data System (ADS)

Orbital-tuned timescales play an important role for many studies in paleoclimatology and integrated stratigraphy. A reliable test for validity stand-alone astronomically-tuned time scales has however not yet been established. Shackleton et al., (1995) suggested that precession amplitude modulation by eccentricity is the best criterion available for a successful tuning. However, Huybers & Aharonson, (2010) oppose this approach and "conclude that the presence of eccentricity-like amplitude modulation in precession-filtered records does not support the accuracy of orbitally tuned time scales". We discuss some approaches to circumvent the potential problem of frequency modulations during the tuning process, thereby allowing the use of amplitude modulations for timescale evaluation. This method is discussed using a geological dataset.

Zeeden, Christian; Lourens, Lucas

2013-04-01

188

Circumplex Structure and Personality Disorder Correlates of the Interpersonal Problems Model (IIP-C): Construct Validity and Clinical Implications  

ERIC Educational Resources Information Center

This study assessed the construct validity of the circumplex model of the Inventory of Interpersonal Problems (IIP-C) in Norwegian clinical and nonclinical samples. Structure was examined by evaluating the fit of the circumplex model to data obtained by the IIP-C. Observer-rated personality disorder criteria (DSM-IV, Axis II) were used as external…

Monsen, Jon T.; Hagtvet, Knut A.; Havik, Odd E.; Eilertsen, Dag E.

2006-01-01

189

Translation, Adaptation and Validation of a Portuguese Version of the Moorehead-Ardelt Quality of Life Questionnaire II.  

PubMed

The prevalence of obesity has increased worldwide. An assessment of the impact of obesity on health-related quality of life (HRQoL) requires specific instruments. The Moorehead-Ardelt Quality of Life Questionnaire II (MA-II) is a widely used instrument to assess HRQoL in morbidly obese patients. The objective of this study was to translate and validate a Portuguese version of the MA-II.The study included forward and backward translations of the original MA-II. The reliability of the Portuguese MA-II was estimated using the internal consistency and test-retest methods. For validation purposes, the Spearman's rank correlation coefficient was used to evaluate the correlation between the Portuguese MA-II and the Portuguese versions of two other questionnaires, the 36-item Short Form Health Survey (SF-36) and the Impact of Weight on Quality of Life-Lite (IWQOL-Lite).One hundred and fifty morbidly obese patients were randomly assigned to test the reliability and validity of the Portuguese MA-II. Good internal consistency was demonstrated by a Cronbach's alpha coefficient of 0.80, and a very good agreement in terms of test-retest reliability was recorded, with an overall intraclass correlation coefficient (ICC) of 0.88. The total sums of MA-II scores and each item of MA-II were significantly correlated with all domains of SF-36 and IWQOL-Lite. A statistically significant negative correlation was found between the MA-II total score and BMI. Moreover, age, gender and surgical status were independent predictors of MA-II total score.A reliable and valid Portuguese version of the MA-II was produced, thus enabling the routine use of MA-II in the morbidly obese Portuguese population. PMID:24817428

Maciel, João; Infante, Paulo; Ribeiro, Susana; Ferreira, André; Silva, Artur C; Caravana, Jorge; Carvalho, Manuel G

2014-11-01

190

Empirical validation of computer models for passive-solar residences  

NASA Astrophysics Data System (ADS)

The theoretical underpinnings for experimental validation of thermodynamic models of passive solar buildings are given. Computer algorithms for such validation are discussed. Models for passive solar buildings are essentially incapable of validation in the classical sense. This is principally due to the fact that buildings are exposed to excitations which have insufficient frequency content to permit estimation of the coefficients in all but the most rudimentary models. One can, however, generate a set of possible models which explain the measured data. Unfortunately, while all models in the set may equally well track the measured data, the coefficients may vary significantly within the set. When used to estimate auxiliary energy consumption by simulation, models within the set may predict substantially different consumptions.

Sebald, A. V.

1983-06-01

191

CMOS Transistor Mismatch Model valid from Weak to Strong Inversion  

E-print Network

CMOS Transistor Mismatch Model valid from Weak to Strong Inversion Teresa Serrano and PMOS transistors for 30 different geometries has been done with this continuos model. The model is able of transistor mismatch is crucial for precision analog design. Using very reduced transistor geometries produces

Barranco, Bernabe Linares

192

Dynamic Modeling and Experimental Validation for Interactive Endodontic Simulation  

Microsoft Academic Search

To facilitate training of endodontic operations, we have developed an interactive virtual environment to simulate endodontic shaping operations. This paper presents methodologies for dynamic modeling, visual\\/haptic display and model validation of endodontic shaping. We first investigate the forces generated in the course of shaping operations and discuss the challenging issues in their modeling. Based on the special properties and constraints

Min Li; Yun-hui Liu

2007-01-01

193

Foundation Heat Exchanger Model and Design Tool Development and Validation  

E-print Network

Foundation Heat Exchanger Model and Design Tool Development and Validation The attached document.1080/10789669.2013.774887) Fan, D., S. Rees, J. Spitler. 2013. A dynamic thermal network approach to the modeling of Foundation of Dynamic Thermal Networks to the Modelling of Foundation Heat Exchangers. Building Simulation 2011. Sydney

194

Validation of Erosion Modeling: Physical and Numerical Mehrad Kamalzare1  

E-print Network

1 Validation of Erosion Modeling: Physical and Numerical Mehrad Kamalzare1 , Christopher Stuetzle2-3590 ABSTRACT The overall intent of this research is to develop numerical models of erosion of levees, dams a geotechnical centrifuge. The erosion is modeled in detail, from beginning to end, that is from the time

Franklin, W. Randolph

195

Heuristic Verification and Validation of Software Process Simulation Models  

Microsoft Academic Search

We illustrate the use of heuristic algorithms to improve the verification and validation of software process simulation models. To use this approach, an optimization problem is formulated to guide a heuristic search algorithm that will attempt to locate configurations of the system that yield surprising results. These surprising results often help the modeler to identify flaws in the model logic

Wayne Wakeland; Stephen Shervais; David Raffo

196

Modelling and Validation of Response Times in Zoned RAID  

Microsoft Academic Search

We present and validate an enhanced analytical queueing network model of zoned RAID. The model focuses on RAID levels 01 and 5, and yields the distribution of I\\/O request response time. Whereas our previous work could only sup- port arrival streams of I\\/O requests of the same type, the model presented here supports heterogeneous streams with a mixture of read

Abigail S. Lebrecht; Nicholas J. Dingle; William J. Knottenbelt

2008-01-01

197

Combustion turbine dynamic model validation from tests  

Microsoft Academic Search

Studies have been conducted on the Alaskan Railbelt System to examine the hydrothermal power system response after the hydroelectric power units at Bradley Lake are installed. The models and data for the generating units for the initial studies were not complete. Typical models were used, but their response appeared to be faster than judged by operating experience. A testing program

L. N. Hannett; Afzal Khan

1993-01-01

198

International Space Station Power System Model Validated  

NASA Technical Reports Server (NTRS)

System Power Analysis for Capability Evaluation (SPACE) is a computer model of the International Space Station's (ISS) Electric Power System (EPS) developed at the NASA Glenn Research Center. This uniquely integrated, detailed model can predict EPS capability, assess EPS performance during a given mission with a specified load demand, conduct what-if studies, and support on-orbit anomaly resolution.

Hojnicki, Jeffrey S.; Delleur, Ann M.

2002-01-01

199

Multi-model ensemble: technique and validation  

NASA Astrophysics Data System (ADS)

In this study, a method of numerical weather prediction by ensemble for the South American region is proposed. This method takes into account combinations of the numerical predictions of various models, assigning greater weight to models that exhibit the best performance. Nine operational numerical models were used to perform this study. The main objective of the study is to obtain a weather forecasting product (short-to-medium range) that combines what is best in each of the nine models used in the study, thus producing more reliable predictions. The proposed method was evaluated during austral summer (December 2012, and January and February 2013) and winter (June, July and August 2013). The results show that the proposed method can significantly improve the results provided by the numerical models and consequently has promising potential for operational applications in any weather forecasting center.

Rozante, J. R.; Moreira, D. S.; Godoy, R. C. M.; Fernandes, A. A.

2014-10-01

200

Validation of a Conceptual Assessment Tool in E&M II  

E-print Network

As part of an ongoing project to investigate student learning in upper-division electrodynamics (E&M II), the PER research group at the University of Colorado Boulder has developed a tool to assess student conceptual understanding: the CURrENT (Colorado UppeR-division ElectrodyNamics Test). The result is an open-ended post-test diagnostic with 6 multi-part questions, an optional 3-question pretest, and an accompanying grading rubric. This instrument is motivated in part by our faculty-consensus learning goals, and is intended to help measure the effectiveness of transformed pedagogy. In addition, it provides insights into student thinking and student difficulties in the covered topical areas. In this paper, we present preliminary measures of the validity and reliability of the instrument and scoring rubric. These include expert validation and student interviews, inter-rater reliability measures, and classical test statistics.

Ryan, Qing X; Baily, Charles; Pollock, Steven J

2014-01-01

201

Approaches for evaluating veterinary epidemiological models: verification, validation and limitations.  

PubMed

The evaluation of models of the spread and control of animal diseases is crucial if these models are to be used to inform decisions about the control or management of such diseases. Two key steps in the evaluation of epidemiological models are model verification and model validation. Verification is the demonstration that a computer-driven model is operating correctly, and conforms to its intended design. Validation refers to the process of determining how well a model corresponds to the system that it is intended to represent. For a veterinary epidemiological model, validation would address such issues as how well the model represents the dynamics of the disease in question in the population to which this model is applied, and how well the model represents the application of different measures for disease control. Just as the development of epidemiological models is a subjective, continuous process, subject to change and refinement, so too is the evaluation of models. The purpose of model evaluation is not to demonstrate that a model is a 'true' or accurate' representation of a system, but to subject it to sufficient scrutiny so that it may be used with an appropriate degree of confidence to aid decision-making. To facilitate model verification and validation, epidemiological modellers should clearly state the purpose, assumptions and limitations of a model; provide a detailed description of the conceptual model; document those steps already taken to test the model; and thoroughly describe the data sources and the process used to produce model input parameters from those data. PMID:21961221

Reeves, A; Salman, M A; Hill, A E

2011-08-01

202

Towards better clinical prediction models: seven steps for development and an ABCD for validation.  

PubMed

Clinical prediction models provide risk estimates for the presence of disease (diagnosis) or an event in the future course of disease (prognosis) for individual patients. Although publications that present and evaluate such models are becoming more frequent, the methodology is often suboptimal. We propose that seven steps should be considered in developing prediction models: (i) consideration of the research question and initial data inspection; (ii) coding of predictors; (iii) model specification; (iv) model estimation; (v) evaluation of model performance; (vi) internal validation; and (vii) model presentation. The validity of a prediction model is ideally assessed in fully independent data, where we propose four key measures to evaluate model performance: calibration-in-the-large, or the model intercept (A); calibration slope (B); discrimination, with a concordance statistic (C); and clinical usefulness, with decision-curve analysis (D). As an application, we develop and validate prediction models for 30-day mortality in patients with an acute myocardial infarction. This illustrates the usefulness of the proposed framework to strengthen the methodological rigour and quality for prediction models in cardiovascular research. PMID:24898551

Steyerberg, Ewout W; Vergouwe, Yvonne

2014-08-01

203

Validation of the Poisson Stochastic Radiative Transfer Model  

NASA Technical Reports Server (NTRS)

A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

Zhuravleva, Tatiana; Marshak, Alexander

2004-01-01

204

Validating Predictions from Climate Envelope Models  

PubMed Central

Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species. PMID:23717452

Watling, James I.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romanach, Stephanie S.

2013-01-01

205

Spatial statistical modeling of shallow landslides—Validating predictions for different landslide inventories and rainfall events  

NASA Astrophysics Data System (ADS)

Statistical models that exploit the correlation between landslide occurrence and geomorphic properties are often used to map the spatial occurrence of shallow landslides triggered by heavy rainfalls. In many landslide susceptibility studies, the true predictive power of the statistical model remains unknown because the predictions are not validated with independent data from other events or areas. This study validates statistical susceptibility predictions with independent test data. The spatial incidence of landslides, triggered by an extreme rainfall in a study area, was modeled by logistic regression. The fitted model was then used to generate susceptibility maps for another three study areas, for which event-based landslide inventories were also available. All the study areas lie in the northern foothills of the Swiss Alps. The landslides had been triggered by heavy rainfall either in 2002 or 2005. The validation was designed such that the first validation study area shared the geomorphology and the second the triggering rainfall event with the calibration study area. For the third validation study area, both geomorphology and rainfall were different. All explanatory variables were extracted for the logistic regression analysis from high-resolution digital elevation and surface models (2.5 m grid). The model fitted to the calibration data comprised four explanatory variables: (i) slope angle (effect of gravitational driving forces), (ii) vegetation type (grassland and forest; root reinforcement), (iii) planform curvature (convergent water flow paths), and (iv) contributing area (potential supply of water). The area under the Receiver Operating Characteristic (ROC) curve ( AUC) was used to quantify the predictive performance of the logistic regression model. The AUC values were computed for the susceptibility maps of the three validation study areas (validation AUC), the fitted susceptibility map of the calibration study area (apparent AUC: 0.80) and another susceptibility map obtained for the calibration study area by 20-fold cross-validation (cross-validation AUC: 0.74). The AUC values of the first and second validation study areas (0.72 and 0.69, respectively) and the cross-validation AUC matched fairly well, and all AUC values were distinctly smaller than the apparent AUC. Based on the apparent AUC one would have clearly overrated the predictive performance for the first two validation areas. Rather surprisingly, the AUC value of the third validation study area (0.82) was larger than the apparent AUC. A large part of the third validation study area consists of gentle slopes, and the regression model correctly predicted that no landslides occur in the flat parts. This increased the predictive performance of the model considerably. The predicted susceptibility maps were further validated by summing the predicted susceptibilities for the entire validation areas and by comparing the sums with the observed number of landslides. The sums exceeded the observed counts for all the validation areas. Hence, the logistic regression model generally over-estimated the risk of landslide occurrence. Obviously, a predictive model that is based on static geomorphic properties alone cannot take a full account of the complex and time dependent processes in the subsurface. However, such a model is still capable of distinguishing zones highly or less prone to shallow landslides.

von Ruette, Jonas; Papritz, Andreas; Lehmann, Peter; Rickli, Christian; Or, Dani

2011-10-01

206

Shuttle Spacesuit: Fabric/LCVG Model Validation  

NASA Technical Reports Server (NTRS)

A detailed spacesuit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the spacesuit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of spacesuit shielding properties assumed the basic fabric lay-up (Thermal Micrometeroid Garment, fabric restraints, and pressure envelope) and Liquid Cooling and Ventilation Garment (LCVG) could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present spacesuit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and highresolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the spacesuit's protection properties.

Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

2001-01-01

207

Epileptic transitions: model predictions and experimental validation.  

PubMed

The essence of epilepsy is that a patient displays (long) periods of normal EEG activity (i.e., nonepileptiform) intermingled occasionally with epileptiform paroxysmal activity. The mechanisms of transition between these two types of activity are not well understood. To provide more insight into the dynamics of the neuronal networks leading to seizure generation, the authors developed a computational model of thalamocortical circuits based on relevant patho(physiologic) data. The model exhibits bistability, i.e., it features two operational states, ictal and interictal, that coexist. The transitions between these two states occur according to a Poisson process. An alternative scenario for transitions can be a random walk of network parameters that ultimately leads to a paroxysmal discharge. Predictions of bistable computational model with experimental results from different types of epilepsy are compared. PMID:16357634

Suffczynski, Piotr; Lopes da Silva, Fernando; Parra, Jaime; Velis, Demetrios; Kalitzin, Stiliyan

2005-10-01

208

ESEEM analysis of multi-histidine Cu(II)-coordination in model complexes, peptides, and amyloid-?.  

PubMed

We validate the use of ESEEM to predict the number of (14)N nuclei coupled to a Cu(II) ion by the use of model complexes and two small peptides with well-known Cu(II) coordination. We apply this method to gain new insight into less explored aspects of Cu(II) coordination in amyloid-? (A?). A? has two coordination modes of Cu(II) at physiological pH. A controversy has existed regarding the number of histidine residues coordinated to the Cu(II) ion in component II, which is dominant at high pH (?8.7) values. Importantly, with an excess amount of Zn(II) ions, as is the case in brain tissues affected by Alzheimer's disease, component II becomes the dominant coordination mode, as Zn(II) selectively substitutes component I bound to Cu(II). We confirm that component II only contains single histidine coordination, using ESEEM and set of model complexes. The ESEEM experiments carried out on systematically (15)N-labeled peptides reveal that, in component II, His 13 and His 14 are more favored as equatorial ligands compared to His 6. Revealing molecular level details of subcomponents in metal ion coordination is critical in understanding the role of metal ions in Alzheimer's disease etiology. PMID:25014537

Silva, K Ishara; Michael, Brian C; Geib, Steven J; Saxena, Sunil

2014-07-31

209

Reduced-complexity modeling of braided rivers: Assessing model performance by sensitivity analysis, calibration, and validation  

NASA Astrophysics Data System (ADS)

paper addresses an important question of modeling stream dynamics: How may numerical models of braided stream morphodynamics be rigorously and objectively evaluated against a real case study? Using simulations from the Cellular Automaton Evolutionary Slope and River (CAESAR) reduced-complexity model (RCM) of a 33 km reach of a large gravel bed river (the Tagliamento River, Italy), this paper aims to (i) identify a sound strategy for calibration and validation of RCMs, (ii) investigate the effectiveness of multiperformance model assessments, (iii) assess the potential of using CAESAR at mesospatial and mesotemporal scales. The approach used has three main steps: first sensitivity analysis (using a screening method and a variance-based method), then calibration, and finally validation. This approach allowed us to analyze 12 input factors initially and then to focus calibration only on the factors identified as most important. Sensitivity analysis and calibration were performed on a 7.5 km subreach, using a hydrological time series of 20 months, while validation on the whole 33 km study reach over a period of 8 years (2001-2009). CAESAR was able to reproduce the macromorphological changes of the study reach and gave good results as for annual bed load sediment estimates which turned out to be consistent with measurements in other large gravel bed rivers but showed a poorer performance in reproducing the characteristics of the braided channel (e.g., braiding intensity). The approach developed in this study can be effectively applied in other similar RCM contexts, allowing the use of RCMs not only in an explorative manner but also in obtaining quantitative results and scenarios.

Ziliani, L.; Surian, N.; Coulthard, T. J.; Tarantola, S.

2013-12-01

210

Validation of nuclear models used in space radiation shielding applications  

SciTech Connect

A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

Norman, Ryan B., E-mail: Ryan.B.Norman@nasa.gov [NASA Langley Research Center, Hampton, VA 23681 (United States); Blattnig, Steve R. [NASA Langley Research Center, Hampton, VA 23681 (United States)] [NASA Langley Research Center, Hampton, VA 23681 (United States)

2013-01-15

211

Development and validation of a disease model for postmenopausal osteoporosis  

Microsoft Academic Search

Summary  This article describes the development of a model for postmenopausal osteoporosis (PMO) based on Swedish data that is easily\\u000a adaptable to other countries.\\u000a \\u000a \\u000a \\u000a \\u000a Introduction  The aims of the study were to develop and validate a model to describe the current\\/future burden of PMO in different national\\u000a settings.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  For validation purposes, the model was developed using Swedish data and provides estimates from

A. Gauthier; J. A. Kanis; M. Martin; J. Compston; F. Borgström; C. Cooper; E. McCloskey

2011-01-01

212

An Analysis Tool for Markovian Traffic Model Validation  

Microsoft Academic Search

Even though the problem of developing adequate models for self-similar data traffic is relatively old, it still remains unresolved.\\u000a One of the limitations of the existing models is their lack of general applicability. Many traffic models re validated using\\u000a a single trace only, such that the developed model may well fit the traffic of this trace, but not necessary of

Rachid El Abdouni Khayari; Axel Lehmann; Markus Siegle

213

Verification and Validation of Agent-based Scientific Simulation Models  

Microsoft Academic Search

Most formalized model verification and validation tech- niques come from industrial and system engineering for discrete-event system simulations. These techniques are widely used in computational science. The agent-based modeling approach is different from discrete event modeling approaches largely used in industrial and system engineer- ing in many aspects. Since the agent-based modeling ap- proach has recently become an attractive and

Xiaorong Xiang; Ryan Kennedy; Gregory Madey; Steve Cabaniss

2005-01-01

214

TEST STRATEGY FOR AERO-ENGINE STRUCTURAL DYNAMIC MODEL VALIDATION  

Microsoft Academic Search

Finite Element Models have been used for many years to predict the structural dynamic behaviour of aero- engines and to influence their design accordingly. The potential savings anticipated by such methods can only be realised if predictions are reliable, incorrect predictions might mislead the designers with serious consequences. Currently, a reliable validation of an aero-engine assembly model is only feasible

J. V. Garcia; D. J. Ewins

215

A Formal Approach to Empirical Dynamic Model Optimization and Validation  

NASA Technical Reports Server (NTRS)

A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

2014-01-01

216

Validation of Available Approaches for Numerical Bird Strike Modeling Tools  

Microsoft Academic Search

This paper investigates the bird strike phenomenon in order to validate available numerical models through experimental tests and simulation tools. It describes how to use the currently available test data while exerting caution as to their reliability. The information is then used to evaluate the performance of the different modeling options currently available. The evaluation is based on five criteria

M-A Lavoie; A. Gakwaya; M. Nejad Ensan; D. G. Zimcik

217

Bioaerosol optical sensor model development and initial validation  

Microsoft Academic Search

This paper describes the development and initial validation of a bioaerosol optical sensor model. This model was used to help determine design parameters and estimate performance of a new low-cost optical sensor for detecting bioterrorism agents. In order to estimate sensor performance in detecting biowarfare simulants and rejecting environmental interferents, use was made of a previously reported catalog of EEM

Steven D. Campbell; Thomas H. Jeys; Xuan Le Eapen

2007-01-01

218

ORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical  

E-print Network

Obstructive sleep apnea syndrome 1 Introduction Since the 1990s, biomechanical modelling of the human upper properties of the upper airway (geometry, rheology). This makes them of interest to improve the qualityORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical simulations using

Lagrée, Pierre-Yves

219

ORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical  

E-print Network

, biomechanical modelling of the human upper airway has received a growing interest since it allows a better of the biomechanical properties of the upper airway (geometry, rheology). This makes them of interest to improveORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical simulations using

Payan, Yohan

220

Modeling and Validating Hybrid Systems Using VDM and Mathematica  

E-print Network

Modeling and Validating Hybrid Systems Using VDM and Mathematica Bernhard K. Aichernig and Reinhold sys- tem Mathematica can be used to model and simu- late both aspects: the control logic and the physics involved. A new Mathematica package emulating VDM-SL has been developed that allows the in

221

Literature-derived bioaccumulation models for earthworms: Development and validation  

Microsoft Academic Search

Estimation of contaminant concentrations in earthworms is a critical component in many ecological risk assessments. Without site-specific data, literature-derived uptake factors or models are frequently used. Although considerable research has been conducted on contaminant transfer from soil to earthworms, most studies focus on only a single location. External validation of transfer models has not been performed. The authors developed a

G. W. II Suter; J. J. Beauchamp; R. A. Efroymson

1999-01-01

222

Validating Finite Element Models of Assembled Shell Structures  

NASA Technical Reports Server (NTRS)

The validation of finite element models of assembled shell elements is presented. The topics include: 1) Problems with membrane rotations in assembled shell models; 2) Penalty stiffness for membrane rotations; 3) Physical stiffness for membrane rotations using shell elements with 6 dof per node; and 4) Connections avoiding rotations.

Hoff, Claus

2006-01-01

223

Modeling HIV Immune Response and Validation with Clinical Data  

E-print Network

Modeling HIV Immune Response and Validation with Clinical Data H. T. Banksa,1 , M. Davidiana,2 equations is formulated to describe the pathogenesis of HIV infection, wherein certain important features, and stimulation by antigens other than HIV. A stability analysis illustrates the capability of this model

224

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory. (author)

Ko, Y.-C. [Nuclear Science and Engineering Department, MIT, Cambridge, MA 02139 (United States); Hu, L.-W. [Nuclear Reactor Laboratory, MIT, Cambridge, MA 02139 (United States)], E-mail: lwhu@mit.edu; Olson, Arne P.; Dunn, Floyd E. [RERTR Program, Argonne National Laboratory, Argonne, IL 60439 (United States)

2008-07-15

225

Experimentally validated HERG pharmacophore models as cardiotoxicity prediction tools.  

PubMed

The goal of this study was to design, experimentally validate, and apply a virtual screening workflow to identify novel hERG channel blockers. The hERG channel is an important antitarget in drug development since cardiotoxic risks remain as a major cause of attrition. A ligand-based pharmacophore model collection was developed and theoretically validated. The seven most complementary and suitable models were used for virtual screening of in-house and commercially available compound libraries. From the hit lists, 50 compounds were selected for experimental validation through bioactivity assessment using patch clamp techniques. Twenty compounds inhibited hERG channels expressed in HEK 293 cells with IC50 values ranging from 0.13 to 2.77 ?M, attesting to the suitability of the models as cardiotoxicity prediction tools in a preclinical stage. PMID:25148533

Kratz, Jadel M; Schuster, Daniela; Edtbauer, Michael; Saxena, Priyanka; Mair, Christina E; Kirchebner, Julia; Matuszczak, Barbara; Baburin, Igor; Hering, Steffen; Rollinger, Judith M

2014-10-27

226

The Validation of Climate Models: The Development of Essential Practice  

NASA Astrophysics Data System (ADS)

It is possible from both a scientific and philosophical perspective to state that climate models cannot be validated. However, with the realization that the scientific investigation of climate change is as much a subject of politics as of science, maintaining this formal notion of "validation" has significant consequences. For example, it relegates the bulk of work of many climate scientists to an exercise of model evaluation that can be construed as ill-posed. Even within the science community this motivates criticism of climate modeling as an exercise of weak scientific practice. Stepping outside of the science community, statements that validation is impossible are used in political arguments to discredit the scientific investigation of climate, to maintain doubt about projections of climate change, and hence, to prohibit the development of public policy to regulate the emissions of greenhouse gases. With the acceptance of the impossibility of validation, scientists often state that the credibility of models can be established through an evaluation process. A robust evaluation process leads to the quantitative description of the modeling system against a standard set of measures. If this process is standardized as institutional practice, then this provides a measure of model performance from one modeling release to the next. It is argued, here, that such a robust and standardized evaluation of climate models can be structured and quantified as "validation." Arguments about the nuanced meaning of validation and evaluation are a subject about which the climate modeling community needs to develop a standard. It does injustice to a body of science-based knowledge to maintain that validation is "impossible." Rather than following such a premise, which immediately devalues the knowledge base, it is more useful to develop a systematic, standardized approach to robust, appropriate validation. This stands to represent the complexity of the Earth's climate and its investigation. This serves not only the scientific method, but the communication of the results of that scientific investigation to other scientists and to those with a stake in those scientific results. It sets a standard, which is essential practice for simulation science with societal ramifications.

Rood, R. B.

2011-12-01

227

ECEMA 2000 Concurrent models II 4.1 CONCURRENT MODELS II  

E-print Network

in the model... © ECEMA 2000 Concurrent models II 4.8 PACKAGE std_logic_1164 PACKAGE std_logic_1164 IS TYPE std Impedance 'W', -- Weak Unknown 'L', -- Weak 0 'H', -- Weak 1 '-' -- Don't care); TYPE std_ulogic_vector IS ARRAY ( NATURAL RANGE ) OF std_ulogic; FUNCTION resolved ( s : std_ulogic_vector ) RETURN std

Aboulhamid, El Mostapha

228

Making Validated Educational Models Central in Preschool Standards  

Microsoft Academic Search

This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education-and-care programs that are available to all 3- and 4-year-olds. An educational model is a coherent body of program practices, curriculum content, program and child assessment, and teacher training. Educational models are meant to contribute to all

Lawrence J. Schweinhart

229

LANDMARK PAPER REPRISE - A TUTORIAL ON VERIFICATION AND VALIDATION OF SIMULATION MODELS  

Microsoft Academic Search

in this tutorial paper we give a a general introduction to verification and validation of simulation models, define the various validation techniques, and present a a recommended model validation procedure.

Robert G. G. Sargent

2007-01-01

230

Validation techniques of agent based modelling for geospatial simulations  

NASA Astrophysics Data System (ADS)

One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

Darvishi, M.; Ahmadi, G.

2014-10-01

231

Data for model validation summary report. A summary of data for validation and benchmarking of recovery boiler models  

SciTech Connect

One of the tasks in the project was to obtain data from operating recovery boilers for the purpose of model validation. Another task was to obtain water model data and computer output from University of British Columbia for purposes of benchmarking the UBC model against other codes. In the course of discussions on recovery boiler modeling over the course of this project, it became evident that there would be value in having some common cases for carrying out benchmarking exercises with different recovery boiler models. In order to facilitate such a benchmarking exercise, the data that was obtained on this project for validation and benchmarking purposes has been brought together in a single, separate report. The intent is to make this data available to anyone who may want to use it for model validation. The report contains data from three different cases. Case 1 is an ABBCE recovery boiler which was used for model validation. The data are for a single set of operating conditions. Case 2 is a Babcock & Wilcox recovery boiler that was modified by Tampella. In this data set, several different operating conditions were employed. The third case is water flow data supplied by UBC, along with computational output using the UBC code, for benchmarking purposes.

Grace, T.; Lien, S.; Schmidl, W.; Salcudean, M.; Abdullah, Z.

1997-07-01

232

Angular Distribution Models for Top-of-Atmosphere Radiative Flux Estimation from the Clouds and the Earth's Radiant Energy System Instrument on the Tropical Rainfall Measuring Mission Satellite. Part II; Validation  

NASA Technical Reports Server (NTRS)

Top-of-atmosphere (TOA) radiative fluxes from the Clouds and the Earth s Radiant Energy System (CERES) are estimated from empirical angular distribution models (ADMs) that convert instantaneous radiance measurements to TOA fluxes. This paper evaluates the accuracy of CERES TOA fluxes obtained from a new set of ADMs developed for the CERES instrument onboard the Tropical Rainfall Measuring Mission (TRMM). The uncertainty in regional monthly mean reflected shortwave (SW) and emitted longwave (LW) TOA fluxes is less than 0.5 W/sq m, based on comparisons with TOA fluxes evaluated by direct integration of the measured radiances. When stratified by viewing geometry, TOA fluxes from different angles are consistent to within 2% in the SW and 0.7% (or 2 W/sq m) in the LW. In contrast, TOA fluxes based on ADMs from the Earth Radiation Budget Experiment (ERBE) applied to the same CERES radiance measurements show a 10% relative increase with viewing zenith angle in the SW and a 3.5% (9 W/sq m) decrease with viewing zenith angle in the LW. Based on multiangle CERES radiance measurements, 18 regional instantaneous TOA flux errors from the new CERES ADMs are estimated to be 10 W/sq m in the SW and, 3.5 W/sq m in the LW. The errors show little or no dependence on cloud phase, cloud optical depth, and cloud infrared emissivity. An analysis of cloud radiative forcing (CRF) sensitivity to differences between ERBE and CERES TRMM ADMs, scene identification, and directional models of albedo as a function of solar zenith angle shows that ADM and clear-sky scene identification differences can lead to an 8 W/sq m root-mean-square (rms) difference in 18 daily mean SW CRF and a 4 W/sq m rms difference in LW CRF. In contrast, monthly mean SW and LW CRF differences reach 3 W/sq m. CRF is found to be relatively insensitive to differences between the ERBE and CERES TRMM directional models.

Loeb, N. G.; Loukachine, K.; Wielicki, B. A.; Young, D. F.

2003-01-01

233

Verification and Validation of Model-Based Autonomous Systems  

NASA Technical Reports Server (NTRS)

This paper presents a three year project (FY99 to FY01) on the verification and validation of model based autonomous systems. The topics include: 1) Project Profile; 2) Model-Based Autonomy; 3) The Livingstone MIR; 4) MPL2SMV; 5) Livingstone to SMV Translation; 6) Symbolic Model Checking; 7) From Livingstone Models to SMV Models; 8) Application In-Situ Propellant Production; 9) Closed-Loop Verification Principle; 10) Livingstone PathFinder (LPF); 11) Publications and Presentations; and 12) Future Directions. This paper is presented in viewgraph form.

Pecheur, Charles; Koga, Dennis (Technical Monitor)

2001-01-01

234

Climate Model Validation Using Spectrally Resolved Shortwave Radiation Measurements  

NASA Astrophysics Data System (ADS)

The climate science community has made significant strides in the development and improvement of Global Climate Models (GCMs) to predict how the Earth's climate system will change over the next several decades. It is crucial to evaluate how well these models reproduce observed climate variability using strict validation techniques to assist the climate modeling community with improving GCM prediction. The ability of climate models to simulate Earth's present-day climate is an initial evaluation of their ability to predict future changes in climate. Models are evaluated in several ways, including model intercomparison projects and comparing model simulations of physical variables with observations of those variables. We are developing new methods for rigorous climate model validation and physical attribution of the cause of model errors using existing, direct measurements of hyperspectral shortwave reflectance. We have also developed a SCIAMACHY-based (Scanning Imaging Absorption Spectrometer for Atmospheric Cartography) hyperspectral shortwave climate validation product to demonstrate using the product to validate GCMs. We are also investigating the information content added by using multispectral and hyperspectral data to study climate variability and validate climate models. The goal is to determine if it is necessary to use data with continuous spectral sampling across the shortwave spectral range, or if it is sufficient to use a subset of carefully selected spectral bands (e.g. MODIS-like data) to study climate trends and evaluate climate model performance. We are carrying out this activity by comparing the information content within broadband, multispectral (discrete-band sampling), and hyperspectral (high spectral resolution with continuous spectral sampling) data sets. Changes in climate-relevant atmospheric and surface variables impact the spectral, spatial, and temporal variability of Earth-reflected solar radiation (0.3-2.5 ?m) through spectrally dependent scattering and absorption processes. Previous studies have demonstrated that highly accurate, hyperspectral (spectrally contiguous and overlapping) measurements of shortwave reflectance are important for monitoring climate variability from space. We are continuing to work to demonstrate that highly accurate, high information content hyperspectral shortwave measurements can be used to detect changes in climate, identify climate variance drivers, and validate GCMs.

Roberts, Y.; Taylor, P. C.; Lukashin, C.; Feldman, D.; Pilewskie, P.; Collins, W.

2013-12-01

235

Bayesian inference method for model validation and confidence extrapolation  

Microsoft Academic Search

This paper presents a Bayesian-hypothesis-testing-based methodology for model validation and confidence extrapolation under uncertainty, using limited test data. An explicit expression of the Bayes factor is derived for the interval hypothesis testing. The interval method is compared with the Bayesian point null hypothesis testing approach. The Bayesian network with Markov Chain Monte Carlo simulation and Gibbs sampling is explored for

Xiaomo Jiang; Sankaran Mahadevan

2009-01-01

236

Multiterminal subsystem model validation for pacific DC intertie  

Microsoft Academic Search

This paper proposes to validate dynamic model of pacific DC intertie with the concept of hybrid simulation by combining simulation with PMU measurements. The playback function available in GE PSLF is adopted for hybrid simulation. It is demonstrated for the first time the feasibility of using playback function on multi-terminal subsystem. Sensitivity studies are also presented as a result of

Bo Yang; Zhenyu Huang; Dmitry Kosterev

2008-01-01

237

Linear Logic Validation and Hierarchical Modeling for Interactive Storytelling Control  

E-print Network

Linear Logic Validation and Hierarchical Modeling for Interactive Storytelling Control Kim Dung and behavior with respect to some predefined rules established by the designer. The storytelling allows Storytelling (IS) control using Linear Logic (LL). Then we present "situation-based" hierarchical scenario

Paris-Sud XI, Université de

238

A Model for Investigating Predictive Validity at Highly Selective Institutions.  

ERIC Educational Resources Information Center

A statistical model for investigating predictive validity at highly selective institutions is described. When the selection ratio is small, one must typically deal with a data set containing relatively large amounts of missing data on both criterion and predictor variables. Standard statistical approaches are based on the strong assumption that…

Gross, Alan L.; And Others

239

A Formal Validation Model for the Netconf Sylvain Halle1  

E-print Network

for the network configuration operations. Its validate operation checks syntactically and semantically the 1 We. By using an existing logical for- malism called TQL [3], we express important, semantic dependencies has proposed other approaches. Some frameworks under development consist in enriching an UML model

Villemaire, Roger

240

Regions of validity for some rough surface scattering models  

Microsoft Academic Search

The objective is to examine the regions of validity that apply to the use of various models for describing rough surface scattering. The first area to be examined is how a small slope condition allows an integral representation of the physical optics (PO) cross section to be formed that does not require the high frequency geometric optics (GO) restriction. Next,

Robert J. Papa; John F. Lennon

1988-01-01

241

VALIDATION OF ACOUSTIC MODELS OF AUDITORY NEURAL PROSTHESES  

PubMed Central

Acoustic models have been used in numerous studies over the past thirty years to simulate the percepts elicited by auditory neural prostheses. In these acoustic models, incoming signals are processed the same way as in a cochlear implant speech processor. The percepts that would be caused by electrical stimulation in a real cochlear implant are simulated by modulating the amplitude of either noise bands or sinusoids. Despite their practical usefulness these acoustic models have never been convincingly validated. This study presents a tool to conduct such validation using subjects who have a cochlear implant in one ear and have near perfect hearing in the other ear, allowing for the first time a direct perceptual comparison of the output of acoustic models to the stimulation provided by a cochlear implant.

Svirsky, Mario A.; Ding, Nai; Sagi, Elad; Tan, Chin-Tuan; Fitzgerald, Matthew; Glassman, E. Katelyn; Seward, Keena; Neuman, Arlene C.

2014-01-01

242

Solar swimming pool heating: Description of a validated model  

SciTech Connect

In the framework of a European Demonstration Programme, co-financed by CEC and national bodies, a model was elaborated and validated for open-air swimming pools having a minimal surface of 100 m[sup 2] and a minimal depth of 0.5 m. The model consists of two parts, the energy balance of the pool and the solar plant. The theoretical background of the energy balance of an open-air swimming pool was found to be poor. Special monitoring campaigns were used to validate the dynamic model using mathematical parameter identification methods. The final model was simplified in order to shorten calculation time and to improve the user-friendliness by reducing the input values to the most important one. The programme is commercially available. However, it requires the hourly meteorological data of a test reference year (TRY) as an input. The users are mainly designing engineers.

Haaf, W.; Luboschik, U.; Tesche, B. (IST Energietechnik GmbH, Hauptsitz Wollbach, Kandern (Germany))

1994-07-01

243

Analysis of Experimental Data for High Burnup PWR Spent Fuel Isotopic Validation - Vandellos II Reactor  

SciTech Connect

This report is one of the several recent NUREG/CR reports documenting benchmark-quality radiochemical assay data and the use of the data to validate computer code predictions of isotopic composition for spent nuclear fuel, to establish the uncertainty and bias associated with code predictions. The experimental data analyzed in the current report were acquired from a high-burnup fuel program coordinated by Spanish organizations. The measurements included extensive actinide and fission product data of importance to spent fuel safety applications, including burnup credit, decay heat, and radiation source terms. Six unique spent fuel samples from three uranium oxide fuel rods were analyzed. The fuel rods had a 4.5 wt % {sup 235}U initial enrichment and were irradiated in the Vandellos II pressurized water reactor operated in Spain. The burnups of the fuel samples range from 42 to 78 GWd/MTU. The measurements were used to validate the two-dimensional depletion sequence TRITON in the SCALE computer code system.

Ilas, Germina [ORNL; Gauld, Ian C [ORNL

2011-01-01

244

Validating and Calibrating Agent-Based Models: A Case Study  

Microsoft Academic Search

In this paper we deal with some validation and calibration experiments on a modified version of the Complex Adaptive Trivial\\u000a System (CATS) model proposed in Gallegati et al. (2005 Journal of Economic Behavior and Organization, 56, 489–512). The CATS model has been extensively used to replicate a large number of scaling types stylized facts with a remarkable\\u000a degree of precision. For

Carlo Bianchi; Pasquale Cirillo; Mauro Gallegati; Pietro A. Vagliasindi

2007-01-01

245

Modelling ice melting processes: numerical and experimental validation  

Microsoft Academic Search

Purpose – This work is devoted to the experimental analysis, numerical modelling and validation of ice melting processes. Design\\/methodology\\/approach – The thermally coupled incompressible Navier-Stokes equations including water density inversion and isothermal phase-change phenomena are assumed as the governing equations of the problem. A fixed-mesh finite element formulation is proposed for the numerical solution of such model. In particular, this

Marcela Cruchaga; Diego Celentano

2007-01-01

246

Experimentally validated finite element model of electrocaloric multilayer ceramic structures  

NASA Astrophysics Data System (ADS)

A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

Smith, N. A. S.; Rokosz, M. K.; Correia, T. M.

2014-07-01

247

Validation and upgrading of physically based mathematical models  

NASA Technical Reports Server (NTRS)

The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

Duval, Ronald

1992-01-01

248

Validating the BHR RANS model for variable density turbulence  

SciTech Connect

The BHR RANS model is a turbulence model for multi-fluid flows in which density variation plays a strong role in the turbulence processes. In this paper they demonstrate the usefulness of BHR over a wide range of flows which include the effects of shear, buoyancy, and shocks. The results are in good agreement with experimental and DNS data across the entire set of validation cases, with no need to retune model coefficients between cases. The model has potential application to a number of aerospace related flow problems.

Israel, Daniel M [Los Alamos National Laboratory; Gore, Robert A [Los Alamos National Laboratory; Stalsberg - Zarling, Krista L [Los Alamos National Laboratory

2009-01-01

249

Propeller aircraft interior noise model utilization study and validation  

NASA Technical Reports Server (NTRS)

Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

Pope, L. D.

1984-01-01

250

Time domain model validation of a nonlinear block-oriented structure  

NASA Astrophysics Data System (ADS)

A crystal detector is studied as a valid candidate reference element for the phase calibration of the large-signal network analyzer (LSNA) for modulated excitations. However, one cannot use a crystal detector straightforwardly as a phase reference element, since a crystal detector inherently introduces phase distortions. Hence, the identification and validation of a parametric black-box model of the detector is needed. In this work, a nonlinear feedback model for a crystal detector is constructed: the model contains a low-pass filter in the feedforward path and a Wiener system in the feedback loop. The model is estimated from baseband data and needs to be validated for use with high-frequency signals: the model needs to be extrapolated to RF frequencies, which is quite an arduous task. The validation of the extrapolated model is performed using two approaches: (i) the RF narrow band modulated signal is applied to the extrapolated model structure and the output signal is computed; (ii) the physical representation of the model structure is translated into its differential equation and by means of this equation, the low-frequency output envelope is computed for RF input signals. The second approach requires a slight modification of the extracted model compared to the model used in the first approach, with respect to the function describing the static nonlinear behavior. The deviation between the modeled output envelope and the measured output envelope is evaluated for both approaches. The method (ii) that computes the output of the detector in the time domain by means of solving a differential equation that characterizes the identified nonlinear feedback model gives the overall best results for predicting the magnitude and the phase of the detector output spectrum and the amplitude behavior of the time domain output waveform. The mean deviation in magnitude between the modeled and measured envelope equals 2.6 dB. This approach significantly outperforms the first method (i) as the mean deviation between the phase of the modeled and measured envelope equals 8.2°.

Gommé, Liesbeth; Rolain, Yves; Schoukens, Johan; Pintelon, Rik

2009-10-01

251

Challenges of Validating Global Assimilative Models of the Ionosphere  

NASA Astrophysics Data System (ADS)

This paper addresses the often surprisingly difficult challenges that arise in conceptually simple validations of global models of the ionosphere. AFRL has been tasked with validating the Utah State University GAIM (Global Assimilation of Ionospheric Measurements) model of the ionosphere, which is run in real time by the Air Force Weather Agency. The USU-GAIM model currently assimilates, in addition to the voluminous GPS TEC data, in situ densities from DMSP satellites, UV radiances from SSUSI sensors on the DMSP satellites, and vertical profiles provided by a limited number of Digisondes. AFRL has performed a large number of USU-GAIM validations, using as ground truth values of foF2 and M(3000)F2 from non-assimilated ionograms, the in situ electron density at ~400 km provided by CHAMP, and the vertical TEC provided over ocean areas by TOPEX and JASON. USU GAIM runs at AFRL in about one-third real time. For validations against ionogram characteristics, AFRL usually works with a full month of GAIM and Digisonde data, which takes ~10 days to run. The long run times make it difficult to address essential "what if" scenarios, except for limited time intervals. Compounded with the problem of long run times is the fact that the UV observations are from a satellite that is only very rarely in near conjunction with the ground-truth satellites such as CHAMP and JASON, or near ground-based ionosondes. Exacerbating this problem even further is the fact that the most reliable assimilated UV data is from the evening equatorial ionosphere. It is often not possible to obtain useful ionogram characteristics for the evening equatorial ionosphere because of the occurrence of irregularities that lead to spread F echoes on the ionograms. We will discuss the impact of these various challenges on the lessons that can be learned from validation studies of global ionospheric models.

Bishop, G. J.; McNamara, L. F.; Welsh, J. A.; Decker, D. T.; Baker, C. R.

2008-12-01

252

Model selection and validation of extreme distribution by goodness-of-fit test based on conditional position  

NASA Astrophysics Data System (ADS)

In Extreme Value Theory, the important aspect of model extrapolation is to model the extreme behavior. This is because the choice of the extreme value distribution model affects the prediction that is about to be made. Thus, model validation which is called Goodness-of-fit (GoF) test is necessary. In this study, the GoF tests were used to fit the Generalized Extreme Value (GEV) Type-II model against the simulated observed values. The ?, ? and ? were estimated by Maximum Likelihood. The critical values based on conditional points were developed by Monte-Carlo simulation. The powers of the tests were identified by power study. The data that is distributed according to GEV Type-II distribution was used to test whether the critical values developed are able to confirm the fit between GEV Type-II model and the data. To confirm the fit, the statistics value of the GOF test should be smaller than the critical value.

Abidin, Nahdiya Zainal; Adam, Mohd Bakri

2014-09-01

253

A process improvement model for software verification and validation  

NASA Technical Reports Server (NTRS)

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

Callahan, John; Sabolish, George

1994-01-01

254

A process improvement model for software verification and validation  

NASA Technical Reports Server (NTRS)

We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

Callahan, John; Sabolish, George

1994-01-01

255

The TIGGE Model Validation Portal: An Improvement In Data Interoperability  

NASA Astrophysics Data System (ADS)

The THORPEX Interactive Grand Global Ensemble (TIGGE), a major component of the World Weather Research Programme, was created to help foster and accelerate the accuracy of 1-day to 2-week high-impact weather forecasts for the benefit of humanity. A key element of this effort is the ability of weather researchers to perform model forecast validation, a statistical procedure by which observational data is used to evaluate how well a numerical model forecast performs as a function of forecast time and model fields. The current methods available for obtaining model forecast verification data can be time-consuming. For example, a user may need to obtain observational, in-situ, and model forecast data from multiple providers and sources in order to carry out the verification process. In most cases, the user is required to download a set of data covering a larger domain and over a longer period of time than is necessary for the user's research. The data preparation challenge is exacerbated if the requested data sets are provided in inconsistent formats, requiring the user to convert the multiple datasets into a preferred common data format. The TIGGE model validation portal, a new product developed for the NCAR Research Data Archive (RDA), strives to solve this data interoperability problem by bringing together and providing observational, model forecast, and in-situ data into a single data package, and in a common data format. Developed to help augment TIGGE research and facilitate researchers' ability to validate TIGGE model forecasts, the portal allows users to submit a delayed-mode data request for the observational and model parameters of their choosing. Additionally, users have the option of requesting a temporal and spatial subset from the global dataset to fit their research needs. This convenience saves both time and storage resources, and allows users to focus their efforts on model verification and research.

Cram, T.; Schuster, D. C.; Wilcox, H.; Worley, S. J.

2011-12-01

256

Empirical Validation of the Thermal Model of a Passive Solar Cell test  

E-print Network

one to really validate the optimized model. Keywords : Building thermal simulation; Model validation1 Empirical Validation of the Thermal Model of a Passive Solar Cell test Thierry Alex MARA so, a methodology of validation must be applied including the verification of numerical

Paris-Sud XI, Université de

257

A verification and validation process for model-driven engineering  

NASA Astrophysics Data System (ADS)

Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

Delmas, R.; Pires, A. F.; Polacsek, T.

2013-12-01

258

Robust cross-validation of linear regression QSAR models.  

PubMed

A quantitative structure-activity relationship (QSAR) model is typically developed to predict the biochemical activity of untested compounds from the compounds' molecular structures. "The gold standard" of model validation is the blindfold prediction when the model's predictive power is assessed from how well the model predicts the activity values of compounds that were not considered in any way during the model development/calibration. However, during the development of a QSAR model, it is necessary to obtain some indication of the model's predictive power. This is often done by some form of cross-validation (CV). In this study, the concepts of the predictive power and fitting ability of a multiple linear regression (MLR) QSAR model were examined in the CV context allowing for the presence of outliers. Commonly used predictive power and fitting ability statistics were assessed via Monte Carlo cross-validation when applied to percent human intestinal absorption, blood-brain partition coefficient, and toxicity values of saxitoxin QSAR data sets, as well as three known benchmark data sets with known outlier contamination. It was found that (1) a robust version of MLR should always be preferred over the ordinary-least-squares MLR, regardless of the degree of outlier contamination and that (2) the model's predictive power should only be assessed via robust statistics. The Matlab and java source code used in this study is freely available from the QSAR-BENCH section of www.dmitrykonovalov.org for academic use. The Web site also contains the java-based QSAR-BENCH program, which could be run online via java's Web Start technology (supporting Windows, Mac OSX, Linux/Unix) to reproduce most of the reported results or apply the reported procedures to other data sets. PMID:18826208

Konovalov, Dmitry A; Llewellyn, Lyndon E; Vander Heyden, Yvan; Coomans, Danny

2008-10-01

259

WASTES II model storage requirements benchmark testing  

SciTech Connect

A study was conducted to benchmark results obtained from using the Waste System Transportation and Economic Simulation - Version II (WASTES II) model against information published in the ''Spent Fuel Storage Requirements'' report (DOE/RL-84-1). The WASTES model was developed by PNL for use in evaluating the spent-fuel storage and transportation requirements and costs for the US Department of Energy (DOE). The ''Spent Fuel Storage Requirements'' report is issued annually by the DOE and provides both historical/projected spent fuel inventory data and storage requirements data based on information supplied directly from utilities. The objective of this study is to compare the total inventory and storage requirements documented in the ''Spent Fuel Storage Requirements'' report with similar data that results from use of the WASTES model. Three differences have been identified as a result of benchmark testing. Two minor differences are present in the total inventory projected and the equivalent metric tons of uranium of spent fuel requiring storage. These differences result from the way reinserted spent fuel is handled and the methods used to calculate mass equivalents. A third difference is found in the storage requirements for the case that uses intra-utility transshipment. This discrepancy is due to the Oyster Creek reactor, which is shown to not require additional storage in the Spent Fuel Storage Requirements report, even though there is no destination reactor of the same type within its utility. The discrepancy was corrected soon after the 1984 ''Spent Fuel Storage Requirements report was issued and does not appear in more recent documents (DOE/RL-85-2).

Shay, M.R.; Walling, R.C.; Altenhofen, M.K.

1986-09-01

260

Development of a Validated Model of Ground Coupling  

SciTech Connect

A research program at Brookhaven National Laboratory (BNL) studies ground coupling, the use of the earth as a heat source/sink or storage element for solar heat pump space conditioning systems. This paper outlines the analytical and experimental research to date toward the development of an experimentally validated model of ground coupling and based on experimental results from December, 1978 to September, 1979, expores sensitivity of present model predictions to variations in thermal conductivity and other factors. Ways in which the model can be further refined are discussed.

Metz, P.D.

1980-01-01

261

Verifying and Validating Proposed Models for FSW Process Optimization  

NASA Technical Reports Server (NTRS)

This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

Schneider, Judith

2008-01-01

262

Concurrent Validity of the WISC-IV and DAS-II in Children with Autism Spectrum Disorder  

ERIC Educational Resources Information Center

Cognitive assessments are used for a variety of research and clinical purposes in children with autism spectrum disorder (ASD). This study establishes concurrent validity of the Wechsler Intelligence Scales for Children-fourth edition (WISC-IV) and Differential Ability Scales-second edition (DAS-II) in a sample of children with ASD with a broad…

Kuriakose, Sarah

2014-01-01

263

Open-Source MFIX-DEM Software for Gas-Solids Flows: Part II - Validation Studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas–solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen

2012-04-01

264

Open-source MFIX-DEM software for gas-solids flows: Part II Validation studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen [National Energy Technology Laboratory (NETL); Garg, Rahul [National Energy Technology Laboratory (NETL); Galvin, Janine [National Energy Technology Laboratory (NETL); Pannala, Sreekanth [ORNL

2012-01-01

265

Seine estuary modelling and AirSWOT measurements validation  

NASA Astrophysics Data System (ADS)

In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being improved, by testing different roughness coefficients, adding tributary inflows. Groundwater contributions will also be introduced (digital TUGOm development in progress) . The model outputs will be validated using data from the GPMR tide gauge data and measurements from the Topex/Poseidon and Jason-1/-2 altimeters for year 2007.

Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

2013-04-01

266

Reuse of a Formal Model for Requirements Validation  

NASA Technical Reports Server (NTRS)

This paper reports experience from how a project engaged in the process of requirements analysis for evolutionary builds can reuse the formally specified design model produced for a similar, earlier project in the same domain. Two levels of reuse are described here. First, a formally specified generic design model was generated on one project to systematically capture the design commonality in a set of software monitors on board a spacecraft. These monitors periodically check for faults and invoke recovery software when needed. The paper summarizes the use of the design model to validate the software design of the various monitors on that first project. Secondly, the paper describes how the formal design model created for the first project was reused on a second, subsequent project. The model was reused to validate the evolutionary requirements for the second project's software monitors, which were being developed in a series of builds. Some mismatches due to the very different architectures on the two projects suggested changes to make the model more generic. In addition, several advantages to the reuse of the first project's formal model on the second project are reported.

Lutz, Robyn R.

1997-01-01

267

Simulation model verification and validation: increasing the users' confidence  

Microsoft Academic Search

This paper sets simulation model verification and validation (V&V) in the context of the process of performing a simulation study. Various different forms of V&V need to take place depending on the stage that has been reached. Since the phases of a study are performed in an iterative manner, so too are the various forms of V&V. A number of

Stewart Robinson

1997-01-01

268

Validating Complex Construction Simulation Models Using 3D Visualization  

Microsoft Academic Search

One of primary impediments in the use of discrete-event simulation to plan and design construction operations is that decision-makers often do not have the means, the knowledge, and\\/or the time to check the veracity and the validity of simulation models and thus have little confidence in the results. Visualizing simulated operations in 3D can be of substantial help in the

Vineet R. Kamat; Julio C. Martinez

2003-01-01

269

Test Exploration and Validation Using Transaction Level Models  

Microsoft Academic Search

The complexity of the test infrastructure and test strategies in systems-on-chip approaches the complexity of the functional design space. This paper presents test design space exploration and validation of test strategies and schedules using transaction level models (TLMs). Since many aspects of testing involve the transfer of a significant amount of test stimuli and responses, the communication-centric view of TLMs

Michael A. Kochte; Christian G. Zoellin; Michael E. Imhof; Rauf Salimi Khaligh; Martin Radetzki; Hans-joachim Wunderlich; Stefano Di Carlo; Paolo Prinetto

2009-01-01

270

Aggregating validity indicators embedded in Conners' CPT-II outperforms individual cutoffs at separating valid from invalid performance in adults with traumatic brain injury.  

PubMed

Continuous performance tests (CPT) provide a useful paradigm to assess vigilance and sustained attention. However, few established methods exist to assess the validity of a given response set. The present study examined embedded validity indicators (EVIs) previously found effective at dissociating valid from invalid performance in relation to well-established performance validity tests in 104 adults with TBI referred for neuropsychological testing. Findings suggest that aggregating EVIs increases their signal detection performance. While individual EVIs performed well at their optimal cutoffs, two specific combinations of these five indicators generally produced the best classification accuracy. A CVI-5A ?3 had a specificity of .92-.95 and a sensitivity of .45-.54. At ?4 the CVI-5B had a specificity of .94-.97 and sensitivity of .40-.50. The CVI-5s provide a single numerical summary of the cumulative evidence of invalid performance within the CPT-II. Results support the use of a flexible, multivariate approach to performance validity assessment. PMID:24957927

Erdodi, Laszlo A; Roth, Robert M; Kirsch, Ned L; Lajiness-O'neill, Renee; Medoff, Brent

2014-08-01

271

Multicomponent aerosol dynamics model UHMA: model development and validation  

Microsoft Academic Search

A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles

H. Korhonen; K. E. J. Lehtinen; M. Kulmala

2004-01-01

272

The two-track model of bereavement questionnaire (TTBQ): development and validation of a relational measure.  

PubMed

The Two-Track Model of Bereavement Questionnaire (TTBQ) was designed to assess response to loss over time. Respondents were 354 persons who completed the 70-item self-report questionnaire constructed in accordance with the Two-Track Model of Bereavement. Track I focuses on the bereaved's biopsychosocial functioning and Track II concerns the bereaved's ongoing relationship to the range of memories, images, thoughts, and feeling states associated with the deceased. Factor analysis identified 5 factors that accounted for 51% of the variance explained. In accord with the theoretical and clinical model, 3 factors were primarily associated with the relationship to the deceased (Track II): Active Relational Grieving, Close and Positive Relationship, and Conflictual Relationship; and 2 factors with aspects of functioning (Track I): General Biopsychosocial Functioning and Traumatic Perception of the Loss. Construct and concurrent validity were examined and were found satisfactory. Differences by kinship, cause of death, gender, and time elapsed were examined across the 5 factors, the total TTBQ, and the ITG. The new measure is shown to have both construct and concurrent validity. Discussions of the results and implications for the measurement of response to loss conclude the article. PMID:19368062

Rubin, Simon Shimshon; Nadav, Ofri Bar; Malkinson, Ruth; Koren, Dan; Goffer-Shnarch, Moran; Michaeli, Ella

2009-04-01

273

Validation of the PESTLA Model: Evaluation of the Validation Statuses of the Pesticide Leaching Models PRZM-1, LEACHP, GLEAMS and PELMO.  

National Technical Information Service (NTIS)

The validation statuses of the pesticide leaching models PRZM-1, LEACHP, GLEAMS and PELMO were assessed by literature study. The required range of validity included all the situations in which pesticides are applied in Dutch agriculture and horticulture. ...

H. van den Bosch, J. J. T. I. Boesten

1995-01-01

274

Shoulder model validation and joint contact forces during wheelchair activities  

PubMed Central

Chronic shoulder impingement is a common problem for manual wheelchair users. The loading associated with performing manual wheelchair activities of daily living is substantial and often at a high frequency. Musculoskeletal modeling and optimization techniques can be used to estimate the joint contact forces occurring at the shoulder to assess the soft tissue loading during an activity and to possibly identify activities and strategies that place manual wheelchair users at risk for shoulder injuries. The purpose of this study was to validate an upper extremity musculoskeletal model and apply the model to wheelchair activities for analysis of the estimated joint contact forces. Upper extremity kinematics and handrim wheelchair kinetics were measured over three conditions: level propulsion, ramp propulsion, and a weight relief lift. The experimental data were used as input to a subject-specific musculoskeletal model utilizing optimization to predict joint contact forces of the shoulder during all conditions. The model was validated using a mean absolute error calculation. Model results confirmed that ramp propulsion and weight relief lifts place the shoulder under significantly higher joint contact loading than level propulsion. In addition, they exhibit large superior contact forces that could contribute to impingement. This study highlights the potential impingement risk associated with both the ramp and weight relief lift activities. Level propulsion was shown to have a low relative risk of causing injury, but with consideration of the frequency with which propulsion is performed, this observation is not conclusive. PMID:20840833

Morrow, Melissa M.B.; Kaufman, Kenton R.; An, Kai-Nan

2010-01-01

275

Validation of the WATEQ4 geochemical model for uranium  

SciTech Connect

As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

1983-09-01

276

Statistical validation of high-dimensional models of growing networks  

NASA Astrophysics Data System (ADS)

The abundance of models of complex networks and the current insufficient validation standards make it difficult to judge which models are strongly supported by data and which are not. We focus here on likelihood maximization methods for models of growing networks with many parameters and compare their performance on artificial and real datasets. While high dimensionality of the parameter space harms the performance of direct likelihood maximization on artificial data, this can be improved by introducing a suitable penalization term. Likelihood maximization on real data shows that the presented approach is able to discriminate among available network models. To make large-scale datasets accessible to this kind of analysis, we propose a subset sampling technique and show that it yields substantial model evidence in a fraction of time necessary for the analysis of the complete data.

Medo, Matúš

2014-03-01

277

Experimental validation of flexible robot arm modeling and control  

NASA Technical Reports Server (NTRS)

Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

Ulsoy, A. Galip

1989-01-01

278

Split D differential probe model validation using an impedance analyzer  

NASA Astrophysics Data System (ADS)

Benchmark and validation studies are presented that quantify the accuracy of computational models. An important factor in these studies is the ability to compare simulated impedance results with experimental data. In a majority of differential benchmark studies the data acquisition is handled by a commercial eddy current instrument which allow for only a relative comparison of the data. In this study a novel data acquisition system allows for the collection of impedance data for differential probes. Details about the data collection, experimental procedure, model construction, and data comparison will be presented.

Mooers, Ryan D.; Knopp, Jeremy S.; Aldrin, John C.; Sathish, Shamachary

2014-02-01

279

Cosmological Parameter Uncertainties from SALT-II Type Ia Supernova Light Curve Models  

E-print Network

We use simulated SN Ia samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and the bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: 120 low-redshift (z training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (winput - wrecovered) ranging from -0.005 +/- 0.012 to -0.024 +/- 0.010. These biases a...

Mosher, J; Kessler, R; Astier, P; Marriner, J; Betoule, M; Sako, M; El-Hage, P; Biswas, R; Pain, R; Kuhlmann, S; Regnault, N; Frieman, J A; Schneider, D P

2014-01-01

280

Verification, validation, and certification of modeling and simulation applications: verification, validation, and certification of modeling and simulation applications  

Microsoft Academic Search

Certifying that a large-scale complex modeling and simulation (M&S) application can be used for a set of specific purposes is an onerous task, which involves complex evaluation processes throughout the entire M&S development life cycle. The evaluation processes consist of verification and validation activities, quality assurance, assessment of qualitative and quantitative elements, assessments by subject matter experts, and integration of

Osman Balci

2003-01-01

281

Quantitative validation of a deformable cortical surface model  

NASA Astrophysics Data System (ADS)

Accurate reconstruction of the human cerebral cortex from magnetic resonance (MR) images is important for brain morphometric analysis, image-guided surgery, and functional mapping. Previously, we have implemented a cortical surface reconstruction method that employs fuzzy segmentation, isosurfaces and deformable surface models. The accuracy of the fuzzy segmentation has been well-studied using simulated brain images. However, global quantitative validation of the cortical surface model has not been feasible due to the lack of a true representation of the cortical surface. In this paper, we have alternately validated the deformable surface model used in one cortical surface reconstruction method by using a metasphere computational phantom. A metasphere is a mathematically defined three-dimensional (3-D) surface that has convolutions similar to the cortex. We simulated 500 image volumes using metaspheres with various numbers and degrees of convolutions. Different levels of Gaussian noise were also incorporated. Quantification of the differences between the reconstructed surfaces and the true metasphere surfaces provides a measure of the deformable model accuracy in relation to the properties of the modeled object and data quality.

Yu, Daphne N.; Xu, Chenyang; Rettmann, Maryam E.; Pham, Dzung L.; Prince, Jerry L.

2000-06-01

282

Concepts and Validation of the ESA MASTER Model  

NASA Astrophysics Data System (ADS)

MASTER-2005 is the new orbital debris reference model of the European Space Agency It was developed by a team led by the Institute of Aerospace Systems The model is based on the simulation of events and processes that lead to the generation of orbital debris The majority of the debris generation mechanisms implemented in MASTER have been reviewed in the course of the project The validation for debris objects larger than 1 mm was based on observation data gathered by the TIRA Goldstone and Haystack radars and the ESA Space Debris Telescope ESA-SDT The PROOF-2005 validation tool has been used to simulate detections of orbital debris based on the analysis of geometrical and instrument parameters The simulation results gathered using the observation scenario were compared with the actual observations In this paper the results of this population generation mechanism will be presented New ESA-SDT data was used to further refine the simulation of the GEO object population In MASTER-2001 in addition to the known fragmentations of the Ekran-2 satellite and the Titan Transtage 11 artificial breakups have been introduced in order to show alignment of PROOF simulations with measurement data Using additional ESA-SDT observation data the assumptions concerning number magnitude time and position of the artificial breakups were reviewed and corrected Small particle validation was performed based on returned space hardware impact data The solar arrays of the Hubble Space Telescope returned by the Space Shuttle on missions STS-61 and

Oswald, M.; Stabroth, S.; Wiedemann, C.; Klinkrad, H.; Vörsmann, P.

283

Model calibration and validation of an impact test simulation  

SciTech Connect

This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

2001-01-01

284

Modeling TCP Throughput: A Simple Model and Its Empirical Validation  

Microsoft Academic Search

In this paper we develop a simple analytic characterization of the steady state throughput, as a function of loss rate and round trip time for a bulk transfer TCP flow, i.e., a flow with an unlimited amount of data to send. Unlike the models in [6, 7, 10], our model captures not only the behavior of TCP's fast retransmit mechanism

Jitendra Padhye; Victor Firoiu; Donald F. Towsley; James F. Kurose

1998-01-01

285

Modeling Topaz-II system performance  

SciTech Connect

The US acquisition of the Topaz-11 in-core thermionic space reactor test system from Russia provides a good opportunity to perform a comparison of the Russian reported data and the results from computer codes such as MCNP (Ref. 3) and TFEHX (Ref. 4). The comparison study includes both neutronic and thermionic performance analyses. The Topaz II thermionic reactor is modeled with MCNP using actual Russian dimensions and parameters. The computation of the neutronic performance considers several important aspects such as the fuel enrichment and location of the thermionic fuel elements (TFES) in the reactor core. The neutronic analysis included the calculation of both radial and axial power distribution, which are then used in the TFEHX code for electrical performance. The reactor modeled consists of 37 single-cell TFEs distributed in a 13-cm-radius zirconium hydride block surrounded by 8 cm of beryllium metal reflector. The TFEs use 90% enriched [sup 235]U and molybdenum coated with a thin layer of [sup 184]W for emitter surface. Electrons emitted are captured by a collector surface with a gap filled with cesium vapor between the collector and emitter surfaces. The collector surface is electrically insulated with alumina. Liquid NaK provides the cooling system for the TFEs. The axial thermal power distribution is obtained by dividing the TFE into 40 axial nodes. Comparison of the true axial power distribution with that produced by electrical heaters was also performed.

Lee, H.H.; Klein, A.C. (Oregon State Univ., Corvallis (United States))

1993-01-01

286

Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression  

PubMed Central

Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments on the two organisms (remission validity). The relevance of this framework is then discussed regarding various animal models of depression. PMID:22738250

2011-01-01

287

Validation of coupled atmosphere-fire behavior models  

SciTech Connect

Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexity of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.

Bossert, J.E.; Reisner, J.M.; Linn, R.R.; Winterkamp, J.L. [Los Alamos National Lab., NM (United States); Schaub, R. [Dynamac Corp., Kennedy Space Center, FL (United States); Riggan, P.J. [Forest Service, Riverside, CA (United States)

1998-12-31

288

Experimental Validation of Thermal Model of Counter Flow Microchannel Heat Exchangers Subjected to External Heat Flux  

Microsoft Academic Search

The effect of uniform external heat flux on the effectiveness of counter flow microchannel heat exchangers is experimentally studied in this article for validating an existing thermal model. The model validated in this study is a one dimensional model previously developed by the same authors. The model is validated to be independent of microchannel profile, hydraulic diameter, and heat capacity

Bobby Mathew; Hisham Hegab

2013-01-01

289

Validation of High Displacement Piezoelectric Actuator Finite Element Models  

NASA Technical Reports Server (NTRS)

The paper presents the results obtained by using NASTRAN(Registered Trademark) and ANSYS(Regitered Trademark) finite element codes to predict doming of the THUNDER piezoelectric actuators during the manufacturing process and subsequent straining due to an applied input voltage. To effectively use such devices in engineering applications, modeling and characterization are essential. Length, width, dome height, and thickness are important parameters for users of such devices. Therefore, finite element models were used to assess the effects of these parameters. NASTRAN(Registered Trademark) and ANSYS(Registered Trademark) used different methods for modeling piezoelectric effects. In NASTRAN(Registered Trademark), a thermal analogy was used to represent voltage at nodes as equivalent temperatures, while ANSYS(Registered Trademark) processed the voltage directly using piezoelectric finite elements. The results of finite element models were validated by using the experimental results.

Taleghani, B. K.

2000-01-01

290

Rapid target gene validation in complex cancer mouse models using re-derived embryonic stem cells.  

PubMed

Human cancers modeled in Genetically Engineered Mouse Models (GEMMs) can provide important mechanistic insights into the molecular basis of tumor development and enable testing of new intervention strategies. The inherent complexity of these models, with often multiple modified tumor suppressor genes and oncogenes, has hampered their use as preclinical models for validating cancer genes and drug targets. In our newly developed approach for the fast generation of tumor cohorts we have overcome this obstacle, as exemplified for three GEMMs; two lung cancer models and one mesothelioma model. Three elements are central for this system; (i) The efficient derivation of authentic Embryonic Stem Cells (ESCs) from established GEMMs, (ii) the routine introduction of transgenes of choice in these GEMM-ESCs by Flp recombinase-mediated integration and (iii) the direct use of the chimeric animals in tumor cohorts. By applying stringent quality controls, the GEMM-ESC approach proofs to be a reliable and effective method to speed up cancer gene assessment and target validation. As proof-of-principle, we demonstrate that MycL1 is a key driver gene in Small Cell Lung Cancer. PMID:24401838

Huijbers, Ivo J; Bin Ali, Rahmen; Pritchard, Colin; Cozijnsen, Miranda; Kwon, Min-Chul; Proost, Natalie; Song, Ji-Ying; de Vries, Hilda; Badhai, Jitendra; Sutherland, Kate; Krimpenfort, Paul; Michalak, Ewa M; Jonkers, Jos; Berns, Anton

2014-02-01

291

Rapid target gene validation in complex cancer mouse models using re-derived embryonic stem cells  

PubMed Central

Human cancers modeled in Genetically Engineered Mouse Models (GEMMs) can provide important mechanistic insights into the molecular basis of tumor development and enable testing of new intervention strategies. The inherent complexity of these models, with often multiple modified tumor suppressor genes and oncogenes, has hampered their use as preclinical models for validating cancer genes and drug targets. In our newly developed approach for the fast generation of tumor cohorts we have overcome this obstacle, as exemplified for three GEMMs; two lung cancer models and one mesothelioma model. Three elements are central for this system; (i) The efficient derivation of authentic Embryonic Stem Cells (ESCs) from established GEMMs, (ii) the routine introduction of transgenes of choice in these GEMM-ESCs by Flp recombinase-mediated integration and (iii) the direct use of the chimeric animals in tumor cohorts. By applying stringent quality controls, the GEMM-ESC approach proofs to be a reliable and effective method to speed up cancer gene assessment and target validation. As proof-of-principle, we demonstrate that MycL1 is a key driver gene in Small Cell Lung Cancer. PMID:24401838

Huijbers, Ivo J; Bin Ali, Rahmen; Pritchard, Colin; Cozijnsen, Miranda; Kwon, Min-Chul; Proost, Natalie; Song, Ji-Ying; Vries, Hilda; Badhai, Jitendra; Sutherland, Kate; Krimpenfort, Paul; Michalak, Ewa M; Jonkers, Jos; Berns, Anton

2014-01-01

292

Deviatoric constitutive model: domain of strain rate validity  

SciTech Connect

A case is made for using an enhanced methodology in determining the parameters that appear in a deviatoric constitutive model. Predictability rests on our ability to solve a properly posed initial boundary value problem (IBVP), which incorporates an accurate reflection of material constitutive behavior. That reflection is provided through the constitutive model. Moreover, the constitutive model is required for mathematical closure of the IBVP. Common practice in the shock physics community is to divide the Cauchy tensor into spherical and deviatoric parts, and to develop separate models for spherical and deviatoric constitutive response. Our focus shall be on the Cauchy deviator and deviatoric constitutive behavior. Discussions related to the spherical part of the Cauchy tensor are reserved for another time. A number of deviatoric constitutive models have been developed for utilization in the solution of IBVPs that are of interest to those working in the field of shock physics, e.g. All of these models are phenomenological and contain a number of parameters that must be determined in light of experimental data. The methodology employed in determining these parameters dictates the loading regime over which the model can be expected to be accurate. The focus of this paper is the methodology employed in determining model parameters and the consequences of that methodology as it relates to the domain of strain rate validity. We shall begin by describing the methodology that is typically employed. We shall discuss limitations imposed upon predictive capability by the typically employed methodology. We shall propose a modification to the typically employed methodology that significantly extends the domain of strain rate validity.

Zocher, Marvin A [Los Alamos National Laboratory

2009-01-01

293

Validation of thermal models for a prototypical MEMS thermal actuator.  

SciTech Connect

This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the polycrystalline silicon test structures, as well as uncontrolled nonuniform changes in this quantity over time and during operation.

Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

2008-09-01

294

Validated models for predicting skin penetration from different vehicles.  

PubMed

The permeability of a penetrant though skin is controlled by the properties of the penetrants and the mixture components, which in turn relates to the molecular structures. Despite the well-investigated models for compound permeation through skin, the effect of vehicles and mixture components has not received much attention. The aim of this Quantitative Structure Activity Relationship (QSAR) study was to develop a statistically validated model for the prediction of skin permeability coefficients of compounds dissolved in different vehicles. Furthermore, the model can help with the elucidation of the mechanisms involved in the permeation process. With this goal in mind, the skin permeability of four different penetrants each blended in 24 different solvent mixtures were determined from diffusion cell studies using porcine skin. The resulting 96 kp values were combined with a previous dataset of 288 kp data for QSAR analysis. Stepwise regression analysis was used for the selection of the most significant molecular descriptors and development of several regression models. The selected QSAR employed two penetrant descriptors of Wiener topological index and total lipole moment, boiling point of the solvent and the difference between the melting point of the penetrant and the melting point of the solvent. The QSAR was validated internally, using a leave-many-out procedure, giving a mean absolute error of 0.454 for the logkp value of the test set. PMID:20816954

Ghafourian, Taravat; Samaras, Eleftherios G; Brooks, James D; Riviere, Jim E

2010-12-23

295

A proposal to use geoid slope validation lines to validate models of geoid change  

NASA Astrophysics Data System (ADS)

The United States National Geodetic Survey (NGS) has embarked on a ten year project called GRAV-D (Gravity for the Redefinition of the American Vertical Datum). The purpose of this project is to replace the current official vertical datum, NAVD 88 (the North American Vertical Datum of 1988) with a geopotential reference system based on a new survey of the gravity field and a gravimetric geoid. As part of GRAV-D, the National Geodetic Survey will develop a set of “geoid slope validation lines” at various locations of the country. These lines will be surveys designed to independently measure the slope of the geoid to provide a check against both the data and theory used to create the final gravimetric geoid which will be used in the geopotential reference system. The first of these lines is proposed to be established in the Autumn of 2011 in the west central region of Texas. The survey will be approximately 300 kilometers long, consisting of GPS, geodetic leveling, deflections of the vertical, surface absolute and relative gravity, including the use of relative meters for low-high surface gradient determination. This region was chosen for many factors including the availability of GRAV-D airborne gravity over the area, its relatively low elevation (220 meter orthometric height max), its geoid slope (from the latest high resolution models being a few decimeters over 300 km), lack of significant topographic relief, lack of large forestation, availability of good roads, clarity of weather and lack of large water crossings. Further lines are planned in the out-years, in more difficult areas, though their locations are not yet determined. Although the original intent of these lines was to serve as calibrations against geoid modeling data and theory, there may be additional uses relevant to geoid monitoring. A gap is being anticipated between the GRACE and GRACE-Follow On missions. GRACE has shown a quantifiable change (millimeters per year) in the geoid over parts of North America. As such, the GRAV-D project contains plans to monitor geoid change. However, without GRACE, some method of modeling geoid change and then testing that model must be developed. It is proposed, therefore, that as NGS develops more “geoid slope validation lines” that some consideration be made to placing one or more of them in areas of known, ongoing geoid change. Re-surveying of these lines would yield a direct, independent look at actual geoid change along the line. The sparseness and linear nature of such lines would not allow them to be used to directly create a continental model of geoid change, but they could stand as in-situ validations of models of geoid change coming from, say a model of mantle and glacial dynamics.

Smith, D. A.

2010-12-01

296

Volumetric Intraoperative Brain Deformation Compensation: Model Development and Phantom Validation  

PubMed Central

During neurosurgery, nonrigid brain deformation may affect the reliability of tissue localization based on preoperative images. To provide accurate surgical guidance in these cases, preoperative images must be updated to reflect the intraoperative brain. This can be accomplished by warping these preoperative images using a biomechanical model. Due to the possible complexity of this deformation, intraoperative information is often required to guide the model solution. In this paper, a linear elastic model of the brain is developed to infer volumetric brain deformation associated with measured intraoperative cortical surface displacement. The developed model relies on known material properties of brain tissue, and does not require further knowledge about intraoperative conditions. To provide an initial estimation of volumetric model accuracy, as well as determine the model’s sensitivity to the specified material parameters and surface displacements, a realistic brain phantom was developed. Phantom results indicate that the linear elastic model significantly reduced localization error due to brain shift, from >16 mm to under 5 mm, on average. In addition, though in vivo quantitative validation is necessary, preliminary application of this approach to images acquired during neocortical epilepsy cases confirms the feasibility of applying the developed model to in vivo data. PMID:22562728

DeLorenzo, Christine; Papademetris, Xenophon; Staib, Lawrence H.; Vives, Kenneth P.; Spencer, Dennis D.; Duncan, James S.

2012-01-01

297

Modeling of a Foamed Emulsion Bioreactor: I. Model Development and Experimental Validation  

E-print Network

ARTICLE Modeling of a Foamed Emulsion Bioreactor: I. Model Development and Experimental Validation treat- ment. In the current paper, a diffusion and reaction model of the FEBR is presented and discussed. The model considers the fate of the volatile pollutant in the emulsion that constitutes the liquid films

298

MODELLING FIRE IN TUNNELS: A LARGE SCALE VALIDATED TWO STEPS MODELLING METHOD  

E-print Network

: CFD modelling, real fire experiment 1. INTRODUCTION Fire in tunnels can generate dramatic conséquencesMODELLING FIRE IN TUNNELS: A LARGE SCALE VALIDATED TWO STEPS MODELLING METHOD 1 B. Truchot, 1 G'Ardèche ABSTRACT Fire is a quite common phenomenon in tunnel and being able to model its conséquences with a good

Paris-Sud XI, Université de

299

Convergent validity of the Beck depression inventory-II with the reynolds adolescent depression scale in psychiatric inpatients.  

PubMed

The Beck Depression Inventory-II (BDI-II; Beck, Steer, & Brown, 1996) and the Reynolds Adolescent Depression Scale (RADS; Reynolds, 1987) were administered to 56 female and 44 male psychiatric inpatients whose ages ranged from 12 to 17 years old. The Cronbach coefficient alpha(s) for the BDI-II and RADS were, respectively, .92 and .91 and indicated comparably high levels of internal consistency. The correlation between the BDI-II and RADS total scores was .84,p <.001. Binormal receiver-operating-characteristic analyses indicated that both instruments were comparably effective in differentiating inpatients who were and were not diagnosed with a major depressive disorder; the areas under the ROC curves for the BDI-II and RADS were, respectively, .78 and .76. The results (a) indicate that the BDI-II and the RADS have similar psychometric characteristics and (b) support the convergent validity of the BDI-II for assessing self-reported depression in adolescent inpatients. PMID:12146814

Krefetz, David G; Steer, Robert A; Gulab, Nazli A; Beck, Aaron T

2002-06-01

300

Contaminant transport model validation: The Oak Ridge Reservation  

SciTech Connect

In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an extensive suite of aquifer test within a 600-square-meter area to obtain aquifer property values to describe the flow field in detail. Following aquifer test, a groundwater tracer test was performed under ambient conditions to verify the aquifer analysis. Tracer migration data in the near-field were used in model calibration to predict tracer arrival time and concentration in the far-field. Despite the extensive aquifer testing, initial modeling inaccurately predicted tracer migration direction. Initial tracer migration rates were consistent with those predicted by the model; however, changing environmental conditions resulted in an unanticipated decay in tracer movement. Evaluation of the predictive accuracy of groundwater flow and contaminant transport models on the Oak Ridge Reservation depends on defining the resolution required, followed by field testing and model grid definition at compatible scales. The use of tracer tests, both as a characterization method and to verify model results, provides the highest level of resolution of groundwater flow characteristics. 3 refs., 4 figs.

Lee, R.R.; Ketelle, R.H.

1988-09-01

301

Multicomponent aerosol dynamics model UHMA: model development and validation  

NASA Astrophysics Data System (ADS)

A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory), as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3-4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

Korhonen, H.; Lehtinen, K. E. J.; Kulmala, M.

2004-05-01

302

Multicomponent aerosol dynamics model UHMA: model development and validation  

NASA Astrophysics Data System (ADS)

A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory), as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3-4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

Korhonen, H.; Lehtinen, K. E. J.; Kulmala, M.

2004-01-01

303

Image quality assessment in digital mammography: part II. NPWE as a validated alternative for contrast detail analysis  

NASA Astrophysics Data System (ADS)

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.

Monnin, P.; Marshall, N. W.; Bosmans, H.; Bochud, F. O.; Verdun, F. R.

2011-07-01

304

CSC6870 Computer Graphics II Geometric Modeling  

E-print Network

curves. · Hermite curves. · Bezier curves. · B-Splines. · NURBS. · Subdivision schemes. CSC6870 Computer Primitives: · Points ­ Vertices. · Curves ­ Lines, polylines, curves. · Surfaces ­ Triangle meshes, splines Graphics II Fundamental Shapes #12;CSC6870 Computer Graphics II Curves · Lines. · Polynomials. · Lagrange

Hua, Jing

305

Bolted connection modeling and validation through laser-aided testing  

NASA Astrophysics Data System (ADS)

Bolted connections are widely employed in facility structures, such as light masts, transmission poles, and wind turbine towers. The complex connection behavior plays a significant role in the overall dynamic characteristics of a structure. A finite element (FE) modeling study of a bolt-connected square tubular steel beam is presented in this paper. Modal testing was performed in a controlled laboratory condition to validate the FE model, developed for the bolted beam. Two laser Doppler vibrometers were used simultaneously to measure structural vibration. A simplified joint model was proposed to further save computation time for structures with bolted connections. This study is an on-going effort to marshal knowledge associated with detecting damage on facility structures with bolted connections.

Dai, Kaoshan; Gong, Changqing; Smith, Benjiamin

2013-04-01

306

Validation of GOCE densities and evaluation of thermosphere models  

NASA Astrophysics Data System (ADS)

Atmospheric densities from ESA’s GOCE satellite at a mean altitude of 270 km are validated by comparison with predictions from the near real time model HASDM along the GOCE orbit in the time frame 1 November 2009 through 31 May 2012. Except for a scale factor of 1.29, which is due to different aerodynamic models being used in HASDM and GOCE, the agreement is at the 3% (standard deviation) level when comparing daily averages. The models NRLMSISE-00, JB2008 and DTM2012 are compared with the GOCE data. They match at the 10% level, but significant latitude-dependent errors as well as errors with semiannual periodicity are detected. Using the 0.1 Hz sampled data leads to much larger differences locally, and this dataset can be used presently to analyze variations down to scales as small as 150 km.

Bruinsma, S. L.; Doornbos, E.; Bowman, B. R.

2014-08-01

307

Cross-Cultural Validation of the Beck Depression Inventory-II across U.S. and Turkish Samples  

ERIC Educational Resources Information Center

The purpose of this study was to test the Beck Depression Inventory-II (BDI-II) for factorial invariance across Turkish and U.S. college student samples. The results indicated that (a) a two-factor model has an adequate fit for both samples, thus providing evidence of configural invariance, and (b) there is a metric invariance but "no" sufficient…

Canel-Cinarbas, Deniz; Cui, Ying; Lauridsen, Erica

2011-01-01

308

Modeling and Validation of Damped Plexiglas Windows for Noise Control  

NASA Technical Reports Server (NTRS)

Windows are a significant path for structure-borne and air-borne noise transmission in general aviation aircraft. In this paper, numerical and experimental results are used to evaluate damped plexiglas windows for the reduction of structure-borne and air-borne noise transmitted into the interior of an aircraft. In contrast to conventional homogeneous windows, the damped plexiglas windows were fabricated using two or three layers of plexiglas with transparent viscoelastic damping material sandwiched between the layers. Transmission loss and radiated sound power measurements were used to compare different layups of the damped plexiglas windows with uniform windows of the same nominal thickness. This vibro-acoustic test data was also used for the verification and validation of finite element and boundary element models of the damped plexiglas windows. Numerical models are presented for the prediction of radiated sound power for a point force excitation and transmission loss for diffuse acoustic excitation. Radiated sound power and transmission loss predictions are in good agreement with experimental data. Once validated, the numerical models were used to perform a parametric study to determine the optimum configuration of the damped plexiglas windows for reducing the radiated sound power for a point force excitation.

Buehrle, Ralph D.; Gibbs, Gary P.; Klos, Jacob; Mazur, Marina

2003-01-01

309

Low Frequency Eddy Current Benchmark Study for Model Validation  

NASA Astrophysics Data System (ADS)

This paper presents results of an eddy current model validation study. Precise measurements were made using an impedance analyzer to investigate changes in impedance due to Electrical Discharge Machining (EDM) notches in aluminum plates. Each plate contained one EDM notch at an angle of 0, 10, 20, or 30 degrees from the normal of the plate surface. Measurements were made with the eddy current probe both scanning parallel and perpendicular to the notch length. The experimental response from the vertical and oblique notches will be reported and compared to results from different numerical simulation codes.

Mooers, R. D.; Cherry, M. R.; Knopp, J. S.; Aldrin, J. C.; Sabbagh, H. A.; Boehnlein, T. R.

2011-06-01

310

Camparison for biosorption modeling of heavy metals (Cr (III), Cu (II), Zn (II)) adsorption from wastewater by carrot residues  

Microsoft Academic Search

The removal of chromium (III), copper (II) and zinc (II) from aqueous solution by adsorption on carrot residues (CR) was studied. Biosorption of chromium (III), copper (II) and zinc (II) on CR were compared. It was shown that CR has high metal removal efficiency. The Freundlich and Langmuir model can describe the adsorption equilibrium of chromium (III), copper (II) and

B. Nasernejad; T. Esslam Zadeh; B. Bonakdar Pour; M. Esmaail Bygi; A. Zamani

2005-01-01

311

Evaluation and cross-validation of Environmental Models  

NASA Astrophysics Data System (ADS)

Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore, preliminary analyses should be carried out under the control and authority of an established international professional Organization or Association, before any final political decision is made by ISO to select a specific Environmental Models, like for example IGRF and DGRF. Of course, Commissions responsible for checking the consistency of definitions, methods and algorithms for data processing might consider to delegate specific tasks (e.g. bench-marking the technical tools, the calibration procedures, the methods of data analysis, and the software algorithms employed in building the different types of models, as well as their usage) to private, intergovernmental or international organization/agencies (e.g.: NASA, ESA, AGU, EGU, COSPAR, . . . ); eventually, the latter should report conclusions to the Commissions members appointed by IAGA or any established authority like IUGG.

Lemaire, Joseph

312

STRUCTURAL VALIDATION OF SYSTEM DYNAMICS AND AGENT-BASED SIMULATION MODELS  

E-print Network

STRUCTURAL VALIDATION OF SYSTEM DYNAMICS AND AGENT- BASED SIMULATION MODELS Hassan Qudrat: hassanq@yorku.ca KEYWORDS Simulation; System Dynamics; Structural Validity ABSTRACT Simulation models dynamics modeling `repertoire'. An illustration of a set of six tests for structural validity of both

Tesfatsion, Leigh

313

An automated technique to support the verification and validation of simulation models  

Microsoft Academic Search

Simulation modeling requires model validation and verification to ensure that computed results are worth being considered. While we cannot expect a magic solution to the general problem, automated techniques for particular aspects of validation and verification are feasible. In this paper, we propose a technique to deduce model properties automatically from simulation runs performed for verification and validation and to

Samuel K. Klock; Peter Kemper

2010-01-01

314

A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.  

ERIC Educational Resources Information Center

A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

Edmonston, Leon P.; Randall, Robert S.

315

Experimental validation of a numerical model for subway induced vibrations  

NASA Astrophysics Data System (ADS)

This paper presents the experimental validation of a coupled periodic finite element-boundary element model for the prediction of subway induced vibrations. The model fully accounts for the dynamic interaction between the train, the track, the tunnel and the soil. The periodicity or invariance of the tunnel and the soil in the longitudinal direction is exploited using the Floquet transformation, which allows for an efficient formulation in the frequency-wavenumber domain. A general analytical formulation is used to compute the response of three-dimensional invariant or periodic media that are excited by moving loads. The numerical model is validated by means of several experiments that have been performed at a site in Regent's Park on the Bakerloo line of London Underground. Vibration measurements have been performed on the axle boxes of the train, on the rail, the tunnel invert and the tunnel wall, and in the free field, both at the surface and at a depth of 15 m. Prior to these vibration measurements, the dynamic soil characteristics and the track characteristics have been determined. The Bakerloo line tunnel of London Underground has been modelled using the coupled periodic finite element-boundary element approach and free field vibrations due to the passage of a train at different speeds have been predicted and compared to the measurements. The correspondence between the predicted and measured response in the tunnel is reasonably good, although some differences are observed in the free field. The discrepancies are explained on the basis of various uncertainties involved in the problem. The variation in the response with train speed is similar for the measurements as well as the predictions. This study demonstrates the applicability of the coupled periodic finite element-boundary element model to make realistic predictions of the vibrations from underground railways.

Gupta, S.; Degrande, G.; Lombaert, G.

2009-04-01

316

Validating a spatially distributed hydrological model with soil morphology data  

NASA Astrophysics Data System (ADS)

Spatially distributed models are popular tools in hydrology claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for inputs of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography and artificial drainage. We translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to observed groundwater levels and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the the groundwater level predictions were not accurate enough to be used for the prediction of saturated areas. Groundwater level dynamics were not adequately reproduced and the predicted spatial saturation patterns did not correspond to those estimated from the soil map. Our results indicate that an accurate prediction of the groundwater level dynamics of the shallow groundwater in our catchment that is subject to artificial drainage would require a model that better represents processes at the boundary between the unsaturated and the saturated zone. However, data needed for such a more detailed model are not generally available. This severely hampers the practical use of such models despite their usefulness for scientific purposes.

Doppler, T.; Honti, M.; Zihlmann, U.; Weisskopf, P.; Stamm, C.

2014-09-01

317

Defect distribution model validation and effective process control  

NASA Astrophysics Data System (ADS)

Assumption of the underlying probability distribution is an essential part of effective process control. In this article, we demonstrate how to improve the effectiveness of equipment monitoring and process induced defect control through properly selecting, validating and using the hypothetical distribution models. The testing method is based on probability plotting, which is made possible through order statistics. Since each ordered sample data point has a cumulative probability associated with it, which is calculated as a function of sample size, the assumption validity is readily judged by the linearity of the ordered sample data versus the deviate predicted by the assumed statistical model from the cumulative probability. A comparison is made between normal and lognormal distributions to illustrate how dramatically the distribution model could affect the control limit setting. Examples presented include defect data collected on SP1 the dark field inspection tool on a variety of deposited and polished metallic and dielectric films. We find that the defect count distribution is in most cases approximately lognormal. We show that normal distribution is an inadequate assumption, as clearly indicated by the non-linearity of the probability plots. Misuse of normal distribution leads to a too optimistic process control limit, typically 50% tighter than suggested by the lognormal distribution. The inappropriate control limit setting consequently results in an excursion rate at a level too high to be manageable. Lognormal distribution is a valid assumption because it is positively skewed, which adequately takes into account the fact that defect count distribution is typically characteristic of a long tail. In essence, use of lognormal distribution is a suggestion that the long tail be treated as part of the process entitlement (capability) instead of process excursion. The adjustment of the expected process entitlement is reflected and quantified by the skewness of lognormal distribution, yielding a more realistic estimate (defect count control limit). It is of particular importance to use a validated probability distribution when the sample size is small. Statistical process control (SPC) chart is generally constructed on the assumption of normality of the underlying population. Although this assumption is not true, as we discussed in the previous paragraph, the sample average will follow a normal distribution regardless of the underlying distribution according to the central limit theorem. However, this practice requires a large sample, which is sometimes impractical, especially in the stage of process development and yield ramp-up, when the process control limit is and has to be a moving target, enabling a rapid and constant yield-learning with minimal amount of production interruption and/or resource reallocation. In this work, we demonstrate that a validated statistical model such as lognormal distribution allows us to monitor the progress in a quantifiable and measurable way, and to tighten the control limits smoothly and systematically. To do so, we use the verified model to make a deduction about the expected defect count at a predetermined deviate, say 3s. The estimate error or the range is a function of sample variation, sample size, and the confidence level at which the estimation is being made. If we choose a fixed sample size and confidence level, the defectivity performance is explicitly defined and gauged by the estimate and the estimate error.

Zhong, Lei

2003-07-01

318

Assessing uncertainty in pollutant wash-off modelling via model validation.  

PubMed

Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies. PMID:25169872

Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

2014-11-01

319

Predictive validity of behavioural animal models for chronic pain  

PubMed Central

Rodent models of chronic pain may elucidate pathophysiological mechanisms and identify potential drug targets, but whether they predict clinical efficacy of novel compounds is controversial. Several potential analgesics have failed in clinical trials, in spite of strong animal modelling support for efficacy, but there are also examples of successful modelling. Significant differences in how methods are implemented and results are reported means that a literature-based comparison between preclinical data and clinical trials will not reveal whether a particular model is generally predictive. Limited reports on negative outcomes prevents reliable estimate of specificity of any model. Animal models tend to be validated with standard analgesics and may be biased towards tractable pain mechanisms. But preclinical publications rarely contain drug exposure data, and drugs are usually given in high doses and as a single administration, which may lead to drug distribution and exposure deviating significantly from clinical conditions. The greatest challenge for predictive modelling is, however, the heterogeneity of the target patient populations, in terms of both symptoms and pharmacology, probably reflecting differences in pathophysiology. In well-controlled clinical trials, a majority of patients shows less than 50% reduction in pain. A model that responds well to current analgesics should therefore predict efficacy only in a subset of patients within a diagnostic group. It follows that successful translation requires several models for each indication, reflecting critical pathophysiological processes, combined with data linking exposure levels with effect on target. LINKED ARTICLES This article is part of a themed issue on Translational Neuropharmacology. To view the other articles in this issue visit http://dx.doi.org/10.1111/bph.2011.164.issue-4 PMID:21371010

Berge, Odd-Geir

2011-01-01

320

Literature-derived bioaccumulation models for earthworms: Development and validation  

SciTech Connect

Estimation of contaminant concentrations in earthworms is a critical component in many ecological risk assessments. Without site-specific data, literature-derived uptake factors or models are frequently used. Although considerable research has been conducted on contaminant transfer from soil to earthworms, most studies focus on only a single location. External validation of transfer models has not been performed. The authors developed a database of soil and tissue concentrations for nine inorganic and two organic chemicals. Only studies that presented total concentrations in departed earthworms were included. Uptake factors and simple and multiple regression models of natural-log-transformed concentrations of each analyte in soil and earthworms were developed using data from 26 studies. These models were then applied to data from six additional studies. Estimated and observed earthworm concentrations were compared using nonparametric Wilcoxon signed-rank tests. Relative accuracy and quality of different estimation methods were evaluated by calculating the proportional deviation of the estimate from the measured value. With the exception of Cr, significant, single-variable (e.g., soil concentration) regression models were fit for each analyte. Inclusion of soil Ca improved model fits for Cd and Pb. Soil pH only marginally improved model fits. The best general estimates of chemical concentrations in earthworms were generated by simple ln-ln regression models for As, Cd, Cu, Hg, Mn, Pb, Zn, and polychlorinated biphenyls. No method accurately estimated Cr or Ni in earthworms. Although multiple regression models including pH generated better estimates for a few analytes, in general, the predictive utility gained by incorporating environmental variables was marginal.

Sample, B.E.; Suter, G.W. II; Beauchamp, J.J.; Efroymson, R.A.

1999-09-01

321

Systematic validation of disease models for pharmacoeconomic evaluations. Swiss HIV Cohort Study.  

PubMed

Pharmacoeconomic evaluations are often based on computer models which simulate the course of disease with and without medical interventions. The purpose of this study is to propose and illustrate a rigorous approach for validating such disease models. For illustrative purposes, we applied this approach to a computer-based model we developed to mimic the history of HIV-infected subjects at the greatest risk for Mycobacterium avium complex (MAC) infection in Switzerland. The drugs included as a prophylactic intervention against MAC infection were azithromycin and clarithromycin. We used a homogenous Markov chain to describe the progression of an HIV-infected patient through six MAC-free states, one MAC state, and death. Probability estimates were extracted from the Swiss HIV Cohort Study database (1993-95) and randomized controlled trials. The model was validated testing for (1) technical validity (2) predictive validity (3) face validity and (4) modelling process validity. Sensitivity analysis and independent model implementation in DATA (PPS) and self-written Fortran 90 code (BAC) assured technical validity. Agreement between modelled and observed MAC incidence confirmed predictive validity. Modelled MAC prophylaxis at different starting conditions affirmed face validity. Published articles by other authors supported modelling process validity. The proposed validation procedure is a useful approach to improve the validity of the model. PMID:10461580

Sendi, P P; Craig, B A; Pfluger, D; Gafni, A; Bucher, H C

1999-08-01

322

Canyon building ventilation system dynamic model -- Parameters and validation  

SciTech Connect

Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical models.'' These component models are functionally integrated to represent the plant. With today's low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in single loop'' design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

Moncrief, B.R. (Westinghouse Savannah River Co., Aiken, SC (United States)); Chen, F.F.K. (Bechtel National, Inc., San Francisco, CA (United States))

1993-01-01

323

Simplified modeling of the EBR-II control rods  

Microsoft Academic Search

Simplified models of the Experimental Breeder Reactor II (EBR-II) control and safety rods have been developed to facilitate core modeling under various operational and shutdown conditions. A parametric study that addressed modeling approximations individually was performed on normal-worth (NW), high-worth (HW), and safety-rod (SR)-type control rods with consideration for worth changes due to (a) axial geometry simplifications, (b) increased axial

1995-01-01

324

NAIRAS aircraft radiation model development, dose climatology, and initial validation  

NASA Astrophysics Data System (ADS)

The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway.

Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

2013-10-01

325

Geophysical Monitoring for Validation of Transient Permafrost Models (Invited)  

NASA Astrophysics Data System (ADS)

Permafrost is a widespread phenomenon at high latitudes and high altitudes and describes the permanently frozen state of the subsurface in lithospheric material. In the context of climate change, both, new monitoring and modelling techniques are required to observe and predict potential permafrost changes, e.g. the warming and degradation which may lead to the liberation of carbon (Arctic) and the destabilisation of permafrost slopes (mountains). Mountain permafrost occurrences in the European Alps are characterised by temperatures only a few degrees below zero and are therefore particularly sensitive to projected climate changes in the 21st century. Traditional permafrost observation techniques are mainly based on thermal monitoring in vertical and horizontal dimension, but they provide only weak indications of physical properties such as ice or liquid water content. Geophysical techniques can be used to characterise permafrost occurrences and to monitor their changes as the physical properties of frozen and unfrozen ground measured by geophysical techniques are markedly different. In recent years, electromagnetic, seismic but especially electrical methods have been used to continuously monitor permafrost occurrences and to detect long-term changes within the active layer and regarding the ice content within the permafrost layer. On the other hand, coupled transient thermal/hydraulic models are used to predict the evolution of permafrost occurrences under different climate change scenarios. These models rely on suitable validation data for a certain observation period, which is usually restricted to data sets of ground temperature and active layer depth. Very important initialisation and validation data for permafrost models are, however, ground ice content and unfrozen water content in the active layer. In this contribution we will present a geophysical monitoring application to estimate ice and water content and their evolution in time at a permafrost station in the Swiss Alps. These data are then used to validate the coupled mass and energy balance soil model COUP, which is used for long-term projections of the permafrost evolution in the Swiss Alps. For this, we apply the recently developed 4-phase model, which is based on simple petrophysical relationships and which uses geoelectric and seismic tomographic data sets as input data.. In addition, we use continuously measured electrical resistivity tomography data sets and soil moisture data in daily resolution to compare modelled ice content changes and geophysical observations in high temporal resolution. The results show still large uncertainties in both model approaches regarding the absolute ice content values, but much smaller uncertainties regarding the changes in ice and unfrozen water content. We conclude that this approach is well suited for the analysis of permafrost changes in both, model and monitoring studies, even though more efforts are needed for obtaining in situ ground truth data of ice content and porosity.

Hauck, C.; Hilbich, C.; Marmy, A.; Scherler, M.

2013-12-01

326

Method for generating exact Bianchi type II cosmological models  

SciTech Connect

A method for generating exact Bianchi type II cosmological models with a perfect fluid distribution of matter is presented. Two new classes of Bianchi type II solutions have been generated from Lorenz's solution (D. Lorenz, Phys. Lett. A 79, 19 (1980)). A detailed study of physical and kinematic properties of one of them has been carried out.

Hajj-Boutros, J.

1986-06-01

327

PIV validation of blood-heart valve leaflet interaction modelling.  

PubMed

The aim of this study was to validate the 2D computational fluid dynamics (CFD) results of a moving heart valve based on a fluid-structure interaction (FSI) algorithm with experimental measurements. Firstly, a pulsatile laminar flow through a monoleaflet valve model with a stiff leaflet was visualized by means of Particle Image Velocimetry (PIV). The inflow data sets were applied to a CFD simulation including blood-leaflet interaction. The measurement section with a fixed leaflet was enclosed into a standard mock loop in series with a Harvard Apparatus Pulsatile Blood Pump, a compliance chamber and a reservoir. Standard 2D PIV measurements were made at a frequency of 60 bpm. Average velocity magnitude results of 36 phase-locked measurements were evaluated at every 10 degrees of the pump cycle. For the CFD flow simulation, a commercially available package from Fluent Inc. was used in combination with inhouse developed FSI code based on the Arbitrary Lagrangian-Eulerian (ALE) method. Then the CFD code was applied to the leaflet to quantify the shear stress on it. Generally, the CFD results are in agreement with the PIV evaluated data in major flow regions, thereby validating the FSI simulation of a monoleaflet valve with a flexible leaflet. The applicability of the new CFD code for quantifying the shear stress on a flexible leaflet is thus demonstrated. PMID:17674341

Kaminsky, R; Dumont, K; Weber, H; Schroll, M; Verdonck, P

2007-07-01

328

Simplified modeling of the EBR-II control rods  

Microsoft Academic Search

Simplified models of EBR-II control and safety rods have been developed for core modeling under various operational and shutdown conditions. A parametric study was performed on normal worth, high worth, and safety rod type control rods. A summary of worth changes due to individual modeling approximations is tabulated. Worth effects due to structural modeling simplification are negligible. Fuel region homogenization

1995-01-01

329

Long-term ELBARA-II Assistance to SMOS Land Product and Algorithm Validation at the Valencia Anchor Station (MELBEX Experiment 2010-2013)  

NASA Astrophysics Data System (ADS)

The main activity of the Valencia Anchor Station (VAS) is currently now to support the validation of SMOS (Soil Moisture and Ocean Salinity) Level 2 and 3 land products (soil moisture, SM, and vegetation optical depth, TAU). With this aim, the European Space Agency (ESA) has provided the Climatology from Satellites Group of the University of Valencia with an ELBARA-II microwave radiometer under a loan agreement since September 2009. During this time, brightness temperatures (TB) have continuously been acquired, except during normal maintenance or minor repair interruptions. ELBARA-II is an L-band dual-polarization radiometer with two channels (1400-1418 MHz, 1409-1427 MHz). It is continuously measuring over a vineyard field (El Renegado, Caudete de las Fuentes, Valencia) from a 15 m platform with a constant protocol for calibration and angular scanning measurements with the aim to assisting the validation of SMOS land products and the calibration of the L-MEB (L-Band Emission of the Biosphere) -basis for the SMOS Level 2 Land Processor- over the VAS validation site. One of the advantages of using the VAS site is the possibility of studying two different environmental conditions along the year. While the vine cycle extends mainly between April and October, during the rest of the year the area remains under bare soil conditions, adequate for the calibration of the soil model. The measurement protocol currently running has shown to be robust during the whole operation time and will be extended in time as much as possible to continue providing a long-term data set of ELBARA-II TB measurements and retrieved SM and TAU. This data set is also showing to be useful in support of SMOS scientific activities: the VAS area and, specifically the ELBARA-II site, offer good conditions to control the long-term evolution of SMOS Level 2 and Level 3 land products and interpret eventual anomalies that may obscure sensor hidden biases. In addition, SM and TAU that are currently retrieved from the ELBARA-II TB data by inversion of the L-MEB model, can also be compared to the Level 2 and Level 3 SMOS products. L-band ELBARA-II measurements provide area-integrated estimations of SM and TAU that are much more representative of the soil and vegetation conditions at field scale than ground measurements (from capacitive probes for SM and destructive measurements for TAU). For instance, Miernecki et al., (2012) and Wigneron et al. (2012) showed that very good correlations could be obtained from TB data and SM retrievals obtained from both SMOS and ELBARA-II over the 2010-2011 time period. The analysis of the quality of these correlations over a long time period can be very useful to evaluate the SMOS measurements and retrieved products (Level 2 and 3). The present work that extends the analysis over almost 4 years now (2010-2013) emphasizes the need to (i) maintain the long-time record of ELBARA-II measurements (ii) enhance as much as possible the control over other parameters, especially, soil roughness (SR), vegetation water content (VWC) and surface temperature, to interpret the retrieved results obtained from both SMOS and ELBARA-II instruments.

Lopez-Baeza, Ernesto; Wigneron, Jean-Pierre; Schwank, Mike; Miernecki, Maciej; Kerr, Yann; Casal, Tania; Delwart, Steven; Fernandez-Moran, Roberto; Mecklenburg, Susanne; Coll Pajaron, M. Amparo; Salgado Hernanz, Paula

330

Improvement to and Validations of the QinetiQ Atmospheric Radiation Model (QARM)  

Microsoft Academic Search

The QinetiQ atmospheric radiation model (QARM) is a comprehensive model of the energetic radiation in the atmosphere. In this paper we report on the improvement and validation activities for this model. The improvements include the implementation of two additional cosmic ray models, new response matrix, dose rate and flight dose calculation facilities. Tests\\/validations of the model have been carried out

Fan Lei; Alex Hands; Simon Clucas; Clive Dyer; Pete Truscott

2006-01-01

331

Bioaerosol optical sensor model development and initial validation  

NASA Astrophysics Data System (ADS)

This paper describes the development and initial validation of a bioaerosol optical sensor model. This model was used to help determine design parameters and estimate performance of a new low-cost optical sensor for detecting bioterrorism agents. In order to estimate sensor performance in detecting biowarfare simulants and rejecting environmental interferents, use was made of a previously reported catalog of EEM (excitation/emission matrix) fluorescence cross-section measurements and previously reported multiwavelength-excitation biosensor modeling work. In the present study, the biosensor modeled employs a single high-power 365 nm UV LED source plus an IR laser diode for particle size determination. The sensor has four output channels: IR size channel, UV elastic channel and two fluorescence channels. The sensor simulation was used to select the fluorescence channel wavelengths of 400-450 and 450-600 nm. Using these selected fluorescence channels, the performance of the sensor in detecting simulants and rejecting interferents was estimated. Preliminary measurements with the sensor are presented which compare favorably with the simulation results.

Campbell, Steven D.; Jeys, Thomas H.; Eapen, Xuan Le

2007-04-01

332

Richards model revisited: validation by and application to infection dynamics.  

PubMed

Ever since Richards proposed his flexible growth function more than half a century ago, it has been a mystery that this empirical function has made many incredible coincidences with real ecological or epidemic data even though one of its parameters (i.e., the exponential term) does not seem to have clear biological meaning. It is therefore a natural challenge to mathematical biologists to provide an explanation of the interesting coincidences and a biological interpretation of the parameter. Here we start from a simple epidemic SIR model to revisit Richards model via an intrinsic relation between both models. Especially, we prove that the exponential term in the Richards model has a one-to-one nonlinear correspondence to the basic reproduction number of the SIR model. This one-to-one relation provides us an explicit formula in calculating the basic reproduction number. Another biological significance of our study is the observation that the peak time is approximately just a serial interval after the turning point. Moreover, we provide an explicit relation between final outbreak size, basic reproduction number and the peak epidemic size which means that we can predict the final outbreak size shortly after the peak time. Finally, we introduce a constraint in Richards model to address over fitting problem observed in the existing studies and then apply our method with constraint to conduct some validation analysis using the data of recent outbreaks of prototype infectious diseases such as Canada 2009 H1N1 outbreak, GTA 2003 SARS outbreak, Singapore 2005 dengue outbreak, and Taiwan 2003 SARS outbreak. Our new formula gives much more stable and precise estimate of model parameters and key epidemic characteristics such as the final outbreak size, the basic reproduction number, and the turning point, compared with earlier simulations without constraints. PMID:22889641

Wang, Xiang-Sheng; Wu, Jianhong; Yang, Yong

2012-11-21

333

Statistically-based Validation of Computer Simulation Models in Traffic Operations and Management  

Microsoft Academic Search

The process of model validation is crucial for the use of computer simulation models intransportation policy, planning and operations. The obstacles that must be overcome andthe issues that must be treated in performing a validation are laid out here. We describe ageneral process that emphasizes five essential ingredients for validation: context, data,uncertainty, feedback, and prediction. We use a test-bed to

Jerome Sacks; Nagui M. Rouphail; B. Brian Park

334

Development, validation and application of numerical space environment models  

NASA Astrophysics Data System (ADS)

Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the Vlasov description of plasma is carried out using the Vlasiator model. The test shows that the Vlasov equation for plasma in six-dimensionsional phase space is solved correctly b! y Vlasiator, that results are obtained beyond those of the magnetohydrodynamic (MHD) description of plasma and that global magnetospheric simulations using a hybrid-Vlasov model are feasible on current hardware. For the first time four global magnetospheric models using the MHD description of plasma (BATS-R-US, GUMICS, OpenGGCM, LFM) are run with identical solar wind input and the results compared to observations in the ionosphere and outer magnetosphere. Based on the results of the global magnetospheric MHD model GUMICS a hypothesis is formulated for a new mechanism of plasmoid formation in the Earth's magnetotail.

Honkonen, Ilja

2013-10-01

335

Validation of a Transient Rotating Reference Frame CFD Model  

NASA Astrophysics Data System (ADS)

Uni-directional impulse turbines are used for the extraction of wave energy by converting oscillating air flow generated by waves into uni-directional rotational energy. Due to the inconsistent nature of ocean waves, airflow within an OWC is bi-directional and inherently transient. Such complex fluid dynamics require a varying rotor RPM incorporated in the CFD simulation to adequately resolve the flow field during turbine startups and changing air flow direction. The software Numeca is used to introduce a user defined function which defines a varying rotor rpm in a three dimensional transient viscous simulation of air flow through a uni-directional turbine. A scaled turbine prototype is used in a wind tunnel to measure the rotors RPM and Torque. Additionally, a radial pressure profile is developed in front and behind of the rotor blades. The experimental data is used to validate the accuracy of this varying rotating reference frame CFD model.

Khoungui, Othmane; Ladd, Justin; Velez, Carlos

2011-11-01

336

Utilizing Chamber Data for Developing and Validating Climate Change Models  

NASA Technical Reports Server (NTRS)

Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

Monje, Oscar

2012-01-01

337

A psychophysiological causal model of pain report validity  

Microsoft Academic Search

The validity of the pain report is vitally important but difficult to assess because pain is a personal experience. Human laboratory research affords an opportunity to investigate validity because one can measure the consistency and sensitivity of pain ratings produced in response to known stimuli. This article presents 2 levels of evidence characterizing the validity of the pain report measure.

C. Richard Chapman; Gary W. Donaldson; Yoshio Nakamura; Robert C. Jacobson; David H. Bradshaw; Jonathan Gavrin

2002-01-01

338

Traveling with Cognitive Tests: Testing the Validity of a KABC-II Adaptation in India  

ERIC Educational Resources Information Center

The authors evaluated the adequacy of an extensive adaptation of the American Kaufman Assessment Battery for Children, second edition (KABC-II), for 6- to 10-year-old Kannada-speaking children of low socioeconomic status in Bangalore, South India. The adapted KABC-II was administered to 598 children. Subtests showed high reliabilities, the…

Malda, Maike; van de Vijver, Fons J. R.; Srinivasan, Krishnamachari; Transler, Catherine; Sukumar, Prathima

2010-01-01

339

Computational modeling and experimental validation of the Legionella and Coxiella virulence-related  

E-print Network

Computational modeling and experimental validation of the Legionella and Coxiella virulence (received for review September 5, 2012) Legionella and Coxiella are intracellular pathogens that use to the identification and ex- perimental validation of 20 effectors from Legionella pneumophila, Legionella longbeachae

Pupko, Tal

340

PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II  

SciTech Connect

To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model is coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.

Chambers, F W; Raymond, B A; Falabella, S; Lee, B S; Richardson, R A; Weir, J T; Davis, H A; Schultze, M E

2005-05-31

341

Dynamic models and model validation for PEM fuel cells using electrical circuits  

Microsoft Academic Search

This paper presents the development of dynamic models for proton exchange membrane (PEM) fuel cells using electrical circuits. The models have been implemented in MATLAB\\/SIMULINK and Pspice environments. Both the double-layer charging effect and the thermodynamic characteristic inside the fuel cell are included in the models. The model responses obtained at steady-state and transient conditions are validated by experimental data

Caisheng Wang; M. H. Nehrir; S. Shaw

2005-01-01

342

Dynamic models and model validation for PEM fuel cells using electrical circuits  

Microsoft Academic Search

This paper presents the development of dynamic models for proton exchange membrane (PEM) fuel cells using electrical circuits. The models have been implemented in MATLAB\\/SIMULINK and PSPICE environments. Both the double-layer charging effect and the thermodynamic characteristic inside the fuel cell are included in the models. The model responses obtained at steady-state and transient conditions are validated by experimental data

Caisheng Wang; M. Hashem Nehrir; Steven R. Shaw

2005-01-01

343

An approach to model validation and model-based prediction -- polyurethane foam case study.  

SciTech Connect

Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are created and addressed. Hypothetical problems are used because no guidance was provided concern

Dowding, Kevin J.; Rutherford, Brian Milne

2003-07-01

344

Circulation Control Model Experimental Database for CFD Validation  

NASA Technical Reports Server (NTRS)

A 2D circulation control wing was tested in the Basic Aerodynamic Research Tunnel at the NASA Langley Research Center. A traditional circulation control wing employs tangential blowing along the span over a trailing-edge Coanda surface for the purpose of lift augmentation. This model has been tested extensively at the Georgia Tech Research Institute for the purpose of performance documentation at various blowing rates. The current study seeks to expand on the previous work by documenting additional flow-field data needed for validation of computational fluid dynamics. Two jet momentum coefficients were tested during this entry: 0.047 and 0.114. Boundary-layer transition was investigated and turbulent boundary layers were established on both the upper and lower surfaces of the model. Chordwise and spanwise pressure measurements were made, and tunnel sidewall pressure footprints were documented. Laser Doppler Velocimetry measurements were made on both the upper and lower surface of the model at two chordwise locations (x/c = 0.8 and 0.9) to document the state of the boundary layers near the spanwise blowing slot.

Paschal, Keith B.; Neuhart, Danny H.; Beeler, George B.; Allan, Brian G.

2012-01-01

345

Canyon building ventilation system dynamic model -- Parameters and validation  

SciTech Connect

Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical ``models.`` These component models are functionally integrated to represent the plant. With today`s low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in ``single loop`` design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

Moncrief, B.R. [Westinghouse Savannah River Co., Aiken, SC (United States); Chen, F.F.K. [Bechtel National, Inc., San Francisco, CA (United States)

1993-02-01

346

Model of the Expansion of H II Region RCW 82  

NASA Astrophysics Data System (ADS)

This paper aims to resolve the problem of formation of young objects observed in the RCW 82 H II region. In the framework of a classical trigger model the estimated time of fragmentation is larger than the estimated age of the H II region. Thus the young objects could not have formed during the dynamical evolution of the H II region. We propose a new model that helps resolve this problem. This model suggests that the H II region RCW 82 is embedded in a cloud of limited size that is denser than the surrounding interstellar medium. According to this model, when the ionization-shock front leaves the cloud it causes the formation of an accelerating dense gas shell. In the accelerated shell, the effects of the Rayleigh-Taylor (R-T) instability dominate and the characteristic time of the growth of perturbations with the observed magnitude of about 3 pc is 0.14 Myr, which is less than the estimated age of the H II region. The total time t ?, which is the sum of the expansion time of the H II region to the edge of the cloud, the time of the R-T instability growth, and the free fall time, is estimated as 0.44 < t ? < 0.78 Myr. We conclude that the young objects in the H II region RCW 82 could be formed as a result of the R-T instability with subsequent fragmentation into large-scale condensations.

Krasnobaev, K. V.; Tagirova, R. R.; Kotova, G. Yu.

2014-05-01

347

Modeling of copper(II) and zinc(II) extraction from chloride media with Kelex 100  

SciTech Connect

The extraction of copper(II) and zinc(II) from acidic chloride solutions with protonated Kelex 100 (HL) was studied and the extraction isotherms were determined for systems containing individual metal ions and their mixtures. A chemical model was proposed and verified. It considers the coextraction of the following species: MCl{sub 4}(H{sub 2}L){sub 2}, MCl{sub 4}(H{sub 2}L){sub 2}{center_dot}HCl, MCl{sub 3}(H{sub 2}L), ML{sub 2}, and H{sub 2}L{center_dot}HCl. Zinc(II) is extracted as the metal ion pairs, while copper(II) can be extracted as the metal ion pair and the chelate. The model can be used to predict the effect of experimental conditions on extraction and coextraction of the metal ions considered.

Bogacki, M.B.; Zhivkova, S.; Kyuchoukov, G.; Szymanowski, J.

2000-03-01

348

Validation of the galactic cosmic ray and geomagnetic transmission models.  

PubMed

A very high-momentum resolution particle spectrometer called the Alpha Magnetic Spectrometer (AMS) was flown in the payload bay of the Space Shuttle in a 51.65 degrees x 380-km orbit during the last solar minimum. This spectrometer has provided the first high statistics data set for galactic cosmic radiation protons, and helium, as well as limited spectral data on carbon and oxygen nuclei in the International Space Station orbit. First measurements of the albedo protons at this inclination were also made. Because of the high-momentum resolution and high statistics, the data can be separated as a function of magnetic latitude. A related investigation, the balloon borne experiment with a superconducting solenoid spectrometer (BESS), has been flown from Lynn Lake, Canada and has also provided excellent high-resolution data on protons and helium. These two data sets have been used here to study the validity of two galactic cosmic ray models and the geomagnetic transmission function developed from the 1990 geomagnetic reference field model. The predictions of both the CREME96 and NASA/JSC models are in good agreement with the AMS data. The shape of the AMS measured albedo proton spectrum, up to 2 GeV, is in excellent agreement with the previous balloon and satellite observations. A new LIS spectrum was developed that is consistent with both previous and new BESS 3He observations. Because the astronaut radiation exposures onboard ISS will be highest around the time of the solar minimum, these AMS measurements and these models provide important benchmarks for future radiation studies. AMS-02 slated for launch in September 2003, will provide even better momentum resolution and higher statistics data. PMID:11855419

Badhwar, G D; Truong, A G; O'Neill, P M; Choutko, V

2001-06-01

349

Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model  

NASA Technical Reports Server (NTRS)

This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

MacNeice, Peter

2009-01-01

350

Bidirectional reflectance function in coastal waters: modeling and validation  

NASA Astrophysics Data System (ADS)

The current operational algorithm for the correction of bidirectional effects from the satellite ocean color data is optimized for typical oceanic waters. However, versions of bidirectional reflectance correction algorithms, specifically tuned for typical coastal waters and other case 2 conditions, are particularly needed to improve the overall quality of those data. In order to analyze the bidirectional reflectance distribution function (BRDF) of case 2 waters, a dataset of typical remote sensing reflectances was generated through radiative transfer simulations for a large range of viewing and illumination geometries. Based on this simulated dataset, a case 2 water focused remote sensing reflectance model is proposed to correct above-water and satellite water leaving radiance data for bidirectional effects. The proposed model is first validated with a one year time series of in situ above-water measurements acquired by collocated multi- and hyperspectral radiometers which have different viewing geometries installed at the Long Island Sound Coastal Observatory (LISCO). Match-ups and intercomparisons performed on these concurrent measurements show that the proposed algorithm outperforms the algorithm currently in use at all wavelengths.

Gilerson, Alex; Hlaing, Soe; Harmel, Tristan; Tonizzo, Alberto; Arnone, Robert; Weidemann, Alan; Ahmed, Samir

2011-11-01

351

Narrowband VLF observations as validation of Plasmaspheric model  

NASA Astrophysics Data System (ADS)

PLASMON is a European Union FP7 project which will use observations of whistlers and field line resonances to construct a data assimilative model of the plasmasphere. This model will be validated by comparison with electron precipitation data derived from narrowband VLF observations of subionospheric propagation from the AARDDVARK network. A VLF receiver on Marion Island, located at 46.9° S 37.1° E (L = 2.60), is able to observe the powerful NWC transmitter in Australia over a 1.4 < L < 3.0 path which passes exclusively over the ocean. The signal is thus very strong and exhibits an excellent signal-to-noise ratio. Data from the UltraMSK narrowband VLF receiver on Marion Island are used to examine evidence of particle precipitation along this path, thereby inferring the rate at which electrons are scattered into the bounce loss cone. This path covers a small range of L-values so that there is little ambiguity in the source of any peturbations. Perturbations detected on the path during geomagnetic storms should predominantly be responses to energetic electron precipitation processes occurring inside the plasmasphere. Comparisons will be made to preliminary plasmaspheric results from the PLASMON project.

Collier, Andrew; Clilverd, Mark; Rodger, C. J.; Delport, Brett; Lichtenberger, János

2012-07-01

352

Microbial dynamics in the marine ecosystem model ERSEM II with decoupled carbon assimilation and nutrient uptake  

NASA Astrophysics Data System (ADS)

A description of the improvements in the microbial complex of the dynamical simulation model ERSEM is given and the results for a 130-box setup of the North Sea are discussed. The improvements affecting the microbial food web are the decoupling of the carbon assimilation from the nutrient uptake dynamics in the phytoplankton groups and the incorporation of nutrient uptake dynamics in the bacterioplankton. Simulation results of ERSEM II are presented and discussed in comparison with those of ERSEM I and validation data. Based on model results and emergent properties of the system it was possible to conclude that ERSEM II is able to model the full range of food webs from a system dominated by the microbial loop in the relatively oligotrophic offshore areas to a system dominated by the omnivorous food web in the eutrophic continental coastal area.

Baretta-Bekker, J. G.; Baretta, J. W.; Ebenhöh, W.

1997-12-01

353

Nonparametric model validations for hidden Markov models with applications in financial econometrics  

PubMed Central

We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

Zhao, Zhibiao

2011-01-01

354

Alaska North Slope Tundra Travel Model and Validation Study  

SciTech Connect

The Alaska Department of Natural Resources (DNR), Division of Mining, Land, and Water manages cross-country travel, typically associated with hydrocarbon exploration and development, on Alaska's arctic North Slope. This project is intended to provide natural resource managers with objective, quantitative data to assist decision making regarding opening of the tundra to cross-country travel. DNR designed standardized, controlled field trials, with baseline data, to investigate the relationships present between winter exploration vehicle treatments and the independent variables of ground hardness, snow depth, and snow slab thickness, as they relate to the dependent variables of active layer depth, soil moisture, and photosynthetically active radiation (a proxy for plant disturbance). Changes in the dependent variables were used as indicators of tundra disturbance. Two main tundra community types were studied: Coastal Plain (wet graminoid/moist sedge shrub) and Foothills (tussock). DNR constructed four models to address physical soil properties: two models for each main community type, one predicting change in depth of active layer and a second predicting change in soil moisture. DNR also investigated the limited potential management utility in using soil temperature, the amount of photosynthetically active radiation (PAR) absorbed by plants, and changes in microphotography as tools for the identification of disturbance in the field. DNR operated under the assumption that changes in the abiotic factors of active layer depth and soil moisture drive alteration in tundra vegetation structure and composition. Statistically significant differences in depth of active layer, soil moisture at a 15 cm depth, soil temperature at a 15 cm depth, and the absorption of photosynthetically active radiation were found among treatment cells and among treatment types. The models were unable to thoroughly investigate the interacting role between snow depth and disturbance due to a lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

Harry R. Bader; Jacynthe Guimond

2006-03-01

355

Mitigation of turbidity currents in reservoirs with passive retention systems: validation of CFD modeling  

NASA Astrophysics Data System (ADS)

Sediment deposition by continuous turbidity currents may affect eco-environmental river dynamics in natural reservoirs and hinder the maneuverability of bottom discharge gates in dam reservoirs. In recent years, innovative techniques have been proposed to enforce the deposition of turbidity further upstream in the reservoir (and away from the dam), namely, the use of solid and permeable obstacles such as water jet screens , geotextile screens, etc.. The main objective of this study is to validate a computational fluid dynamics (CFD) code applied to the simulation of the interaction between a turbidity current and a passive retention system, designed to induce sediment deposition. To accomplish the proposed objective, laboratory tests were conducted where a simple obstacle configuration was subjected to the passage of currents with different initial sediment concentrations. The experimental data was used to build benchmark cases to validate the 3D CFD software ANSYS-CFX. Sensitivity tests of mesh design, turbulence models and discretization requirements were performed. The validation consisted in comparing experimental and numerical results, involving instantaneous and time-averaged sediment concentrations and velocities. In general, a good agreement between the numerical and the experimental values is achieved when: i) realistic outlet conditions are specified, ii) channel roughness is properly calibrated, iii) two equation k - ? models are employed iv) a fine mesh is employed near the bottom boundary. Acknowledgements This study was funded by the Portuguese Foundation for Science and Technology through the project PTDC/ECM/099485/2008. The first author thanks the assistance of Professor Moitinho de Almeida from ICIST and to all members of the project and of the Fluvial Hydraulics group of CEHIDRO.

Ferreira, E.; Alves, E.; Ferreira, R. M. L.

2012-04-01

356

Validation of transport models using additive flux minimization technique  

SciTech Connect

A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States)] [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States)] [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States)] [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)] [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

2013-10-15

357

Criterion Validity, Severity Cut Scores, and Test-Retest Reliability of the Beck Depression Inventory-II in a University Counseling Center Sample  

ERIC Educational Resources Information Center

The criterion validity of the Beck Depression Inventory-II (BDI-II; A. T. Beck, R. A. Steer, & G. K. Brown, 1996) was investigated by pairing blind BDI-II administrations with the major depressive episode portion of the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I; M. B. First, R. L. Spitzer, M. Gibbon, & J. B. W. Williams,…

Sprinkle, Stephen D.; Lurie, Daphne; Insko, Stephanie L.; Atkinson, George; Jones, George L.; Logan, Arthur R.; Bissada, Nancy N.

2002-01-01

358

Verification & Validation Of An Agent-Based Forest Fire Simulation Model  

Microsoft Academic Search

In this paper, we present the verification and validation of an agent-based model of forest fires. We use a combination of a Virtual Overlay Multi-Agent System (VOMAS) validation scheme with Fire Weather Index (FWI) to validate the forest fire Simulation. FWI is based on decades of real forest fire data and is now regarded as a standard index for fire

Muaz A. Niazi; Amir Hussain; Qasim Siddique; Mario Kolberg

359

Verification & validation of an agent-based forest fire simulation model  

Microsoft Academic Search

In this paper, we present the verification and validation of an agent-based model of forest fires. We use a combination of a Virtual Overlay Multi-Agent System (VOMAS) validation scheme with Fire Weather Index (FWI) to validate the forest fire Simulation. FWI is based on decades of real forest fire data and is now regarded as a standard index for fire

Muaz A. Niazi; Qasim Siddique; Amir Hussain; Mario Kolberg

2010-01-01

360

Validation of Community Models: Identifying Events in Space Weather Model Timelines  

NASA Technical Reports Server (NTRS)

I develop and document a set of procedures which test the quality of predictions of solar wind speed and polarity of the interplanetary magnetic field (IMF) made by coupled models of the ambient solar corona and heliosphere. The Wang-Sheeley-Arge (WSA) model is used to illustrate the application of these validation procedures. I present an algorithm which detects transitions of the solar wind from slow to high speed. I also present an algorithm which processes the measured polarity of the outward directed component of the IMF. This removes high-frequency variations to expose the longer-scale changes that reflect IMF sector changes. I apply these algorithms to WSA model predictions made using a small set of photospheric synoptic magnetograms obtained by the Global Oscillation Network Group as input to the model. The results of this preliminary validation of the WSA model (version 1.6) are summarized.

MacNeice, Peter

2009-01-01

361

An independent verification and validation of the Future Theater Level Model conceptual model  

SciTech Connect

This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

1994-08-01

362

Real-time infrared signature model validation for hardware-in-the-loop simulations  

Microsoft Academic Search

Techniques and tools for validation of real-time infrared target signature models are presented. The model validation techniques presented in this paper were developed for hardware-in-the-loop (HWIL) simulations at the U.S. Army Missile Command's Research, Development, and Engineering Center. Real-time target model validation is a required deliverable to the customer of a HWIL simulation facility and is a critical part of

Jeffrey S. Sanders; Trina Peters

1997-01-01

363

Ground target infrared signature model validation for real-time hardware-in-the-loop simulations  

Microsoft Academic Search

Techniques and tools for validation of real-time infrared target signature models are presented. The model validation techniques presented in this paper were developed for hardware-in-the-loop (HWIL) simulations at the U.S. Army Missile Command's Research, Development, and Engineering Center. Real-time target model validation is a required deliverable to the customer of a HWIL simulation facility and is a critical part of

Jeffrey S. Sanders; Jeremy B. Rodgers; Ahmed A. Siddique

1998-01-01

364

Revisiting the JDL data fusion model II  

Microsoft Academic Search

This paper suggests refinements and extensions of the {JDL} Data Fusion Model, the standard process model used for a multiplicity of community purposes. However, this Model has not been reviewed in accordance with (a) the dynamics of world events and (b) the changes, discoveries, and new methods in both the data fusion research and development community and related {IT} technologies.

James Llinas; Christopher Bowman; Galina Rogova; Alan Steinberg; E. Waltz; F. White

2004-01-01

365

Validation and Application of Concentrated Cesium Eluate Physical Property Models  

SciTech Connect

This work contained two objectives. To verify the mathematical equations developed for the physical properties of concentrated cesium eluate solutions against experimental test results obtained with simulated feeds. To estimate the physical properties of the radioactive AW-101 cesium eluate at saturation using the validated models. The Hanford River Protection Project (RPP) Hanford Waste Treatment and Immobilization Plant (WTP) is currently being built to extract radioisotopes from the vast inventory of Hanford tank wastes and immobilize them in a silicate glass matrix for eventual disposal at a geological repository. The baseline flowsheet for the pretreatment of supernatant liquid wastes includes removal of cesium using regenerative ion-exchange resins. The loaded cesium ion-exchange columns will be eluted with nitric acid nominally at 0.5 molar, and the resulting eluate solution will be concentrated in a forced-convection evaporator to reduce the storage volume and to recover the acid for reuse. The reboiler pot is initially charged with a concentrated nitric acid solution and kept under a controlled vacuum during feeding so the pot contents would boil at 50 degrees Celsius. The liquid level in the pot is maintained constant by controlling both the feed and boilup rates. The feeding will continue with no bottom removal until the solution in the pot reaches the target endpoint of 80 per cent saturation with respect to any one of the major salt species present.

Choi, A.S.

2004-03-18

366

Experiments for calibration and validation of plasticity and failure material modeling: 304L stainless steel.  

SciTech Connect

Experimental data for material plasticity and failure model calibration and validation were obtained from 304L stainless steel. Model calibration data were taken from smooth tension, notched tension, and compression tests. Model validation data were provided from experiments using thin-walled tube specimens subjected to path dependent combinations of internal pressure, extension, and torsion.

Lee, Kenneth L.; Korellis, John S.; McFadden, Sam X.

2006-01-01

367

Complexity Economics 0 (2011) 1 1 Validation and Model Selection: Three Similarity Measures  

E-print Network

Complexity Economics 0 (2011) 1 1 Validation and Model Selection: Three Similarity Measures, and Descriptive models, that attempt to track dynamic histor- ical phenomena. Both types require verification. Descriptive models require validation against historical data as well. More broadly, we can think of a process

Tesfatsion, Leigh

368

Modeling and Validation of Fuel Cell Water Dynamics using Neutron Imaging  

E-print Network

Modeling and Validation of Fuel Cell Water Dynamics using Neutron Imaging Jason B. Siegel, Denise A and validated a low order model of the liquid water and gas dynamics within the gas diffusion layer (GDL a dynamic lumped parameter fuel cell model. I. INTRODUCTION The electrochemical power generation

Stefanopoulou, Anna

369

Validation and Verification Techniques for Simulation Based Model: A theoretical outlook  

Microsoft Academic Search

This paper outlines different approaches to validate and verify the simulation models. It discusses how models relate to validation and verification techniques and important aspect to use it. Modeling and simulation is a powerful and effective technique to approach and analyze complex systems. Most of the decisions are based on computer generated data and simulation methods. These decisions are directly

Pratiksha Saxena

370

Validation analysis of probabilistic models of dietary exposure to food additives  

Microsoft Academic Search

The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above ‘true’ additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for

M. B. Gilsenan; R. L. Thompson; J. Lambe; M. J. Gibney

2003-01-01

371

Validation of a vibration and electric model of honeycomb panels equiped with  

E-print Network

homogeneiza- tion of honeycombs and shell models of piezoelectric patches. These models are validatedValidation of a vibration and electric model of honeycomb panels equiped with piezoelectric patch experimentally. The considered honeycomb is shown to be significantly viscoelastic and local bending effects

Paris-Sud XI, Université de

372

A Validity-Based Model for the Evaluation of a Criterion-Referenced Test.  

ERIC Educational Resources Information Center

This paper describes a model for the evaluation and approval of a test battery for compliance with a midwestern state law mandating criterion-referenced testing of specific objectives. Standards specifying that the test scores must demonstrate content validity and criterion-related validity form the foundation of the model. The model also…

Schattgen, Sharon; And Others

373

Validation of population-based disease simulation models: a review of concepts and methods  

Microsoft Academic Search

BACKGROUND: Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. METHODS: We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of

Jacek A Kopec; Philippe Finès; Douglas G Manuel; David L Buckeridge; William M Flanagan; Jillian Oderkirk; Michal Abrahamowicz; Samuel Harper; Behnam Sharif; Anya Okhmatovskaia; Eric C Sayre; M Mushfiqur Rahman; Michael C Wolfson

2010-01-01

374

Models for Proton-coupled Electron Transfer in Photosystem II  

Microsoft Academic Search

The coupling of proton and electron transfers is a key part of the chemistry of photosynthesis. The oxidative side of photosystem\\u000a II (PS II) in particular seems to involve a number of proton-coupled electron transfer (PCET) steps in the S-state transitions.\\u000a This mini-review presents an overview of recent studies of PCET model systems in the authors’ laboratory. PCET is defined

James M. Mayer; Ian J. Rhile; Frank B. Larsen; Elizabeth A. Mader; Todd F. Markle; Antonio G. DiPasquale

2006-01-01

375

Modeling local paleoclimates and validation in the southwest United States  

SciTech Connect

In order to evaluate the spatial and seasonal variations of paleoclimate in the southwest US, a local climate model (LCM) is developed that computes modern and 18,000 yr B.P. (18 ka) monthly temperature and precipitation from a set of independent variables. Independent variables include: terrain elevation, insolation, CO[sub 2] concentration, January and July winds, and January and July sea-surface temperatures. Solutions are the product of a canonical regression function which is calibrated using climate data from 641 stations from AZ, CA, CO, NM, NV, UT in the National Weather Service Cooperative observer network. Validation of the LCH, using climate data at 98 climate stations from the period 1980--1984, indicates no significant departures of LCM solutions from climate data. LCM solutions of modern and 18 ka climate are computed at a 15 km spacing over a rectangular domain extending 810 km east, 360 km west, 225 km north and 330 km south of the approximate location of Yucca Mt., KV. Solutions indicate mean annual temperature was 5[degrees]C cooler at 18 ka and mean annual precipitation increased 68%. The annual cycle of temperature and precipitation at 18 ka was amplified with summers about 1[degrees]C cooler and 71% drier, and winters about 11[degrees]C colder and 35% wetter than the modern. Model results compare quite reasonably with proxy paleoclimate estimates from glacial deposits, pluvial lake deposits, pollen records, ostracodes records and packrat madden records from the southwest US However, bias (+5[degrees]C to +10[degrees]C) is indicated for LCM solutions of summer temperatures at 18 ka.

Stamm, J.F.

1992-01-01

376

Effects of Risperidone on Aberrant Behavior in Persons with Developmental Disabilities: II. Social Validity Measures.  

ERIC Educational Resources Information Center

Consumer satisfaction and social validity were measured during a double-blind, placebo-controlled evaluation of risperidone in treating aberrant behaviors of persons with developmental disabilities. A survey showed all 17 caregivers felt participation was positive. Community members (n=52) also indicated that when on medication, the 5 participants…

McAdam, David B.; Zarcone, Jennifer R.; Hellings, Jessica; Napolitano, Deborah A.; Schroeder, Stephen R.

2002-01-01

377

The African American Acculturation Scale II: Cross-Validation and Short Form.  

ERIC Educational Resources Information Center

Studied African American culture, using a new, shortened, 33-item African American Acculturation Scale (AAAS-33) to assess the scale's validity and reliability. Comparisons between the original form and AAAS-33 reveal high correlations, however, the longer form may be sensitive to some beliefs, practices, and attitudes not assessed by the short…

Landrine, Hope; Klonoff, Elizabeth A.

1995-01-01

378

Comparing Validity and Reliability in Special Education Title II and IDEA Data  

ERIC Educational Resources Information Center

Previous researchers have found that special education teacher shortages are pervasive and exacerbated by federal policies regarding "highly qualified" teacher requirements. The authors examined special education teacher personnel data from 2 federal data sources to determine if these sources offer a reliable and valid means of…

Steinbrecher, Trisha D.; McKeown, Debra; Walther-Thomas, Chriss

2013-01-01

379

ORIGINAL PAPER Modeling Heart Rate Regulation--Part II: Parameter  

E-print Network

ORIGINAL PAPER Modeling Heart Rate Regulation--Part II: Parameter Identification and Analysis K. R of this study we introduced a 17- parameter model that can predict heart rate regulation during postural change to adequately represent the observed heart rate response. In part I and in previous work (Olufsen et al. 2006

Olufsen, Mette Sofie

380

Results of site validation experiments. Volume II. Supporting documents 5 through 14  

SciTech Connect

Volume II contains the following supporting documents: Summary of Geologic Mapping of Underground Investigations; Logging of Vertical Coreholes - ''Double Box'' Area and Exploratory Drift; WIPP High Precision Gravity Survey; Basic Data Reports for Drillholes, Brine Content of Facility Internal Strata; Mineralogical Content of Facility Interval Strata; Location and Characterization of Interbedded Materials; Characterization of Aquifers at Shaft Locations; and Permeability of Facility Interval Strate.

Not Available

1983-01-01

381

Analysis of Experimental Data for High Burnup PWR Spent Fuel Isotopic Validation - Vandellos II Reactor  

Microsoft Academic Search

This report is one of the several recent NUREG\\/CR reports documenting benchmark-quality radiochemical assay data and the use of the data to validate computer code predictions of isotopic composition for spent nuclear fuel, to establish the uncertainty and bias associated with code predictions. The experimental data analyzed in the current report were acquired from a high-burnup fuel program coordinated by

Germina Ilas; Ian C Gauld

2011-01-01

382

Gravitropic responses of the Avena coleoptile in space and on clinostats. II. Is reciprocity valid?  

NASA Technical Reports Server (NTRS)

Experiments were undertaken to determine if the reciprocity rule is valid for gravitropic responses of oat coleoptiles in the acceleration region below 1 g. The rule predicts that the gravitropic response should be proportional to the product of the applied acceleration and the stimulation time. Seedlings were cultivated on 1 g centrifuges and transferred to test centrifuges to apply a transverse g-stimulation. Since responses occurred in microgravity, the uncertainties about the validity of clinostat simulation of weightlessness was avoided. Plants at two stages of coleoptile development were tested. Plant responses were obtained using time-lapse video recordings that were analyzed after the flight. Stimulus intensities and durations were varied and ranged from 0.1 to 1.0 g and from 2 to 130 min, respectively. For threshold g-doses the reciprocity rule was obeyed. The threshold dose was of the order of 55 g s and 120 g s, respectively, for two groups of plants investigated. Reciprocity was studied also at bending responses which are from just above the detectable level to about 10 degrees. The validity of the rule could not be confirmed for higher g-doses, chiefly because the data were more variable. It was investigated whether the uniformity of the overall response data increased when the gravitropic dose was defined as (gm x t) with m-values different from unity. This was not the case and the reciprocity concept is, therefore, valid also in the hypogravity region. The concept of gravitropic dose, the product of the transverse acceleration and the stimulation time, is also well-defined in the acceleration region studied. With the same hardware, tests were done on earth where responses occurred on clinostats. The results did not contradict the reciprocity rule but scatter in the data was large.

Johnsson, A.; Brown, A. H.; Chapman, D. K.; Heathcote, D.; Karlsson, C.

1995-01-01

383

EEG-neurofeedback for optimising performance. II: creativity, the performing arts and ecological validity.  

PubMed

As a continuation of a review of evidence of the validity of cognitive/affective gains following neurofeedback in healthy participants, including correlations in support of the gains being mediated by feedback learning (Gruzelier, 2014a), the focus here is on the impact on creativity, especially in the performing arts including music, dance and acting. The majority of research involves alpha/theta (A/T), sensory-motor rhythm (SMR) and heart rate variability (HRV) protocols. There is evidence of reliable benefits from A/T training with advanced musicians especially for creative performance, and reliable benefits from both A/T and SMR training for novice music performance in adults and in a school study with children with impact on creativity, communication/presentation and technique. Making the SMR ratio training context ecologically relevant for actors enhanced creativity in stage performance, with added benefits from the more immersive training context. A/T and HRV training have benefitted dancers. The neurofeedback evidence adds to the rapidly accumulating validation of neurofeedback, while performing arts studies offer an opportunity for ecological validity in creativity research for both creative process and product. PMID:24239853

Gruzelier, John H

2014-07-01

384

Land-cover change model validation by an ROC method for the Ipswich watershed, Massachusetts, USA  

Microsoft Academic Search

Scientists need a better and larger set of tools to validate land-use change models, because it is essential to know a model’s prediction accuracy. This paper describes how to use the relative operating characteristic (ROC) as a quantitative measurement to validate a land-cover change model. Typically, a crucial component of a spatially explicit simulation model of land-cover change is a

R. Gil Pontius Jr; Laura C. Schneider

2001-01-01

385

Transient PVT measurements and model predictions for vessel heat transfer. Part II.  

SciTech Connect

Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models in which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.

Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.; Evans, Gregory Herbert; Rice, Steven F.

2010-07-01

386

Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition  

NASA Technical Reports Server (NTRS)

Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

Ewing, Anthony; Adams, Charles

2004-01-01

387

Validation of speciation techniques: a study of chlorozincate(II) ionic liquids.  

PubMed

The speciation of chlorozincate(II) ionic liquids, prepared by mixing 1-octyl-3-methylimidazolium chloride, [C(8)mim]Cl, and zinc(II) chloride in various molar ratios, ?(ZnCl(2)), was investigated using Raman spectroscopy and differential scanning calorimetry; the Gutmann acceptor number, which is a quantitative measure of Lewis acidity, was also determined as a function of the composition. These results were combined with literature data to define the anionic speciation; in the neat liquid phase, the existence of Cl(-), [ZnCl(4)](2-), [Zn(2)Cl(6)](2-), [Zn(3)Cl(8)](2-), and [Zn(4)Cl(10)](2-) anions was confirmed. From two chlorozincate(II) ionic liquids with [C(2)mim](+) cations (?(ZnCl(2)) = 0.33 and ?(ZnCl(2)) = 0.50), crystals have been obtained, revealing the structures of [C(2)mim](2)[ZnCl(4)] and [C(2)mim](2)[Zn(2)Cl(6)] forming three-dimensional hydrogen-bond networks. The compound [C(2)mim](2){Zn(4)Cl(10)} was crystallized from the ?(ZnCl(2)) = 0.75 composition, showing an open-framework structure, with the first example of zinc in a trigonal-bipyramidal chloride coordination. Reinvestigation of the electrospray ionization mass spectrometry of these systems demonstrated that it is an unreliable technique to study liquid-phase speciation. PMID:21545101

Estager, Julien; Nockemann, Peter; Seddon, Kenneth R; Swad?ba-Kwa?ny, Ma?gorzata; Tyrrell, Sophie

2011-06-01

388

Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria  

NASA Astrophysics Data System (ADS)

Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the repeatability of such catastrophe losses in the country. The validation process also included collaboration between Aon Benfield and its client in order to consider the insurance market penetration in Algeria estimated approximately at 5%. Thus, we believe that the applied approach led towards the production of an earthquake model for Algeria that is scientifically sound and reliable from one side and market and client oriented on the other side.

Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

2012-04-01

389

Importance of Sea Ice for Validating Global Climate Models  

NASA Technical Reports Server (NTRS)

Reproduction of current day large-scale physical features and processes is a critical test of global climate model performance. Without this benchmark, prognoses of future climate conditions are at best speculation. A fundamental question relevant to this issue is, which processes and observations are both robust and sensitive enough to be used for model validation and furthermore are they also indicators of the problem at hand? In the case of global climate, one of the problems at hand is to distinguish between anthropogenic and naturally occuring climate responses. The polar regions provide an excellent testing ground to examine this problem because few humans make their livelihood there, such that anthropogenic influences in the polar regions usually spawn from global redistribution of a source originating elsewhere. Concomitantly, polar regions are one of the few places where responses to climate are non-anthropogenic. Thus, if an anthropogenic effect has reached the polar regions (e.g. the case of upper atmospheric ozone sensitivity to CFCs), it has most likely had an impact globally but is more difficult to sort out from local effects in areas where anthropogenic activity is high. Within this context, sea ice has served as both a monitoring platform and sensitivity parameter of polar climate response since the time of Fridtjof Nansen. Sea ice resides in the polar regions at the air-sea interface such that changes in either the global atmospheric or oceanic circulation set up complex non-linear responses in sea ice which are uniquely determined. Sea ice currently covers a maximum of about 7% of the earth's surface but was completely absent during the Jurassic Period and far more extensive during the various ice ages. It is also geophysically very thin (typically <10 m in Arctic, <3 m in Antarctic) compared to the troposphere (roughly 10 km) and deep ocean (roughly 3 to 4 km). Because of these unique conditions, polar researchers regard sea ice as one of the more important features to monitor in terms of heat, mass, and momentum transfer between the air and sea and furthermore, the impact of such responses to global climate.

Geiger, Cathleen A.

1997-01-01

390

Assessment of the Validity of the Double Superhelix Model for Reconstituted High Density Lipoproteins  

PubMed Central

For several decades, the standard model for high density lipoprotein (HDL) particles reconstituted from apolipoprotein A-I (apoA-I) and phospholipid (apoA-I/HDL) has been a discoidal particle ?100 ? in diameter and the thickness of a phospholipid bilayer. Recently, Wu et al. (Wu, Z., Gogonea, V., Lee, X., Wagner, M. A., Li, X. M., Huang, Y., Undurti, A., May, R. P., Haertlein, M., Moulin, M., Gutsche, I., Zaccai, G., Didonato, J. A., and Hazen, S. L. (2009) J. Biol. Chem. 284, 36605–36619) used small angle neutron scattering to develop a new model they termed double superhelix (DSH) apoA-I that is dramatically different from the standard model. Their model possesses an open helical shape that wraps around a prolate ellipsoidal type I hexagonal lyotropic liquid crystalline phase. Here, we used three independent approaches, molecular dynamics, EM tomography, and fluorescence resonance energy transfer spectroscopy (FRET) to assess the validity of the DSH model. (i) By using molecular dynamics, two different approaches, all-atom simulated annealing and coarse-grained simulation, show that initial ellipsoidal DSH particles rapidly collapse to discoidal bilayer structures. These results suggest that, compatible with current knowledge of lipid phase diagrams, apoA-I cannot stabilize hexagonal I phase particles of phospholipid. (ii) By using EM, two different approaches, negative stain and cryo-EM tomography, show that reconstituted apoA-I/HDL particles are discoidal in shape. (iii) By using FRET, reconstituted apoA-I/HDL particles show a 28–34-? intermolecular separation between terminal domain residues 40 and 240, a distance that is incompatible with the dimensions of the DSH model. Therefore, we suggest that, although novel, the DSH model is energetically unfavorable and not likely to be correct. Rather, we conclude that all evidence supports the likelihood that reconstituted apoA-I/HDL particles, in general, are discoidal in shape. PMID:20974855

Jones, Martin K.; Zhang, Lei; Catte, Andrea; Li, Ling; Oda, Michael N.; Ren, Gang; Segrest, Jere P.

2010-01-01

391

Reconceptualizing the Learning Transfer Conceptual Framework: Empirical Validation of a New Systemic Model  

ERIC Educational Resources Information Center

The main purpose of this study was to examine the validity of a new systemic model of learning transfer and thus determine if a more holistic approach to training transfer could better explain the phenomenon. In all, this study confirmed the validity of the new systemic model and suggested that a high performance work system could indeed serve as…

Kontoghiorghes, Constantine

2004-01-01

392

Contribution to a dynamic wind turbine model validation from a wind farm islanding experiment  

Microsoft Academic Search

Measurements from an islanding experiment on the Rejsby Hede wind farm, Denmark, are used for the validation of the dynamic model of grid-connected, stall-controlled wind turbines equipped with induction generators. The simulated results are found to be in good agreement with the measurements and possible discrepancies are explained. The work with the wind turbine model validation relates to the dynamic

J. K. Pedersen; K. O. Helgelsen-Pedersen; N. Kjølstad Poulsen; V. Akhmatov; A. Hejde Nielsen

2003-01-01

393

New methods for estimation, modeling and validation of dynamical systems using automatic differentiation  

E-print Network

NEW METHODS FOR ESTIMATION, MODELING AND VALIDATION OF DYNAMICAL SYSTEMS USING AUTOMATIC DIFFERENTIATION A Dissertation by DANIEL TODD GRIFFITH Submitted to the Office of Graduate Studies of Texas A&M University... in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY December 2004 Major Subject: Aerospace Engineering NEW METHODS FOR ESTIMATION, MODELING AND VALIDATION OF DYNAMICAL SYSTEMS USING AUTOMATIC...

Griffith, Daniel Todd

2005-02-17

394

Validation and verification of the simulation model of a photolithography process in semiconductor manufacturing  

Microsoft Academic Search

Simulation modeling provides an effective and powerful approach for capturing and analyzing complex manufacturing systems. More and more decisions are based on computer generated data derived from simulation. The strength of these decisions is a direct function of the validity of this data. Thus the need for efficient and objective methods to verify and validate simulation models is greater than

Nirupama Nayani; Mansooreh Mollaghasemi

1998-01-01

395

Why test animals to treat humans? On the validity of animal models  

Microsoft Academic Search

Critics of animal modeling have advanced a variety of arguments against the validity of the practice. The point of one such form of argument is to establish that animal modeling is pointless and therefore immoral. In this article, critical arguments of this form are divided into three types, the pseudoscience argument, the disanalogy argument, and the predictive validity argument. I

Cameron Shelley

2010-01-01

396

Validation of a biomechanical heart model using animal data with acute myocardial infarction  

E-print Network

Experimental data The experimental data consisted of animal data obtained with a farm pig of 25kg. The inValidation of a biomechanical heart model using animal data with acute myocardial infarction R Hospital, Cr´eteil, France Abstract. In this paper, we validate a biomechanical heart model with animal

Paris-Sud XI, Université de

397

Techniques for Down-Sampling a Measured Surface Height Map for Model Validation  

NASA Technical Reports Server (NTRS)

This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

Sidick, Erkin

2012-01-01

398

LFT uncertain model validation with time- and frequency-domain measurements  

Microsoft Academic Search

In this paper the authors study a model validation problem pertaining to linear fractional uncertainty models. They extend the previous validation approaches, based upon either time or frequency measurements, to one using simultaneously time- and frequency-domain data. They show that this problem can be reduced to two independent convex feasibility tests, each of which corresponds to the time- or frequency-domain

Demin Xu; Zhang Ren; Guoxiang Gu; Jie Chen

1999-01-01

399

Field validation of the DNDC model for greenhouse gas emissions in East Asian cropping systems  

E-print Network

Field validation of the DNDC model for greenhouse gas emissions in East Asian cropping systems validation of the DNDC model for greenhouse gas emissions in East Asian cropping systems, Global Biogeochem for estimating CH4 emissions from regional and/or national rice fields. For instance, Bachelet et al. [1995

400

Sociodemography of borderline personality disorder (PD): a comparison with Axis II PDs and psychiatric symptom disorders convergent validation.  

PubMed

A theoretical objective of the present meta-analysis based upon data derived from a previously reported review (Taub, 1995), was to test two inductive hypotheses empirically regarding educational background and social class across different criteria for the DSM-III diagnosis of borderline personality disorder (PD). A secondary purpose was to determine whether comorbidity of borderline PD with other Axis II PDs would significantly delineate socioeducational variables. Across 7/8 pairwise contrasts which represented five studies, distribution of Hollingshead Redlich (H-R) social classes II-IV borderline PD (N = 326) significantly exceeded that in 457 diagnostic controls with Axis II PDs and psychiatric symptom disorders. Although average differences, as well as, interactions reflected by values of the H-R two-factor scale attained statistical significance these were were less consistent in magnitude and direction versus outcomes yielded by distribution of social classes. For the borderline PD diagnosis, the inductive hypotheses were substantiated by findings of significantly advanced scholastic achievement, as well as the younger age of most cohorts versus diagnostic controls with Axis II PDs and psychiatric symptom disorders; and in pairwise contrasts of outpatients with hospitalized cohorts. Comorbidity of the borderline PD diagnosis was associated with significantly lower social class, scholastic achievement and to a lesser extent, more severe psychopathology. Evidence for predominantly convergent validation relative to the socioeducational variables was substantiated by comparisons with (a) cohorts selected by criteria of the DSM-III-R, Gunderson's DIB and Borderline Personality Scale: (b) Norwegian females admitted to Gaustad Hospital and (c) patients with the DSM-III diagnosis of borderline PD attending an outpatient clinic in Norway. PMID:9003963

Taub, J M

1996-11-01

401

NGC1300 dynamics - II. The response models  

NASA Astrophysics Data System (ADS)

We study the stellar response in a spectrum of potentials describing the barred spiral galaxy NGC1300. These potentials have been presented in a previous paper and correspond to three different assumptions as regards the geometry of the galaxy. For each potential we consider a wide range of ?p pattern speed values. Our goal is to discover the geometries and the ?p supporting specific morphological features of NGC1300. For this purpose we use the method of response models. In order to compare the images of NGC1300 with the density maps of our models, we define a new index which is a generalization of the Hausdorff distance. This index helps us to find out quantitatively which cases reproduce specific features of NGC1300 in an objective way. Furthermore, we construct alternative models following a Schwarzschild-type technique. By this method we vary the weights of the various energy levels, and thus the orbital contribution of each energy, in order to minimize the differences between the response density and that deduced from the surface density of the galaxy, under certain assumptions. We find that the models corresponding to ?p ~ 16 and 22 kms-1kpc-1 are able to reproduce efficiently certain morphological features of NGC1300, with each one having its advantages and drawbacks. Based on observations collected at the European Southern Observatory, Chile: programme ESO 69.A-0021. E-mail: ckalapot@phys.uoa.gr (CK); patsis@academyofathens.gr (PAP); pgrosbol@eso.org (PG)

Kalapotharakos, C.; Patsis, P. A.; Grosbøl, P.

2010-10-01

402

Dicobalt II-II, II-III, and III-III complexes as spectroscopic models for dicobalt enzyme active sites.  

PubMed

A matched set of dinuclear cobalt complexes with II-II, II-III, and III-III oxidation states have been prepared and structurally characterized. In [(bpbp)Co2(O2P(OPh)2)2]n+ ( n = 1, 2, or 3; bpbp(-) = 2,6-bis(( N,N'-bis-(2-picolyl)amino)-methyl)-4-tertbutylphenolato), the nonbonded Co...Co separations are within the range 3.5906(17) to 3.7081(11) angstroms, and the metal ions are triply bridged by the phenolate oxygen atom of the heptadentate dinucleating ligand and by two diphenylphosphate groups. The overall structures and geometries of the complexes are very similar, with minor variations in metal-ligand bond distances consistent with oxidation state assignments. The CoIICoIII compound is a valence-trapped Robin-Day class II complex. Solid state 31P NMR spectra of the diamagnetic CoIIICoIII (3) and paramagnetic CoIICoIII (2) and CoIICoII (1) complexes show that 31P isotropic shifts broaden and move downfield by about 3000 ppm for each increment in oxidation state. Cyclic voltammetry corroborates the existence of the CoIICoII, CoIICoIII, and CoIIICoIII species in solution. The redox changes are not reversible in the applied scanning timescales, indicating that chemical changes are associated with oxidation and reduction of the cobalt centers. An investigation of the spectroscopic properties of this series has been carried out for its potential usefulness in analyses of the related spectroscopic properties of the dicobalt metallohydrolases. Principally, magnetic circular dichroism (MCD) has been used to determine the strength of the magnetic exchange coupling in the CoIICoII complex by analysis of the variable-temperature variable-field (VTVH) intensity behavior of the MCD signal. The series is ideal for the spectroscopic determination of magnetic coupling since it can occur only in the CoIICoII complex. The CoIICoIII complex contains a nearly isostructural CoII ion, but since CoIII is diamagnetic, the magnetic coupling is switched off, while the spectral features of the CoII ion remain. Analysis of the MCD data from the CoIICoIII complex has been undertaken in the theoretical context of a 4T1g ground-state of the CoII ion, initially in an octahedral ligand field that is split by both geometric distortion and zero-field splitting to form an isolated doublet ground state. The MCD data for the CoIICoII pair in the [(bpbp)Co2(O2P(OPh)2)2]+ complex were fitted to a model based on weak antiferromagnetic coupling with J = -1.6 cm (-1). The interpretation is confirmed by solid state magnetic susceptibility measurements. PMID:18494467

Johansson, Frank B; Bond, Andrew D; Nielsen, Ulla Gro; Moubaraki, Boujemaa; Murray, Keith S; Berry, Kevin J; Larrabee, James A; McKenzie, Christine J

2008-06-16

403

Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.  

PubMed

This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. PMID:24076304

López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

2013-11-01

404

Model Validation Lessons Learned: A Case Study at Oak Ridge National Laboratory.  

National Technical Information Service (NTIS)

A groundwater flow and contaminant transport model validation study was performed to determine the applicability of typical groundwater flow models for performance assessment of proposed waste disposal facilities at Oak Ridge, Tennessee. Standard practice...

R. H. Ketelle, R. R. Lee, J. M. Bownds, T. A. Rizk

1989-01-01

405

Development of a Hydraulic Manipulator Servoactuator Model: Simulation and Experimental Validation  

E-print Network

Development of a Hydraulic Manipulator Servoactuator Model: Simulation and Experimental Validation Abstract In this paper, modelling and identification of a hydraulic servoactuator system is presented, leakage, and load dynamics. System parameters are identified based on a high-performance hydraulic

Papadopoulos, Evangelos

406

Model Validation for Nonlinear and Time-Varying Systems: Improved Bounds using the S-Procedure  

E-print Network

. A generic LFT model structure fractional transformation (LFT). The linear transfer functions Pij are known (1) At this point, we may define our a model (in)validation optimization problem: given the upper LFT

Cambridge, University of

407

The Adriatic Sea ecosystem seasonal cycle: Validation of a three-dimensional numerical model  

Microsoft Academic Search

A three-dimensional coupled biogeochemical-circulation numerical model was implemented in the Adriatic Sea. The biogeochemical part of the model is a development of the European Seas Regional Ecosystem Model (ERSEM II), while the circulation model is the Adriatic Sea implementation of the Princeton Ocean Model (POM). The model was run under climatological monthly varying atmospheric and river runoff forcing in order

L. Polimene; N. Pinardi; M. Zavatarelli; S. Colella

2006-01-01

408

Open-system respirometry in intensive aquaculture: model validation and application to red drum (Sciaenops ocellatus)  

E-print Network

The need for information on actual metabolic rates of fish in aquacultural production systems led to development of a methodology for open-system respirometry. Central to this methodology is a model proposing that oxygen-uptake rates of fish kept at high... model, via an inflow of oxygen-deficient water simulating fish metabolism. B. To biologically validate the open-system respirometry model, via an energy-expenditure comparison study. 2. To apply the validated model, in estimating fish metabolism...

Oborny, Edmund Lee

2012-06-07

409

Validation of a vertical progression porcine burn model.  

PubMed

A major potential goal of burn therapy is to limit progression of partial- to full-thickness burns. To better test therapies, the authors developed and validated a vertical progression porcine burn model in which partial-thickness burns treated with an occlusive dressing convert to full-thickness burns that heal with scarring and wound contraction. Forty contact burns were created on the backs and flanks of two young swine using a 150 g aluminum bar preheated to 70°C, 80°C, or 90°C for 20 or 30 seconds. The necrotic epidermis was removed and the burns were covered with a polyurethane occlusive dressing. Burns were photographed at 1, 24, and 48 hours as well as at 7, 14, 21, and 28 days postinjury. Full-thickness biopsies were obtained at 1, 4, 24, and 48 hours as well as at 7 and 28 days. The primary outcomes were presence of deep contracted scars and wound area 28 days after injury. Secondary outcomes were depth of injury, reepithelialization, and depth of scars. Data were compared across burn conditions using analysis of variance and ?(2) tests. Eight replicate burns were created with the aluminum bar using the following temperature/contact-time combinations: 70/20, 70/30, 80/20, 80/30, and 90/20. The percentage of burns healing with contracted scars were 70/20, 0%; 70/30, 25%; 80/20, 50%; 80/30, 75%; and 90/20, 100% (P = .05). Wound areas at 28 days by injury conditions were 70/20, 8.1 cm(2); 70/30, 7.8 cm(2); 80/20, 6.6 cm(2); 80/30, 4.9 cm(2); and 90/20, 4.8 cm(2) (P = .007). Depth of injury judged by depth of endothelial damage for the 80/20 and 80/30 burns at 1 hour was 36% and 60% of the dermal thickness, respectively. The depth of injury to the endothelial cells 1 hour after injury was inversely correlated with the degree of scar area (Pearson's correlation r = -.71, P < .001). Exposure of porcine skin to an aluminum bar preheated to 80°C for 20 or 30 seconds results initially in a partial-thickness burn that when treated with an occlusive dressing progresses to a full-thickness injury and heals with significant scarring and wound contracture. PMID:21841494

Singer, Adam J; Hirth, Douglas; McClain, Steve A; Crawford, Laurie; Lin, Fubao; Clark, Richard A F

2011-01-01

410

Validation of the integration of CFD and SAS4A/SASSYS-1: Analysis of EBR-II shutdown heat removal test 17  

SciTech Connect

Recent analyses have demonstrated the need to model multidimensional phenomena, particularly thermal stratification in outlet plena, during safety analyses of loss-of-flow transients of certain liquid-metal cooled reactor designs. Therefore, Argonne's reactor systems safety code SAS4A/SASSYS-1 is being enhanced by integrating 3D computational fluid dynamics models of the plena. A validation exercise of the new tool is being performed by analyzing the protected loss-of-flow event demonstrated by the EBR-II Shutdown Heat Removal Test 17. In this analysis, the behavior of the coolant in the cold pool is modeled using the CFD code STAR-CCM+, while the remainder of the cooling system and the reactor core are modeled with SAS4A/SASSYS-1. This paper summarizes the code integration strategy and provides the predicted 3D temperature and velocity distributions inside the cold pool during SHRT-17. The results of the coupled analysis should be considered preliminary at this stage, as the exercise pointed to the need to improve the CFD model of the cold pool tank. (authors)

Thomas, J. W.; Fanning, T. H.; Vilim, R.; Briggs, L. L. [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439-4842 (United States)

2012-07-01

411

Dynamical dark matter. II. An explicit model  

NASA Astrophysics Data System (ADS)

In a recent paper [K. R. Dienes and B. Thomas, Phys. Rev. DPRVDAQ1550-7998 85, 083523 (2012).], we introduced “dynamical dark matter,” a new framework for dark-matter physics, and outlined its underlying theoretical principles and phenomenological possibilities. Unlike most traditional approaches to the dark-matter problem which hypothesize the existence of one or more stable dark-matter particles, our dynamical dark-matter framework is characterized by the fact that the requirement of stability is replaced by a delicate balancing between cosmological abundances and lifetimes across a vast ensemble of individual dark-matter components. This setup therefore collectively produces a time-varying cosmological dark-matter abundance, and the different dark-matter components can interact and decay throughout the current epoch. While the goal of our previous paper was to introduce the broad theoretical aspects of this framework, the purpose of the current paper is to provide an explicit model of dynamical dark matter and demonstrate that this model satisfies all collider, astrophysical, and cosmological constraints. The results of this paper therefore constitute an “existence proof” of the phenomenological viability of our overall dynamical dark-matter framework, and demonstrate that dynamical dark matter is indeed a viable alternative to the traditional paradigm of dark-matter physics. Dynamical dark matter must therefore be considered alongside other approaches to the dark-matter problem, particularly in scenarios involving large extra dimensions or string theory in which there exist large numbers of particles which are neutral under standard-model symmetries.

Dienes, Keith R.; Thomas, Brooks

2012-04-01

412

Theoretical H II region models - The effects of stellar atmosphere models  

NASA Technical Reports Server (NTRS)

Several grids of theoretical H II region models are computed by photoionization modeling in order to determine the extent to which the choice of the ionizing stellar atmosphere model affects the calibration of emission-line diagnostic diagrams of Evans and Dopita (1985) and the semiempirical H II region abundance sequence calibration of Evans and Dopita. Emission-line diagnostic diagrams are presented and compared for model nebulae ionized by Hummer and Mihalas (1970) unblanketed LTE atmospheres, Kurucz (1979) line-blanketed LTE atmospheres, Mihalas (1972) unblanketed non-LTE, and a truncated blackbody spectrum. The models demonstrate that for solar nebular and atmospheric abundances, there are only minor differences between H II models ionized by the Hummer and Mihalas atmospheres. The unblanketed non-LTE stellar atmosphere models of Mihalas and truncated blackbody spectra are shown to be unsuitable for general H II region modelling.

Evans, I. N.

1991-01-01

413

Some Hamiltonian models of friction II  

SciTech Connect

In the present paper we consider the motion of a very heavy tracer particle in a medium of a very dense, non-interacting Bose gas. We prove that, in a certain mean-field limit, the tracer particle will be decelerated and come to rest somewhere in the medium. Friction is caused by emission of Cerenkov radiation of gapless modes into the gas. Mathematically, a system of semilinear integro-differential equations, introduced in Froehlich et al. ['Some hamiltonian models of friction,' J. Math. Phys. 52(8), 083508 (2011)], describing a tracer particle in a dispersive medium is investigated, and decay properties of the solution are proven. This work is an extension of Froehlich et al. ['Friction in a model of hamiltonian dynamics,' Commun. Math. Phys. 315(2), 401-444 (2012)]; it is an extension because no weak coupling limit for the interaction between tracer particle and medium is assumed. The technical methods used are dispersive estimates and a contraction principle.

Egli, Daniel; Gang Zhou [Institute for Theoretical Physics, ETH Zurich, CH-8093 Zuerich (Switzerland)

2012-10-15

414

The quantum HMF model: II. Bosons  

NASA Astrophysics Data System (ADS)

We study the thermodynamics of quantum particles with long-range interactions at T = 0. Specifically, we generalize the Hamiltonian mean-field (HMF) model to the case of bosons. We consider the Hartree approximation that becomes exact in a proper thermodynamic limit N\\rightarrow+\\infty with a coupling constant k ~ 1/N. The equilibrium configurations are solutions of the mean-field Schrödinger equation with a cosine interaction. We show that the homogeneous phase, which is unstable in the classical regime, becomes stable in the quantum regime. The homogeneous phase is stabilized by the Heisenberg uncertainty principle. This takes place through a second-order phase transition where the control parameter is the normalized Planck constant. The homogeneous phase is unstable for \\hbar \\lt \\hbar_{ {c}}\\equiv 1/\\sqrt {\\pi } and stable for \\hbar \\gt \\hbar_{ {c}} . The inhomogeneous phase is stable for \\hbar \\lt \\hbar_{ {c}} and disappears for \\hbar \\gt \\hbar_{ {c}} . We point out analogies between the bosonic HMF model and the concept of boson stars in astrophysics. We also discuss the differences between bosons and fermions for what concerns the thermodynamic limit, the order of the phase transition and the form of the density profiles.

Chavanis, Pierre-Henri

2011-08-01

415

PART II TECHNIQUES PROJECT MODELLING OF THE CORROSION OF  

E-print Network

of corrosion depending on a number of factors including the composition and surface condition of the metal- 1 - PART II TECHNIQUES PROJECT MODELLING OF THE CORROSION OF BINARY ALLOYS R.A. Jones Produced and temperatures. In this work a neural network method was employed to study how the rate of corrosion of Fe

Cambridge, University of

416

Bow shock models of ultracompact H II regions  

NASA Technical Reports Server (NTRS)

This paper presents models of ultracompact H II regions as the bow shocks formed by massive stars, with strong stellar winds, moving supersonically through molecular clouds. The morphologies, sizes and brightnesses of observed objects match the models well. Plausible models are provided for the ultracompact H II regions G12.21 - 0.1, G29.96 - 0.02, G34.26 + 0.15, and G43.89 - 0.78. To do this, the equilibrium shape of the wind-blown shell is calculated, assuming momentum conservation. Then the shell is illuminated with ionizing radiation from the central star, radiative transfer for free-free emission through the shell is performed, and the resulting object is visualized at various angles for comparison with radio continuum maps. The model unifies most of the observed morphologies of ultracompact H II regions, excluding only those objects with spherical shells. Ram pressure confinement greatly lengthens the life of ultracompact H II regions, explaining the large number that exist in the Galaxy despite their low apparent kinematic ages.

Mac Low, Mordecai-Mark; Van Buren, Dave; Wood, Douglas O. S.; Churchwell, ED

1991-01-01

417

Social Validity of a Positive Behavior Interventions and Support Model  

ERIC Educational Resources Information Center

As more schools turn to positive behavior interventions and support (PBIS) to address students' academic and behavioral problems, there is an increased need to adequately evaluate these programs for social relevance. The present study used social validation measures to evaluate a statewide PBIS initiative. Active consumers of the program were…

Miramontes, Nancy Y.; Marchant, Michelle; Heath, Melissa Allen; Fischer, Lane

2011-01-01

418

BIOCHEMICAL AND MORPHOLOGICAL VALIDATION OF A RODENT MODEL OF OPIDN  

EPA Science Inventory

The paper describes six years of research designed to validate the use of the rat as a viable alternative to the hen for screening and mechanistic studies of neuropathic OP compounds. To date the results indicate that if morphological rather than behavioral endpoints are used, th...