Note: This page contains sample records for the topic ii model validation from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: November 12, 2013.
1

Development, calibration and validation of a greenhouse tomato growth model: II. Field calibration and validation  

Microsoft Academic Search

The tomato crop growth model described in part I of this study has been parameterized on the basis of the results of experiments conducted in greenhouses in the northern Negev of Israel. Iterative procedures were applied to derive the parameters for the functional responses of various processes to temperature, radiation intensity and carbon dioxide concentration. The model was subsequently validated

E. Dayan; H. van Keulen; J. W. Jones; I. Zipori; D. Shmuel; H. Challa

1993-01-01

2

Anaerobic digestion model No. 1-based distributed parameter model of an anaerobic reactor: II. Model validation.  

PubMed

In this study, an ADM1-based distributed parameter model was validated using experimental results obtained in a laboratory-scale 10 L UASB reactor. Sensitivity analysis of the model parameters was used to select four parameters for estimation by a numerical procedure while other parameters were accepted from ADM1 benchmark simulations. The parameter estimation procedure used measurements of liquid phase components obtained at different sampling points in the reactor and under different operating conditions. Model verification used real time fluorescence-based measurements of chemical oxygen demand and volatile fatty acids at four sampling locations in the reactor. Overall, the distributed parameter model was able to describe the distribution of liquid phase components in the reactor and adequately simulated the effect of external recirculation on degradation efficiency. The model can be used in the design, analysis and optimization of UASB reactors. PMID:17889525

Tartakovsky, B; Mu, S J; Zeng, Y; Lou, S J; Guiot, S R; Wu, P

2007-09-21

3

Large-scale Validation of AMIP II Land-surface Simulations: Preliminary Results for Ten Models  

SciTech Connect

This report summarizes initial findings of a large-scale validation of the land-surface simulations of ten atmospheric general circulation models that are entries in phase II of the Atmospheric Model Intercomparison Project (AMIP II). This validation is conducted by AMIP Diagnostic Subproject 12 on Land-surface Processes and Parameterizations, which is focusing on putative relationships between the continental climate simulations and the associated models' land-surface schemes. The selected models typify the diversity of representations of land-surface climate that are currently implemented by the global modeling community. The current dearth of global-scale terrestrial observations makes exacting validation of AMIP II continental simulations impractical. Thus, selected land-surface processes of the models are compared with several alternative validation data sets, which include merged in-situ/satellite products, climate reanalyses, and off-line simulations of land-surface schemes that are driven by observed forcings. The aggregated spatio-temporal differences between each simulated process and a chosen reference data set then are quantified by means of root-mean-square error statistics; the differences among alternative validation data sets are similarly quantified as an estimate of the current observational uncertainty in the selected land-surface process. Examples of these metrics are displayed for land-surface air temperature, precipitation, and the latent and sensible heat fluxes. It is found that the simulations of surface air temperature, when aggregated over all land and seasons, agree most closely with the chosen reference data, while the simulations of precipitation agree least. In the latter case, there also is considerable inter-model scatter in the error statistics, with the reanalyses estimates of precipitation resembling the AMIP II simulations more than to the chosen reference data. In aggregate, the simulations of land-surface latent and sensible heat fluxes appear to occupy intermediate positions between these extremes, but the existing large observational uncertainties in these processes make this a provisional assessment. In all selected processes as well, the error statistics are found to be sensitive to season and latitude sector, confirming the need for finer-scale analyses which also are in progress.

Phillips, T J; Henderson-Sellers, A; Irannejad, P; McGuffie, K; Zhang, H

2005-12-01

4

Dynamics and control of towed underwater vehicle system, part II: model validation and turn maneuver optimization  

Microsoft Academic Search

This paper presents a validation of a three-dimensional dynamics model of a towed underwater vehicle system and discusses an application of the model to improve the performance of the system during a turn maneuver. The model was validated by comparing its results to experimental sea trial data, as well as to results from another independently developed simulation. The dynamics model

C Lambert; M Nahon; B Buckham; M Seto

2003-01-01

5

Use of ISLSCP II data to intercompare and validate the terrestrial net primary production in a land surface model coupled to a general circulation model  

Microsoft Academic Search

Using the global terrestrial NPP and climate data from International Satellite Land Surface Climatology Project Initiative II (ISLSCP II) and additional NPP data, we validated the NPP simulations and explored the relationship between NPP and climate variation in a global two-way coupled model AVIM-GOALS. The strength of this study is that the global simulations produced will enhance interactive climate and

Li Dan; Jinjun Ji; Yong He

2007-01-01

6

A wheat grazing model for simulating grain and beef production: Part II - model validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

Model evaluation is a prerequisite to its adoption and successful application. The objective of this paper is to evaluate the ability of a newly developed wheat grazing model to predict fall-winter forage and grain yields of winter wheat (Triticum aestivum L.) as well as daily weight gains per steer...

7

Solvent-refined-coal (SRC) process: reactivity of SRC-II liquefaction products. Interim report No. 69. [Experiments for model validation  

Microsoft Academic Search

Data from experimental studies of the reactivities of SRC-II liquefaction products, in the absence of coal, are analyzed. A quantitative validation of essential features of the new kinetic model for SRC-II coal-liquefaction is obtained by very good agreements between model predicted and measured yields. Validation of the proposed zero order rate mechanism for SRC conversion is tentative in this work,

C. P. P. Singh; N. L. Carr

1982-01-01

8

A new 3D finite element model of the IEC 60318-1 artificial ear: II. Experimental and numerical validation  

NASA Astrophysics Data System (ADS)

In part I, the feasibility of using three-dimensional (3D) finite elements (FEs) to model the acoustic behaviour of the IEC 60318-1 artificial ear was studied and the numerical approach compared with classical lumped elements modelling. It was shown that by using a more complex acoustic model that took account of thermo-viscous effects, geometric shapes and dimensions, it was possible to develop a realistic model. This model then had clear advantages in comparison with the models based on equivalent circuits using lumped parameters. In fact results from FE modelling produce a better understanding about the physical phenomena produced inside ear simulator couplers, facilitating spatial and temporal visualization of the sound fields produced. The objective of this study (part II) is to extend the investigation by validating the numerical calculations against measurements on an ear simulator conforming to IEC 60318-1. For this purpose, an appropriate commercially available device is taken and a complete 3D FE model developed for it. The numerical model is based on key dimensional data obtained with a non-destructive x-ray inspection technique. Measurements of the acoustic transfer impedance have been carried out on the same device at a national measurement institute using the method embodied in IEC 60318-1. Having accounted for the actual device dimensions, the thermo-viscous effects inside narrow slots and holes and environmental conditions, the results of the numerical modelling were found to be in good agreement with the measured values.

Bravo, Agustín; Barham, Richard; Ruiz, Mariano; López, Juan Manuel; De Arcas, Guillermo; Alonso, Jesus

2012-12-01

9

Harris Corporation, Harris 1600-02 Model II, Harris Interactive COBOL, Version 107.1 (Validation).  

National Technical Information Service (NTIS)

This Validation Summary Report (VSR) for the Harris Interactive COBOL Compiler Version 107.1 (ECOS Version 106.1) provides a consolidated summary of the results obtained from the validation of the subject compiler against the 1974 COBOL Standard (X3.23-19...

1981-01-01

10

The validation of biodynamic models  

Microsoft Academic Search

Biodynamic models may: (i) represent understanding of how the body moves (i.e., `mechanistic models'), (ii) summarise biodynamic measurements (i.e., `quantitative models'), and (iii) provide predictions of the effects of motion on human health, comfort or performance (i.e., `effects models').Model validation may involve consideration of evidence used to derive a model, comparison of the model with alternatives, and a comparison between

Michael J Griffin

2001-01-01

11

Verifying and validating simulation models  

Microsoft Academic Search

This paper discusses verification and validation of simulation models. The different approaches to deciding model validity am presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined, conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; and a recommended procedure is presented.

Robert G. Sargent

1996-01-01

12

Solvent-refined-coal (SRC) process: reactivity of SRC-II liquefaction products. Interim report No. 69. [Experiments for model validation  

SciTech Connect

Data from experimental studies of the reactivities of SRC-II liquefaction products, in the absence of coal, are analyzed. A quantitative validation of essential features of the new kinetic model for SRC-II coal-liquefaction is obtained by very good agreements between model predicted and measured yields. Validation of the proposed zero order rate mechanism for SRC conversion is tentative in this work, however, because the concentration of SRC in the feed slurry changed very little. This shows that the new kinetic model provides a reasonable representation of the overall intrinsic process of SRC-II coal liquefaction and hence, it can be used to explore the regions of process conditions far away from those obtained under normal SRC-II operations. The new kinetic model for SRC-II coal liquefaction is based on conversion of SRC-II liquefaction products in the absence of coal. The data set consists of seven consecutive experiments in which the unfiltered coal solution from one experiment is fed to the next. Due to continuous use of the same material, minus removal of light products, the experiments are termed as passes in which a pass number represents the number of times the UFCS is used as feed. Modifications to the normal SRC-II process scheme required to obtain liquefied coal products and those for their reactions in the absence of feed coal are simulated in detail to make the necessary modifications in the kinetic model. Since the experiments analyzed in this work were carried out in the absence of coal, this validation shows that the new kinetic model provides a reasonable representation of the intrinsic process in SRC-II coal liquefaction.

Singh, C.P.P.; Carr, N.L.

1982-05-01

13

Assessing Wildlife Habitat Value of New England Salt Marshes: II. Model Testing and Validation  

EPA Science Inventory

We test a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. Assessment scores ranged f...

14

A new integrated CFD modelling approach towards air-assisted orchard spraying—Part II: Validation for different sprayer types  

Microsoft Academic Search

A computational fluid dynamics (CFD) model to simulate airflow from air-assisted orchard sprayers through pear canopies was validated for three different sprayers; single-fan (Condor V), two-fan (Duoprop) and four-fan sprayers (AirJet Quatt). The first two sprayers are widely used in Belgium and the latter one is a new design. Validation experiments were carried out in an experimental orchard (pcfruit, Velm,

A. Melese Endalew; C. Debaer; N. Rutten; J. Vercammen; M. A. Delele; H. Ramon; B. M. Nicolaï; P. Verboven

2010-01-01

15

Groundwater Model Validation  

SciTech Connect

Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation data to constrain model input parameters is shown for the second case study using a Bayesian approach known as Markov Chain Monte Carlo. The approach shows a great potential to be helpful in the validation process and in incorporating prior knowledge with new field data to derive posterior distributions for both model input and output.

Ahmed E. Hassan

2006-01-24

16

Water-coupled carbon dioxide microchannel gas cooler for heat pump water heaters: Part IIModel development and validation  

Microsoft Academic Search

An experimental and analytical study on the performance of a compact, microchannel water-carbon dioxide (CO2) gas cooler was conducted. The experimental results addressed in Part I of this study are used here in Part II to develop an analytical model, utilizing a segmented approach to account for the steep gradients in the thermodynamic and transport properties of supercritical CO2. The

Brian M. Fronk; Srinivas Garimella

2011-01-01

17

Black liquor combustion validated recovery boiler modeling: Final year report. Volume 3 (Appendices II, sections 2--3 and III)  

SciTech Connect

This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 3 contains the following appendix sections: Formation and destruction of nitrogen oxides in recovery boilers; Sintering and densification of recovery boiler deposits laboratory data and a rate model; and Experimental data on rates of particulate formation during char bed burning.

Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

1998-08-01

18

Development of a new version of the Liverpool Malaria Model. II. Calibration and validation for West Africa  

PubMed Central

Background In the first part of this study, an extensive literature survey led to the construction of a new version of the Liverpool Malaria Model (LMM). A new set of parameter settings was provided and a new development of the mathematical formulation of important processes related to the vector population was performed within the LMM. In this part of the study, so far undetermined model parameters are calibrated through the use of data from field studies. The latter are also used to validate the new LMM version, which is furthermore compared against the original LMM version. Methods For the calibration and validation of the LMM, numerous entomological and parasitological field observations were gathered for West Africa. Continuous and quality-controlled temperature and precipitation time series were constructed using intermittent raw data from 34 weather stations across West Africa. The meteorological time series served as the LMM data input. The skill of LMM simulations was tested for 830 different sets of parameter settings of the undetermined LMM parameters. The model version with the highest skill score in terms of entomological malaria variables was taken as the final setting of the new LMM version. Results Validation of the new LMM version in West Africa revealed that the simulations compare well with entomological field observations. The new version reproduces realistic transmission rates and simulated malaria seasons are comparable to field observations. Overall the new model version performs much better than the original model. The new model version enables the detection of the epidemic malaria potential at fringes of endemic areas and, more importantly, it is now applicable to the vast area of malaria endemicity in the humid African tropics. Conclusions A review of entomological and parasitological data from West Africa enabled the construction of a new LMM version. This model version represents a significant step forward in the modelling of a weather-driven malaria transmission cycle. The LMM is now more suitable for the use in malaria early warning systems as well as for malaria projections based on climate change scenarios, both in epidemic and endemic malaria areas.

2011-01-01

19

Minet validation study using EBR-II transient data  

SciTech Connect

The MINET code, developed to simulate large and complex thermal-hydraulic systems, such as balance of plant, was used to simulate an EBR-II test transient. MINET calculations agreed well with measured parameters, confirming the validity of the basic MINET methodology and key component models.

Van Tuyle, G.J.

1984-01-01

20

Rail Vehicle Dynamics Model Validation.  

National Technical Information Service (NTIS)

The validation of mathematical models of rail vehicle dynamics using test data poses a number of difficult problems, which are addressed in this report. Previous attempts to validate rail vehicle models are reviewed critically, and experience gained in va...

S. E. Shladover R. L. Hull

1981-01-01

21

Characteristic time model validation  

NASA Astrophysics Data System (ADS)

An experimental program for validation of the semi-empirical Characteristic Time Model (CTM) is described. A two-dimensional turbulent shear layer is generated in the experimental test section using a two-stream, vertically downflowing wind tunnel with a flat pre-filming airblast atomizer fitted along its centerline. This facility simulates the shear layer around the recirculation zone found in the primary zone of a gas turbine combustor. Experimental results are used to investigate CTM parameters for turbulent mixing and droplet lifetime and to examine current finite difference modeling techniques. Global mixing times evaluated at the origin of the shear layer and defined in terms of geometric macroscale and a reference velocity are compared with the locally measured values of turbulent mixing time. The results demonstrate that these global times, as defined for the CTM, do in fact accurately represent the events occurring on a local scale, as hypothesized. Modifications to the mixing time parameter to improve existing correlations are proposed. Due to restrictions imposed by the facility and instrumentation, validation of the droplet lifetime parameter was not possible. Measurements were restricted to mean spray diameters. These data and others demonstrate that current correlations for Sauter mean diameter do not adequately account for changes in atomizer geometry or liquid properties.

Tallio, K. V.; Prior, J. C., Jr.; Mellor, A. M.

1988-09-01

22

Verification, validation and accreditation of simulation models  

Microsoft Academic Search

The paper discusses verification, validation, and accreditation of simulation models. The different approaches to deciding model validity are presented; how model verification and validation relate to the model development process are discussed; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are described; ways to document results are given; a recommended procedure is presented;

Robert G. Sargent

2000-01-01

23

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of si- mulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are de- fined; conceptual model validity, model verification, op- erational validity, and data validity are discussed; a way to document results

Robert G. Sargent

1994-01-01

24

Verification and validation of simulation models  

Microsoft Academic Search

In this paper we discuss verification and validation of simulation models. Four different approaches to deciding model validity are described; two different paradigms that relate verification and validation to the model development process are presented; various validation techniques are defined; conceptual model validity, model verification, operational validity, and data validity are discussed; a way to document results is given; a

Robert G. Sargent

2003-01-01

25

Ecological reality and model validation  

SciTech Connect

Definitions of model realism and model validation are developed. Ecological and mathematical arguments are then presented to show that model equations which explicitly treat ecosystem processes can be systematically improved such that greater realism is attained and the condition of validity is approached. Several examples are presented.

Cale, W.G. Jr.; Shugart, H.H.

1980-01-01

26

Ab Initio Structural Modeling of and Experimental Validation for Chlamydia trachomatis Protein CT296 Reveal Structural Similarity to Fe(II) 2-Oxoglutarate-Dependent Enzymes?  

PubMed Central

Chlamydia trachomatis is a medically important pathogen that encodes a relatively high percentage of proteins with unknown function. The three-dimensional structure of a protein can be very informative regarding the protein's functional characteristics; however, determining protein structures experimentally can be very challenging. Computational methods that model protein structures with sufficient accuracy to facilitate functional studies have had notable successes. To evaluate the accuracy and potential impact of computational protein structure modeling of hypothetical proteins encoded by Chlamydia, a successful computational method termed I-TASSER was utilized to model the three-dimensional structure of a hypothetical protein encoded by open reading frame (ORF) CT296. CT296 has been reported to exhibit functional properties of a divalent cation transcription repressor (DcrA), with similarity to the Escherichia coli iron-responsive transcriptional repressor, Fur. Unexpectedly, the I-TASSER model of CT296 exhibited no structural similarity to any DNA-interacting proteins or motifs. To validate the I-TASSER-generated model, the structure of CT296 was solved experimentally using X-ray crystallography. Impressively, the ab initio I-TASSER-generated model closely matched (2.72-? C? root mean square deviation [RMSD]) the high-resolution (1.8-?) crystal structure of CT296. Modeled and experimentally determined structures of CT296 share structural characteristics of non-heme Fe(II) 2-oxoglutarate-dependent enzymes, although key enzymatic residues are not conserved, suggesting a unique biochemical process is likely associated with CT296 function. Additionally, functional analyses did not support prior reports that CT296 has properties shared with divalent cation repressors such as Fur.

Kemege, Kyle E.; Hickey, John M.; Lovell, Scott; Battaile, Kevin P.; Zhang, Yang; Hefty, P. Scott

2011-01-01

27

Validation of performance assessment models.  

National Technical Information Service (NTIS)

The purpose of model validation in a low-level waste site performance assessment is to increase confidence in predictions of the migration and fate of future releases from the wastes. Unlike the process of computer code verification, model validation is a...

M. P. Bergeron C. T. Kincaid

1991-01-01

28

Verification and Validation of Models.  

National Technical Information Service (NTIS)

The paper surveys verification and validation of models, especially simulation models in operations research. For verification it discusses (1) general good programming practice (such as modular programming), (2) checking intermediate simulation outputs t...

J. P. C. Kleijnen

1993-01-01

29

Macrotransport-solidification kinetics modeling of equiaxed dendritic growth: Part II. Computation problems and validation on INCONEL 718 superalloy castings  

Microsoft Academic Search

In Part I of the article, a new analytical model that describes solidification of equiaxed dendrites was presented. In this\\u000a part of the article, the model is used to simulate the solidification of INCONEL 718 superalloy castings. The model was incorporated\\u000a into a commercial finite-element code, PROCAST. A special procedure called microlatent heat method (MLHM) was used for coupling\\u000a between

L. Nastac; D. M. Stefanescu

1996-01-01

30

Fall Risk Assessment for Older Adults: The Hendrich II Model  

Microsoft Academic Search

TARGET POPULATION: The Hendrich II Fall Risk Model is intended to be used in the acute care setting to identify adults at risk for falls. The Model is being validated for further application of the specific risk factors in pediatrics and obstetrical populations. VALIDITY AND RELIABILITY: The Hendrich II Fall Risk Model was validated in a large case control study

Deanna Gray-Miceli

31

Mechanical behavior of glass\\/epoxy tubes under combined static loading. Part II: Validation of FEA progressive damage model  

Microsoft Academic Search

Experimental results from a series of biaxial static tests of E-Glass\\/Epoxy tubular specimens [±45]2, were compared successfully with numerical predictions from thick shell FE calculations. Stress analysis was performed in a progressive damage sense consisting of layer piece-wise linear elastic behavior, simulating lamina anisotropic non-linear constitutive equations, failure mode-dependent criteria and property degradation strategies. The effect of accurate modeling of

Alexandros E. Antoniou; Christoph Kensche; Theodore P. Philippidis

2009-01-01

32

ON PREDICTION AND MODEL VALIDATION  

SciTech Connect

Quantification of prediction uncertainty is an important consideration when using mathematical models of physical systems. This paper proposes a way to incorporate ''validation data'' in a methodology for quantifying uncertainty of the mathematical predictions. The report outlines a theoretical framework.

M. MCKAY; R. BECKMAN; K. CAMPBELL

2001-02-01

33

Coupling an Advanced Land Surface Hydrology Model with the Penn State NCAR MM5 Modeling System. Part II: Preliminary Model Validation  

Microsoft Academic Search

A number of short-term numerical experiments conducted by the Penn State-NCAR fifth-generation Mesoscale Model (MM5) coupled with an advanced land surface model, alongside the simulations coupled with a simple slab model, are verified with observations. For clear sky day cases, the MM5 model gives reasonable estimates of radiation forcing at the surface with solar radiation being slightly overestimated probably due

Fei Chen; Jimy Dudhia

2001-01-01

34

WILDFIRE RISK MODEL VALIDATION  

Microsoft Academic Search

Field samples (n = 128) collected during summer 2001 were used to create a series of fuel load and wildfire risk models. Field data collected during the summer 2002 field season (n = 370) were used to determine the accuracy of these models, as well as to refine and rebuild the existing models. Using this new data, we created an

Ben McMahan; Keith T. Weber

35

TUTORIAL: Validating biorobotic models  

NASA Astrophysics Data System (ADS)

Some issues in neuroscience can be addressed by building robot models of biological sensorimotor systems. What we can conclude from building models or simulations, however, is determined by a number of factors in addition to the central hypothesis we intend to test. These include the way in which the hypothesis is represented and implemented in simulation, how the simulation output is interpreted, how it is compared to the behaviour of the biological system, and the conditions under which it is tested. These issues will be illustrated by discussing a series of robot models of cricket phonotaxis behaviour. .

Webb, Barbara

2006-09-01

36

Validation of the Biglan model  

Microsoft Academic Search

The empirical validity of the Biglan model of academic disciplines is supported by the results of this study. Examples are provided to illustrate how the systematic use of this model could enhance the quality of research on university faculty members and the academic administration of institutions of higher learning.

John C. Smart; Charles F. Elton

1982-01-01

37

Tank waste source term inventory validation. Volume II. Letter report.  

National Technical Information Service (NTIS)

This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides a...

1995-01-01

38

ADEOS-II calibration and validation plan  

Microsoft Academic Search

ADEOS-II is the second global observation satellite of Japan following ADEOS to improve satellite-based global change observation system, and to obtain Earth observation data set for elucidation of the global water and energy cycle, carbon cycle, stratospheric ozone depletion and so on. For these mission objectives, five remote sensing instruments are onboard ADEOS-II that was launched on December 14, 2002.

T. Igarashi; N. Matsuura

2003-01-01

39

Validation assessment model for atmospheric retrievals  

Microsoft Academic Search

A linear mathematical error model for the assessment of validation activity of atmospheric retrievals is presented. The purpose of the validation activity is to assess the actual performance of the remote sensing validated system while in orbit by comparing its measurements to some relevant-validating-data sets. The validating system samples volumes of the atmosphere at times and locations that are different

Nikita Pougatchev; Gail Bingham; Joel Cardon; Karen St. Germain; Stephen Mango; Joe Tansock; Vladimir Zavyalov; Stanislav Kireev; David Tobin

2006-01-01

40

ADAPT model: Model use, calibration and validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

This paper presents an overview of the Agricultural Drainage and Pesticide Transport (ADAPT) model and a case study to illustrate the calibration and validation steps for predicting subsurface tile drainage and nitrate-N losses from an agricultural system. The ADAPT model is a daily time step field ...

41

Comparison of MCMI-II and 16PF validity scales.  

PubMed

We administered the Millon Clinical Multiaxial Inventory-II (MCMI-II; Millon, 1987) and the Sixteen Personality Factors Inventory (16PF; Cattell, Eber, & Tatsuoka, 1970) to 131 outpatients in marital therapy and tested the correlation between the validity scales of the two instruments. The results indicated that MCMI-II Disclosure and Debasement scales were positively correlated with the 16PF Fake-Bad scale and negatively correlated with the 16PF Fake-Good scale. The MCMI-II Desirability scale was significantly correlated with the 16PF Fake-Good scale. PMID:7722863

Grossman, L S; Craig, R J

1995-04-01

42

The MicroArray Quality Control (MAQC)-II study of common practices for the development and validation of microarray-based predictive models  

PubMed Central

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.

2012-01-01

43

(Validity of environmental transfer models)  

SciTech Connect

BIOMOVS (BIOspheric MOdel Validation Study) is an international cooperative study initiated in 1985 by the Swedish National Institute of Radiation Protection to test models designed to calculate the environmental transfer and bioaccumulation of radionuclides and other trace substances. The objective of the symposium and workshop was to synthesize results obtained during Phase 1 of BIOMOVS (the first five years of the study) and to suggest new directions that might be pursued during Phase 2 of BIOMOVS. The travelers were an instrumental part of the development of BIOMOVS. This symposium allowed the travelers to present a review of past efforts at model validation and a synthesis of current activities and to refine ideas concerning future development of models and data for assessing the fate, effect, and human risks of environmental contaminants. R. H. Gardner also visited the Free University, Amsterdam, and the National Institute of Public Health and Environmental Protection (RIVM) in Bilthoven to confer with scientists about current research in theoretical ecology and the use of models for estimating the transport and effect of environmental contaminants and to learn about the European efforts to map critical loads of acid deposition.

Blaylock, B.G.; Hoffman, F.O.; Gardner, R.H.

1990-11-07

44

Calibration and validation of an ASM3-based steady-state model for activated sludge systems--part II: Prediction of phosphorus removal.  

PubMed

An ASM3-based steady-state model which can be used for estimating the average nitrogen-removal, sludge-production and phosphorus-removal rates of different biological phosphorus-removing systems (AAO, UCT, intermittent processes) is developed. It considers the wastewater composition, the oxygen and nitrate input in the anaerobic compartment and the interaction between biological phosphorus removal and denitrification for different operating conditions. The model is calibrated and validated with data from a number of long-term pilot and full-scale experiments for Swiss municipal wastewater. The steady-state model is adequate for a comparison of different BPR process configurations or for a first estimation of the nutrient-removal efficiency. It allows the plant performance and key parameters to be determined very quickly. Excel spreadsheets of the model for different flow schemes are available from the corresponding author. PMID:11358304

Koch, G; Kühni, M; Rieger, L; Siegrist, H

2001-06-01

45

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experimental Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-II, and are also valuable tools for the analysis of innovative reactor designs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1992-01-01

46

Validation for a recirculation model.  

PubMed

Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation. PMID:11318387

LaPuma, P T

2001-04-01

47

Bayesian validation assessment of multivariate computational models  

Microsoft Academic Search

Multivariate model validation is a complex decision-making problem involving comparison of multiple correlated quantities, based upon the available information and prior knowledge. This paper presents a Bayesian risk-based decision method for validation assessment of multivariate predictive models under uncertainty. A generalized likelihood ratio is derived as a quantitative validation metric based on Bayes’ theorem and Gaussian distribution assumption of errors

Xiaomo Jiang; Sankaran Mahadevan

2008-01-01

48

Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part I: Benchmark comparisons of WIMS-D5 and DRAGON cell and control rod parameters with MCNP5  

SciTech Connect

In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure-vessel design with 451 vertical coolant channels, and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update calculation methods and models (cell, supercell and reactor) was recently carried out covering cell, supercell (control rod) and core calculations. As a validation of the new models some benchmark comparisons were done with Monte Carlo calculations with MCNP5. This paper presents comparisons of cell and supercell benchmark problems based on a slightly idealized model of the Atucha-I core obtained with the WIMS-D5 and DRAGON codes with MCNP5 results. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, and more symmetric than Atucha-II Cell parameters compared include cell k-infinity, relative power levels of the different rings of fuel rods, and some two-group macroscopic cross sections. Supercell comparisons include supercell k-infinity changes due to the control rods (tubes) of steel and hafnium. (authors)

Mollerach, R. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina); Leszczynski, F. [Comision Nacional de Energia Atomica, Avenida del Libertador 8250, Buenos Aires 1429 (Argentina); Fink, J. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina)

2006-07-01

49

Testing ecological models: the meaning of validation  

Microsoft Academic Search

The ecological literature reveals considerable confusion about the meaning of validation in the context of simulation models. The confusion arises as much from semantic and philosophical considerations as from the selection of validation procedures. Validation is not a procedure for testing scientific theory or for certifying the ‘truth’ of current scientific understanding, nor is it a required activity of every

Edward J. Rykiel

1996-01-01

50

Model validation using experimental watershed data  

Technology Transfer Automated Retrieval System (TEKTRAN)

Experimental watersheds are an invaluable resource for model development and validation. These watersheds allow us to develop and evaluate hydrological models and test them in a variety of climates and ecosystems. Validation efforts of the Simultaneous Heat and Water (SHAW)model are presented for a ...

51

Proposed Modifications to the Conceptual Model of Coaching Efficacy and Additional Validity Evidence for the Coaching Efficacy Scale II-High School Teams  

ERIC Educational Resources Information Center

|The purpose of this study was to determine whether theoretically relevant sources of coaching efficacy could predict the measures derived from the Coaching Efficacy Scale II-High School Teams (CES II-HST). Data were collected from head coaches of high school teams in the United States (N = 799). The analytic framework was a multiple-group…

Myers, Nicholas; Feltz, Deborah; Chase, Melissa

2011-01-01

52

Proposed Modifications to the Conceptual Model of Coaching Efficacy and Additional Validity Evidence for the Coaching Efficacy Scale II-High School Teams  

ERIC Educational Resources Information Center

The purpose of this study was to determine whether theoretically relevant sources of coaching efficacy could predict the measures derived from the Coaching Efficacy Scale II-High School Teams (CES II-HST). Data were collected from head coaches of high school teams in the United States (N = 799). The analytic framework was a multiple-group…

Myers, Nicholas; Feltz, Deborah; Chase, Melissa

2011-01-01

53

Verification and validation of simulation models  

Microsoft Academic Search

This paper surveys verification and validation of models, especially simulation models in operations research. For verification it discusses 1) general good programming practice (such as modular programming), 2) checking intermediate simulation outputs through tracing and statistical testing per module, 3) statistical testing of final simulation outputs against analytical results, and 4) animation. For validation it discusses 1) obtaining real-worl data,

Jack P. C. Kleijnen

1995-01-01

54

Internal validation of models with several interventions.  

PubMed

In cost-effectiveness analyses, models are used typically to synthesize the best available data and/or extrapolate beyond clinical trial data. Ideally, models should be validated both internally and externally. The purpose of this paper is to suggest a test for internal validation of models where several interventions for the same clinical indication are compared. To the best of our knowledge, such a specific test does not yet exist. There are four versions of the test, which consider the relationship between incremental downstream costs and effects in the case of a single or several endpoints. We apply two versions of the validation test to published cost-effectiveness analyses of physical activity programs and demonstrate internal validity of the model in one study and lack of internal validity of the model in the other study. PMID:23124683

Gandjour, Afschin; Gafni, Amiram

2012-11-03

55

Validation of models: statistical techniques and data availability  

Microsoft Academic Search

This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability, three situations are distinguished: (i) no data; (ii) only output data; and (iii) both input and output data. In case (i)-no real data-the analysts can still experiment with the simulation model to obtain simulated data; such an

J. P. C. Kleijen

1999-01-01

56

Ground-water models cannot be validated  

Microsoft Academic Search

Ground-water models are embodiments of scientific hypotheses. As such, the models cannot be proven or validated, but only tested and invalidated. However, model testing and the evaluation of predictive errors lead to improved models and a better understanding of the problem at hand. In applying ground-water models to field problems, errors arise from conceptual deficiencies, numerical errors, and inadequate parameter

Leonard F. Konikow; John D. Bredehoeft

1992-01-01

57

Definition and validation of model transformations  

Microsoft Academic Search

With model transformations becoming more widely used, there is an increasing need for approaches focussing on a systematic development of model transformations. Although a number of approaches for specifying model transformations exist, none of them focusses on systematically validating model transformations with respect to termination and confluence. Termination and confluence ensure that a model transformation always produces a unique result.

Jochen Malte Küster

2006-01-01

58

Experimental Models for Validating Technology  

Microsoft Academic Search

Experimentation helps determine the effectiveness of proposed theories and methods. However, computer science has not developed a concise taxonomy of methods for demonstrating the validity of new techniques. Experimentation is a crucial part of attribute evaluation and can help determine whether methods used in accordance with some theory during product development will result in software being as effective as necessary.

Marvin V. Zelkowitz; Dolores R. Wallace

1998-01-01

59

Validation of Hadronic Models in Geant4.  

National Technical Information Service (NTIS)

Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented...

D. H. Wright G. Folger T. Koi V. Ivantchenko

2006-01-01

60

Statistical validation of system models.  

National Technical Information Service (NTIS)

It is common practice in system analysis to develop mathematical models for system behavior. Frequently, the actual system being modeled is also available for testing and observation, and sometimes the test data are used to help identify the parameters of...

P. Barney C. Ferregut L. E. Perez N. F. Hunter T. L. Paez

1997-01-01

61

Nonlinear model validation using correlation tests  

Microsoft Academic Search

New higher order correlation tests which use model residuals combined with system inputs and outputs are presented to check the validity of a general class of nonlinear models. The new method is illustrated by testing both simple and complex nonlinear system models.

S. A. BILLINGS; Q. M. ZHU

1994-01-01

62

Local thermal seeing modeling validation through observatory measurements  

NASA Astrophysics Data System (ADS)

Dome and mirror seeing are critical effects influencing the optical performance of ground-based telescopes. Computational Fluid Dynamics (CFD) can be used to obtain the refractive index field along a given optical path and calculate the corresponding image quality utilizing optical modeling tools. This procedure is validated using measurements from the Keck II and CFHT telescopes. CFD models of Keck II and CFHT observatories on the Mauna Kea summit have been developed. The detailed models resolve all components that can influence the flow pattern through turbulence generation or heat release. Unsteady simulations generate time records of velocity and temperature fields from which the refractive index field at a given wavelength and turbulence parameters are obtained. At Keck II the Cn2 and l0 (inner scale of turbulence) were monitored along a 63m path sensitive primarily to turbulence around the top ring of the telescope tube. For validation, these parameters were derived from temperature and velocity fluctuations obtained from CFD simulations. At CFHT dome seeing has been inferred from their database that includes telescope delivered Image Quality (IQ). For this case CFD simulations were run for specific orientations of the telescope respect to incoming wind, wind speeds and outside air temperature. For validation, temperature fluctuations along the optical beam from the CFD are turned to refractive index variations and corresponding Optical Path Differences (OPD) then to Point Spread Functions (PSF) that are ultimately compared to the record of IQ.

Vogiatzis, Konstantinos; Otarola, Angel; Skidmore, Warren; Travouillon, Tony; Angeli, George

2012-09-01

63

Real-time validation of mechanical models coupling PGD and constitutive relation error  

NASA Astrophysics Data System (ADS)

In this work, we introduce a general framework that enables to perform real-time validation of mechanical models. This framework is based on two main ingredients: (i) the constitutive relation error which constitutes a convenient and mechanically sound tool for model validation; (ii) a powerful method for model reduction, the proper generalized decomposition, which is used to compute a solution with separated representations and thus to run the validation process quickly. Performances of the proposed approach are illustrated on machining applications.

Bouclier, Robin; Louf, François; Chamoin, Ludovic

2013-10-01

64

Empirical assessment of model validity  

SciTech Connect

The metabolism of amino acids is far more complicated than a 1- to 2-pool model. Yet, these simple models have been extensively used with many different isotopically labeled tracers to study protein metabolism. A tracer of leucine and measurement of leucine kinetics has been a favorite choice for following protein metabolism. However, administering a leucine tracer and following it in blood will not adequately reflect the complex multi-pool nature of the leucine system. Using the tracer enrichment of the ketoacid metabolite of leucine, alpha-ketoisocaproate (KIC), to reflect intracellular events of leucine was an important improvement. Whether this approach is adequate to follow accurately leucine metabolism in vivo or not has not been tested. From data obtained using simultaneous administration of leucine and KIC tracers, we developed a 10-pool model of the in vivo leucine-KIC and bicarbonate kinetic system. Data from this model were compared with conventional measurements of leucine kinetics. The results from the 10-pool model agreed best with the simplified approach using a leucine tracer and measurement of KIC enrichment.

Wolfe, R.R. (Shriners Burns Institute, Galveston, TX (USA))

1991-05-01

65

Fundamentals of population pharmacokinetic modelling: validation methods.  

PubMed

Population pharmacokinetic modelling is widely used within the field of clinical pharmacology as it helps to define the sources and correlates of pharmacokinetic variability in target patient populations and their impact upon drug disposition; and population pharmacokinetic modelling provides an estimation of drug pharmacokinetic parameters. This method's defined outcome aims to understand how participants in population pharmacokinetic studies are representative of the population as opposed to the healthy volunteers or highly selected patients in traditional pharmacokinetic studies. This review focuses on the fundamentals of population pharmacokinetic modelling and how the results are evaluated and validated. This review defines the common aspects of population pharmacokinetic modelling through a discussion of the literature describing the techniques and placing them in the appropriate context. The concept of validation, as applied to population pharmacokinetic models, is explored focusing on the lack of consensus regarding both terminology and the concept of validation itself. Population pharmacokinetic modelling is a powerful approach where pharmacokinetic variability can be identified in a target patient population receiving a pharmacological agent. Given the lack of consensus on the best approaches in model building and validation, sound fundamentals are required to ensure the selected methodology is suitable for the particular data type and/or patient population. There is a need to further standardize and establish the best approaches in modelling so that any model created can be systematically evaluated and the results relied upon. PMID:22799590

Sherwin, Catherine M T; Kiang, Tony K L; Spigarelli, Michael G; Ensom, Mary H H

2012-09-01

66

Model Selection Via Multifold Cross Validation  

Microsoft Academic Search

A natural extension of the simple leave-one-out cross validation (CV) method is to allow the deletion of more than one observations. In this article, several notions of the multifold cross validation (MCV) method have been discussed. In the context of variable selection under a linear regression model, we show that the delete-d MCV criterion is asymptotically equivalent to the well

Ping Zhang

1993-01-01

67

A Sampling Model for Validity  

Microsoft Academic Search

A multifacet sampling model, based on generalizability theory, is developed for the measurement of dispositional attributes. Dispositions are defined in terms of universes of observations, and the value of the disposition is given by the universe score, the mean over the universe defining the disposition. Observed scores provide estimates of universe scores, and errors of measurement are introduced in order

Michael T. Kane

1982-01-01

68

Validation of the Hot Strip Mill Model  

SciTech Connect

The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

Richard Shulkosky; David Rosberg; Jerrud Chapman

2005-03-30

69

Linear Model Selection by Cross-validation  

Microsoft Academic Search

We consider the problem of selecting a model having the best predictive ability among a class of linear models. The popular leave-one-out cross-validation method, which is asymptotically equivalent to many other model selection methods such as the Akaike information criterion (AIC), the Cp, and the bootstrap, is asymptotically inconsistent in the sense that the probability of selecting the model with

Jun Shao

1993-01-01

70

Validation plan for the German CAMAELEON model  

Microsoft Academic Search

Engineers and scientists at the US Army's Night Vision and Electronic Sensors Directorate (NVESD) are in the process of evaluating the German CAMAELEON model, a signature evaluation model that was created for use in designing and evaluating camouflage in the visible spectrum and is based on computational vision methodologies. Verification and preliminary validation have been very positive. For this reason,

James R. McManamey

1997-01-01

71

Minimal type II seesaw model  

SciTech Connect

We propose a minimal type II seesaw model by introducing only one right-handed neutrino besides the SU(2){sub L} triplet Higgs to the standard model. In the usual type II seesaw models with several right-handed neutrinos, the contributions of the right-handed neutrinos and the triplet Higgs to the CP asymmetry, which stems from the decay of the lightest right-handed neutrino, are proportional to their respective contributions to the light neutrino mass matrix. However, in our minimal type II seesaw model, this CP asymmetry is just given by the one-loop vertex correction involving the triplet Higgs, even though the contribution of the triplet Higgs does not dominate the light neutrino masses. For illustration, the Fritzsch-type lepton mass matrices are considered.

Gu Peihong; Zhang He; Zhou Shun [Institute of High Energy Physics, Chinese Academy of Sciences, P.O. Box 918 (4), Beijing 100049 (China)

2006-10-01

72

On validation and invalidation of biological models  

PubMed Central

Background Very frequently the same biological system is described by several, sometimes competing mathematical models. This usually creates confusion around their validity, ie, which one is correct. However, this is unnecessary since validity of a model cannot be established; model validation is actually a misnomer. In principle the only statement that one can make about a system model is that it is incorrect, ie, invalid, a fact which can be established given appropriate experimental data. Nonlinear models of high dimension and with many parameters are impossible to invalidate through simulation and as such the invalidation process is often overlooked or ignored. Results We develop different approaches for showing how competing ordinary differential equation (ODE) based models of the same biological phenomenon containing nonlinearities and parametric uncertainty can be invalidated using experimental data. We first emphasize the strong interplay between system identification and model invalidation and we describe a method for obtaining a lower bound on the error between candidate model predictions and data. We then turn to model invalidation and formulate a methodology for discrete-time and continuous-time model invalidation. The methodology is algorithmic and uses Semidefinite Programming as the computational tool. It is emphasized that trying to invalidate complex nonlinear models through exhaustive simulation is not only computationally intractable but also inconclusive. Conclusion Biological models derived from experimental data can never be validated. In fact, in order to understand biological function one should try to invalidate models that are incompatible with available data. This work describes a framework for invalidating both continuous and discrete-time ODE models based on convex optimization techniques. The methodology does not require any simulation of the candidate models; the algorithms presented in this paper have a worst case polynomial time complexity and can provide an exact answer to the invalidation problem.

Anderson, James; Papachristodoulou, Antonis

2009-01-01

73

Numerical model representation and validation strategies  

SciTech Connect

This paper describes model representation and validation strategies for use in numerical tools that define models in terms of topology, geometry, or topography. Examples of such tools include Computer-Assisted Engineering (CAE), Computer-Assisted Manufacturing (CAM), Finite Element Analysis (FEA), and Virtual Environment Simulation (VES) tools. These tools represent either physical objects or conceptual ideas using numerical models for the purpose of posing a question, performing a task, or generating information. Dependence on these numerical representations require that models be precise, consistent across different applications, and verifiable. This paper describes a strategy for ensuring precise, consistent, and verifiable numerical model representations in a topographic framework. The main assertion put forth is that topographic model descriptions are more appropriate for numerical applications than topological or geometrical descriptions. A topographic model verification and validation methodology is presented.

Dolin, R.M.; Hefele, J.

1997-10-01

74

Uncertainty Modeling via Frequency Domain Model Validation.  

National Technical Information Service (NTIS)

The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much att...

M. R. Waszak D. Andrisani

1999-01-01

75

Systematic Independent Validation of Inner Heliospheric Models  

NASA Astrophysics Data System (ADS)

This presentation is the first in a series which will provide independent validation of community models of the outer corona and inner heliosphere. In this work we establish a set of measures to be used in validating this group of models. We use these procedures to generate a comprehensive set of results from the Wang- Sheeley-Arge (WSA) model which will be used as a baseline, or reference, against which to compare all other models. We also run a test of the validation procedures by applying them to a small set of results produced by the ENLIL Magnetohydrodynamic (MHD) model. In future presentations we will validate other models currently hosted by the Community Coordinated Modeling Center(CCMC), including a comprehensive validation of the ENLIL model. The Wang-Sheeley-Arge (WSA) model is widely used to model the Solar wind, and is used by a number of agencies to predict Solar wind conditions at Earth as much as four days into the future. Because it is so important to both the research and space weather forecasting communities, it is essential that its performance be measured systematically, and independently. In this paper we offer just such an independent and systematic validation. We report skill scores for the model's predictions of wind speed and IMF polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests line of sight magnetograms. For this study we generated model results for monthly magnetograms from the National Solar Observatory (SOLIS), Mount Wilson Observatory and the GONG network, spanning the Carrington rotation range from 1650 to 2068. We compare the influence of the different magnetogram sources, performance at quiet and active times, and estimate the effect of different empirical wind speed tunings. We also consider the ability of the WSA model to identify sharp transitions in wind speed from slow to fast wind. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of MHD models under development for use in forecasting.

MacNeice, P. J.; Taktakishvili, A.

2008-12-01

76

Oil spill impact modeling: development and validation.  

PubMed

A coupled oil fate and effects model has been developed for the estimation of impacts to habitats, wildlife, and aquatic organisms resulting from acute exposure to spilled oil. The physical fates model estimates the distribution of oil (as mass and concentrations) on the water surface, on shorelines, in the water column, and in the sediments, accounting for spreading, evaporation, transport, dispersion, emulsification, entrainment, dissolution, volatilization, partitioning, sedimentation, and degradation. The biological effects model estimates exposure of biota of various behavior types to floating oil and subsurface contamination, resulting percent mortality, and sublethal effects on production (somatic growth). Impacts are summarized as areas or volumes affected, percent of populations lost, and production foregone because of a spill's effects. This paper summarizes existing information and data used to develop the model, model algorithms and assumptions, validation studies, and research needs. Simulation of the Exxon Valdez oil spill is presented as a case study and validation of the model. PMID:15511105

French-McCay, Deborah P

2004-10-01

77

Feature extraction for structural dynamics model validation  

SciTech Connect

This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

2010-11-08

78

UNCERTAINTY MODELING VIA FREQUENCY DOMAIN MODEL VALIDATION  

Microsoft Academic Search

The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much attention in the literature. The influence of the level of accuracy of the uncertainty model on closed loop performance has received

Martin R. Waszak

1999-01-01

79

Computer thermal modeling for the Salt Rock II experiment  

Microsoft Academic Search

The Salt Block II experiment consisted of a cylindrical block of bedded salt which was heated from within by a cylindrical electric heater. It was an extensively instrumented laboratory experiment that served, among other things, as a touchstone against which to measure the validity of a computer thermal model. The thermal model consisted of 282 nodes joined by 572 conductors,

O. L. Jr

1980-01-01

80

VALIDATION OF IMPROVED 3D ATR MODEL  

SciTech Connect

A full-core Monte Carlo based 3D model of the Advanced Test Reactor (ATR) was previously developed. [1] An improved 3D model has been developed by the International Criticality Safety Benchmark Evaluation Project (ICSBEP) to eliminate homogeneity of fuel plates of the old model, incorporate core changes into the new model, and to validate against a newer, more complicated core configuration. This new 3D model adds capability for fuel loading design and azimuthal power peaking studies of the ATR fuel elements.

Soon Sam Kim; Bruce G. Schnitzler

2005-11-01

81

Full-Scale Cookoff Model Validation Experiments.  

National Technical Information Service (NTIS)

This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data w...

A. A. McClelland A. I. Atwood P. O. Curran E. R. Heimdahl M. K. Rattanapote

2004-01-01

82

Verification and Validation of the Sparc Model.  

National Technical Information Service (NTIS)

SPARC (SPARC Performs Automated Reasoning in Chemistry) chemical reactivity models were validated on more than 5000 ionization pKas (in the gas phase and in many organic solvents including water as a function of temperature), 1200 carboxylic acid ester hy...

S. H. Hilal S. W. Karickhoff L. A. Carreira

2003-01-01

83

MELPPROG debris meltdown model and validation experiments  

Microsoft Academic Search

The MELPROG computer code is being developed to provide mechanistic treatment of Light Water Reactor (LWR) accidents from accident initiation through vessel failure. This paper describes a two-dimensional (r-z) debris meltdown model that is being developed for use in the MELPROG code and discusses validation experiments. Of interest to this study is melt progression in particle beds that can form

S. S. Dosanjh; R. O. Gauntt

1988-01-01

84

Construct Validation of the Health Belief Model  

Microsoft Academic Search

A multitrait-multimethod design was employed to assess the construct validity of the Health Belief Model. The data were obtained from a non- representative sample of 85 graduate students at The University of Michigan's School of Public Health. The traits consisted of the respondents' perceptions of: health interest, locus of control, susceptibility to influenza, severity of influenza, benefits provided by a

K. Michael Cummings; Alan M. Jette; Irwin M. Rosenstock

1978-01-01

85

Regimes of validity for balanced models  

NASA Astrophysics Data System (ADS)

Scaling analyses are presented which delineate the atmospheric and oceanic regimes of validity for the family of balanced models described in Gent and McWilliams (1983a). The analyses follow and extend the classical work of Charney (1948) and others. The analyses use three non-dimensional parameters which represent the flow scale relative to the Earth's radius, the dominance of turbulent or wave-like processes, and the dominant component of the potential vorticity. For each regime, the models that are accurate both at leading order and through at least one higher order of accuracy in the appropriate small parameter are then identified. In particular, it is found that members of the balanced family are the appropriate models of higher-order accuracy over a broad range of parameter regimes. Examples are also given of particular atmospheric and oceanic phenomena which are in the regimes of validity for the different balanced models.

Gent, Peter R.; McWilliams, James C.

1983-07-01

86

Verification validation and accreditation of simulation models  

Microsoft Academic Search

This paper presents guidelines for conducting verifica- tion, validation and accreditation (VV&A) of simulation models. Fifteen guiding principles are introduced to help the researchers, practitioners and managers better com- prehend what VV&A is all about. The VV&A activities are described in the modeling and simulation life cycle. A taxonomy of more than 77 V&V techniques is provided to assist simulationists

Osman Balci

1997-01-01

87

Validation of Hadronic Models in GEANT4  

SciTech Connect

Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott, Peter; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

2007-09-26

88

Validation of Hadronic Models in Geant4  

SciTech Connect

Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

Koi, Tatsumi; Wright, Dennis H. [Stanford Linear Accelerator Center, Menlo Park, California (United States); Folger, Gunter; Ivantchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai [CERN, Geneva (Switzerland); Heikkinen, Aatos [Helsinki Institute of Physics, Helsinki (Finland); Truscott, Pete; LeiFan [QinetiQ, Farnborough (United Kingdom); Wellisch, Hans-Peter [Geneva, (Switzerland)

2007-03-19

89

A Hierarchical Systems Approach to Model Validation  

NASA Astrophysics Data System (ADS)

Existing approaches to the question of how climate models should be evaluated tend to rely on either philosophical arguments about the status of models as scientific tools, or on empirical arguments about how well runs from a given model match observational data. These have led to quantitative approaches expressed in terms of model bias or forecast skill, and ensemble approaches where models are assessed according to the extent to which the ensemble brackets the observational data. Unfortunately, such approaches focus the evaluation on models per se (or more specifically, on the simulation runs they produce) as though the models can be isolated from their context. Such approach may overlook a number of important aspects of the use of climate models: - the process by which models are selected and configured for a given scientific question. - the process by which model outputs are selected, aggregated and interpreted by a community of expertise in climatology. - the software fidelity of the models (i.e. whether the running code is actually doing what the modellers think it's doing). - the (often convoluted) history that begat a given model, along with the modelling choices long embedded in the code. - variability in the scientific maturity of different model components within a coupled system. These omissions mean that quantitative approaches cannot assess whether a model produces the right results for the wrong reasons, or conversely, the wrong results for the right reasons (where, say the observational data is problematic, or the model is configured to be unlike the earth system for a specific reason). Hence, we argue that it is a mistake to think that validation is a post-hoc process to be applied to an individual "finished" model, to ensure it meets some criteria for fidelity to the real world. We are therefore developing a framework for model validation that extends current approaches down into the detailed codebase and the processes by which the code is built and tested; and up into the broader scientific context in which models are selected and used to explore theories and test hypotheses. By taking software testing into account, we can build up a picture of the day-to-day practices by which modellers make small changes to the model and test the effect of such changes, both in isolated sections of code, and on the climatology of a full model. By taking the broader scientific context into account, we examine how features of the entire scientific enterprise improve (or impede) model validity, from the collection of observational data, creation of theories, use of these theories to develop models, choices for which model and which model configuration to use, choices for how to set up the runs, and interpretation of the results. Our approach cannot quantify model validity, but it can provide a systematic account of how the detailed practices involved in the development and use of climate models contribute to the quality of modelling systems and the scientific enterprise that they support. By making the relationships between these practices and model quality more explicit, we expect to identify specific strengths and weaknesses the modelling systems, particularly with respect to structural uncertainty in the models, and better characterize the "unknown unknowns".

Easterbrook, S. M.

2011-12-01

90

Concepts of Model Verification and Validation  

SciTech Connect

Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a V&V process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, V&V cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model V&V is fundamentally different from software V&V. Code developers developing computer programs perform software V&V to ensure code correctness, reliability, and robustness. In model V&V, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model V&V guidelines and procedures. The expected outcome of the model V&V process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful V&V program. This objective is motivated by the need for highly accurate numerical models for making predictions to s

B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

2004-10-30

91

Validity of the Sleep Subscale of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II)  

ERIC Educational Resources Information Center

|Currently there are no available sleep disorder measures for individuals with severe and profound intellectual disability. We, therefore, attempted to establish the external validity of the "Diagnostic Assessment for the Severely Handicapped-II" (DASH-II) sleep subscale by comparing daily observational sleep data with the responses of direct care…

Matson, Johnny L.; Malone, Carrie J.

2006-01-01

92

Social anxiety and fear of negative evaluation: construct validity of the BFNE-II.  

PubMed

The Brief Fear of Negative Evaluation Scale [BFNE; Leary, M. R. (1983). A brief version of the Fear of Negative Evaluation Scale. Personality and Social Psychology Bulletin, 9, 371-375] is a self-report measure designed to assess fear of negative evaluation, a characteristic feature of social anxiety disorders [Rapee, R. M., & Heimberg, R. G. (1997). A cognitive-behavioral model of anxiety in social phobia. Behaviour Research and Therapy, 35, 741-756]. Recent psychometric assessments have suggested that a 2-factor model is most appropriate, with the first factor comprising the straightforwardly worded items and the second factor comprising the reverse-worded items [Carleton, R. N., McCreary, D., Norton, P. J., & Asmundson, G. J. G. (in press-a). The Brief Fear of Negative Evaluation Scale, Revised. Depression & Anxiety; Rodebaugh, T. L., Woods, C. M., Thissen, D. M., Heimberg, R. G., Chambless, D. L., & Rapee, R. M. (2004). More information from fewer questions: the factor structure and item properties of the original and brief fear of negative evaluation scale. Psychological Assessment, 2, 169-181; Weeks, J. W., Heimberg, R. G., Fresco, D. M., Hart, T. A., Turk, C. L., Schneier, F. R., et al. (2005). Empirical validation and psychometric evaluation of the Brief Fear of Negative Evaluation Scale in patients with social anxiety disorder. Psychological Assessment, 17, 179-190]. Some researchers recommend the reverse-worded items be removed from scoring [e.g., Rodebaugh, T. L., Woods, C. M., Thissen, D. M., Heimberg, R. G., Chambless, D. L., & Rapee, R. M. (2004). More information from fewer questions: the factor structure and item properties of the original and brief fear of negative evaluation scale. Psychological Assessment, 2, 169-181; Weeks, J. W., Heimberg, R. G., Fresco, D. M., Hart, T. A., Turk, C. L., Schneier, F. R., et al. (2005). Empirical validation and psychometric evaluation of the Brief Fear of Negative Evaluation Scale in patients with social anxiety disorder. Psychological Assessment, 17, 179-190]; however [Carleton, R. N., McCreary, D., Norton, P. J., & Asmundson, G. J. G. (in press-a). The Brief Fear of Negative Evaluation Scale, Revised. Depression & Anxiety; Collins, K. A., Westra, H. A., Dozois, D. J. A., & Stewart, S. H. (2005). The validity of the brief version of the fear of negative evaluation scale. Journal of Anxiety Disorders, 19, 345-359] recommend that these items be reworded to maintain scale sensitivity. The present study examined the reliability and validity of the BFNE-II, a version of the BFNE evaluating revisions of the reverse-worded items in a community sample. A unitary model of the BFNE-II resulted in excellent confirmatory factor analysis fit indices. Moderate convergent and discriminant validity were found when BFNE-II items were correlated with additional independent measures of social anxiety [i.e., Social Interaction Anxiety & Social Phobia Scales; Mattick, R. P., & Clarke, J. C. (1998). Development and validation of measures of social phobia scrutiny fear and social interaction anxiety. Behaviour Research and Therapy, 36, 455-470], and fear [i.e., Anxiety Sensitivity Index; Reiss, S., & McNally, R. J. (1985). The expectancy model of fear. In S. Reiss, R. R. Bootzin (Eds.), Theoretical issues in behaviour therapy (pp. 107--121). New York: Academic Press. and the Illness/Injury Sensitivity Index; Carleton, R. N., Park, I., & Asmundson, G. J. G. (in press-b). The Illness/Injury Sensitivity Index: an examination of construct validity. Depression & Anxiety). These findings support the utility of the revised items and the validity of the BFNE-II as a measure of the fear of negative evaluation. Implications and future research directions are discussed. PMID:16675196

Carleton, R Nicholas; Collimore, Kelsey C; Asmundson, Gordon J G

2006-05-03

93

Regional Validation of General Circulation Models.  

NASA Astrophysics Data System (ADS)

Available from UMI in association with The British Library. Requires signed TDF. General Circulation Models (GCMs) of the atmosphere and ocean have been used for performing a variety of climate experiments. Confidence in the reliability of experimental results can only be obtained by detailed validation of model control run results. It is generally accepted that current GCMs show considerable disagreement in terms of important regional and seasonal details of their control run climatologies, but there are few objective intercomparison studies to substantiate this. This study examines the regional and seasonal details of the mean sea-level pressure (MSLP) fields simulated by three GCMs--the OSU two-layer AGCM, the OSU CGCM and the GISS nine-layer AGCM. Model validation is performed in a North American/Atlantic/European study area. Prior to statistical significance testing, the principal seasonal characteristics of the observed Azores High (AH) and Iceland Low (IL) are analysed with the aid of time-averaged MSLP maps and objective locational and intensity indices. These results are then used to test the performance of the three models in simulating center of action (COA) seasonal cycle characteristics. All three GCMs have large, systematic errors throughout the seasonal cycle in their simulation of AH/IL position and intensity, and all generate an unrealistic 'Greenland High'. The main focus of the investigation is on the statistical aspects of control run validation. Eighteen different statistics are used to test the significance of differences between observed and simulated means, variances and spatial patterns. Test statistic significance is determined using Preisendorfer and Barnett's permutation procedures. Statistics which measure the degree of spatial autocorrelation in latitudinal and longitudinal directions (and at different spatial lags) are also used to compare observed and simulated fields. Validation of the simulated seasonal cycles of MSLP indicates that all three models have statistically significant errors in the mean field, variances and spatial patterns. For the three models examined here, test statistic significance levels for observed/simulated differences in the mean field and spatial patterns are relatively insensitive to decadal time-scale variability in the observed MSLP data. Significance levels for the variance ratio results can depend critically on the choice of observed validation data.

Santer, Benjamin David

94

Bayesian structural equation modeling method for hierarchical model validation  

Microsoft Academic Search

A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level

Xiaomo Jiang; Sankaran Mahadevan

2009-01-01

95

External validation of EPIWIN biodegradation models.  

PubMed

The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6. PMID:15844447

Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

96

Construct validation of the health belief model.  

PubMed

A multitrait-multimethod design was employed to assess the construct validity of the Health Belief Model. The data were obtained from a nonrepresentative sample of 85 graduate students at The University of Michigan's School of Public Health. The traits consisted of the respondents' perceptions of: health interest, locus of control, susceptibility to influenza, severity of influenza, benefits provided by a flu shot, and the barriers or costs associated with getting a flu shot. Each trait was measured by three methods: a seven-point Likert scale, a fixed-alternative multiple choice scale, and a vignette. The results indicate that the Health Belief Model variables can be measured with a substantial amount of convergent validity using Likert or multiple choice questionnaire items. With regard to discriminant validity, evidence suggests that subjects' perceptions of barriers and benefits are quite different from their perceptions of susceptibility and severity. Perceptions of susceptibility and severity are substantially but not entirely independent. Perceived benefits and barriers demonstrate a strong negative relationship which suggests the possibility that these two variables represent opposite ends of a single continuum and not separate health beliefs. These preliminary results provide the basis for developing brief health belief scales that may be administered to samples of consumers and providers to assess educational needs. Such needs assessment, in turn, could then be used to tailor messages and programs to meet the particular needs of a client group. PMID:299611

Cummings, K M; Jette, A M; Rosenstock, I M

1978-01-01

97

Feature selective validation (FSV) for validation of computational electromagnetics (CEM). part II- assessment of FSV performance  

Microsoft Academic Search

The feature selective validation (FSV) method has been proposed as a technique to allow the objective, quantified, comparison of data for inter alia validation of computational electromagnetics. In the companion paper \\

Antonio Orlandi; Alistair P. Duffy; Bruce Archambeault; Giulio Antonini; Dawn E. Coleby; Samuel Connor

2006-01-01

98

HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments  

SciTech Connect

HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

McCann, R.A.; Lowery, P.S.

1987-10-01

99

MODEL VALIDATION FOR ROBUST CONTROL AND CONTROLLER VALIDATION IN A PREDICTION ERROR FRAMEWORK  

Microsoft Academic Search

This paper presents a coherent framework for model validation for control and for controller validation (for stability and for performance) in the context where the validated model uncertainty sets are obtained by prediction error identification methods. Thus, these uncertainty sets are parametrized transfer function sets, with parameters lying in ellipsoidal regions in parameter space. Our results cover two distinct aspects:

Michel Gevers; Xavier Bombois; Gerard Scorletti; Batiment EULER; LAP ISMRA

100

Validation Techniques for the MAS Corona Model  

NASA Astrophysics Data System (ADS)

In the interest of making competent predictions about the structure of the solar corona, we have developed the tools necessary to quantitatively compare the Magnetohydrodynamics Around a Sphere (MAS) numerical model to the observed corona. The SAIC coronal modeling group has written an algorithm that creates a two dimensional polarization brightness image of the corona from the model density output by computing the line of sight integral for scattered white light. Using the tools we developed for the CISM Data Explorer, the white light intensity is extracted from the image around the full disk of the Sun at a given radial height. A series of these images, spanning a Carrington rotation, are processed through this method into a Carrington Map, which we use for direct comparison against LASCO C2 polarization brightness data. Our validation will begin with a chi-squared comparison of model to observations of the latitude of the streamer belt brightness maximum during the Whole Sun Month.

Schmit, D.; Gibson, S.; Detoma, G.; Wiltberger, M.

2006-05-01

101

Propeller aircraft interior noise model, part II: Scale-model and flight-test comparisons  

Microsoft Academic Search

Part I [1] of this paper contains the theory used to create a basic propeller aircraft interior noise model. The model predicts tonal levels of blade passage harmonics in the cabin of a propeller driven aircraft. Part II presents the results of validation studies based on scale-model and flight comparisons.

L. D. Pope; C. M. Willis; W. H. Mayes

1987-01-01

102

Validation of Computational Models in Biomechanics  

PubMed Central

The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models.

Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

2010-01-01

103

Bayes factor of model selection validates FLMP.  

PubMed

The fuzzy logical model of perception (FLMP; Massaro, 1998) has been extremely successful at describing performance across a wide range of ecological domains as well as for a broad spectrum of individuals. An important issue is whether this descriptive ability is theoretically informative or whether it simply reflects the model's ability to describe a wider range of possible outcomes. Previous tests and contrasts of this model with others have been adjudicated on the basis of both a root mean square deviation (RMSD) for goodness-of-fit and an observed RMSD relative to a benchmark RMSD if the model was indeed correct. We extend the model evaluation by another technique called Bayes factor (Kass & Raftery, 1995; Myung & Pitt, 1997). The FLMP maintains its significant descriptive advantage with this new criterion. In a series of simulations, the RMSD also accurately recovers the correct model under actual experimental conditions. When additional variability was added to the results, the models continued to be recoverable. In addition to its descriptive accuracy, RMSD should not be ignored in model testing because it can be justified theoretically and provides a direct and meaningful index of goodness-of-fit. We also make the case for the necessity of free parameters in model testing. Finally, using Newton's law of universal gravitation as an analogy, we argue that it might not be valid to expect a model's fit to be invariant across the whole range of possible parameter values for the model. We advocate that model selection should be analogous to perceptual judgment, which is characterized by the optimal use of multiple sources of information (e.g., the FLMP). Conclusions about models should be based on several selection criteria. PMID:11340853

Massaro, D W; Cohen, M M; Campbell, C S; Rodriguez, T

2001-03-01

104

Operational validation and intercomparison of different types of hydrological models  

Microsoft Academic Search

A theoretical framework for model validation, based on the methodology originally proposed by Klemes [1985, 1986], is presented. It includes a hierarchial validation testing scheme for model application to runoff prediction in gauged and ungauged catchments subject to stationary and nonstationary climate conditions. A case study on validation and intercomparison of three different models on three catchments in Zimbabwe is

Jens Christian Refsgaard; Jesper Knudsen

1996-01-01

105

Expert and construct validity of the Simbionix GI Mentor II endoscopy simulator for colonoscopy  

Microsoft Academic Search

Objectives The main objectives of this study were to establish expert validity (a convincing realistic representation of colonoscopy according to experts) and construct validity (the ability to discriminate between different levels of expertise) of the Simbionix GI Mentor II virtual reality (VR) simulator for colonoscopy tasks, and to assess the didactic value of the simulator, as judged by experts. \\u000aMethods

A. D. Koch; S. N. Buzink; J. Heemskerk; S. M. B. I. Botden; R Veenendaal; J. J. Jakimowicz; E. J. Schoon

2007-01-01

106

Code validation with EBR-II test data  

SciTech Connect

An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experiment Breeder Reactor 2. Three of these codes, NATDEMO/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs in EBR-2, and are also valuable tools for the analysis of innovative reactor designs. 29 refs., 6 figs.

Herzog, J.P.; Chang, L.K.; Dean, E.M.; Feldman, E.E.; Hill, D.J.; Mohr, D.; Planchon, H.P.

1991-01-01

107

When Role Models Have Flaws: Static Validation of Enterprise Security Policies  

Microsoft Academic Search

Modern multiuser software systems have adopted Role- Based Access Control (RBAC) for authorization manage- ment. This paper presents a formal model for RBAC pol- icy validation and a static-analysis model for RBAC sys- tems that can be used to (i) identify the roles required by users to execute an enterprise application, (ii) detect po- tential inconsistencies caused by principal-delegation poli-

Marco Pistoia; Stephen J. Fink; Robert J. Flynn; Eran Yahav

2007-01-01

108

Hydrological validation of multifractal rainfall simulation models  

NASA Astrophysics Data System (ADS)

The observed scaling invariance properties of rainfall time series have often been put forward to justify the choice of multifractal (scaling) models for rainfall stochastic modelling. These models are nevertheless seldom validated on real hydrological applications. Two types of multifractal models - the first one with a Log-Poisson generator and the second one with a uniform generator - were calibrated on a 8 year point rainfall series with a five minute time step. The results obtained with the rainfall series simulated with these models on two hydrological applications (the computation of intensity-duration-frequency, IDF, curves and the conception of a urban drainage storage volume) were compared with those obtained with the original measured rainfall series. The disagreements reveal some limitations of the multifractal models. On the one hand, using the vocabulary of the multifractalists, the models are calibrated on the basis of the statistical properties of the simulated undressed series while the IDF curves are computed on the dressed series. The statistical properties of both types of series clearly differ if a canonical model is used : here the model with the Log-Poisson generator. On the other hand, the optimal dimensions of the storage volume depend on the shape of the hyetographs. The discordances between the volumes obtained with the simulated or measured rainfall series indicate that the temporal structure of the simulated rainfall intensity series (i.e. the shapes of the simulated hyetographs) are not comparable with the one of the measured series. As a conclusion, multifractal models appear to reproduce accuratly only some of the properties of the real measured series. Their appropriateness should not be a priori asserted but verified for each considered application.

Mouhous, N.; Gaume, E.; Andrieu, H.

2003-04-01

109

Modeling Earth Dynamics: Complexity, Uncertainty, and Validation  

NASA Astrophysics Data System (ADS)

28th IUGG Conference on Mathematical Geophysics; Pisa, Italy, 7-11 June 2010; The capabilities and limits of mathematical models applied to a variety of geophysical processes were discussed during the 28th international Conference on Mathematical Geophysics, held in Italy (see the conference Web site (http://cmg2010.pi.ingv.it), which includes abstracts). The conference was organized by the International Union of Geodesy and Geophysics (IUGG) Commission on Mathematical Geophysics (CMG) and the Istituto Nazionale di Geofisica e Vulcanologia and was cosponsored by the U.S. National Science Foundation. The meeting was attended by more than 160 researchers from 26 countries and was dedicated to the theme “Modelling Earth Dynamics: Complexity, Uncertainty, and Validation.” Many talks were dedicated to illustration of the complexities affecting geophysical processes. Novel applications of geophysical fluid dynamics were presented, with specific reference to volcanological and ­subsurface/surface flow processes. In most cases, investigations highlighted the need for multidimensional and multiphase flow models able to describe the nonlinear effects associated with the nonhomogeneous nature of the matter. Fluid dynamic models of atmospheric, oceanic, and environmental systems also illustrated the fundamental role of nonlinear couplings between the different subsystems. Similarly, solid Earth models have made it possible to obtain the first tomographies of the planet; to formulate nonlocal and dynamic damage models of rocks; to investigate statistically the triggering, clustering, and synchronization of faults; and to develop realistic simulators of the planetary dynamo, plate tectonics, and gravity and magnetic fields.

Neri, A.

2010-12-01

110

Modeling distributed hybrid systems in Ptolemy II  

Microsoft Academic Search

We present Ptolemy II as a modeling and simulation environment for distributed hybrid systems. In Ptolemy II, a distributed hybrid system is specified as a hierarchy of models: an event-based top level and distributed islands of hybrid systems. Each hybrid system is in turn a hierarchy of continuous-time models and finite state machines. A variety of models of computation was

Jie Liu; Xiaojun Liu; Edward A. Lee

2001-01-01

111

Document Degradation Models and a Methodology for Degradation Model Validation  

Microsoft Academic Search

Document Degradation Models and a Methodology for DegradationModel Validationby Tapas KanungoChairperson of Supervisory Committee: Professor Robert M. HaralickDepartment of Electrical EngineeringPrinting, photocopying and scanning processes degrade the image quality of adocument. Although research in document understanding started in the sixties, onlytwo document degradation models have been proposed thus far. Furthermore, noattempts have been made to rigorously validate them. In...

Tapas Kanungo

1996-01-01

112

Initialization and validation of a simulation of cirrus using FIRE-II data  

SciTech Connect

Observations from a wide variety of instruments and platforms are used to validate many different aspects of a three-dimensional mesoscale simulation of the dynamics, cloud microphysics, and radiative transfer of a cirrus cloud system observed on 26 November 1991 during the second cirrus field program of the First International Satellite Cloud Climatology Program (ISCCP) Regional Experiment (FIRE-II) located in southeastern Kansas. The simulation was made with a mesoscale dynamical model utilizing a simplified bulk water cloud scheme and a spectral model of radiative transfer. Expressions for cirrus optical properties for solar and infrared wavelength intervals as functions of ice water content and effective particle radius are modified for the midlatitude cirrus observed during FIRE-II and are shown to compare favorably with explicit size-resolving calculations of the optical properties. Rawinsonde, Raman lidar, and satellite data are evaluated and combined to produce a time-height cross section of humidity at the central FIRE-II site for model verification. Due to the wide spacing of rawinsondes and their infrequent release, important moisture features go undetected and are absent in the conventional analyses. The upper-tropospheric humidities used for the initial conditions were generally less than 50% of those inferred from satellite data, yet over the course of a 24-h simulation the model produced a distribution that closely resembles the large-scale features of the satellite analysis. The simulated distribution and concentration of ice compares favorably with data from radar, lidar satellite, and aircraft. Direct comparison is made between the radiative transfer simulation and data from broadband and spectral sensors and inferred quantities such as cloud albedo, optical depth, and top-of-the-atmosphere 11-{mu}m brightness temperature, and the 6.7-{mu}m brightness temperature. 49 refs., 26 figs., 1 tab.

Westphal, D.L. [Naval Research Lab., Monterey, CA (United States); Kinne, S. [NASA/Ames Research Center, Moffet Field, CA (United States); Alvarez, J.M.; Minnis, P. [NASA/Langley Research Center, Hampton, VA (United States)] [and others

1996-12-01

113

A Proposed Model for Simulation Validation Process Maturity  

Microsoft Academic Search

This paper proposes a model of process maturity for simulation validation. The development of this model begins by recognizing validation as a process that generates information as its sole product and therefore resembles a systematic quest for truth. These characteristics distinguish the simulation validation process from other processes such as those for manufacturing or software engineering. This development then substitutes

S. Y. Harmon; Simone M. Youngblood

2005-01-01

114

Validating a Dynamic Microsimulation Model of the Italian Households  

Microsoft Academic Search

The recent literature — including among others Redmond et al. [22], Gupta and Kapur [13], Mitton et al. [19] — has highlighted model alignment and validation as crucial issues to be tackled when microsimulating the consequences of public policies. This paper discusses some preliminary validation experiments performed on the model inputs, procedures and simulation results. The validation process that we

Carlo Bianchi; Marzia Romanelli; Pietro Vagliasindi

115

MINET validation study using EBR-II test data  

SciTech Connect

A natural circulation test transient performed at the EBR-II facility is simulated using the MINET computer code, and calculated results are compared against data from the plant. The MINET EBR-II representation includes much of the intermediate loop and the steam generator system, and corresponds to the portion of the plant usually represented by MINET when it is executed with SSC, the Super System Code. MINET calculations agreed well with the plant transient data, with discrepancies well within uncertainties in thermocouple time constants and boundary conditions.

Van Tuyle, G.J.

1983-11-01

116

Outward Bound Outcome Model Validation and Multilevel Modeling  

ERIC Educational Resources Information Center

This study was intended to measure construct validity for the Outward Bound Outcomes Instrument (OBOI) and to predict outcome achievement from individual characteristics and course attributes using multilevel modeling. A sample of 2,340 participants was collected by Outward Bound USA between May and September 2009 using the OBOI. Two phases of…

Luo, Yuan-Chun

2011-01-01

117

Outward Bound Outcome Model Validation and Multilevel Modeling  

ERIC Educational Resources Information Center

|This study was intended to measure construct validity for the Outward Bound Outcomes Instrument (OBOI) and to predict outcome achievement from individual characteristics and course attributes using multilevel modeling. A sample of 2,340 participants was collected by Outward Bound USA between May and September 2009 using the OBOI. Two phases of…

Luo, Yuan-Chun

2011-01-01

118

Developments in modelling and validation of land surface models  

NASA Astrophysics Data System (ADS)

One of the motivations for modelling the land surface at coarse scales is the need to provide adequate boundary conditions for weather forecasting and climate models. Historically, modelling the land surface initially comprised of bucket-type models whereby both evaporation and runoff were simple functions of the fullness of the bucket. Since then significant development has occurred, including modelling of the vertical transfer of water through the soil and the physical effects of soil and vegetation in limiting evaporative loss. Much of this work initially focussed on vertical detail. Spatial heterogeneity has more recently been incorporated into some of these models, including variations in surface types, soil properties and orography. The latest generation of schemes have also incorporated vegetation growth and competition. Wetland methane and biogenic volatile organic compound emissions, and the impact of ozone on vegetation have also been modelled. In all these developments the interaction between soil, plant and atmosphere are critical. With increasing model complexity, parameter estimation and model validation becomes more challenging. Here we summarise the latest modelling developments and validation techniques in land surface models. This includes the use of multi-flux optimisation techniques and the latest satellite data. We discuss the areas where there is largest uncertainty in the models and how these might be addressed.

Gedney, N.

2009-04-01

119

DEVELOPMENT OF THE MESOPUFF II DISPERSION MODEL  

EPA Science Inventory

The development of the MESOPUFF II regional-scale air quality model is described. MESOPUFF II is a Lagrangian variable-trajectory puff superposition model suitable for modeling the transport, diffusion and removal of air pollutants from multiple point and area sources at transpor...

120

Full-Scale Cookoff Model Validation Experiments  

SciTech Connect

This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release of the live data. Predictions were comparable to the measured values and were highly dependent on the established boundary conditions. Both HWP tests failed at a weld located near the aft closure of the device. More than 90 percent of unreacted explosive was recovered in the end heated experiment and less than 30 percent recovered in the side heated test.

McClelland, M A; Rattanapote, M K; Heimdahl, E R; Erikson, W E; Curran, P O; Atwood, A I

2003-11-25

121

Constructing and Validating a Decadal Prediction Model  

NASA Astrophysics Data System (ADS)

For the purpose of identifying potential sources of predictability of Scottish mean air temperature (SMAT), a redundancy analysis (RA) was accomplished to quantitatively assess the predictability of SMAT from North Atlantic SSTs as well as the temporal consistency of this predictability. The RA was performed between the main principal components of North Atlantic SST anomalies and SMAT anomalies for two time periods: 1890-1960 and 1960-2006. The RA models developed using data from the 1890-1960 period were validated using the 1960-2006 period; in a similar way the model developed based on the 1960-2006 period was validated using data from the 1890-1960 period. The results indicate the potential to forecast decadal trends in SMAT for all seasons in 1960-2006 time period and all seasons with the exception of winter for the period 1890-1960 with the best predictability achieved in summer. The statistical models show the best performance when SST anomalies in the European shelf seas (45°N-65°N, 20W-20E) rather than those for the SSTs over the entire North Atlantic (30°N-75°N, 80°W-30°E) were used as a predictor. The results of the RA demonstrated that similar SSTs modes were responsible for predictions in the first and second half of the 20th century, establishing temporal consistency, though with stronger influence in the more recent half. The SST pattern responsible for explaining the largest amount of variance in SMAT was stronger in the second half of the 20th century and showed increasing influence from the area of the North Sea, possibly due to faster sea-surface warming in that region in comparison with the open North Atlantic. The Wavelet Transform (WT), Cross Wavelet Transform (XWT) and Wavelet Coherence (WTC) techniques were used to analyse RA-model-based forecasts of SMAT in the time-frequency domain. Wavelet-based techniques applied to the predicted and observed time series revealed a good performance of RA models to predict the frequency variability in the SMAT time series. A better performance was obtained for predicting the SMAT during the period 1960-2006 based on 1890-1960 than vice versa, with the exception of winter 1890-1960. In the same frequency bands and in the same time interval there was high coherence between observed and predicted time series. In particular, winter, spring and summer wavelets at 8±1.5 year band were highly correlated in both time periods, with higher correlation in 1960-2006 and in summer.

Foss, I.; Woolf, D. K.; Gagnon, A. S.; Merchant, C. J.

2010-05-01

122

Numerical Validation of Quasigeostrophic Ellipsoidal Vortex Model  

NASA Astrophysics Data System (ADS)

In geophysical flows, coherent vortex structures persist for long time and their interactions dominate the dynamics of geophysical turbulence. Meacham et al. obtained a series of exact unsteady solution of the quasigeostrophic equation, which represents a uniform ellipsoidal vortex patch embedded in a uniform 3D shear field. Miyazaki et al. derived a Hamiltonian dynamical system describing the interactions of N ellipsoidal vortices, where each coherent vortex was modeled by an ellipsoid of uniform potential vorticity. In this paper, direct numerical simulations based on a Contour Advective Semi-Lagrangian algorithm (CASL) are performed in order to assess the validity of the Hamiltonian model. First, the instability of a tilted spheroid is investigated. A prolate spheroid becomes unstable against the third Legendre mode when the aspect ratio is less than 0.44 and the inclination angle is larger than 0.48. Weakly unstable flatter spheroidal vortices emit thin filaments from their top and bottom, whereas strongly ustable slender spheriodal vortices are broken up into two pieces. Secondly, the interaction of two co-rotaing spheroidal vortices on slightly different vertical levels is studied in detail. It is shown that the Hamiltonian model can predict the critical merger distance fairly well. Considerable amounts of energy and enstrophy are dissipated in these events. The correlation between the energy dissipation and the enstrophy dissipation is good, suggesting the existence of a deterministic reset-rule.

Miyazaki, Takeshi; Asai, Akinori; Yamamoto, Masahiro; Fujishima, Shinsuke

2002-11-01

123

Validation of species-climate impact models under climate change  

Microsoft Academic Search

Increasing concern over the implications of climate change for biodiversity has led to the use of species-climate envelope models to project species extinction risk under climate- change scenarios. However, recent studies have demonstrated significant variability in model predictions and there remains a pressing need to validate models and to reduce uncertainties. Model validation is problematic as predictions are made for

MIGUEL B. A RAUJO; R ICHARD G. P EARSON; W ILFRIED T HUILLER; MARKUS E RHARD

2005-01-01

124

Validation of species-climate impact models under climate change  

Microsoft Academic Search

Increasing concern over the implications of climate change for biodiversity has led to the use of species-climate envelope models to project species extinction risk under climate- change scenarios. However, recent studies have demonstrated significant variability in model predictions and there remains a pressing need to validate models and to reduce uncertainties. Model validation is problematic as predictions are made for

MIGUEL B. A RAUJO; W ILFRIED T HUILLER

125

Validation of A Global Hydrological Model  

NASA Astrophysics Data System (ADS)

Freshwater availability has been recognized as a global issue, and its consistent quan- tification not only in individual river basins but also at the global scale is required to support the sustainable use of water. The Global Hydrology Model WGHM, which is a submodel of the global water use and availability model WaterGAP 2, computes sur- face runoff, groundwater recharge and river discharge at a spatial resolution of 0.5. WGHM is based on the best global data sets currently available, including a newly developed drainage direction map and a data set of wetlands, lakes and reservoirs. It calculates both natural and actual discharge by simulating the reduction of river discharge by human water consumption (as computed by the water use submodel of WaterGAP 2). WGHM is calibrated against observed discharge at 724 gauging sta- tions (representing about 50% of the global land area) by adjusting a parameter of the soil water balance. It not only computes the long-term average water resources but also water availability indicators that take into account the interannual and seasonal variability of runoff and discharge. The reliability of the model results is assessed by comparing observed and simulated discharges at the calibration stations and at se- lected other stations. We conclude that reliable results can be obtained for basins of more than 20,000 km2. In particular, the 90% reliable monthly discharge is simu- lated well. However, there is the tendency that semi-arid and arid basins are modeled less satisfactorily than humid ones, which is partially due to neglecting river channel losses and evaporation of runoff from small ephemeral ponds in the model. Also, the hydrology of highly developed basins with large artificial storages, basin transfers and irrigation schemes cannot be simulated well. The seasonality of discharge in snow- dominated basins is overestimated by WGHM, and if the snow-dominated basin is uncalibrated, discharge is likely to be underestimated due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.

Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

126

Document Degradation Models: Parameter Estimation and Model Validation  

Microsoft Academic Search

version (from foreground to background and vice- versa) that occurs independently at each pixel Scanned documents are noisy. Recently, (KHP93, due to light intensity fluctuations and threshold- KHP94, BaiSO), document degradation models were ing level, and (ii) the blurring that occurs due to proposed that model the local distortion introduced the point-spread function of the optical system the during the

Tapas Kanungo; Robert M. Haralick; Henry S. Baird; Werner Stuetzle; David Madigan

1994-01-01

127

Geochemistry Model Validation Report: Material Degradation and Release Model  

SciTech Connect

The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

H. Stockman

2001-09-28

128

Validating and Calibrating Agent-based Models: a Case Study  

Microsoft Academic Search

In this paper we deal with the validation of an agent-based model and, in particular, with the technical validation process, that is to say all the set of test and methods used to analyze if the results of a simulation agree with reality. Today, thanks to some important studies, validation techniques are more and more complete and reliable: many distributional

Pasquale Cirillo; Carlo Bianchi; Mauro Gallegati; Pietro Vagliasindi

2006-01-01

129

Cloud-chemistry interactions modeling and validation  

NASA Astrophysics Data System (ADS)

Clouds play a crucial role in several processes related to atmospheric chemistry, many of which include aerosols: On the one hand, aerosols are needed for the formation of clouds, and any change in the amount or composition of aerosols will influence cloud properties. For instance, the addition of cloud condensation nuclei (CCN) by human activity changes the cloud microphysical properties such that cloud albedo is enhanced and precipitation is suppressed (aerosol indirect effect). Furthermore, adding anthropogenic ice nuclei enhances the ability of supercooled water droplets to freeze. On the other hand, clouds serve as sites for wet-phase oxidation processes, by which gases (e.g., SO2) are transformed to aerosols (e.g., sulfate). Through cloud processing, both the aerosol size distribution and the chemical composition of the air are modified. In addition, clouds influence the atmospheric chemical composition through wet deposition, which is the removal of material from the atmosphere by cloud particles or precipitation. In this talk we will focus on the modeling of aerosol-cloud interactions in global climate models (GCMs). We will start by reviewing some recent literature on this topic. We will then describe in detail a modeling approach developed at the University of Oslo and implemented in the NCAR Community Atmosphere Model. A life-cycle model dealing with the chemistry transformations of 5 aerosol species (sulfate, black carbon, organic matter, sea salt and mineral dust) and their precursors is involved. The aerosol size distributions are then determined, assuming log-normal modes. Cloud droplet nucleation is computed, assuming different hygroscopicities based on chemical composition, and taking into account the competition effect, i.e., the lowering of the supersaturation by the competition between the CCN for the available vapor. Some of the aerosol particles (e.g., soot and dust particles) are assumed to have ice nucleating capabilities, enabling an explicit calculation of heterogeneous freezing, while homogeneous freezing is assumed to take place spontaneously at temperatures below -35°C. Finally, when the aerosols are allowed to influence the state of the climate system, interesting interactions take place between climate change and the chemical processes. Results from such simulations will be presented, as well as results from simulations investigating the sensitivity to parameterization assumptions. Where appropriate, validation of model results against observations, in particular satellite retrievals (e.g., MODIS) will be presented.

Kristjansson, J.; Storelvmo, T.; Iversen, T.

2006-12-01

130

Monthly Precipitation and Runoff Data for Validation of Climate Models.  

National Technical Information Service (NTIS)

A study addressing the best practical means of interfacing general circulation models with hydrological models is presented. Climate models require for their validation estimates of precipitation and runoff assessed for intervals of one month or less, on ...

A. Aureli A. Becker S. Kaden S. I. Solomon

1992-01-01

131

Airlift Capabilities Estimation Prototype: A Case Study in Model Validation.  

National Technical Information Service (NTIS)

This study investigates the application of a life cycle approach to the validation of operational models. The classic waterfall life cycle from software engineering is adapted for use on mathematical models by defining four stages of model development. Ea...

R. McCanne

1993-01-01

132

Design and Development Research: A Model Validation Case  

ERIC Educational Resources Information Center

This is a report of one case of a design and development research study that aimed to validate an overlay instructional design model incorporating the theory of multiple intelligences into instructional systems design. After design and expert review model validation, The Multiple Intelligence (MI) Design Model, used with an Instructional Systems…

Tracey, Monica W.

2009-01-01

133

Challenges of Validating Global Assimilative Models of the Ionosphere  

Microsoft Academic Search

This paper addresses the often surprisingly difficult challenges that arise in conceptually simple validations of global models of the ionosphere. AFRL has been tasked with validating the Utah State University GAIM (Global Assimilation of Ionospheric Measurements) model of the ionosphere, which is run in real time by the Air Force Weather Agency. The USU-GAIM model currently assimilates, in addition to

G. J. Bishop; L. F. McNamara; J. A. Welsh; D. T. Decker; C. R. Baker

2008-01-01

134

Game Theoretic Validation of Air Combat Simulation Models  

Microsoft Academic Search

The paper presents a new game theoretic approach towards the validation of discrete event air combat simulation models. In the approach, statistical techniques are applied for estimating game models based on simulation data. The estimation procedure is presented in cases involving games with both discrete and continuous decision variables. The validity of the simulation model is assessed by comparing the

Jirka Poropudas; Kai Matti Virtanen

2009-01-01

135

Numerical Validation of Quasigeostrophic Ellipsoidal Vortex Model  

NASA Astrophysics Data System (ADS)

In geophysical flows, coherent vortex structures persist for long time and their interactions dominate the dynamics of geophysical turbulence. Meacham et al.1,2) obtained a series of exact unsteady solution of the quasigeostrophic equation, which represents a uniform ellipsoidal vortex patch embedded in a uniform 3D shear field. Miyazaki et al.3,4) have derived a Hamiltonian dynamical system of 3N degrees of freedom, describing the interactions of N ellipsoidal vortices, where each coherent vortex was modeled by an ellipsoid of uniform potential vorticity. The center of vorticity and the angular momentum are conserved, besides the total energy and Casimirs of the system, such as the vortex height and the vortex volume. There are three Poisson-commutable invariants, which is less than the degree of freedom for N>=2, and chaotic motions are observed even in a two-body system. In this paper, direct numerical simulations based on a Contour Advective Semi-Lagrangian algorithm (CASL) are performed in order to assess the validity of the Hamiltonian model. First, the instability of a tilted spheroid is investigated. A prolate spheroid becomes unstable against the third Legendre mode when the aspect ratio is less than 0.44 and the inclination angle is larger than 0.48.5) Weakly unstable flatter spheroidal vortices emit thin filaments from their top and bottom, whereas strongly unstable slender spheroidal vortices are broken up into two pieces. Secondly, the interaction of two co-rotating spheroidal vortices on slightly different vertical levels, which plays a key role in the turbulence dynamics, is studied in detail. The Hamiltonian model can predict the critical distance of symmetric mergers very well, except for mergers of vortices on the same horizontal plane. The model gives poorer predictions in asymmetric cases, where vorticity exchange occurs (instead of merger) along the threshold determined by the Hamiltonian model. The slenderer vortex loses half of its original volume, and the flatter vortex expands slightly absorbing some of the filaments ejected from the slenderer vortex. This is a new dynamical process linked with the energy and enstrophy cascades. Considerable amounts of energy and enstrophy are dissipated in these events. The correlation between the energy dissipation and the enstrophy dissipation is good, suggesting the existence of a simple deterministic reset-rule. 1)S. P. Meacham, et al.: Dyn. Atmos. Oceans 21 (1994) 167. 2)S. P. Meacham, et al: Phys. Fluids 9 (1997) 2310. 3)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 69 (2000) 3233. 4)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 70 (2001) 1942. 5)T. Miyazaki, et al.: J. Phys. Soc. Jpn. 68 (1999) 2592.

Miyazaki, T.; Fujishima, S.

2002-05-01

136

Micromachined accelerometer design, modeling and validation  

SciTech Connect

Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

1998-04-01

137

The validity of computational models in organization science: From model realism to purpose of the model  

Microsoft Academic Search

Computational models are widely applied to address fundamental and practical issues in organization science. Yet, computational modeling in organization science continues to raise questions of validity. In this paper, we argue that computational validity is a balance of three elements: the question or purpose, the experimental design, and the computational model. Simple models which address the question are preferred. Non-simple,

Richard M. Burton; Børge Obel

1995-01-01

138

Validation status of the TARDEC visual model (TVM)  

NASA Astrophysics Data System (ADS)

An extensive effort is ongoing to validate the TARDEC visual mode (TVM). This paper describes in detail some recent efforts to utilize the model for dual need commercial and military target acquisition applications. The recent completion of a visual perception laboratory within TARDEC is a useful tool to calibrate and validate human performance models for specific visual tasks. Some validation examples will be given for low contrast targets along with a description of the TVM and perception laboratory capabilities.

Gerhart, Grant R.; Goetz, Richard; Meitzler, Thomas J.; Karlsen, Robert E.

1996-06-01

139

Validation of the Sexual Assault Symptom Scale II (SASS II) Using a Panel Research Design  

ERIC Educational Resources Information Center

To examine the utility of a self-report scale of sexual assault trauma, 223 female victims were interviewed with the 43-item Sexual Assault Symptom Scale II (SASS II) at 1, 3, 7, 11, and 15 months postassault. Factor analyses using principal-components extraction with an oblimin rotation yielded 7 common factors with 31 items. The internal…

Ruch, Libby O.; Wang, Chang-Hwai

2006-01-01

140

Concurrent and Predictive Validity of the Phelps Kindergarten Readiness Scale-II  

ERIC Educational Resources Information Center

|The purpose of this research was to establish the concurrent and predictive validity of the Phelps Kindergarten Readiness Scale, Second Edition (PKRS-II; L. Phelps, 2003). Seventy-four kindergarten students of diverse ethnic backgrounds enrolled in a northeastern suburban school participated in the study. The concurrent administration of the…

Duncan, Jennifer; Rafter, Erin M.

2005-01-01

141

Concurrent Validity of the Online Version of the Keirsey Temperament Sorter II.  

ERIC Educational Resources Information Center

|Data from the Keirsey Temperament Sorter II online instrument and Myers Briggs Type Indicator (MBTI) for 203 college freshmen were analyzed. Positive correlations appeared between the concurrent MBTI and Keirsey measures of psychological type, giving preliminary support to the validity of the online version of Keirsey. (Contains 28 references.)…

Kelly, Kevin R.; Jugovic, Heidi

2001-01-01

142

Statistical Validation of Spatial Patterns in Agent-Based Models  

Microsoft Academic Search

We present and evaluate an agent-based model (ABM) of land use change at the rural-urban fringe. This paper is part of a project that links the ABM to surveys of residential preferences and historical patterns of development. Validation is an important issue for such models and we discuss the use of distributional phenomena as a method of validation. We then

William Rand; Daniel G. Brown; Scott E. Page; Rick Riolo; Luis E. Fernandez; Moira Zellner

2003-01-01

143

Empirical Validation of Agent Based Models: A Critical Survey  

Microsoft Academic Search

This paper addresses the problem of finding the appropriate method for conducting empirical validation in agent-based (AB) models, which is often regarded as the Achilles’ heel of the AB approach to economic modelling. The paper has two objectives. First, to identify key issues facing AB economists engaged in empirical validation. Second, to critically appraise the extent to which alternative approaches

Giorgio Fagiolo; Paul Windrum; Alessio Moneta

2006-01-01

144

Model selection and validation methods for non-linear systems  

Microsoft Academic Search

The theory of hypothesis testing is used to select a model with the correct structure, and the relation of such a method to the AIC and FPE criteria is investigated. Parametric validation and correlation validation methods are developed for non-non-linear difference equation models. Several shortcomings of traditional methods, especially when applied to non-linear systems, are described.

I. J. LEONTARITIS; S. A. BILLINGS

1987-01-01

145

Petri Net Based Model Validation in Systems Biology  

Microsoft Academic Search

This paper describes the thriving application of Petri net theory for model validation of different types of molecular biological sys- tems. After a short introduction into systems biology we demonstrate how to develop and validate qualitative models of biological pathways in a systematic manner using the well-established Petri net analysis tech- nique of place and transition invariants. We discuss special

Monika Heiner; Ina Koch

2004-01-01

146

A Time-Domain Approach to Model Validation  

Microsoft Academic Search

In this paper we offer a novel approach to control-oriented model validation problems. This approach differs from other available techniques in that it directly uses time-domain input output data to validate uncertainty models. The algorithms we develop are computationally tractable and reduce to (generally non-differentiable) convex feasibility programming problems.

Kameshwar Poolla; Pramod Khargonekar; Ashok Tikku; James Krause; Krishan Nagpal

1992-01-01

147

Validation of Model Forecasts of the Ambient Solar Wind (Invited)  

Microsoft Academic Search

Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the

P. J. MacNeice; M. Hesse; M. M. Kuznetsova; L. Rastaetter; A. Taktakishvili

2009-01-01

148

Validity of Certain Simple Electron Models of Carbons.  

National Technical Information Service (NTIS)

A number of band models are suggested to explain various electronic properties of pregraphitic carbons. Unfortunately, the validity of each model does not extend beyond the property for which it was designed. The general validity of some of the simpler mo...

F. Boy A. Marchand

1976-01-01

149

Searching For Valid Psychiatric Phenotypes: Discrete Latent Variable Models  

PubMed Central

Introduction A primary challenge in psychiatric genetics is the lack of a completely validated system of classification for mental disorders. Appropriate statistical methods are needed to empirically derive more homogenous disorder subtypes. Methods Using the framework of Robins & Guze’s (1970) five phases, latent variable models to derive and validate diagnostic groups are described. A process of iterative validation is proposed through which refined phenotypes would facilitate research on genetics, pathogenesis, and treatment, which would in turn aid further refinement of disorder definitions. Conclusions Latent variable methods are useful tools for defining and validating psychiatric phenotypes. Further methodological research should address sample size issues and application to iterative validation.

Leoutsakos, Jeannie-Marie S.; Zandi, Peter P.; Bandeen-Roche, Karen; Lyketsos, Constantine G.

2010-01-01

150

Covariance and Regression Slope Models for Studying Validity Generalization  

Microsoft Academic Search

Two new models, the covariance and regression slope models, are proposed for assessing validity gen eralization. The new models are less restrictive in that they require only one hypothetical distribution (distri bution of range restriction for the covariance model and distribution of predictor reliability for the regres sion slope model) for their implementation, in contrast to the correlation model which

Nambury S. Raju; Rodney Fralicx; Stephen D. Steinhaus

1986-01-01

151

Poisson validity for orbital debris: II. Combinatorics and simulation  

NASA Astrophysics Data System (ADS)

The International Space Station (ISS) will be at risk from orbital debris and micrometeorite impact (i.e., an impact that penetrates a critical component, possibly leading to loss of life). In support of ISS, last year the authors examined a fundamental assumption upon which the modeling of risk is based; namely, the assertion that the orbital collision problem can be modeled using a Poisson distribution. The assumption was found to be appropriate based upon the Poisson's general use as an approximation for the binomial distribution and the fact that is it proper to physically model exposure to the orbital debris flux environment using the binomial. This paper examines another fundamental issue in the expression of risk posed to space structures: the methodology by which individual incremental collision probabilities are combined to express an overall collision probability. The specific situation of ISS in this regard is that the determination of the level of safety for ISS is made via a single overall expression of critical component penetration risk. This paper details the combinatorial mathematical methods for calculating and expressing individual component (or incremental) penetration risks, utilizing component risk probabilities to produce an overall station penetration risk probability, and calculating an expected probability of loss from estimates for the loss of life given a penetration. Additionally, the paper will examine whether the statistical Poissonian answer to the orbital collision problem can be favorably compared to the results of a Monte Carlo simulation.

Fudge, Michael L.; Maclay, Timothy D.

1997-10-01

152

Designing Experiments to Fit and Validate Regression Models.  

National Technical Information Service (NTIS)

A graphical procedure to select designs for estimation and validation of regression models is illustrated with polynomial regression involving one and two explanatory variables. The procedure is as follows. Assume a null model likely to fit the data. Sele...

J. A. Hoekstra

1984-01-01

153

Description and Validation of the CTH-Urban Runoff Model.  

National Technical Information Service (NTIS)

The report describes the structure, function, and validation of the CTH-Urban Runoff Model. It is a design/analysis model and includes the processes of infiltration, surface depression storage, overland flow, gutter flow, pipe flow, and retention storage....

V. Arnell

1980-01-01

154

Validations of Computational Weld Models: Comparison of Residual Stresses.  

National Technical Information Service (NTIS)

The objective of this project was to validate the capability of VrWeld to simulate the weld buildup process in two experimental setups. Setup I had a central depression with dimensions of 100 x 100 x 3 mm, while Setup II had a central depression with dime...

J. Goldak

2010-01-01

155

Using First-Order Logic for Product Line Model Validation  

Microsoft Academic Search

Product line models are used to drive the generation of requirements for single systems in the product line. They are difficult\\u000a to validate because they are large and complex. By modelling variability and dependency between requirements using propositional\\u000a connectives, a logical expression can be developed for the model. Validation of the selection of requirements from the model\\u000a can be achieved

Mike Mannion

2002-01-01

156

Model validation: a connection between robust control and identification  

Microsoft Academic Search

The gap between the models used in control synthesis and those obtained from identification experiments is considered by investigating the connection between uncertain models and data. The model validation problem addressed is: given experimental data and a model with both additive noise and norm-bounded perturbations, is it possible that the model could produce the observed input-output data? This problem is

Roy S. Smith; John C. Doyle

1992-01-01

157

The Peak-Patch Picture of Cosmic Catalogs. II. Validation  

NASA Astrophysics Data System (ADS)

We compare hierarchical peak-patch catalogs with groups and clusters constructed using Couchman's adaptive P^3^M simulations of a "standard" CDM model with amplitude parameter ?_8_ ~ 1. The N-body groups are found using an identification algorithm based on average cluster overdensity and the peak-patch properties were determined using algorithms from Paper I. We show that the best agreement is obtained if we use (I) density peaks rather than shear eigenvalue peaks as candidate points, (2) ellipsoidal rather than spherical collapse dynamics, thereby including external tidal effects, and (3) a binary reduction method as opposed to a full exclusion method for solving the cloud-in-cloud problem of peak theory. These are also the best choices physically. The mass and internal energy distributions of the peaks and groups are quite similar, but the group kinetic energy distribution is offset by ~12% in velocity dispersion, reflecting our finding that the N-body clusters are invariably out of isolated virial equilibrium. Individual peak-to-group comparisons show good agreement for high-mass, tightly bound groups, with growing scatter for lower masses and looser binding. The final state (Eulerian) spatial distribution of peak patches and N-body clusters are shown to be satisfyingly close. There is indication for the necessity of a small nonlinear correction to the Zeldovich peak velocities.

Bond, J. R.; Myers, S. T.

1996-03-01

158

Multi-terminal Subsystem Model Validation for Pacific DC Intertie  

SciTech Connect

this paper proposes to validate dynamic model of Pacific DC Intertie with the concept of hybrid simulation by combing simulation with PMU measurements. The Playback function available in GE PSLF is adopted for hybrid simulation. It is demonstrated for the first time the feasibility of using Playback function on multi-terminal subsystem. Sensitivity studies are also presented as a result of common PMU measurement quality problem, ie, offset noise and time synchronization. Results indicate a good tolerance of PDCI model generally. It is recommended that requirements should apply to phasor measurements in model validation work to ensure better analysis. Key parameters are identified based on impact of value change to model behavior. Two events are employed for preliminary model validation with PMU measurements. Suggestions are made for PDCI model validation work in the future.

Yang, Bo; Huang, Zhenyu; Kosterev, Dmitry

2008-07-20

159

Evaluation of Cloudiness In Amip Ii Models  

NASA Astrophysics Data System (ADS)

Clouds are clearly an important and very uncertain component of climate and climate change. A major goal of the Atmospheric Model Intercomparison Project (AMIPP) II is to better understand how well global climate models simulate clouds and radiation. AMIP II results for 1979-1993 are available for more than 20 of the world's global climate models. The available AMIP II cloud products include monthly means of ver- tically integrated total cloud amounts and cloud water and ice concentration, and layer values of cloud amount and cloud water/ice. In preliminary analyses the vertically in- tegrated cloud quantities have been compared with quite analogous monthly global cloud products from the International Satellite Cloud Climatorology Project (ISCCP) D2 analysis as well as surface based observations. Overall, long term means of zonal average cloudiness from the AMIP II models are closer to observations than those of the AMIP I models. However, most models have root-mean-square (rms) differences with the D2 observations of greater than 0.12 cloud fraction even after global means have been removed. Global mean cloud optical depth range from approximately the D2 value to more than twice that value for many models; rms differences with the D2 observations are often as large or larger than the global means. For the small number of AMIP II models with currently available cloud level information, there is generally poor agreement with the surface observations of low clouds. These models do, how- ever, have better agreement with D2 high cloud amounts. Some of the disagreement with respect to low clouds is due to the large uncertainties associated with the choice of the proper "overlap parameter" to apply to the model output. In conclusion AMIP II model clouds still agree rather poorly with the best available observations. Especially large uncertainties exist in determining even the qualitative nature of the agreement between models and observations with respect to the vertical structure of clouds.

Weare, B. C.

160

A time-domain approach to model validation  

Microsoft Academic Search

In this paper we offer a novel approach to control-oriented model validation problems. The problem is to decide whether a postulated nominal model with bounded uncertainty is consistent with measured input-output data. Our approach directly uses time-domain input-output data to validate uncertainty models. The algorithms we develop are computationally tractable and reduce to (generally nondifferentiable) convex feasibility problems or to

Kameshwar Poolla; Pramod Khargonekar; Ashok Tikku; James Krause; Krishan Nagpal

1994-01-01

161

Model Selection for Probabilistic Clustering using Cross-Validated Likelihood  

Microsoft Academic Search

Cross-validated likelihood is investigated as a tool for automatically determiningthe appropriate number of components (given the data) in finite mixture modelling,particularly in the context of model-based probabilistic clustering. The conceptualframework for the cross-validation approach to model selection is direct in the sense thatmodels are judged directly on their out-of-sample predictive performance. The methodis applied to a well-known clustering problem in

Padhraic Smyth

1998-01-01

162

Validation of Numerical Shallow Water Models for Tidal Lagoons  

SciTech Connect

An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

Eliason, D.; Bourgeois, A.

1999-11-01

163

Expert and construct validity of the Simbionix GI Mentor II endoscopy simulator for colonoscopy  

Microsoft Academic Search

Objectives  The main objectives of this study were to establish expert validity (a convincing realistic representation of colonoscopy\\u000a according to experts) and construct validity (the ability to discriminate between different levels of expertise) of the Simbionix\\u000a GI Mentor II virtual reality (VR) simulator for colonoscopy tasks, and to assess the didactic value of the simulator, as judged\\u000a by experts.\\u000a \\u000a \\u000a \\u000a Methods  Four groups

Arjun D. Koch; Sonja N. Buzink; Jeroen Heemskerk; Sanne M. B. I. Botden; Roeland Veenendaal; Jack J. Jakimowicz; Erik J. Schoon

2008-01-01

164

Models for Validating Content Coverage. Instructional Assessment Project.  

ERIC Educational Resources Information Center

Two papers are presented that resulted from the Instructional Assessment Project of the Center for Research on Evaluation, Standards, and Student Testing. Both relate to models for assuring assessment validity. The first paper--"Assessing the Content Validity of Teachers' Reports of Content Coverage and Its Relationship to Student Achievement" by…

Burstein, Leigh; And Others

165

Validation of Measurement Models in Global Marketing Research  

Microsoft Academic Search

Global market studies entail more sources of errors, making validation of the measurement of marketing constructs a crucial issue in such research. Yet, few international\\/global marketing studies establish the validity of their measurement models before using them for analysis and interpretation. Using data collected from Hogn Kong, Korea, the Philippines, Kenya, and Malawi, we demonstrate how LISREL can be used

Li Zhang; Kofi Q. Dadzie

1994-01-01

166

Cross-Validation in Statistical Climate Forecast Models  

Microsoft Academic Search

Cross-validation is a statistical procedure that produces an estimate of forecast skill which is less biased than the usual hindcast skill estimates. The cross-validation method systematically deletes one or more cases in a dataset, derives a forecast model from the remaining cases, and tests it on the deleted case or cases. The procedure is nonparametric and can be applied to

Joel Michaelsen

1987-01-01

167

VALIDATION METHODS FOR CHEMICAL EXPOSURE AND HAZARD ASSESSMENT MODELS  

EPA Science Inventory

Mathematical models and computer simulation codes designed to aid in hazard assessment for environmental protection must be verified and validated before they can be used with confidence in a decision-making or priority-setting context. Operational validation, or full-scale testi...

168

A probabilistic model for validation of behavioral hierarchies  

Microsoft Academic Search

A probabilistic model for the validation of behavioral hierarchies is presented. Estimation is by means of iterative convergence to maximum likelihood estimates, and two approaches to assessing the fit of the model to sample data are discussed. The relation of this general probabilistic model to other more restricted models which have been presented previously is explored and three cases of

C. Mitchell Dayton; George B. Macready

1976-01-01

169

Coffee, Segregation, Energy ,a nd the Law: Validating Simulation Models  

Microsoft Academic Search

T his paper provides a framework for discussing the empirical validation of computer simulation models of market phenomena. It considers first a simulation model of a market of competing coffee brands to ask some questions about the purposes and practices of simulation modeling. It defines functional complexity and derive sm easures of this for Schelling' sS egregation model. It refers

Robert E. Marks

170

Parameter Estimation and Validation of Groundwater Flow Models  

Microsoft Academic Search

The reliability of model predictions is determined by the accuracy of the calibration, that is, dependent on the solution of the inverse problem of the groundwater flow equation. Provided that this problem is solved, the model can be validated. If the model is capable of reproducing the measured data for only one significantly different additional hydrologic system state, the model

Rainer Niedermeyer

1998-01-01

171

Comparison and Validation of Two Surface Ship Readiness Models.  

National Technical Information Service (NTIS)

Two models are used by the U.S. Navy to predict surface ship readiness: the Surface Ship Resources to Material Readiness Model (SRM) and the Surface Ship Inventory to Material Readiness Model (SIM). This thesis examines both models, in order to validate t...

B. S. Pennypacker

1994-01-01

172

ECOLOGICAL MODEL TESTING: VERIFICATION, VALIDATION OR NEITHER?  

EPA Science Inventory

Consider the need to make a management decision about a declining animal population. Two models are available to help. Before a decision is made based on model results, the astute manager or policy maker may ask, "Do the models work?" Or, "Have the models been verified or validat...

173

Status of level-2 products of ADEOS-II validation plan  

NASA Astrophysics Data System (ADS)

The mission objectives of ADEOS-II (Midori-II) are to improve satellite-based global earth observation system, and to obtain earth observation data for the contribution to better understanding and elucidation of global change mechanism relevant to earth environmental issues. To implement the objectives, five onboard earth observation sensors are selected based on the science requirement primarily focused on the quantitative estimation of geophysical parameters to describe important processes of the earth system such as water and energy cycle, carbon cycle, and changes in polar stratospheric ozone. This paper describes the present status of level-2 products derived from AMSR and GLI observation data after the launch, in the middle of operational observation / calibration and validation phase, as of the beginning of August, 2003 after four months from the beginning of calibration and validation phase on April 15, 2003.

Igarashi, Tamotsu; Shibata, Akira; Sasaki, Masayuki; Hashimoto, Toshiaki; Imaoka, Keiji; Nakajima, Takashi Y.; Murakami, Hiroshi; Hori, Masahiro; Yamamoto, Hirokazu; Nakayama, Masashige

2004-02-01

174

Using virtual reality to validate system models  

SciTech Connect

To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

Winter, V.L.; Caudell, T.P.

1999-12-09

175

Designing Experiments to Fit and Validate Regression Models.  

National Technical Information Service (NTIS)

A graphical procedure is proposed for selecting designs suitable for estimation and validation of regression models. The procedure is examplified with polynomial regression involving one and two explanatory variables.

J. A. Hoekstra

1984-01-01

176

Validation of Thermal Models for a Prototypical MEMS Thermal Actuator.  

National Technical Information Service (NTIS)

This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and ...

E. S. Piekos J. R. Serrano J. R. Torczynski L. M. Phinney M. A. Gallis

2008-01-01

177

Modeling and validating the grabbing forces of hydraulic log ...  

Treesearch

An operational model grapple was designed and tested to validate grabbing forces of ... The results can be used by equipment manufacturers and researchers ... Check the Northern Research Station web site to request a printed copy of this ...

178

Statistical Validation of Engineering and Scientific Models: Background  

SciTech Connect

A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made.

Hills, Richard G.; Trucano, Timothy G.

1999-05-01

179

Validation of the Gallagher Protonospheric Model.  

National Technical Information Service (NTIS)

Ionospheric models are used in many systems throughout the Department of Defense: for example, they are useful in correcting range errors in radio signals. However, correction models don't incorporate the protonosphere, the torus-shaped plasma volume abov...

K. M. Law

1999-01-01

180

On the problem of model validation for predictive exposure assessments  

NASA Astrophysics Data System (ADS)

The development and use of models for predicting exposures are increasingly common and are essential for many risk assessments of the United States Environmental Protection Agency (EPA). Exposure assessments conducted by the EPA to assist regulatory or policy decisions are often challenged to demonstrate their “scientific validity”. Model validation has thus inevitably become a major concern of both EPA officials and the regulated community, sufficiently so that the EPA's Risk Assessment Forum is considering guidance for model validation. The present paper seeks to codify the issues and extensive foregoing discussion of validation with special reference to the development and use of models for predicting the impact of novel chemicals on the environment. Its preparation has been part of the process in formulating a White Paper for the EPA's Risk Assessment Forum. Its subject matter has been drawn from a variety of fields, including ecosystem analysis, surface water quality management, the contamination of groundwaters from high-level nuclear waste, and the control of air quality. The philosophical and conceptual bases of model validation are reviewed, from which it is apparent that validation should be understood as a task of product (or tool) design, for which some form of protocol for quality assurance will ultimately be needed. The commonly used procedures and methods of model validation are also reviewed, including the analysis of uncertainty. Following a survey of past attempts at resolving the issue of model validation, we close by introducing the notion of a model having maximum relevance to the performance of a specific task, such as, for example, a predictive exposure assessment.

Beck, M. B.; Ravetz, J. R.; Mulkey, L. A.; Barnwell, T. O.

1997-06-01

181

Monte Carol-based validation of neutronic methodology for EBR-II analyses  

SciTech Connect

The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code is based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.

Liaw, J.R.; Finck, P.J. (Argonne National Lab., IL (United States))

1993-01-01

182

Towards integrated design evaluation: Validation of models  

Microsoft Academic Search

This paper professes the importance of the evaluation activity, particularly during the conceptual phase of the engineering design process. It provides a review of a range of complementary models and reports on research aimed at modelling the evaluation of conceptualdesigns,leading to the proposal of a general framework enabling the combination of separate models into a possible future integrated design evaluation

Graham Green

2000-01-01

183

Circumplex Structure and Personality Disorder Correlates of the Interpersonal Problems Model (IIP-C): Construct Validity and Clinical Implications  

ERIC Educational Resources Information Center

|This study assessed the construct validity of the circumplex model of the Inventory of Interpersonal Problems (IIP-C) in Norwegian clinical and nonclinical samples. Structure was examined by evaluating the fit of the circumplex model to data obtained by the IIP-C. Observer-rated personality disorder criteria (DSM-IV, Axis II) were used as…

Monsen, Jon T.; Hagtvet, Knut A.; Havik, Odd E.; Eilertsen, Dag E.

2006-01-01

184

Calibration and validation of DRAINMOD to model bioretention hydrology  

NASA Astrophysics Data System (ADS)

Bioretention hydrology was modeled with DRAINMOD, a widely accepted drainage model. Four bioretention cells monitored for 2 year periods were used in calibration. DRAINMOD can model an internal water storage (IWS) zone configuration. It can be used to predict bioretention hydrology on a continuous, long-term basis. In the validation period, Nash-Sutcliffe coefficients commonly exceeded 0.7.

Brown, R. A.; Skaggs, R. W.; Hunt, W. F.

2013-04-01

185

STRUCTURAL VALIDATION OF SYSTEM DYNAMICS AND AGENT BASED SIMULATION MODELS  

Microsoft Academic Search

Simulation models are becoming increasingly popular in the analysis of important policy issues including global warming, population dynamics, energy systems, and urban planning. The usefulness of these models is predicated on their ability to link observable patterns of behavior of a system to micro-level structures. This paper argues that structural validity of a simulation model -right behavior for the right

Hassan Qudrat-Ullah

186

A Cartilage Growth Mixture Model With Collagen Remodeling: Validation Protocols  

Microsoft Academic Search

A cartilage growth mixture (CGM) model is proposed to address limitations of a model used in a previous study. New stress constitutive equations for the solid matrix are derived and collagen (COL) remodeling is incorporated into the CGM model by allowing the intrinsic COL material constants to evolve during growth. An analytical validation protocol based on experimental data from a

Stephen M. Klisch; Anna Asanbaeva; Sevan R. Oungoulian; Koichi Masuda; Eugene J.-MA. Thonar; Andrew Davol; Robert L. Sah

2008-01-01

187

Validation of a Model of the Domino Effect?  

Microsoft Academic Search

A recent paper proposing a model of the limiting speed of the domino effect is discussed with reference to its need and the need of models in general for validation against experimental data. It is shown that the proposed model diverges significantly from experimentally derived speed estimates over a significant range of domino spacing using data from the existing literature

Ron Larham

2008-01-01

188

Validating simulation model cycle times at Seagate Technology  

Microsoft Academic Search

This paper describes the validation of cycle times in a factory simulation model of a new recording head wafer manufacturing facility at Seagate Technology, Minneapolis, MN. The project goals were to determine which factors were causing cycle time deltas between the model and the actual factory, and to add detail to the simulation model to bring cycle times closer to

Navdeep S. Grewal; Alvin C. Bruska; Timbur M. Wulf; Jennifer K. Robinson

1999-01-01

189

Validating simulation model cycle times at Seagate Technology  

Microsoft Academic Search

This paper describes the validation of cycle times in a factory simulation model of a new Recording Head Wafer manufacturing facility at Seagate Technology, Minneapolis, MN. The project goals were to determine which factors were causing cycle time deltas between the model and the actual factory, and to add detail to the simulation model to bring cycle times closer to

Navdeep S. Grewal; Alvin C. Bruska; Timbur M. Wulf; Jennifer K. Robinson

1999-01-01

190

Parameterisation, calibration and validation of distributed hydrological models  

Microsoft Academic Search

This paper emphasizes the different requirements for calibration and validation of lumped and distributed models. On the basis of a theoretically founded modelling protocol, the different steps in distributed hydrological modelling are illustrated through a case study based on the MIKE SHE code and the 440km2 Karup catchment in Denmark. The importance of a rigorous and purposeful parameterisation is emphasized

Jens Christian Refsgaard

1997-01-01

191

Reversible Oregonator model revisited: Thermodynamic validity  

Microsoft Academic Search

We have investigated the features of thermodynamic equilibrium state of the reversible Oregonator (RO) model with close system approximation for a plausible stoichiometry, in which there is no overall change in the concentrations of the intermediates. For Field-Fo¨rsterling (ff) parameters, this model with close system approximation attains the state of thermodynamic equilibrium at equilibrium concentration of its final product in

Arun K. Dutt

2011-01-01

192

SWAT: Model use, calibration, and validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

SWAT (Soil and Water Assessment Tool) is a comprehensive, semi-distributed river basin model that requires a large number of input parameters which complicates model parameterization and calibration. Several calibration techniques have been developed for SWAT including manual calibration procedures...

193

Combustion turbine dynamic model validation from tests  

Microsoft Academic Search

Studies have been conducted on the Alaskan Railbelt System to examine the hydrothermal power system response after the hydroelectric power units at Bradley Lake are installed. The models and data for the generating units for the initial studies were not complete. Typical models were used, but their response appeared to be faster than judged by operating experience. A testing program

L. N. Hannett; Afzal Khan

1993-01-01

194

Validating the Mexican American Intergenerational Caregiving Model  

ERIC Educational Resources Information Center

The purpose of this study was to substantiate and further develop a previously formulated conceptual model of Role Acceptance in Mexican American family caregivers by exploring the theoretical strengths of the model. The sample consisted of women older than 21 years of age who self-identified as Hispanic, were related through consanguinal or…

Escandon, Socorro

2011-01-01

195

Cross-Validation of Regression Models  

Microsoft Academic Search

A methodolgy for assessment of the predictive ability of regression models is presented. Attention is given to models obtained via subset selection procedures, which are extremely difficult to evaluate by standard techniques. Cross-validatory assessments of predictive ability are obtained and their use illustrated in examples.

Richard R. Picard; R. Dennis Cook

1984-01-01

196

Development and validation of a two-phase, three-dimensional model for PEM fuel cells.  

SciTech Connect

The objectives of this presentation are: (1) To develop and validate a two-phase, three-dimensional transport modelfor simulating PEM fuel cell performance under a wide range of operating conditions; (2) To apply the validated PEM fuel cell model to improve fundamental understanding of key phenomena involved and to identify rate-limiting steps and develop recommendations for improvements so as to accelerate the commercialization of fuel cell technology; (3) The validated PEMFC model can be employed to improve and optimize PEM fuel cell operation. Consequently, the project helps: (i) address the technical barriers on performance, cost, and durability; and (ii) achieve DOE's near-term technical targets on performance, cost, and durability in automotive and stationary applications.

Chen, Ken Shuang

2010-04-01

197

SAGE III aerosol extinction validation in the Arctic winter: comparisons with SAGE II and POAM III  

NASA Astrophysics Data System (ADS)

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use of potential vorticity as a spatial coordinate and thus greatly increased of the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020 nm extinction ratio shows a consistent bias of ~30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that SAGE II and POAM III data sets are not well correlated at and below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

Thomason, L. W.; Poole, L. R.; Randall, C. E.

2007-03-01

198

VERIFICATION AND VALIDATION OF THE SPARC MODEL  

EPA Science Inventory

Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values--that is, the physical and chemical constants that govern reactivity. Although empirical structure-activity relationships that allow estimation of some ...

199

Validating predictions from climate envelope models.  

PubMed

Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 ) and evaluated using occurrence data from 1998-2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species. PMID:23717452

Watling, James I; Bucklin, David N; Speroterra, Carolina; Brandt, Laura A; Mazzotti, Frank J; Romañach, Stephanie S

2013-05-23

200

Validation of geometric models for fisheye lenses  

Microsoft Academic Search

The paper focuses on the photogrammetric investigation of geometric models for different types of optical fisheye constructions (equidistant, equisolid-angle, sterographic and orthographic projection). These models were implemented and thoroughly tested in a spatial resection and a self-calibrating bundle adjustment. For this purpose, fisheye images were taken with a Nikkor 8 mm fisheye lens on a Kodak DSC 14n Pro digital camera

D. Schneider; E. Schwalbe; H.-G. Maas

2009-01-01

201

Improving Communication with People with an Intellectual Disability: The content validation of the Biala-II profile  

Microsoft Academic Search

This study explores the content validity of a profile used to describe the communication behaviour of people with intellectual disabilities. The profile, named Biala-II - a Wiradjuri (an Australian Aboriginal language) word for \\

Anthony J. Shaddock; Anthony T. Spinks; Anna Esbensen

2000-01-01

202

Solution Verification Linked to Model Validation, Reliability, and Confidence  

SciTech Connect

The concepts of Verification and Validation (V&V) can be oversimplified in a succinct manner by saying that 'verification is doing things right' and 'validation is doing the right thing'. In the world of the Finite Element Method (FEM) and computational analysis, it is sometimes said that 'verification means solving the equations right' and 'validation means solving the right equations'. In other words, if one intends to give an answer to the equation '2+2=', then one must run the resulting code to assure that the answer '4' results. However, if the nature of the physics or engineering problem being addressed with this code is multiplicative rather than additive, then even though Verification may succeed (2+2=4 etc), Validation may fail because the equations coded are not those needed to address the real world (multiplicative) problem. We have previously provided a 4-step 'ABCD' quantitative implementation for a quantitative V&V process: (A) Plan the analyses and validation testing that may be needed along the way. Assure that the code[s] chosen have sufficient documentation of software quality and Code Verification (i.e., does 2+2=4?). Perform some calibration analyses and calibration based sensitivity studies (these are not validated sensitivities but are useful for planning purposes). Outline the data and validation analyses that will be needed to turn the calibrated model (and calibrated sensitivities) into validated quantities. (B) Solution Verification: For the system or component being modeled, quantify the uncertainty and error estimates due to spatial, temporal, and iterative discretization during solution. (C) Validation over the data domain: Perform a quantitative validation to provide confidence-bounded uncertainties on the quantity of interest over the domain of available data. (D) Predictive Adequacy: Extend the model validation process of 'C' out to the application domain of interest, which may be outside the domain of available data in one or more planes of multi-dimensional space. Part 'D' should provide the numerical information about the model and its predictive capability such that given a requirement, an adequacy assessment can be made to determine of more validation analyses or data are needed.

Logan, R W; Nitta, C K

2004-06-16

203

Discussion of model calibration and validation for transient dynamics simulation.  

SciTech Connect

Model calibration refers to a family of inverse problem-solving numerical techniques used to infer the value of parameters from test data sets. The purpose of model calibration is to optimize parametric or non-parametric models in such a way that their predictions match reality. In structural dynamics an example of calibration is the finite element model updating technology. Our purpose is essentially to discuss calibration in the broader context of model validation. Formal definitions are proposed and the notions of calibration and validation are illustrated using an example of transient structural dynamics that deals with the propagation of a shock wave through a hyper-foam pad. An important distinction that has not been made in finite element model updating and that is introduced here is that parameters of the numerical models or physical tests are categorized into input parameters, calibration variables, controllable and uncontrollable variables. Such classification helps to define model validation goals. Finally a path forward for validating numerical model is discussed and the relationship with uncertainty assessment is stressed.

Hemez, F. M. (François M.); Doebling, S. W. (Scott W.); Wilson, A. C. (Amanda C.)

2001-01-01

204

Validation of nuclear models used in space radiation shielding applications  

NASA Astrophysics Data System (ADS)

A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

Norman, Ryan B.; Blattnig, Steve R.

2013-01-01

205

The Reliability and Validity of the Brief Acculturation Rating Scale for Mexican Americans-II for Children and Adolescents  

Microsoft Academic Search

This study investigated the reliability and validity of the Brief Acculturation Rating Scale for Mexican Americans-II (ARSMA-II) using two samples of Mexican American children: 292 middle school students from a mid-sized culturally diverse southwestern city, and 116 third-through fifth graders in culturally homogeneous rural elementary schools. Results provided evidence of the reliability and validity of this measure of acculturation for

Sheri Bauman

2005-01-01

206

Validation of the Hot Strip Mill Model  

Microsoft Academic Search

The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI\\/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot

Richard Shulkosky; David Rosberg; Jerrud Chapman

2005-01-01

207

Validation and discovery from computational biology models  

Microsoft Academic Search

Simulation software is often a fundamental component in systems biology projects and provides a key aspect of the integration of experimental and analytical techniques in the search for greater understanding and prediction of biology at the systems level. It is important that the modelling and analysis software is reliable and that techniques exist for automating the analysis of the vast

Mariam Kiran; Simon Coakley; Neil Walkinshaw; Phil Mcminn; Mike Holcombe

2008-01-01

208

WEPP: Model use, calibration and validation  

Technology Transfer Automated Retrieval System (TEKTRAN)

The Water Erosion Prediction Project (WEPP) model is a process-based, continuous simulation, distributed parameter, hydrologic and soil erosion prediction system. It has been developed over the past 25 years to allow for easy application to a large number of land management scenarios. Most general o...

209

Understanding internet topology: principles, models, and validation  

Microsoft Academic Search

Building on a recent effort that combines a first-principles approach to modeling router-level connectivity with a more pragmatic use of statistics and graph theory, we show in this paper that for the Internet, an improved understanding of its physical infrastructure is possible by viewing the physical connectivity as an annotated graph that delivers raw connectivity and bandwidth to the upper

David Alderson; Lun Li; Walter Willinger; John C. Doyle

2005-01-01

210

Modeling and Validation of Biased Human Trust  

Microsoft Academic Search

When considering intelligent agents that interact with humans, having an idea of the trust levels of the human, for example in other agents or services, can be of great importance. Most models of human trust that exist, are based on some rationality assumption, and biased behavior is not represented, whereas a vast literature in Cognitive and Social Sciences indicates that

Mark Hoogendoorn; S. Waqar Jaffry; Peter-Paul van Maanen; Jan Treur; P. P. van Maanen

2011-01-01

211

Testing and Validation of a Low Cost Cystoscopy Teaching Model  

PubMed Central

Objective The objective of this study was to determine whether the use of a low cost cystoscopy model effectively trains residents in cystourethroscopy and to validate the model as a teaching tool. Study Design A randomized, controlled, and evaluator-blinded study was performed. Baseline skills in 29 OB/GYN residents were assessed, using the validated Objective Structured Assessment of Technical Skills (OSATS) checklists for cystourethroscopy, on fresh-frozen cadavers. Residents were randomized to one of two arms, a study arm using the cystoscopy model and a control arm. Repeat OSATS testing was performed. Results The study group demonstrated statistically significant decreases in cystoscope assembly time (p=0.004), and increases in task specific checklist and global rating scale scores (p values <0.0001) compared to the controls. Conclusions Use of the bladder model exhibited validity in enhancing performance and knowledge of cystourethroscopy among OB/GYN residents.

BOWLING, C. Bryce; GREER, W. Jerod; BRYANT, Shannon A.; GLEASON, Jonathan L.; SZYCHOWSKI, Jeff M.; VARNER, R. Edward; HOLLEY, Robert L.; RICHTER, Holly E.

2011-01-01

212

SAGE III aerosol extinction validation in the Arctic winter: comparisons with SAGE II and POAM III  

NASA Astrophysics Data System (ADS)

The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use potential vorticity as a spatial coordinate and thus greatly increased the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020 nm extinction ratio shows a consistent bias of ~30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that the two data sets are not well correlated below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

Thomason, L. W.; Poole, L. R.; Randall, C. E.

2006-11-01

213

Modeling of sensor nets in Ptolemy II  

Microsoft Academic Search

This paper describes a modeling and simulation framework called VisualSense for wireless sensor networks that builds on and leverages Ptolemy II. This framework supports actor-oriented definition of sensor nodes, wireless communication channels, physical media such as acoustic channels, and wired subsystems. The software architecture consists of a set of base classes for defining channels and sensor nodes, a library of

Philip Baldwin; Sanjeev Kohli; Edward A. Lee; Xiaojun Liu; Yang Zhao

2004-01-01

214

Verification and Validation of Agent-based Scientific Simulation Models  

Microsoft Academic Search

Most formalized model verification and validation tech- niques come from industrial and system engineering for discrete-event system simulations. These techniques are widely used in computational science. The agent-based modeling approach is different from discrete event modeling approaches largely used in industrial and system engineer- ing in many aspects. Since the agent-based modeling ap- proach has recently become an attractive and

Xiaorong Xiang; Ryan Kennedy; Gregory Madey; Steve Cabaniss

2005-01-01

215

Computational Model with Experimental Validation for DNA Flow in Microchannels.  

National Technical Information Service (NTIS)

The authors compare a computational model to experimental data for DNA-laden flow in microchannels. The purpose of this work in progress is to validate a new numerical algorithm for viscoelastic flow using the Oldroyd-B model. The numerical approach is a ...

A. Nonaka S. Gulati D. Trebotich G. H. Miller S. J. Muller D. Liepmann

2005-01-01

216

Validating regional-scale surface energy balance models  

Technology Transfer Automated Retrieval System (TEKTRAN)

One of the major challenges in developing reliable regional surface flux models is the relative paucity of scale-appropriate validation data. Direct comparisons between coarse-resolution model flux estimates and flux tower data can often be dominated by sub-pixel heterogeneity effects, making it di...

217

Model Solution for C3I Message Translation and Validation.  

National Technical Information Service (NTIS)

The purpose of this document is to describe an artifact, the Message Translation and Validation (MTV) model solution. The MTV model solution is a general solution, written in Ada, tha can be used in a system when the system must convert between different ...

C. Plinta K. Lee M. Rissman

1989-01-01

218

Institutional Effectiveness: A Model for Planning, Assessment & Validation.  

ERIC Educational Resources Information Center

The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

Truckee Meadows Community Coll., Sparks, NV.

219

Validation of a metabolic cotton seedling emergence model  

Technology Transfer Automated Retrieval System (TEKTRAN)

A seedling emergence model based on thermal dependence of enzyme activity in germinating cotton was developed. The model was validated under both laboratory and field conditions with several cotton lines under diverse temperature regimes. Four commercial lines were planted on four dates in Lubbock T...

220

FINITE ELEMENT MODEL DEVELOPMENT AND VALIDATION FOR AIRCRAFT FUSELAGE STRUCTURES  

Microsoft Academic Search

The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated.

Ralph D. Buehrle; Gary A. Fleming; Richard S. Pappa; Ferdinand W. Grosveld

2000-01-01

221

Institutional Effectiveness: A Model for Planning, Assessment & Validation.  

ERIC Educational Resources Information Center

|The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

Truckee Meadows Community Coll., Sparks, NV.

222

Validation of 1-D transport and sawtooth models for ITER  

SciTech Connect

In this paper the authors describe progress on validating a number of local transport models by comparing their predictions with relevant experimental data from a range of tokamaks in the ITER profile database. This database, the testing procedure and results are discussed. In addition a model for sawtooth oscillations is used to investigate their effect in an ITER plasma with alpha-particles.

Connor, J.W.; Turner, M.F. [UKAEA, Culham (United Kingdom); Attenberger, S.E.; Houlberg, W.A. [ORNL, Oak Ridge, TN (United States)] [and others

1996-12-31

223

Experiments for foam model development and validation.  

SciTech Connect

A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

2008-09-01

224

ARCJET plasma modeling with experimental validation  

NASA Astrophysics Data System (ADS)

We report for the first time thermal non-equilibrium (separate electron and gas temperatures) numerical results for a hydrazine arcjet. All viscous flow properties are considered, assuming laminar axisymmetric flow. The model includes anode temperature distribution, and the electrical conductivity is coupled to the flow properties, allowing for a self-consistent current distribution. The numerical solution algorithm employs the compressible form of the PISO algorithm to solve the continuity and momentum equations. Run time is a few hours on a Convex C240 Mainframe with a 44 x 24 grid. Numerical results are presented for low power hydrogen and hydrazine thrusters. Preliminary results of quadruple electrostatic probe measurements at the exit plane of a 1 kW hydrazine arcjet, including ne and Te profiles, are presented. The quadruple probe model includes the effects of Te and Ne gradients across the probe volume to extract Te and Ne radial profiles from the asymmetric raw probe data. A time-of-flight electrostatic probe technique for measuring heavy particle velocities is described which, when coupled with the quadruple probe data can yield radial profiles of Ne(r), Te(r), Ti(r) and Ui(r). Experimental investigations of the energy deposition processes in the nozzle and constrictor regions of a 1-2 kill hydrazine arcjet are being performed. Electron number density and electron temperature measurements, using an array of flush-mounted Langmuir probes, will be made in the boundary layer.

Krier, Herman; Burton, Rodney L.; Megli, Thomas W.; Bufton, Scott A.; Tiliakos, Nicholas T.

1994-09-01

225

Experimental determination of validated, critical interfacial modes I and II energy release rates in a composite sandwich panel  

Microsoft Academic Search

A validated experimental approach to obtaining critical mode I and mode II energy release rates for interfacial failure in a sandwich composite panel is outlined in this paper. By modifying the geometry of the sandwich structure to align the face sheet-core interface to coincide with the neutral axis, it is possible to obtain critical mode I and mode II energy

Paul Davidson; Anthony M. Waas; Chandra S. Yerramalli

226

The Validation of Climate Models: The Development of Essential Practice  

NASA Astrophysics Data System (ADS)

It is possible from both a scientific and philosophical perspective to state that climate models cannot be validated. However, with the realization that the scientific investigation of climate change is as much a subject of politics as of science, maintaining this formal notion of "validation" has significant consequences. For example, it relegates the bulk of work of many climate scientists to an exercise of model evaluation that can be construed as ill-posed. Even within the science community this motivates criticism of climate modeling as an exercise of weak scientific practice. Stepping outside of the science community, statements that validation is impossible are used in political arguments to discredit the scientific investigation of climate, to maintain doubt about projections of climate change, and hence, to prohibit the development of public policy to regulate the emissions of greenhouse gases. With the acceptance of the impossibility of validation, scientists often state that the credibility of models can be established through an evaluation process. A robust evaluation process leads to the quantitative description of the modeling system against a standard set of measures. If this process is standardized as institutional practice, then this provides a measure of model performance from one modeling release to the next. It is argued, here, that such a robust and standardized evaluation of climate models can be structured and quantified as "validation." Arguments about the nuanced meaning of validation and evaluation are a subject about which the climate modeling community needs to develop a standard. It does injustice to a body of science-based knowledge to maintain that validation is "impossible." Rather than following such a premise, which immediately devalues the knowledge base, it is more useful to develop a systematic, standardized approach to robust, appropriate validation. This stands to represent the complexity of the Earth's climate and its investigation. This serves not only the scientific method, but the communication of the results of that scientific investigation to other scientists and to those with a stake in those scientific results. It sets a standard, which is essential practice for simulation science with societal ramifications.

Rood, R. B.

2011-12-01

227

Validity of empirical models of exposure in asphalt paving  

PubMed Central

Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers.

Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

2002-01-01

228

Development and validation of a three-dimensional morphological model  

Microsoft Academic Search

Computer modeling of sediment transport patterns is generally recognized as a valuable tool for understanding and predicting morphological developments. In practice, state-of-the-art computer models are one- or two-dimensional (depth-averaged) and have a limited ability to model many of the important three-dimensional flow phenomena found in nature. This paper presents the implementation and validation of sediment transport formulations within the proven

G. R. Lesser; J. A. Roelvink; J. A. T. M. van Kester; G. S. Stelling

2004-01-01

229

Structure, use, and validation of the IEUBK model.  

PubMed

The potential impact of the effects of lead in children is a major concern. Although measurements of lead concentration can be made in a geographic area, it is difficult to predict the effects of this exposure that involve complicated biologic functions. Dynamic mathematical models that can be simulated on a digital computer provide one method of analysis to facilitate the prediction process. The integrated exposure uptake biokinetic (IEUBK) model is a dynamic mathematical model that has been discretized for execution on a digital computer. This paper is concerned with the general difficulties in validating a dynamic model of this type. A number of the general pitfalls of validating a model of this type are presented. The illustrations are of a general nature not requiring an understanding of the physiologic effects of lead on children. The concept of validating a model by comparing results to historical data is discussed. A comparison is made with traditional modeling efforts having this form of dynamic model. Also included are general mathematic concepts illustrating potential difficulties with intuitive analyses in calibrating a dynamic model. PMID:9860911

Mickle, M H

1998-12-01

230

Structure, use, and validation of the IEUBK model.  

PubMed Central

The potential impact of the effects of lead in children is a major concern. Although measurements of lead concentration can be made in a geographic area, it is difficult to predict the effects of this exposure that involve complicated biologic functions. Dynamic mathematical models that can be simulated on a digital computer provide one method of analysis to facilitate the prediction process. The integrated exposure uptake biokinetic (IEUBK) model is a dynamic mathematical model that has been discretized for execution on a digital computer. This paper is concerned with the general difficulties in validating a dynamic model of this type. A number of the general pitfalls of validating a model of this type are presented. The illustrations are of a general nature not requiring an understanding of the physiologic effects of lead on children. The concept of validating a model by comparing results to historical data is discussed. A comparison is made with traditional modeling efforts having this form of dynamic model. Also included are general mathematic concepts illustrating potential difficulties with intuitive analyses in calibrating a dynamic model.

Mickle, M H

1998-01-01

231

Data for model validation summary report. A summary of data for validation and benchmarking of recovery boiler models  

SciTech Connect

One of the tasks in the project was to obtain data from operating recovery boilers for the purpose of model validation. Another task was to obtain water model data and computer output from University of British Columbia for purposes of benchmarking the UBC model against other codes. In the course of discussions on recovery boiler modeling over the course of this project, it became evident that there would be value in having some common cases for carrying out benchmarking exercises with different recovery boiler models. In order to facilitate such a benchmarking exercise, the data that was obtained on this project for validation and benchmarking purposes has been brought together in a single, separate report. The intent is to make this data available to anyone who may want to use it for model validation. The report contains data from three different cases. Case 1 is an ABBCE recovery boiler which was used for model validation. The data are for a single set of operating conditions. Case 2 is a Babcock & Wilcox recovery boiler that was modified by Tampella. In this data set, several different operating conditions were employed. The third case is water flow data supplied by UBC, along with computational output using the UBC code, for benchmarking purposes.

Grace, T.; Lien, S.; Schmidl, W.; Salcudean, M.; Abdullah, Z.

1997-07-01

232

Validation of the Millon Clinical Multiaxial Inventory for Axis II disorders: does it meet the Daubert standard?  

PubMed

Relevant to forensic practice, the Supreme Court in Daubert v. Merrell Dow Pharmaceuticals, Inc. (1993) established the boundaries for the admissibility of scientific evidence that take into account its trustworthiness as assessed via evidentiary reliability. In conducting forensic evaluations, psychologists and other mental health professionals must be able to offer valid diagnoses, including Axis II disorders. The most widely available measure of personality disorders is the Million Clinical Multiaxial Inventory (MCMI) and its subsequent revisions (MCMI-II and MCMI-III). We address the critical question, "Do the MCMI-II and MCMI-III meet the requirements of Daubert?" Fundamental problems in the scientific validity and error rates for MCMI-III appear to preclude its admissibility under Daubert for the assessment of Axis II disorders. We address the construct validity for the MCMI and MCMI-II via a meta-analysis of 33 studies. The resulting multitrait-multimethod approach allowed us to address their convergent and discriminant validity through method effects (Marsh, 1990). With reference to Daubert, the results suggest a circumscribed use for the MCMI-II with good evidence of construct validity for Avoidant, Schizotypal, and Borderline personality disorders. PMID:10439726

Rogers, R; Salekin, R T; Sewell, K W

1999-08-01

233

Validation of Computer Models for Homeland Security Purposes  

SciTech Connect

At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources.

Schweppe, John E.; Ely, James H.; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

2005-10-23

234

Validity-weighted model vector-based retrieval of video  

NASA Astrophysics Data System (ADS)

Model vector-based retrieval is a novel approach for video indexing that uses a semantic model vector signature that describes the detection of a fixed set of concepts across a lexicon. The model vector basis is created using a set of independent binary classifiers that correspond to the semantic concepts. The model vectors are created by applying the binary detectors to video content and measuring the confidence of detection. Once the model vectors are extracted, simple techniques can be used for searching to find similar matches in a video database. However, since confidence scores alone do not capture information about the reliability of the underlying detectors, techniques are needed to ensure good performance in the presence of varying qualities of detectors. In this paper, we examine the model vector-based retrieval framework for video and propose methods using detector validity to improve matching performance. In particular, we develop a model vector distance metric that weighs the dimensions using detector validity scores. In this paper, we explore the new model vector-based retrieval method for video indexing and empirically evaluate the retrieval effectiveness on a large video test collection using different methods of measuring and incorporating detector validity indicators.

Smith, John R.; Lin, Ching-Yung; Naphade, Milind R.; Natsev, Apostol; Tseng, Belle L.

2003-12-01

235

Numerical model for the performance prediction of a PEM fuel cell. Model results and experimental validation  

Microsoft Academic Search

This work presents a Computational Fluid Dynamics (CFD) model developed for a 50cm2 fuel cell with parallel and serpentine flow field bipolar plates, and its validation against experimental measurements. The numerical CFD model was developed using the commercial ANSYS FLUENT software, and the results obtained were compared with the experimental results in order to perform a model validation. A single

Alfredo Iranzo; Miguel Muñoz; Felipe Rosa; Javier Pino

2010-01-01

236

Modeling of copper(II), cadmium(II), and lead(II) adsorption on red mud  

SciTech Connect

The adsorption of toxic heavy metal cations, i.e., Cu(II), Cd(II), and Pb(II), on red muds has been modeled with the aid of a modified Langmuir approach assuming single-site adsorption and of a double-site binding model incorporating the effect of pH. For equilibrium concentrations of metal solutions between 0.03 and 5.8 mmol.dm{sup {minus}3} and equilibrium pH between 4.4 and 5.6, adsorption equilibrium constants corresponding to single- and double-site binding were found by linear and nonlinear least-squares approximation, respectively, and the double-site model was shown to conform better to experimental data. The contributions of the monomeric and dimeric hydroxo-complexes of Cu(II) to total copper adsorption at a fixed pH were also investigated. The Langmuir parameters of adsorption were found with the aid of the linearized Langmuir isotherm. This work aims to clarify heavy metal adsorption behavior on composite sorbents consisting of hydrated oxides.

Apak, R.; Gueclue, K. [Istanbul Univ. (Turkey). Dept. of Chemistry; Turgut, M.H. [Cekmece Nuclear Research and Training Centre, Istanbul (Turkey). Dept. of Nuclear Engineering

1998-07-01

237

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU.  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory.

Ko, Y. C.; Hu, L. W.; Olson, A. P.; Dunn, F. E.; Nuclear Engineering Division; MIT

2007-01-01

238

Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU  

SciTech Connect

An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory. (author)

Ko, Y.-C. [Nuclear Science and Engineering Department, MIT, Cambridge, MA 02139 (United States); Hu, L.-W. [Nuclear Reactor Laboratory, MIT, Cambridge, MA 02139 (United States)], E-mail: lwhu@mit.edu; Olson, Arne P.; Dunn, Floyd E. [RERTR Program, Argonne National Laboratory, Argonne, IL 60439 (United States)

2008-07-15

239

Validating Work Discrimination and Coping Strategy Models for Sexual Minorities  

ERIC Educational Resources Information Center

The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

2009-01-01

240

ID Model Construction and Validation: A Multiple Intelligences Case  

ERIC Educational Resources Information Center

This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one, the theoretical foundations of multiple Intelligences and ID were examined to guide the development…

Tracey, Monica W.; Richey, Rita C.

2007-01-01

241

VALIDATION OF VOID COALESCENCE MODEL FOR DUCTILE DAMAGE.  

SciTech Connect

A model for void coalescence for ductile damage in metals is presented. The basic mechanism is void linking through an instability in the intervoid ligament. The formation probability of void clusters is calculated, as a function of cluster size, imposed stress, and strain. Numerical approximations are validated in a 1 D hydrocode.

Tonks, D. L. (Davis L.); Zurek, A. K. (Anna K.); Thissell, W. R. (W. Richards)

2001-01-01

242

On assuring valid measures for theoretical models using survey data  

Microsoft Academic Search

This research critically reviews the process and procedures used in marketing to assure valid and reliable measures for theoretical model tests involving unobserved variables and survey data, and it selectively suggests improvements. The review and suggestions are based on reviews of articles in the marketing literature, and the recent methods literature. This research also provides several perhaps needed explanations and

Robert Ping Jr.

2004-01-01

243

A Model for Investigating Predictive Validity at Highly Selective Institutions.  

ERIC Educational Resources Information Center

|A statistical model for investigating predictive validity at highly selective institutions is described. When the selection ratio is small, one must typically deal with a data set containing relatively large amounts of missing data on both criterion and predictor variables. Standard statistical approaches are based on the strong assumption that…

Gross, Alan L.; And Others

244

Validating soil phosphorus routines in the SWAT model  

Technology Transfer Automated Retrieval System (TEKTRAN)

Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

245

MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES  

EPA Science Inventory

This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...

246

Analytical thermal model validation for Cassini radioisotope thermoelectric generator  

Microsoft Academic Search

The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding

Edward L Lin

1997-01-01

247

Solar swimming pool heating: Description of a validated model  

SciTech Connect

In the framework of a European Demonstration Programme, co-financed by CEC and national bodies, a model was elaborated and validated for open-air swimming pools having a minimal surface of 100 m[sup 2] and a minimal depth of 0.5 m. The model consists of two parts, the energy balance of the pool and the solar plant. The theoretical background of the energy balance of an open-air swimming pool was found to be poor. Special monitoring campaigns were used to validate the dynamic model using mathematical parameter identification methods. The final model was simplified in order to shorten calculation time and to improve the user-friendliness by reducing the input values to the most important one. The programme is commercially available. However, it requires the hourly meteorological data of a test reference year (TRY) as an input. The users are mainly designing engineers.

Haaf, W.; Luboschik, U.; Tesche, B. (IST Energietechnik GmbH, Hauptsitz Wollbach, Kandern (Germany))

1994-07-01

248

TAYLOR MODELS AND OTHER VALIDATED FUNCTIONAL INCLUSION METHODS  

Microsoft Academic Search

Abstract: A detailed comparison,between,Taylor model,methods,and other tools for validated computations,is provided. Basic elements,of the Taylor model (TM) methods are reviewed, beginning with the arithmetic for elementary,operations and intrinsic functions. We discuss some,of the fundamental properties, including high approximation order and the ability to control the dependency problem, and pointers to many of the more,advanced,TM tools are provided. Aspects of the

K. Makino; M. Berz

2003-01-01

249

ID model construction and validation: a multiple intelligences case  

Microsoft Academic Search

This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model\\u000a that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one,\\u000a the theoretical foundations of multiple Intelligences and ID were examined to guide the development of such model. In phase\\u000a two the

Monica W. Tracey; Rita C. Richey

2007-01-01

250

Validating and Calibrating Agent-Based Models: A Case Study  

Microsoft Academic Search

In this paper we deal with some validation and calibration experiments on a modified version of the Complex Adaptive Trivial\\u000a System (CATS) model proposed in Gallegati et al. (2005 Journal of Economic Behavior and Organization, 56, 489–512). The CATS model has been extensively used to replicate a large number of scaling types stylized facts with a remarkable\\u000a degree of precision. For

Carlo Bianchi; Pasquale Cirillo; Mauro Gallegati; Pietro A. Vagliasindi

2007-01-01

251

Reverse electrodialysis: A validated process model for design and optimization  

Microsoft Academic Search

Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter- and co-current operation.It was decided to focus on co-current design because

J. Veerman; M. Saakes; S. J. Metz; G. J. Harmsen

2011-01-01

252

A standardized approach to PV system performance model validation.  

SciTech Connect

PV performance models are used to predict how much energy a PV system will produce at a given location and subject to prescribed weather conditions. These models are commonly used by project developers to choose between module technologies and array designs (e.g., fixed tilt vs. tracking) for a given site or to choose between different geographic locations, and are used by the financial community to establish project viability. Available models can differ significantly in their underlying mathematical formulations and assumptions and in the options available to the analyst for setting up a simulation. Some models lack complete documentation and transparency, which can result in confusion on how to properly set up, run, and document a simulation. Furthermore, the quality and associated uncertainty of the available data upon which these models rely (e.g., irradiance, module parameters, etc.) is often quite variable and frequently undefined. For these reasons, many project developers and other industry users of these simulation tools have expressed concerns related to the confidence they place in PV performance model results. To address this problem, we propose a standardized method for the validation of PV system-level performance models and a set of guidelines for setting up these models and reporting results. This paper describes the basic elements for a standardized model validation process adapted especially for PV performance models, suggests a framework to implement the process, and presents an example of its application to a number of available PV performance models.

Stein, Joshua S.; Jester, Terry (Hudson Clean Energy Partners); Posbic, Jean (BP Solar); Kimber, Adrianne (First Solar); Cameron, Christopher P.; Bourne, Benjamin (SunPower Corporation)

2010-10-01

253

Validating the BHR RANS model for variable density turbulence  

SciTech Connect

The BHR RANS model is a turbulence model for multi-fluid flows in which density variation plays a strong role in the turbulence processes. In this paper they demonstrate the usefulness of BHR over a wide range of flows which include the effects of shear, buoyancy, and shocks. The results are in good agreement with experimental and DNS data across the entire set of validation cases, with no need to retune model coefficients between cases. The model has potential application to a number of aerospace related flow problems.

Israel, Daniel M [Los Alamos National Laboratory; Gore, Robert A [Los Alamos National Laboratory; Stalsberg - Zarling, Krista L [Los Alamos National Laboratory

2009-01-01

254

Experimental Validation of Modified Barton's Model for Rock Fractures  

NASA Astrophysics Data System (ADS)

Among the constitutive models for rock fractures developed over the years, Barton’s empirical model has been widely used. Although Barton’s failure criterion predicts peak shear strength of rock fractures with acceptable precision, it has some limitations in estimating the peak shear displacement, post-peak shear strength, dilation, and surface degradation. The first author modified Barton’s original model in order to address these limitations. In this study, the modified Barton’s model (the peak shear displacement, the shear stress-displacement curve, and the dilation displacement) is validated by conducting a series of direct shear tests.

Asadollahi, Pooyan; Invernizzi, Marco C. A.; Addotto, Simone; Tonon, Fulvio

2010-09-01

255

Propeller aircraft interior noise model utilization study and validation  

NASA Astrophysics Data System (ADS)

Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

Pope, L. D.

1984-09-01

256

Pharmacological validation of a new animal model of alcoholism  

Microsoft Academic Search

Summary.   A new animal model of alcoholism has been developed. Rats derived from this model show certain characteristics: (i) they\\u000a have an incentive demand to consume alcohol, (ii) they exhibit relapse-like drinking even after a very long time of abstinence,\\u000a (iii) they show tolerance to alcohol and have mild signs of physical withdrawal during the onset of abstinence, and (iv)

R. Spanagel; S. M. Hölter

2000-01-01

257

Attempted validation of ICRP 30 and ICRP 66 respiratory models.  

PubMed

The validation of human biological models for inhaled radionuclides is nearly impossible. Requirements for validation are: (1) the measurement of the relevant human tissue data and (2) valid exposure measurements over the interval known to apply to tissue uptake. Two lung models, ICRP 30(1) and ICRP 66(2), are widely used to estimate lung doses following acute occupational or environmental exposure. Both ICRP 30 and 66 lung models are structured to estimate acute rather than chronic exposure. Two sets of human tissue measurements are available: (210)Po accumulated in tissue from inhaled cigarettes and ingested in diet and airborne global fallout (239,240)Pu accumulated in the lungs from inhalation. The human tissue measurements include pulmonary and bronchial tissue in smokers, ex-smokers and non-smokers analysed radiochemically for (210)Po, and pulmonary, bronchial and lymph nodes analysed for (239,240)Pu in lung tissue collected by the New York City Medical Examiner from 1972 to 1974. Both ICRP 30 and 66 models were included in a programme to accommodate chronic uptake. Neither lung model accurately described the estimated tissue concentrations but was within a factor of 2 from measurements. ICRP 66 was the exception and consistently overestimated the bronchial concentrations probably because of its assumption of an overly long 23-d clearance half-time in the bronchi and bronchioles. PMID:22923255

Harley, N H; Fisenne, I M; Robbins, E S

2012-08-23

258

The TIGGE Model Validation Portal: An Improvement In Data Interoperability  

NASA Astrophysics Data System (ADS)

The THORPEX Interactive Grand Global Ensemble (TIGGE), a major component of the World Weather Research Programme, was created to help foster and accelerate the accuracy of 1-day to 2-week high-impact weather forecasts for the benefit of humanity. A key element of this effort is the ability of weather researchers to perform model forecast validation, a statistical procedure by which observational data is used to evaluate how well a numerical model forecast performs as a function of forecast time and model fields. The current methods available for obtaining model forecast verification data can be time-consuming. For example, a user may need to obtain observational, in-situ, and model forecast data from multiple providers and sources in order to carry out the verification process. In most cases, the user is required to download a set of data covering a larger domain and over a longer period of time than is necessary for the user's research. The data preparation challenge is exacerbated if the requested data sets are provided in inconsistent formats, requiring the user to convert the multiple datasets into a preferred common data format. The TIGGE model validation portal, a new product developed for the NCAR Research Data Archive (RDA), strives to solve this data interoperability problem by bringing together and providing observational, model forecast, and in-situ data into a single data package, and in a common data format. Developed to help augment TIGGE research and facilitate researchers' ability to validate TIGGE model forecasts, the portal allows users to submit a delayed-mode data request for the observational and model parameters of their choosing. Additionally, users have the option of requesting a temporal and spatial subset from the global dataset to fit their research needs. This convenience saves both time and storage resources, and allows users to focus their efforts on model verification and research.

Cram, T.; Schuster, D. C.; Wilcox, H.; Worley, S. J.

2011-12-01

259

Design of Training Systems, Phase II Report. Volume II. Detailed Model Descriptions.  

National Technical Information Service (NTIS)

The report consists of three volumes, Volume II presents a detailed description of the System Capabilities/Requirements and Resources model, the Educational Technology Evaluation model, and the Training Process Flow model. Model logic design, input/output...

H. J. Bellamy K. V. Branch L. R. Duffy C. G. Edison R. E. Hallman

1974-01-01

260

Model validation and selection based on inverse fuzzy arithmetic  

NASA Astrophysics Data System (ADS)

In this work, a method for the validation of models in general, and the selection of the most appropriate model in particular, is presented. As an industrially relevant example, a Finite Element (FE) model of a brake pad is investigated and identified with particular respect to uncertainties. The identification is based on inverse fuzzy arithmetic and consists of two stages. In the first stage, the eigenfrequencies of the brake pad are considered, and for three different material models, a set of fuzzy-valued parameters is identified on the basis of measurement values. Based on these identified parameters and a resimulation of the system with these parameters, a model validation is performed which takes into account both the model uncertainties and the output uncertainties. In the second stage, the most appropriate material model is used in the FE model for the computation of frequency response functions between excitation point and three measurement points. Again, the parameters of the model are identified on the basis of three corresponding measurement signals and a resimulation is conducted.

Haag, Thomas; Carvajal González, Sergio; Hanss, Michael

2012-10-01

261

Biosorption optimization of lead(II), cadmium(II) and copper(II) using response surface methodology and applicability in isotherms and thermodynamics modeling.  

PubMed

The present study was carried out to optimize the various environmental conditions for biosorption of Pb(II), Cd(II) and Cu(II) by investigating as a function of the initial metal ion concentration, temperature, biosorbent loading and pH using Trichoderma viride as adsorbent. Biosorption of ions from aqueous solution was optimized in a batch system using response surface methodology. The values of R(2) 0.9716, 0.9699 and 0.9982 for Pb(II), Cd(II) and Cu(II) ions, respectively, indicated the validity of the model. The thermodynamic properties DeltaG degrees , DeltaH degrees , DeltaE degrees and DeltaS degrees by the metal ions for biosorption were analyzed using the equilibrium constant value obtained from experimental data at different temperatures. The results showed that biosorption of Pb(II) ions by T. viride adsorbent is more endothermic and spontaneous. The study was attempted to offer a better understating of representative biosorption isotherms and thermodynamics with special focuses on binding mechanism for biosorption using the FTIR spectroscopy. PMID:19836883

Singh, Rajesh; Chadetrik, Rout; Kumar, Rajender; Bishnoi, Kiran; Bhatia, Divya; Kumar, Anil; Bishnoi, Narsi R; Singh, Namita

2009-09-23

262

Modeling and experimental validation of buckling dielectric elastomer actuators  

NASA Astrophysics Data System (ADS)

Buckling dielectric elastomer actuators are a special type of electromechanical transducers that exploit electro-elastic instability phenomena to generate large out-of-plane axial-symmetric deformations of circular membranes made of non-conductive rubbery material. In this paper a simplified explicit analytical model and a general monolithic finite element model are described for the coupled electromechanical analysis and simulation of buckling dielectric elastomer membranes which undergo large electrically induced displacements. Experimental data are also reported which validate the developed models.

Vertechy, Rocco; Frisoli, Antonio; Bergamasco, Massimo; Carpi, Federico; Frediani, Gabriele; De Rossi, Danilo

2012-09-01

263

Validation of the SUNY Satellite Model in a Meteosat Evironment  

SciTech Connect

The paper presents a validation of the SUNY satellite-to-irradiance model against four ground-truth stations from the Indian solar radiation network located in and around the province of Rajasthan, India. The SUNY model had initially been developed and tested to process US weather satellite data from the GOES series and has been used as part of the production of the US National Solar Resource Data Base (NSRDB). Here the model is applied to processes data from the European weather satellites Meteosat 5 and 7.

Perez, R.; Schlemmer, J.; Renne, D.; Cowlin, S.; George, R.; Bandyopadhyay, B.

2009-01-01

264

Validation results of wind diesel simulation model TKKMOD  

NASA Astrophysics Data System (ADS)

The document summarizes the results of TKKMOD validation procedure. TKKMOD is a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project Engineering Design Tools for Wind-Diesel Systems (JOUR-0078). The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, energy losses in the system components, diesel fuel consumption, and the number of diesel engine starts. The work has been funded through the Finnish Advanced Energy System R&D Programme (NEMO). The validation has been performed using the data from EFI (Norwegian Electric Power Institute), since data from the Finnish reference system is not yet available. The EFI system has a slightly different configuration with similar overall operating principles and approximately same battery capacity. The validation data set, 394 hours of measured data, is from the first prototype wind-diesel system on the island FROYA off the Norwegian coast.

Manninen, L. M.

265

Line emission from H II blister models  

NASA Astrophysics Data System (ADS)

Numerical techniques to calculate the thermal and geometric properties of line emission from H II 'blister' regions are presented. It is assumed that the density distributions of the H II regions are a function of two dimensions, with rotational symmetry specifying the shape in three-dimensions. The thermal and ionization equilibrium equations of the problem are solved by spherical modeling, and a spherical sector approximation is used to simplify the three-dimensional treatment of diffuse ionizing radiation. The global properties of H II 'blister' regions near the edges of a molecular cloud are simulated by means of the geometry/density distribution, and the results are compared with observational data. It is shown that there is a monotonic increase of peak surface brightness from the i = 0 deg (pole-on) observational position to the i = 90 deg (edge-on) position. The enhancement of the line peak intensity from the edge-on to the pole-on positions is found to depend on the density, stratification, ionization, and electron temperature weighting. It is found that as i increases, the position of peak line brightness of the lower excitation species is displaced to the high-density side of the high excitation species.

Rubin, R. H.

1984-12-01

266

USER'S MANUAL FOR THE PLUME VISIBILITY MODEL (PLUVUE II)  

EPA Science Inventory

This publication contains information about the computer programs for the Plume Visibility Model PLUVUE II. A technical overview of PLUVUE II and the results of model evaluation studies are presented. The source code of PLUVUE II, as well as two sets of input and output data, is ...

267

Microelectronics package design using experimentally-validated modeling and simulation.  

SciTech Connect

Packaging high power radio frequency integrated circuits (RFICs) in low temperature cofired ceramic (LTCC) presents many challenges. Within the constraints of LTCC fabrication, the design must provide the usual electrical isolation and interconnections required to package the IC, with additional consideration given to RF isolation and thermal management. While iterative design and prototyping is an option for developing RFIC packaging, it would be expensive and most likely unsuccessful due to the complexity of the problem. To facilitate and optimize package design, thermal and mechanical simulations were used to understand and control the critical parameters in LTCC package design. The models were validated through comparisons to experimental results. This paper summarizes an experimentally-validated modeling approach to RFIC package design, and presents some results and key findings.

Johnson, Jay Dean; Young, Nathan Paul; Ewsuk, Kevin Gregory

2010-11-01

268

The validation of ecosystem models of turbid estuaries  

NASA Astrophysics Data System (ADS)

The ecosystem model of the Bristol Channel and Severn Estuary (GEMBASE) was fitted to 3 years of survey data, and has subsequently been validated against a further 5 years of monitoring data. A control chart technique clearly demonstrates that the model is, on the whole, an adequate representation of the estuarine carbon cycle, although the precision of model estimates reduces with increasing trophic level. An ecosystem model of the Ems Estuary has been adapted to simulate the Severn Estuary, and the impact of introducing a notional tidal power scheme assessed. The results were compared to those obtained using GEMBASE in the Severn. The broad predictions from both models are in agreement, although some detail is at variance, which implies that the fundamental ecological assumptions of the models are compatible.

Radford, P. J.; Ruardij, P.

1987-11-01

269

Seine estuary modelling and AirSWOT measurements validation  

NASA Astrophysics Data System (ADS)

In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being improved, by testing different roughness coefficients, adding tributary inflows. Groundwater contributions will also be introduced (digital TUGOm development in progress) . The model outputs will be validated using data from the GPMR tide gauge data and measurements from the Topex/Poseidon and Jason-1/-2 altimeters for year 2007.

Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

2013-04-01

270

Overview of OpenModel-based Validation with Partial Information  

Microsoft Academic Search

Multi-stakeholder distributed systems (MSDS), such as the Internet email and instant messaging systems, and e-business Web service networks, raise new challenges for users, developers, and systems analysts. Traditional requirements engineering, validation, and debugging approaches cannot handle two primary problems of MSDS: the lack of consistent high level requirements and the ignorance problem caused by lack of communication among stakeholders. OpenModel

Robert J. Hall; Andrea Zisman

2003-01-01

271

Model-based validation of safety-critical embedded systems  

Microsoft Academic Search

Safety-critical systems have become increasingly software reliant and the current development process of ¿build, then integrate¿ has become unaffordable. This paper examines two major contributors to today's exponential growth in cost: system-level faults that are not discovered until late in the development process; and multiple truths of analysis results when predicting system properties through model-based analysis and validating them against

Peter H. Feiler

2010-01-01

272

ENERGETIC MATERIAL RESPONSE IN A COOKOFF MODEL VALIDATION EXPERIMENT  

Microsoft Academic Search

The cookoff experiments described in this paper belong to the small-scale experimental portion of a three-year phased study of the slow cookoff problem. This paper presents the response of three energetic materials in a small-scale cookoff experiment. The experimental effort is being used to validate the cookoff models currently under development by the Department of Energy (DOE).1-2 In this phase

A. I. Atwood; P. O. Curran; D. T. Bui; T. L. Boggs; K. B. Lee

273

A community diagnostic tool for Chemistry Climate Model Validation  

NASA Astrophysics Data System (ADS)

This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag) tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model), and/or many types of observations. The tool can also compute quantitative performance metrics. The initial construction and application is to coupled Chemistry-Climate Models (CCMs) participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP) is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool is supporting model development as well as quantifying model improvements, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth System. User modifications are encouraged and easy to perform with a minimum of coding.

Gettelman, A.; Eyring, V.; Fischer, C.; Shiona, H.; Cionni, I.; Neish, M.; Morgenstern, O.; Wood, S. W.; Li, Z.

2012-05-01

274

A community diagnostic tool for chemistry climate model validation  

NASA Astrophysics Data System (ADS)

This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag) tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model), and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs) participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP) is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

Gettelman, A.; Eyring, V.; Fischer, C.; Shiona, H.; Cionni, I.; Neish, M.; Morgenstern, O.; Wood, S. W.; Li, Z.

2012-09-01

275

Validation of detailed thermal hydraulic models used for LMR safety and for improvement of technical specifications  

SciTech Connect

Detailed steady-state and transient coolant temperatures and flow rates from an operating reactor have been used to validate the multiple pin model in the SASSYS-1 liquid metal reactor systems analysis code. This multiple pin capability can be used for explicit calculations of axial and lateral temperature distributions within individual subassemblies. Thermocouples at a number of axial locations and in a number of different coolant sub-channels m the XXO9 instrumented subassembly in the EBR-II reactor provided temperature data from the Shutdown Heat Removal Test (SHRT) series. Flow meter data for XXO9 and for the overall system are also available from these tests. Results of consistent SASSYS-1 multiple pin analyses for both the SHRT-45 loss-of-flow-without-scram-test and the S14RT-17 protected loss-of-flow test agree well with the experimental data, providing validation of the SASSYS-1 code over a wide range of conditions.

Dunn, F.E.

1995-12-31

276

Application of reliability models to studies of biomarker validation.  

PubMed Central

We present a model of biomarker validation developed in our laboratory, the results of the validation study, and the impact of the estimation of the variance components on the design of future molecular epidemiologic studies. Four different biomarkers of exposure are illustrated: DNA-protein cross-link (DNA-PC), DNA-amino acid cross link (DNA-AA), metallothionein gene expression (MT), and autoantibodies to oxidized DNA bases (DNAox). The general scheme for the validation experiments involves n subjects measured on k occasions, with j replicate samples analyzed on each occasion. Multiple subjects, occasions, and replicates provide information on intersubject, intrasubject, and analytical measurement variability, respectively. The analysis of variance showed a significant effect of batch variability for DNA-PC and MT gene expression, whereas DNAox showed a significant between-subject variability. Among the amino acids tested, cysteine and methionine showed a significant contribution of both batch and between-subject variability, threonine showed between-subject variability only, and tyrosine showed between-batch and between-subject variability. The total variance estimated through the experiment was used to calculate the minimum sample size required for a future epidemiologic study including the same biomarkers used for the reliability study. Such validation studies can detect the various components of variability of a biomarker and indicate needed improvements of the assay, along with possible use in field studies.

Taioli, E; Kinney, P; Zhitkovich, A; Fulton, H; Voitkun, V; Cosma, G; Frenkel, K; Toniolo, P; Garte, S; Costa, M

1994-01-01

277

Shoulder model validation and joint contact forces during wheelchair activities  

PubMed Central

Chronic shoulder impingement is a common problem for manual wheelchair users. The loading associated with performing manual wheelchair activities of daily living is substantial and often at a high frequency. Musculoskeletal modeling and optimization techniques can be used to estimate the joint contact forces occurring at the shoulder to assess the soft tissue loading during an activity and to possibly identify activities and strategies that place manual wheelchair users at risk for shoulder injuries. The purpose of this study was to validate an upper extremity musculoskeletal model and apply the model to wheelchair activities for analysis of the estimated joint contact forces. Upper extremity kinematics and handrim wheelchair kinetics were measured over three conditions: level propulsion, ramp propulsion, and a weight relief lift. The experimental data were used as input to a subject-specific musculoskeletal model utilizing optimization to predict joint contact forces of the shoulder during all conditions. The model was validated using a mean absolute error calculation. Model results confirmed that ramp propulsion and weight relief lifts place the shoulder under significantly higher joint contact loading than level propulsion. In addition, they exhibit large superior contact forces that could contribute to impingement. This study highlights the potential impingement risk associated with both the ramp and weight relief lift activities. Level propulsion was shown to have a low relative risk of causing injury, but with consideration of the frequency with which propulsion is performed, this observation is not conclusive.

Morrow, Melissa M.B.; Kaufman, Kenton R.; An, Kai-Nan

2010-01-01

278

Construct Validity of an Inanimate Training Model for Laparoscopic Appendectomy  

PubMed Central

Background and Objective: The use of training models in laparoscopic surgery allows the surgical team to practice procedures in a safe environment. The aim of this study was to determine the capability of an inanimate laparoscopic appendectomy model to discriminate between different levels of surgical experience (construct validity). Methods: The performance of 3 groups with different levels of expertise in laparoscopic surgery—experts (Group A), intermediates (Group B), and novices (Group C)—was evaluated. The groups were instructed of the task to perform in the model using a video tutorial. Procedures were recorded in a digital format for later analysis using the Global Operative Assessment of Laparoscopic Skills (GOALS) score; procedure time was registered. The data were analyzed using the analysis of variance test. Results: Twelve subjects were evaluated, 4 in each group, using the GOALS score and time required to finish the task. Higher scores were observed in the expert group, followed by the intermediate and novice groups, with statistically significant difference. Regarding procedure time, a significant difference was also found between the groups, with the experts having the shorter time. The proposed model is able to discriminate among individuals with different levels of expertise, indicating that the abilities that the model evaluates are relevant in the surgeon's performance. Conclusions: Construct validity for the inanimate full-task laparoscopic appendectomy training model was demonstrated. Therefore, it is a useful tool in the development and evaluation of the resident in training.

Sanchez-Ismayel, Alexis; Sanchez, Renata; Pena, Romina; Salamo, Oriana

2013-01-01

279

Validation of a transparent decision model to rate drug interactions  

PubMed Central

Background Multiple databases provide ratings of drug-drug interactions. The ratings are often based on different criteria and lack background information on the decision making process. User acceptance of rating systems could be improved by providing a transparent decision path for each category. Methods We rated 200 randomly selected potential drug-drug interactions by a transparent decision model developed by our team. The cases were generated from ward round observations and physicians’ queries from an outpatient setting. We compared our ratings to those assigned by a senior clinical pharmacologist and by a standard interaction database, and thus validated the model. Results The decision model rated consistently with the standard database and the pharmacologist in 94 and 156 cases, respectively. In two cases the model decision required correction. Following removal of systematic model construction differences, the DM was fully consistent with other rating systems. Conclusion The decision model reproducibly rates interactions and elucidates systematic differences. We propose to supply validated decision paths alongside the interaction rating to improve comprehensibility and to enable physicians to interpret the ratings in a clinical context.

2012-01-01

280

Short-Term Mortality Prediction for Acute Lung Injury Patients: External Validation of the ARDSNet Prediction Model  

PubMed Central

Objective An independent cohort of acute lung injury (ALI) patients was used to evaluate the external validity of a simple prediction model for short-term mortality previously developed using data from ARDS Network (ARDSNet) trials. Design, Setting, and Patients Data for external validation were obtained from a prospective cohort study of ALI patients from 13 ICUs at four teaching hospitals in Baltimore, Maryland. Measurements and Main Results Of the 508 non-trauma, ALI patients eligible for this analysis, 234 (46%) died in-hospital. Discrimination of the ARDSNet prediction model for inhospital mortality, evaluated by the area under the receiver operator characteristics curves (AUC), was 0.67 for our external validation dataset versus 0.70 and 0.68 using APACHE II and the ARDSNet validation dataset, respectively. In evaluating calibration of the model, predicted versus observed in-hospital mortality for the external validation dataset was similar for both low risk (ARDSNet model score = 0) and high risk (score = 3 or 4+) patient strata. However, for intermediate risk (score = 1 or 2) patients, observed in-hospital mortality was substantially higher than predicted mortality (25.3% vs. 16.5% and 40.6% vs. 31.0% for score = 1 and 2, respectively). Sensitivity analyses limiting our external validation data set to only those patients meeting the ARDSNet trial eligibility criteria and to those who received mechanical ventilation in compliance with the ARDSNet ventilation protocol, did not substantially change the model’s discrimination or improve its calibration. Conclusions Evaluation of the ARDSNet prediction model using an external ALI cohort demonstrated similar discrimination of the model as was observed with the ARDSNet validation dataset. However, there were substantial differences in observed versus predicted mortality among intermediate risk ALI patients. The ARDSNet model provided reasonable, but imprecise, estimates of predicted mortality when applied to our external validation cohort of ALI patients.

Damluji, Abdulla; Colantuoni, Elizabeth; Mendez-Tellez, Pedro A.; Sevransky, Jonathan E.; Fan, Eddy; Shanholtz, Carl; Wojnar, Margaret; Pronovost, Peter J.; Needham, Dale M.

2011-01-01

281

Validation of the WATEQ4 geochemical model for uranium  

SciTech Connect

As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

1983-09-01

282

Experimental Validation and Applications of a Fluid Infiltration Model  

PubMed Central

Horizontal infiltration experiments were performed to validate a plug flow model that minimizes the number of parameters that must be measured. Water and silicone oil at three different viscosities were infiltrated into glass beads, desert alluvium, and silica powder. Experiments were also performed with negative inlet heads on air-dried silica powder, and with water and oil infiltrating into initially water moist silica powder. Comparisons between the data and model were favorable in most cases, with predictions usually within 40% of the measured data. The model is extended to a line source and small areal source at the ground surface to analytically predict the shape of two-dimensional wetting fronts. Furthermore, a plug flow model for constant flux infiltration agrees well with field data and suggests that the proposed model for a constant-head boundary condition can be effectively used to predict wetting front movement at heterogeneous field sites if averaged parameter values are used.

Kao, Cindy S.; Hunt, James R.

2010-01-01

283

Command validation of secondary EM (electro-magnetic) pump system in the EBR-II (Experimental Breeder Reactor-II)  

Microsoft Academic Search

The objective of command validation is to determine the relevance and accuracy of the command strategy (or control signals) generated by the control system, and to validate the resulting output of the actuator system. Actuators include pumps, valves, control rod drive mechanisms, heaters, sprays, and other direct operations in the process. Overall command validation requires the verification of control signal

R. L. Bywater; R. C. Berkan; B. R. Upadhyaya; R. A. Kisner

1990-01-01

284

Low-order dynamic modeling of the Experimental Breeder Reactor II  

SciTech Connect

This report describes the development of a low-order, linear model of the Experimental Breeder Reactor II (EBR-II), including the primary system, intermediate heat exchanger, and steam generator subsystems. The linear model is developed to represent full-power steady state dynamics for low-level perturbations. Transient simulations are performed using model building and simulation capabilities of the computer software Matrix{sub x}. The inherently safe characteristics of the EBR-II are verified through the simulation studies. The results presented in this report also indicate an agreement between the linear model and the actual dynamics of the plant for several transients. Such models play a major role in the learning and in the improvement of nuclear reactor dynamics for control and signal validation studies. This research and development is sponsored by the Advanced Controls Program in the Instrumentation and Controls Division of the Oak Ridge National Laboratory. 17 refs., 67 figs., 15 tabs.

Berkan, R.C. (Tennessee Univ., Knoxville, TN (USA). Dept. of Nuclear Engineering); Upadhyaya, B.R.; Kisner, R.A. (Oak Ridge National Lab., TN (USA))

1990-07-01

285

Validation Analysis of the Shoal Groundwater Flow and Transport Model  

SciTech Connect

Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of groundwater withdrawal activities in the area. The conceptual and numerical models were developed based upon regional hydrogeologic investigations conducted in the 1960s, site characterization investigations (including ten wells and various geophysical and geologic studies) at Shoal itself prior to and immediately after the test, and two site characterization campaigns in the 1990s for environmental restoration purposes (including eight wells and a year-long tracer test). The new wells are denoted MV-1, MV-2, and MV-3, and are located to the northnortheast of the nuclear test. The groundwater model was generally lacking data in the north-northeastern area; only HC-1 and the abandoned PM-2 wells existed in this area. The wells provide data on fracture orientation and frequency, water levels, hydraulic conductivity, and water chemistry for comparison with the groundwater model. A total of 12 real-number validation targets were available for the validation analysis, including five values of hydraulic head, three hydraulic conductivity measurements, three hydraulic gradient values, and one angle value for the lateral gradient in radians. In addition, the fracture dip and orientation data provide comparisons to the distributions used in the model and radiochemistry is available for comparison to model output. Goodness-of-fit analysis indicates that some of the model realizations correspond well with the newly acquired conductivity, head, and gradient data, while others do not. Other tests indicated that additional model realizations may be needed to test if the model input distributions need refinement to improve model performance. This approach (generating additional realizations) was not followed because it was realized that there was a temporal component to the data disconnect: the new head measurements are on the high side of the model distributions, but the heads at the original calibration locations themselves have also increased over time. This indicates that the steady-state assumption of the groundwater model is in error. To test the robustness of the model d

A. Hassan; J. Chapman

2008-11-01

286

Concepts and Validation of the ESA MASTER Model  

NASA Astrophysics Data System (ADS)

MASTER-2005 is the new orbital debris reference model of the European Space Agency It was developed by a team led by the Institute of Aerospace Systems The model is based on the simulation of events and processes that lead to the generation of orbital debris The majority of the debris generation mechanisms implemented in MASTER have been reviewed in the course of the project The validation for debris objects larger than 1 mm was based on observation data gathered by the TIRA Goldstone and Haystack radars and the ESA Space Debris Telescope ESA-SDT The PROOF-2005 validation tool has been used to simulate detections of orbital debris based on the analysis of geometrical and instrument parameters The simulation results gathered using the observation scenario were compared with the actual observations In this paper the results of this population generation mechanism will be presented New ESA-SDT data was used to further refine the simulation of the GEO object population In MASTER-2001 in addition to the known fragmentations of the Ekran-2 satellite and the Titan Transtage 11 artificial breakups have been introduced in order to show alignment of PROOF simulations with measurement data Using additional ESA-SDT observation data the assumptions concerning number magnitude time and position of the artificial breakups were reviewed and corrected Small particle validation was performed based on returned space hardware impact data The solar arrays of the Hubble Space Telescope returned by the Space Shuttle on missions STS-61 and

Oswald, M.; Stabroth, S.; Wiedemann, C.; Klinkrad, H.; Vörsmann, P.

287

Model calibration and validation of an impact test simulation  

SciTech Connect

This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

2001-01-01

288

Image decomposition as a tool for validating stress analysis models  

NASA Astrophysics Data System (ADS)

It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

Patki, A.; Wang, W.; Mottershead, J.; Patterson, E.

2010-06-01

289

Modeling TCP Throughput: A Simple Model and Its Empirical Validation  

Microsoft Academic Search

In this paper we develop a simple analytic characterization of the steady state throughput, as a function of loss rate and round trip time for a bulk transfer TCP flow, i.e., a flow with an unlimited amount of data to send. Unlike the models in [6, 7, 10], our model captures not only the behavior of TCP's fast retransmit mechanism

Jitendra Padhye; Victor Firoiu; Donald F. Towsley; James F. Kurose

1998-01-01

290

Validation of coupled atmosphere-fire behavior models  

SciTech Connect

Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexity of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.

Bossert, J.E.; Reisner, J.M.; Linn, R.R.; Winterkamp, J.L. [Los Alamos National Lab., NM (United States); Schaub, R. [Dynamac Corp., Kennedy Space Center, FL (United States); Riggan, P.J. [Forest Service, Riverside, CA (United States)

1998-12-31

291

Leading compounds for the validation of animal models of psychopathology.  

PubMed

Modelling of complex psychiatric disorders, e.g., depression and schizophrenia, in animals is a major challenge, since they are characterized by certain disturbances in functions that are absolutely unique to humans. Furthermore, we still have not identified the genetic and neurobiological mechanisms, nor do we know precisely the circuits in the brain that function abnormally in mood and psychotic disorders. Consequently, the pharmacological treatments used are mostly variations on a theme that was started more than 50 years ago. Thus, progress in novel drug development with improved therapeutic efficacy would benefit greatly from improved animal models. Here, we review the available animal models of depression and schizophrenia and focus on the way that they respond to various types of potential candidate molecules, such as novel antidepressant or antipsychotic drugs, as an index of predictive validity. We conclude that the generation of convincing and useful animal models of mental illnesses could be a bridge to success in drug discovery. PMID:23942897

Micale, Vincenzo; Kucerova, Jana; Sulcova, Alexandra

2013-08-15

292

Validation of the Millon Clinical Multiaxial Inventory for Axis II Disorders: Does It Meet the Daubert Standard?  

Microsoft Academic Search

Relevant to forensic practice, the Supreme Court in Daubert v. Merrell Dow Pharmaceuticals, Inc. (1993) established the boundaries for the admissibility of scientific evidence that take into account its trustworthiness as assessed via evidentiary reliability. In conducting forensic evaluations, psychologists and other mental health professionals must be able to offer valid diagnoses, including Axis II disorders. The most widely available

Richard Rogers; Randall T. Salekin; Kenneth W. Sewell

1999-01-01

293

Development and Validation of a 3-Dimensional CFB Furnace Model  

NASA Astrophysics Data System (ADS)

At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents CFB process analysis focused on combustion and NO profiles in pilot and industrial scale bituminous coal combustion.

Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

294

Validation of thermal models for a prototypical MEMS thermal actuator.  

SciTech Connect

This report documents technical work performed to complete the ASC Level 2 Milestone 2841: validation of thermal models for a prototypical MEMS thermal actuator. This effort requires completion of the following task: the comparison between calculated and measured temperature profiles of a heated stationary microbeam in air. Such heated microbeams are prototypical structures in virtually all electrically driven microscale thermal actuators. This task is divided into four major subtasks. (1) Perform validation experiments on prototypical heated stationary microbeams in which material properties such as thermal conductivity and electrical resistivity are measured if not known and temperature profiles along the beams are measured as a function of electrical power and gas pressure. (2) Develop a noncontinuum gas-phase heat-transfer model for typical MEMS situations including effects such as temperature discontinuities at gas-solid interfaces across which heat is flowing, and incorporate this model into the ASC FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (3) Develop a noncontinuum solid-phase heat transfer model for typical MEMS situations including an effective thermal conductivity that depends on device geometry and grain size, and incorporate this model into the FEM heat-conduction code Calore to enable it to simulate these effects with good accuracy. (4) Perform combined gas-solid heat-transfer simulations using Calore with these models for the experimentally investigated devices, and compare simulation and experimental temperature profiles to assess model accuracy. These subtasks have been completed successfully, thereby completing the milestone task. Model and experimental temperature profiles are found to be in reasonable agreement for all cases examined. Modest systematic differences appear to be related to uncertainties in the geometric dimensions of the test structures and in the thermal conductivity of the polycrystalline silicon test structures, as well as uncontrolled nonuniform changes in this quantity over time and during operation.

Gallis, Michail A.; Torczynski, John Robert; Piekos, Edward Stanley; Serrano, Justin Raymond; Gorby, Allen D.; Phinney, Leslie Mary

2008-09-01

295

Open-Source MFIX-DEM Software for Gas-Solids Flows: Part II - Validation Studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas–solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas–solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen

2012-04-01

296

Open-source MFIX-DEM software for gas-solids flows: Part II Validation studies  

SciTech Connect

With rapid advancements in computer hardware and numerical algorithms, computational fluid dynamics (CFD) has been increasingly employed as a useful tool for investigating the complex hydrodynamics inherent in multiphase flows. An important step during the development of a CFD model and prior to its application is conducting careful and comprehensive verification and validation studies. Accordingly, efforts to verify and validate the open-source MFIX-DEM software, which can be used for simulating the gas solids flow using an Eulerian reference frame for the continuum fluid and a Lagrangian discrete framework (Discrete Element Method) for the particles, have been made at the National Energy Technology Laboratory (NETL). In part I of this paper, extensive verification studies were presented and in this part, detailed validation studies of MFIX-DEM are presented. A series of test cases covering a range of gas solids flow applications were conducted. In particular the numerical results for the random packing of a binary particle mixture, the repose angle of a sandpile formed during a side charge process, velocity, granular temperature, and voidage profiles from a bounded granular shear flow, lateral voidage and velocity profiles from a monodisperse bubbling fluidized bed, lateral velocity profiles from a spouted bed, and the dynamics of segregation of a binary mixture in a bubbling bed were compared with available experimental data, and in some instances with empirical correlations. In addition, sensitivity studies were conducted for various parameters to quantify the error in the numerical simulation.

Li, Tingwen [National Energy Technology Laboratory (NETL); Garg, Rahul [National Energy Technology Laboratory (NETL); Galvin, Janine [National Energy Technology Laboratory (NETL); Pannala, Sreekanth [ORNL

2012-01-01

297

Validation of two air quality models for Indian mining conditions.  

PubMed

All major mining activity particularly opencast mining contributes to the problem of suspended particulate matter (SPM) directly or indirectly. Therefore, assessment and prediction are required to prevent and minimize the deterioration of SPM due to various opencast mining operations. Determination of emission rate of SPM for these activities and validation of air quality models are the first and foremost concern. In view of the above, the study was taken up for determination of emission rate for SPM to calculate emission rate of various opencast mining activities and validation of commonly used two air quality models for Indian mining conditions. To achieve the objectives, eight coal and three iron ore mining sites were selected to generate site specific emission data by considering type of mining, method of working, geographical location, accessibility and above all resource availability. The study covers various mining activities and locations including drilling, overburden loading and unloading, coal/mineral loading and unloading, coal handling or screening plant, exposed overburden dump, stock yard, workshop, exposed pit surface, transport road and haul road. Validation of the study was carried out through Fugitive Dust Model (FDM) and Point, Area and Line sources model (PAL2) by assigning the measured emission rate for each mining activity, meteorological data and other details of the respective mine as an input to the models. Both the models were run separately for the same set of input data for each mine to get the predicted SPM concentration at three receptor locations for each mine. The receptor locations were selected such a way that at the same places the actual filed measurement were carried out for SPM concentration. Statistical analysis was carried out to assess the performance of the models based on a set measured and predicted SPM concentration data. The value of coefficient of correlation for PAL2 and FDM was calculated to be 0.990-0.994 and 0.966-0.997, respectively, which shows a fairly good agreement between measured and predicted values of SPM concentration. The average index of agreement values for PAL2 and FDM was found to be 0.665 and 0.752, respectively, which represents that the prediction by PAL2 and FDM models are accurate by 66.5 and 75.2%, respectively. These indicate that FDM model is more suited for Indian mining conditions. PMID:12602620

Chaulya, S K; Ahmad, M; Singh, R S; Bandopadhyay, L K; Bondyopadhay, C; Mondal, G C

2003-02-01

298

Comparison with CLPX II airborne data using DMRT model  

USGS Publications Warehouse

In this paper, we considered a physical-based model which use numerical solution of Maxwell Equations in three-dimensional simulations and apply into Dense Media Radiative Theory (DMRT). The model is validated in two specific dataset from the second Cold Land Processes Experiment (CLPX II) at Alaska and Colorado. The data were all obtain by the Ku-band (13.95GHz) observations using airborne imaging polarimetric scatterometer (POLSCAT). Snow is a densely packed media. To take into account the collective scattering and incoherent scattering, analytical Quasi-Crystalline Approximation (QCA) and Numerical Maxwell Equation Method of 3-D simulation (NMM3D) are used to calculate the extinction coefficient and phase matrix. DMRT equations were solved by iterative solution up to 2nd order for the case of small optical thickness and full multiple scattering solution by decomposing the diffuse intensities into Fourier series was used when optical thickness exceed unity. It was shown that the model predictions agree with the field experiment not only co-polarization but also cross-polarization. For Alaska region, the input snow structure data was obtain by the in situ ground observations, while for Colorado region, we combined the VIC model to get the snow profile. ??2009 IEEE.

Xu, X.; Liang, D.; Andreadis, K. M.; Tsang, L.; Josberger, E. G.

2009-01-01

299

A plan for ILAS-II correlative measurements with emphasis on a validation balloon campaign at Kiruna-Esrange  

NASA Astrophysics Data System (ADS)

The present paper describes a plan for correlative measurements dedicated to the satellite sensor ILAS (Improved Limb Atmospheric Spectrometer)-II. ILAS-II, a solar occultation sensor with three infrared spectrometers and a visible spectrometer, is a successor to ILAS, and is scheduled to be launched in February 2002. ILAS-II will measure vertical profiles of ozone (O3), nitric acid (HNO3), nitrogen dioxide (NO2), nitrous oxide (N2O), methane (CH4), water vapor (H2O), chlorine nitrate (ClONO2) mixing ratios, aerosol extinction coefficients, and atmospheric temperature and pressure in the altitude range of ~10-60 km (depending on species), with an altitude resolution of 1 km in high latitudes. As done for ILAS, correlative measurements for ILAS-II are planned. The present plan for ILAS-II is similar to that for ILAS. The balloon campaign at Kiruna-Esrange (68N, 21E) for ILAS-II will be considered as the core one among various other correlative experiments. The anticipated period of the first balloon campaign for ILAS-II is around August-September 2002 when ILAS-II will be in operation and will measure stratospheric profiles over Kiruna. Another candidate period may be around November-December 2001. Other plans for correlative measurements will be briefly presented. Some of the ILAS-II correlative measurement data might be useful for validation of other satellite sensors, and vice versa.

Kanzawa, Hiroshi; Camy-Peyret, Claude; Nakajima, Hideaki; Sasano, Yasuhiro

2001-08-01

300

Multiscale Modeling of Gastrointestinal Electrophysiology and Experimental Validation  

PubMed Central

Normal gastrointestinal (GI) motility results from the coordinated interplay of multiple cooperating mechanisms, both intrinsic and extrinsic to the GI tract. A fundamental component of this activity is an omnipresent electrical activity termed slow waves, which is generated and propagated by the interstitial cells of Cajal (ICCs). The role of ICC loss and network degradation in GI motility disorders is a significant area of ongoing research. This review examines recent progress in the multiscale modeling framework for effectively integrating a vast range of experimental data in GI electrophysiology, and outlines the prospect of how modeling can provide new insights into GI function in health and disease. The review begins with an overview of the GI tract and its electrophysiology, and then focuses on recent work on modeling GI electrical activity, spanning from cell to body biophysical scales. Mathematical cell models of the ICCs and smooth muscle cell are presented. The continuum framework of monodomain and bidomain models for tissue and organ models are then considered, and the forward techniques used to model the resultant body surface potential and magnetic field are discussed. The review then outlines recent progress in experimental support and validation of modeling, and concludes with a discussion on potential future research directions in this field.

Du, Peng; O'Grady, Greg; Davidson, John B.; Cheng, Leo K.; Pullan, Andrew J.

2011-01-01

301

Bolted connection modeling and validation through laser-aided testing  

NASA Astrophysics Data System (ADS)

Bolted connections are widely employed in facility structures, such as light masts, transmission poles, and wind turbine towers. The complex connection behavior plays a significant role in the overall dynamic characteristics of a structure. A finite element (FE) modeling study of a bolt-connected square tubular steel beam is presented in this paper. Modal testing was performed in a controlled laboratory condition to validate the FE model, developed for the bolted beam. Two laser Doppler vibrometers were used simultaneously to measure structural vibration. A simplified joint model was proposed to further save computation time for structures with bolted connections. This study is an on-going effort to marshal knowledge associated with detecting damage on facility structures with bolted connections.

Dai, Kaoshan; Gong, Changqing; Smith, Benjiamin

2013-04-01

302

Validation of a crop field modeling to simulate agronomic images.  

PubMed

In precision agriculture, crop/weed discrimination is often based on image analysis but though several algorithms using spatial information have been proposed, not any has been tested on relevant databases. A simple model that simulates virtual fields is developed to evaluate these algorithms. Virtual fields are made of crops, arranged according to agricultural practices and represented by simple patterns, and weeds that are spatially distributed using a statistical approach. It ensures a user-defined Weed Infestation Rate (WIR). Then, experimental devices using cameras are simulated with a pinhole model. Its ability to characterize the spatial reality is demonstrated through different pairs (real, virtual) of pictures. Two spatial descriptors (nearest neighbor method and Besag's function) have been set up and tested to validate the spatial realism of the crop field model, comparing a real image to the homologous virtual one. PMID:20588922

Jones, Gawain; Gée, Christelle; Villette, Sylvain; Truchetet, Frédéric

2010-05-10

303

Predictive validity of a multidisciplinary model of reemployment success.  

PubMed

The authors propose a multidisciplinary model of the predictors of reemployment and test its predictive validity for explaining reemployment success. Predictor variables from the fields of economics, sociology, and psychology are incorporated into the model. Reemployment success is conceptualized as a construct consisting of unemployment insurance exhaustion and reemployment speed, and for reemployed persons, job improvement, job-organization fit, and intention to leave the new job. Direct, mediated, and moderated relationships were hypothesized and tested, clarifying the role of the variables in the reemployment process and outcome. The authors' proposal and examination of a multidisciplinary model of reemployment success contributes to a literature that has not tended to adequately cross disciplinary boundaries. PMID:12558217

Wanberg, Connie R; Hough, Leaetta M; Song, Zhaoli

2002-12-01

304

Increasing the validity of experimental models for depression.  

PubMed

Major depressive disorder (MDD) is a central nervous system disorder characterized by the culmination of profound disturbances in mood and affective regulation. Animal models serve as a powerful tool for investigating the neurobiological mechanisms underlying this disorder; however, little standardization exists across the wide range of available modeling approaches most often employed. This review will illustrate some of the most challenging obstacles faced by investigators attempting to associate depressive-like behaviors in rodents with symptoms expressed in MDD. Furthermore, a novel series of depressive-like criteria based on correlating behavioral endophenotypes, novel in vivo neurophysiological measurements, and molecular/cellular analyses within multiple brain are proposed as a potential solution to overcoming this barrier. Ultimately, linking the neurophysiological and cellular/biochemical actions that contribute to the expression of a defined MDD-like syndrome will dramatically extend the translational value of the most valid animal models of MDD. PMID:22823549

Dzirasa, Kafui; Covington, Herbert E

2012-07-23

305

Validation of a new Mesoscale Model for MARS .  

NASA Astrophysics Data System (ADS)

The study of Mars planet is very important because of the several similarities with the Earth. For the understanding of the dynamical processes which drive the martian atmosphere, a new Martian Mesoscale Model (MARS-MM5) is presented. The new model is based on the Pennsylvania State University (PSU)/National Centre for Atmosphere Research (NCAR) Mesoscale Model Version 5 \\citep{duh,gre}. MARS-MM5 has been adapted to Mars using soil characteristics and topography obtained by Mars Orbital Laser Altimeter (MOLA). Different cases, depending from data availability and corresponding to the equatorial region of Mars, have been selected for multiple MARS-MM5 simulations. To validate the different developments Mars Climate Database (MCD) and TES observations have been employed: MCD version 4.0 has been created on the basis of multi annual integration of Mars GCM output. The Thermal Emission Spectromter observations (TES) detected during Mars Global Surveyor (MGS) mission are used in terms of temperature. The new, and most important, aspect of this work is the direct validation of the newly generated MARS-MM5 in terms of three-dimensional observations. The comparison between MARS-MM5 and GCM horizontal and vertical temperature profiles shows a good agreement; moreover, a good agreement is also found between TES observations and MARS-MM5.

De Sanctis, K.; Ferretti, R.; Forget, F.; Fiorenza, C.; Visconti, G.

306

Development and validation of a preclinical food effect model.  

PubMed

A preclinical canine model capable of predicting a compound's potential for a human food effect was developed. The beagle dog was chosen as the in vivo model. A validation set of compounds with known propensities for human food effect was studied. Several diets were considered including high-fat dog food and various quantities of the human FDA meal. The effect of pentagastrin pretreatment was also investigated. The high-fat dog food did not predict human food effect and was discontinued from further evaluation. The amount of FDA meal in the dog was important in the overall prediction of the magnitude of human food effect. Fed/fasted Cmax and AUC ratios using a 50-g aliquot of the FDA meal in the dog were in the closest qualitative agreement to human data. Pentagastrin pretreatment did not affect the AUC in the fed state, but increased the fasted AUC for weakly basic compounds. Pentagastrin pretreatment and a 50-g aliquot of the FDA meal in the dog predicted the human food effect for a validation set of compounds. This model, which is intended for compound screening, will be helpful for determining food effect as a liability when compounds progress from discovery to clinical development. PMID:17075867

Lentz, Kimberley A; Quitko, Megan; Morgan, Daniel G; Grace, James E; Gleason, Carol; Marathe, Punit H

2007-02-01

307

Development, Verification, and Validation of Multiphase Models for Polydisperse Flows  

SciTech Connect

This report describes in detail the technical findings of the DOE Award entitled 'Development, Verification, and Validation of Multiphase Models for Polydisperse Flows.' The focus was on high-velocity, gas-solid flows with a range of particle sizes. A complete mathematical model was developed based on first principles and incorporated into MFIX. The solid-phase description took two forms: the Kinetic Theory of Granular Flows (KTGF) and Discrete Quadrature Method of Moments (DQMOM). The gas-solid drag law for polydisperse flows was developed over a range of flow conditions using Discrete Numerical Simulations (DNS). These models were verified via examination of a range of limiting cases and comparison with Discrete Element Method (DEM) data. Validation took the form of comparison with both DEM and experimental data. Experiments were conducted in three separate circulating fluidized beds (CFB's), with emphasis on the riser section. Measurements included bulk quantities like pressure drop and elutriation, as well as axial and radial measurements of bubble characteristics, cluster characteristics, solids flux, and differential pressure drops (axial only). Monodisperse systems were compared to their binary and continuous particle size distribution (PSD) counterparts. The continuous distributions examined included Gaussian, lognormal, and NETL-provided data for a coal gasifier.

Christine Hrenya; Ray Cocco; Rodney Fox; Shankar Subramaniam; Sankaran Sundaresan

2011-12-31

308

A validation study of a stochastic model of human interaction  

NASA Astrophysics Data System (ADS)

The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.

Burchfield, Mitchel Talmadge

309

Distributed hydrological modelling of the Senegal River Basin — model construction and validation  

Microsoft Academic Search

A modified version of the physically-based distributed mike she model code was applied to the 375,000km2 Senegal River Basin. On the basis of conventional data from meteorological stations and readily accessible databases on topography, soil types, vegetation type, etc. three models with different levels of calibration were constructed and rigorous validation tests conducted. Calibration against one station and internal validation

Jens Andersen; Jens C Refsgaard; Karsten H Jensen

2001-01-01

310

ON VALIDATION OF SOURCE AND SINK MODELS: PROBLEMS AND POSSIBLE SOLUTIONS  

EPA Science Inventory

The paper discusses solutions for problems relating to validating indoor air quality (IAQ) source and sink models. hile model validation remains the weakest part of the entire process of IAQ model development, special problems have made the validation of indoor source and sink mo...

311

A Technique for Global Monitoring of Net Solar Irradiance at the Ocean Surface. Part II: Validation.  

NASA Astrophysics Data System (ADS)

The present study constitutes the generation and validation of the first satellite-based, long-term record of surface solar irradiance over the global oceans. The record is generated using Nimbus-7 earth radiation budget (ERB) wide-field-of-view (WFOV) planetary-albedo data as input to a numerical algorithm designed and implemented for this study based on radiative transfer theory. Net surface solar irradiance is obtained by subtracting the solar radiation reflected by the ocean-atmosphere system (measured by satellite) and the solar radiation absorbed by atmospheric constituents (modeled theoretically) from the solar irradiance at the top of the atmosphere (a known quantity). The resulting monthly mean values are computed on a 9° latitude-longitude spatial grid for November 1978°October 1985.Because direct measurements of surface solar irradiance are not available on the global spatial scales needed to validate the new approach, the ERB-based values cannot be verified directly against in situ pyranometer data. Although the ERB-based annual and monthly mean climatologies are compared with those obtained from ship observations and empirical formulas, a comparison with long-term mean climatologies does not provide an assessment of the month-to-month accuracies achieved using the new technique. Furthermore, the accuracy of the ship-based climatologies is questionable.Therefore, the new dataset is validated in comparisons with short-term, regional, high-resolution, satellite- based records (which were generated using methods that in turn have been validated using in situ measurements). The ERB-based values of net surface solar irradiance are compared with corresponding values based on radiance measurements taken by the VISSR (Visible-Infrared Spin Scan Radiometer) aboard GOES (Geostationary Operational Environmental Satellite) series satellites during the TOGA (Tropical Ocean Global Atmosphere), Tropic Heat, and MONEX (Monsoon Experiment) field experiments. The rms differences are 14.5 W m2 (i.e., 6.2% of the average VISSR-based value on monthly time scales) for the TOGA data comparison, 6.4 W m2 (i.e., 2.5% of the average VISSR-based value on monthly time scales) for the Tropic Heat data comparison, and 16.8 W m2 (i.e., 7.5% of the average VISSR-based value on monthly time scales) for the MONEX data comparison. The ERB-based record is also compared with an additional satellite-based dataset, focused primarily over the Atlantic Ocean, that was generated using radiance measurements from the Meteosat radiometer. On the basis of these validation studies, errors in the new dataset are estimated to lie between 10 and 20 W m2 on monthly time scales.

Chertock, Beth; Frouin, Robert; Gautier, Catherine

1992-09-01

312

LANL* V2.0: global modeling and validation  

NASA Astrophysics Data System (ADS)

We describe in this paper the new version of LANL*. Just like the previous version, this new version V2.0 of LANL* is an artificial neural network (ANN) for calculating the magnetic drift invariant, L*, that is used for modeling radiation belt dynamics and for other space weather applications. We have implemented the following enhancements in the new version: (1) we have removed the limitation to geosynchronous orbit and the model can now be used for any type of orbit. (2) The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005) (TS05) instead of the older model by Tsyganenko et al. (2003). We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ? L* < 0.2 which corresponds to an error of 3% at geosynchronous orbit. This new LANL-V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

Koller, J.; Zaharia, S.

2011-03-01

313

LANL*V2.0: global modeling and validation  

NASA Astrophysics Data System (ADS)

We describe in this paper the new version of LANL*, an artificial neural network (ANN) for calculating the magnetic drift invariant L*. This quantity is used for modeling radiation belt dynamics and for space weather applications. We have implemented the following enhancements in the new version: (1) we have removed the limitation to geosynchronous orbit and the model can now be used for a much larger region. (2) The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005) (TS05) instead of the older model by Tsyganenko et al. (2003). We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ?L* < 0.2 which corresponds to an error of 3 % at geosynchronous orbit. This new LANL* V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

Koller, J.; Zaharia, S.

2011-08-01

314

Experimental validation of a numerical model for subway induced vibrations  

NASA Astrophysics Data System (ADS)

This paper presents the experimental validation of a coupled periodic finite element-boundary element model for the prediction of subway induced vibrations. The model fully accounts for the dynamic interaction between the train, the track, the tunnel and the soil. The periodicity or invariance of the tunnel and the soil in the longitudinal direction is exploited using the Floquet transformation, which allows for an efficient formulation in the frequency-wavenumber domain. A general analytical formulation is used to compute the response of three-dimensional invariant or periodic media that are excited by moving loads. The numerical model is validated by means of several experiments that have been performed at a site in Regent's Park on the Bakerloo line of London Underground. Vibration measurements have been performed on the axle boxes of the train, on the rail, the tunnel invert and the tunnel wall, and in the free field, both at the surface and at a depth of 15 m. Prior to these vibration measurements, the dynamic soil characteristics and the track characteristics have been determined. The Bakerloo line tunnel of London Underground has been modelled using the coupled periodic finite element-boundary element approach and free field vibrations due to the passage of a train at different speeds have been predicted and compared to the measurements. The correspondence between the predicted and measured response in the tunnel is reasonably good, although some differences are observed in the free field. The discrepancies are explained on the basis of various uncertainties involved in the problem. The variation in the response with train speed is similar for the measurements as well as the predictions. This study demonstrates the applicability of the coupled periodic finite element-boundary element model to make realistic predictions of the vibrations from underground railways.

Gupta, S.; Degrande, G.; Lombaert, G.

2009-04-01

315

Ovarian Volume throughout Life: A Validated Normative Model  

PubMed Central

The measurement of ovarian volume has been shown to be a useful indirect indicator of the ovarian reserve in women of reproductive age, in the diagnosis and management of a number of disorders of puberty and adult reproductive function, and is under investigation as a screening tool for ovarian cancer. To date there is no normative model of ovarian volume throughout life. By searching the published literature for ovarian volume in healthy females, and using our own data from multiple sources (combined n?=?59,994) we have generated and robustly validated the first model of ovarian volume from conception to 82 years of age. This model shows that 69% of the variation in ovarian volume is due to age alone. We have shown that in the average case ovarian volume rises from 0.7 mL (95% CI 0.4–1.1 mL) at 2 years of age to a peak of 7.7 mL (95% CI 6.5–9.2 mL) at 20 years of age with a subsequent decline to about 2.8 mL (95% CI 2.7–2.9 mL) at the menopause and smaller volumes thereafter. Our model allows us to generate normal values and ranges for ovarian volume throughout life. This is the first validated normative model of ovarian volume from conception to old age; it will be of use in the diagnosis and management of a number of diverse gynaecological and reproductive conditions in females from birth to menopause and beyond.

Kelsey, Thomas W.; Dodwell, Sarah K.; Wilkinson, A. Graham; Greve, Tine; Andersen, Claus Y.; Anderson, Richard A.; Wallace, W. Hamish B

2013-01-01

316

Approaches to Validation of CFD Models for Far Ship Wake  

NASA Astrophysics Data System (ADS)

The centerline wake of surface ships can extend to tens of kilometers on synthetic aperture radar (SAR) images. However, the hydrodynamics of far wakes of ships are not well understood. Our assumption is that far from the ship, the pattern of flows comprising the wake is represented by longitudinal coherent vortices, which gross parameters only slowly change in the direction along the wake. In order to model this process, we have run a set of non-hydrostatic simulations. The simulations also included dynamics of freshwater plumes in the upper ocean because such types of formations often produce sharp fronts, which can be confused with ship wakes. We have used 2D and 3D setups with slippery and free upper boundary and with several turbulence closure schemes including k-?, standard LES, and Hybrid LES. The models have been implemented in CFD Fluent and simulated such processes as wind-wake and wind-plume interactions and formation of sharp frontal lines on the sea surface. In order to justify the choice of grid and model parameters, we have performed a series of validation tests. These tests included grid and time convergence, sensitivity to geometric parameters, and comparison to available experimental data including photo and SAR images of ship wakes. The Hybrid LES turbulence model has demonstrated a more realistic performance than the other two tested turbulence closure models. The application of CFD to these problems has resulted in a qualitative level of information. Providing information on the level of absolute quantities requires validation with the data from specialized field and laboratory experiments.

Fujimura, A.; Soloviev, A.

2008-12-01

317

Analysis and Validation of a Predictive Model for Growth and Death of Aeromonas hydrophila under Modified Atmospheres at Refrigeration Temperatures  

Microsoft Academic Search

Specific growth and death rates of Aeromonas hydrophila were measured in laboratory media under various combinations of temperature, pH, and percent CO2 and O2 in the atmosphere. Predictive models were developed from the data and validated by means of observations obtained from (i) seafood experiments set up for this purpose and (ii) the ComBase database (http:\\/\\/www.combase.cc; http:\\/\\/wyndmoor.arserrc.gov\\/combase\\/). Two main reasons

Carmen Pin; Raquel Velasco de Diego; Susan George; Gonzalo D. García de Fernando; Jozsef Baranyi

2004-01-01

318

Modeling organic transformations by microorganisms of soils in six contrasting ecosystems: Validation of the MOMOS model  

Microsoft Academic Search

The Modeling Organic Transformations by Microorganisms of Soils (MOMOS) model simulates the growth, respiration, and mortality of soil microorganisms as main drivers of the mineralization and humification processes of organic substrates. Originally built and calibrated using data from two high-altitude sites, the model is now validated with data from a 14C experiment carried out in six contrasting tropical ecosystems covering

M. Pansu; L. Sarmiento; M. A. Rujano; M. Ablan; D. Acevedo; P. Bottner

2010-01-01

319

Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting.  

National Technical Information Service (NTIS)

This research has two objectives-to verify and validate the U.S. Army's Forecast and Allocation of Army Recruiting Resources (FAARR) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simu...

G. M. Piskator

1998-01-01

320

Low Altitude Validation of Geomagnetic Cutoff Models Using SAMPEX Data  

NASA Astrophysics Data System (ADS)

Single event upsets (SEUs) caused by MeV protons are a concern for satellite operators so AFRL is working to create a tool that can specify and/or forecast SEU probabilities. An important component of the tool's SEU probability calculation will be the local energetic ion spectrum. The portion of that spectrum due to trapped energetic ion population is relatively stable and predictable; however it is more difficult to account for the transient solar energetic particles (SEPs). These particles, which can be ejected from the solar atmosphere during a solar flare or filament eruption or can be energized by coronal mass ejection (CME) driven shocks, can penetrate the Earth's magnetosphere into regions not normally populated by energetic protons. The magnetosphere will provide energy dependent shielding that also depends on its magnetic configuration. During magnetic storms that configuration is modified and the SEP cutoff latitude for a given particle energy can be suppressed up to ~15 degrees equatorward exposing normally shielded regions. As a first step to creating the satellite SEU prediction tool, we are comparing the Smart et al. (Advances in Space Research, 2006) and CISM-Dartmouth (Kress et al., Space Weather, 2010) geomagnetic cutoff tools. While they have provided some of their own validations in the noted papers, our validation will be done consistently between models allowing us to better compare the models.

Young, S. L.; Kress, B. T.

2011-12-01

321

Validating global atmospheric models using Odin satellite data.  

NASA Astrophysics Data System (ADS)

The Odin satellite, launched in February 2001, provides global or near-global coverage of various atmospheric constituents with high vertical resolution. The two onboard instruments, the Sub-Millimetre Radiometer (SMR) and the Optical Spectrograph and Infra-Red Imager System (OSIRIS), are co-aligned and give reliable profiles of O3, NO2, N2O, HNO3, H2O, CO, etc. A new approach, where data from both OSIRIS and SMR are merged using a chemical box model to construct an Odin proxy NOy product, has recently been developed. Climatologies of these data sets, based on multi-year statistics, have proven to be very useful for validating Chemical Transport Models (CTMs) and Chemistry Climate Models (CCMs). Longer-lived species such as N2O and CO have been used to test whether atmospheric transport is realistically represented by/in the models. The presentation will describe the relevant Odin climatologies and provide examples of comparisons to state-of-the-art chemistry climate models such as the Canadian Middle Atmosphere Model.

Brohede, S.; Urban, J.

2009-04-01

322

Systematic validation of disease models for pharmacoeconomic evaluations. Swiss HIV Cohort Study.  

PubMed

Pharmacoeconomic evaluations are often based on computer models which simulate the course of disease with and without medical interventions. The purpose of this study is to propose and illustrate a rigorous approach for validating such disease models. For illustrative purposes, we applied this approach to a computer-based model we developed to mimic the history of HIV-infected subjects at the greatest risk for Mycobacterium avium complex (MAC) infection in Switzerland. The drugs included as a prophylactic intervention against MAC infection were azithromycin and clarithromycin. We used a homogenous Markov chain to describe the progression of an HIV-infected patient through six MAC-free states, one MAC state, and death. Probability estimates were extracted from the Swiss HIV Cohort Study database (1993-95) and randomized controlled trials. The model was validated testing for (1) technical validity (2) predictive validity (3) face validity and (4) modelling process validity. Sensitivity analysis and independent model implementation in DATA (PPS) and self-written Fortran 90 code (BAC) assured technical validity. Agreement between modelled and observed MAC incidence confirmed predictive validity. Modelled MAC prophylaxis at different starting conditions affirmed face validity. Published articles by other authors supported modelling process validity. The proposed validation procedure is a useful approach to improve the validity of the model. PMID:10461580

Sendi, P P; Craig, B A; Pfluger, D; Gafni, A; Bucher, H C

1999-08-01

323

Defect distribution model validation and effective process control  

NASA Astrophysics Data System (ADS)

Assumption of the underlying probability distribution is an essential part of effective process control. In this article, we demonstrate how to improve the effectiveness of equipment monitoring and process induced defect control through properly selecting, validating and using the hypothetical distribution models. The testing method is based on probability plotting, which is made possible through order statistics. Since each ordered sample data point has a cumulative probability associated with it, which is calculated as a function of sample size, the assumption validity is readily judged by the linearity of the ordered sample data versus the deviate predicted by the assumed statistical model from the cumulative probability. A comparison is made between normal and lognormal distributions to illustrate how dramatically the distribution model could affect the control limit setting. Examples presented include defect data collected on SP1 the dark field inspection tool on a variety of deposited and polished metallic and dielectric films. We find that the defect count distribution is in most cases approximately lognormal. We show that normal distribution is an inadequate assumption, as clearly indicated by the non-linearity of the probability plots. Misuse of normal distribution leads to a too optimistic process control limit, typically 50% tighter than suggested by the lognormal distribution. The inappropriate control limit setting consequently results in an excursion rate at a level too high to be manageable. Lognormal distribution is a valid assumption because it is positively skewed, which adequately takes into account the fact that defect count distribution is typically characteristic of a long tail. In essence, use of lognormal distribution is a suggestion that the long tail be treated as part of the process entitlement (capability) instead of process excursion. The adjustment of the expected process entitlement is reflected and quantified by the skewness of lognormal distribution, yielding a more realistic estimate (defect count control limit).It is of particular importance to use a validated probability distribution when the sample size is small. Statistical process control (SPC) chart is generally constructed on the assumption of normality of the underlying population. Although this assumption is not true, as we discussed in the previous paragraph, the sample average will follow a normal distribution regardless of the underlying distribution according to the central limit theorem. However, this practice requires a large sample, which is sometimes impractical, especially in the stage of process development and yield ramp-up, when the process control limit is and has to be a moving target, enabling a rapid and constant yield-learning with minimal amount of production interruption and/or resource reallocation. In this work, we demonstrate that a validated statistical model such as lognormal distribution allows us to monitor the progress in a quantifiable and measurable way, and to tighten the control limits smoothly and systematically. To do so, we use the verified model to make a deduction about the expected defect count at a predetermined deviate, say 3s. The estimate error or the range is a function of sample variation, sample size, and the confidence level at which the estimation is being made. If we choose a fixed sample size and confidence level, the defectivity performance is explicitly defined and gauged by the estimate and the estimate error.

Zhong, Lei

2003-07-01

324

Validation of a neural network model using cross application approaches  

NASA Astrophysics Data System (ADS)

This paper discusses an important component of landslide susceptibility mapping using back-propagation based artificial neural network model and its cross application of weights at three study areas in Malaysia, using a Geographic Information System (GIS). Landslide locations were identified in the study areas from the interpretation of aerial photographs, field surveys and inventory reports. Landslide related spatial database was constructed from topographic, soil, geology, landcover maps. The paper further examines the factors affecting landslide for assessing landslide susceptibility mapping and reviews tools for quantifying the likelihood of occurrence of the scenarios. Different training sites were selected randomly to train the neural network and nine sets of landslide susceptibility maps were prepared. The paper then illustrates the validation of those maps using Area Under Curve (AUC) model.

Pradhan, Biswajeet; Lee, Saro; Hyun-Joo, Oh; Buchroithner, Manfred F.

2010-05-01

325

Ptolemy II: Heterogeneous Concurrent Modeling And Design In Java  

Microsoft Academic Search

This document describes the design and implementation of Ptolemy II 2.0.1. Ptolemy II is a set of Java packages supporting heterogeneous, concurrent modeling and design. The focus is on assembly of concurrent components. The key underlying principle in the Ptolemy II is the use of well-defined models of computation that govern the interaction between components. A major problem area that

Christopher Hylands; Edward A. Lee; Jie Liu; Xiaojun Liu; Steve Neuendorffer; Yuhong Xiong

2001-01-01

326

Sea surface simulation in the infrared modeling and validation  

NASA Astrophysics Data System (ADS)

A physics based 3D simulation of sea surfaces is presented. The simulation is suitable for the pre-calculation of detector images for an IR camera. Synthetic views of a maritime scenario are calculated in the MWIR and LWIR spectral bands and the images are compared with data collected in a field trial. In our computer simulation the basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. Sea surface animation is introduced by time dependent control of the basic statistics. Choppy waves are included into the model to improve the realism of the rough sea. To predict the view of a thermal camera the sea surface radiance must be calculated. This is done with respect to the emitted sea surface radiance and the reflected sky radiance, using either MODTRAN or a semi-empirical model. Slope-shadowing of the sea surface waves is considered, which strongly influences the IR appearance of the sea surface near the horizon. MWIR and LWIR simulations are shown of sun glint as well as of whitecaps which depend upon wind velocity. For validation purposes appropriate data sets (images and meteorological data) were selected from field measurements. A simple maritime scenario including a floating foreground object has been prepared and views of two different thermal imagers, similar to those used in the field trials, have been simulated. The validation is done by visual inspection of measured and simulated images and in addition by numerical comparison based on image statistics. The results of the comparison are presented. For an accurate reflectance calculation it is necessary to consider the maritime sky. The model is improved by inclusion of a static two-dimensional cloud layer. The cloud distribution adjusted to measured data with respect, e.g. to power spectral density and temperature distribution.

Schwenger, Frédéric; Repasi, Endre

2006-06-01

327

Predictive validity of behavioural animal models for chronic pain  

PubMed Central

Rodent models of chronic pain may elucidate pathophysiological mechanisms and identify potential drug targets, but whether they predict clinical efficacy of novel compounds is controversial. Several potential analgesics have failed in clinical trials, in spite of strong animal modelling support for efficacy, but there are also examples of successful modelling. Significant differences in how methods are implemented and results are reported means that a literature-based comparison between preclinical data and clinical trials will not reveal whether a particular model is generally predictive. Limited reports on negative outcomes prevents reliable estimate of specificity of any model. Animal models tend to be validated with standard analgesics and may be biased towards tractable pain mechanisms. But preclinical publications rarely contain drug exposure data, and drugs are usually given in high doses and as a single administration, which may lead to drug distribution and exposure deviating significantly from clinical conditions. The greatest challenge for predictive modelling is, however, the heterogeneity of the target patient populations, in terms of both symptoms and pharmacology, probably reflecting differences in pathophysiology. In well-controlled clinical trials, a majority of patients shows less than 50% reduction in pain. A model that responds well to current analgesics should therefore predict efficacy only in a subset of patients within a diagnostic group. It follows that successful translation requires several models for each indication, reflecting critical pathophysiological processes, combined with data linking exposure levels with effect on target. LINKED ARTICLES This article is part of a themed issue on Translational Neuropharmacology. To view the other articles in this issue visit http://dx.doi.org/10.1111/bph.2011.164.issue-4

Berge, Odd-Geir

2011-01-01

328

Systematic approach to verification and validation: High explosive burn models  

SciTech Connect

Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code, run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.

Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

2012-04-16

329

Canyon building ventilation system dynamic model -- Parameters and validation  

SciTech Connect

Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical models.'' These component models are functionally integrated to represent the plant. With today's low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in single loop'' design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

Moncrief, B.R. (Westinghouse Savannah River Co., Aiken, SC (United States)); Chen, F.F.K. (Bechtel National, Inc., San Francisco, CA (United States))

1993-01-01

330

Rangemeter for XM23 Laser Rangefinder Model II.  

National Technical Information Service (NTIS)

A modified rangemeter, designated model II, was constructed and determined capable for use with the XM23 Laser Rangefinder. The model II design is a high-speed, miniature, digital time-interval counter that displays range readings from 200 to 19,995 meter...

I. R. Marcus

1964-01-01

331

Type II endoleak in porcine model of abdominal aortic aneurysm  

Microsoft Academic Search

Clinical relevanceWe set out to show that aortic aneurysm sac pressurization caused by lumbar arterial flow in the setting of type II endoleak can be reproduced in an in vivo porcine model of endovascular aortic aneurysm repair. Indeed, in this model the aneurysm sac pulse pressure was a sensitive indicator of type II endoleak, correlating well with findings at computed

Sergio Diaz; Matthew R Uzieblo; Ketan M Desai; Michael R Talcott; Kyongtae T Bae; Patrick J Geraghty; Juan C Parodi; Gregorio A Sicard; Luis A Sanchez; Eric T Choi

2004-01-01

332

Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.  

SciTech Connect

A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

2004-10-01

333

Image quality assessment in digital mammography: part II. NPWE as a validated alternative for contrast detail analysis  

NASA Astrophysics Data System (ADS)

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.

Monnin, P.; Marshall, N. W.; Bosmans, H.; Bochud, F. O.; Verdun, F. R.

2011-07-01

334

Validation of source and sink models: Problems and possible solutions. Rept. for Sep 91-Apr 92  

SciTech Connect

The paper discusses solutions for problems relating to validating indoor air quality (IAQ) source and sink models. While model validation remains the weakest part of the entire process of IAQ model development, special problems have made the validation of indoor source and sink models even more difficult. Many source and sink models have been developed, but few have been properly validated. Major problems include: elusive model parameters, confusion in parameter estimation methods, uncertainty in scale-up and misleading scaling factors, unspecified validity ranges, and weakness in quantitative comparisons between models and experimental observation. Possible solutions include: proper definition of validation scope, proper use of statistical comparison methods, development of mass transfer indices to bridge the gap between test chambers and real buildings, and development of a cooperative effort to build a source and sink database to facilitate validation.

Guo, Z.

1992-01-01

335

Combined Analysis and Validation of Earth Rotation Models and Observations  

NASA Astrophysics Data System (ADS)

Global dynamic processes cause changes in the Earth's rotation, gravity field and geometry. Thus, they can be traced in geodetic observations of these quantities. However, the sensitivity of the various geodetic observation techniques to specific processes in the Earth system differs. More meaningful conclusions with respect to contributions from individual Earth subsystems can be drawn from the combined analysis of highly precise and consistent parameter time series from heterogeneous observation types which carry partially redundant and partially complementary information. For the sake of a coordinated research in this field, the Research Unit FOR 584 "Earth Rotation and Global Dynamic Processes" is funded at present by the German Research Foundation (DFG). It is concerned with the refined and consistent modeling and data analysis. One of the projects (P9) within this Research Unit addresses the combined analysis and validation of Earth rotation models and observations. In P9 three main topics are addressed: (1) the determination and mutual validation of reliable consistent time series for Earth rotation parameters and gravity field coefficients due to the consideration of their physical connection by the Earth's tensor of inertia, (2) the separation of individual Earth rotation excitation mechanisms by merging all available relevant data from recent satellite missions (GRACE, Jason-1, …) and geodetic space techniques (GNSS, SLR, VLBI, …) in a highly consistent way, (3) the estimation of fundamental physical Earth parameters (Love numbers, …) by an inverse model using the improved geodetic observation time series as constraints. Hence, this project provides significant and unique contributions to the field of Earth system science in general; it corresponds with the goals of the Global Geodetic Observing System (GGOS). In this paper project P9 is introduced, the goals are summarized and a status report including a presentation and discussion of intermediate results is given.

Kutterer, Hansjoerg; Göttl, Franziska; Heiker, Andrea; Kirschner, Stephanie; Schmidt, Michael; Seitz, Florian

2010-05-01

336

Cross-Cultural Validation of the Beck Depression Inventory-II across U.S. and Turkish Samples  

ERIC Educational Resources Information Center

The purpose of this study was to test the Beck Depression Inventory-II (BDI-II) for factorial invariance across Turkish and U.S. college student samples. The results indicated that (a) a two-factor model has an adequate fit for both samples, thus providing evidence of configural invariance, and (b) there is a metric invariance but "no" sufficient…

Canel-Cinarbas, Deniz; Cui, Ying; Lauridsen, Erica

2011-01-01

337

Cross-Cultural Validation of the Beck Depression Inventory-II across U.S. and Turkish Samples  

ERIC Educational Resources Information Center

|The purpose of this study was to test the Beck Depression Inventory-II (BDI-II) for factorial invariance across Turkish and U.S. college student samples. The results indicated that (a) a two-factor model has an adequate fit for both samples, thus providing evidence of configural invariance, and (b) there is a metric invariance but "no" sufficient…

Canel-Cinarbas, Deniz; Cui, Ying; Lauridsen, Erica

2011-01-01

338

Command validation of secondary EM pump system in the EBR-II  

Microsoft Academic Search

The objective of command validation is to determine the relevance and accuracy of the command strategy (or control signals) generated by the control system and to validate the resulting output of the actuator system. Actuators include pumps, valves, control rod drive mechanisms, heaters, sprays, and other direct operations in the process. Overall command validation requires the verification of control signal

R. L. Bywater; B. R. Upadhyaya; R. C. Berkan; R. A. Kisner

1990-01-01

339

Passive millimeter-wave imaging model application and validation  

NASA Astrophysics Data System (ADS)

The military use of millimeter wave radiometers has been studied since the 1960's. It is only recently that advances in the technology have made passive millimeter wave (PMMW) systems practical. It is well established that metal targets will have a large contrast ratio versus the background in the millimeter wave (MMW) regime and that atmospheric propagation through clouds, fog and light rain is possible. The limitations have been the noise figures of the detectors, the size of the systems, and the cost of the systems. Through the advent of millimeter wave monolithic integrated circuits technology, MMW devices are becoming smaller, more sensitive, and less expensive. In addition many efforts are currently under way to develop PMMW array imaging devices. This renewed interest has likewise brought forth the need for passive millimeter wave system modeling capabilities. To fill this need, Nichols Research Corporation has developed for Eglin AFB a physics-based image synthesis code, capable of modeling the dominant effects in the MMW regime. This code has been developed to support the development of the next generation of PMMW seeker systems. This paper will describe the phenomenology of PMMW signatures, the Irma software, validation of the Irma models and the application of the models to both Air Force and Navy problems.

Blume, Bradley T.; Chenault, David B.

1997-06-01

340

Bioaerosol optical sensor model development and initial validation  

NASA Astrophysics Data System (ADS)

This paper describes the development and initial validation of a bioaerosol optical sensor model. This model was used to help determine design parameters and estimate performance of a new low-cost optical sensor for detecting bioterrorism agents. In order to estimate sensor performance in detecting biowarfare simulants and rejecting environmental interferents, use was made of a previously reported catalog of EEM (excitation/emission matrix) fluorescence cross-section measurements and previously reported multiwavelength-excitation biosensor modeling work. In the present study, the biosensor modeled employs a single high-power 365 nm UV LED source plus an IR laser diode for particle size determination. The sensor has four output channels: IR size channel, UV elastic channel and two fluorescence channels. The sensor simulation was used to select the fluorescence channel wavelengths of 400-450 and 450-600 nm. Using these selected fluorescence channels, the performance of the sensor in detecting simulants and rejecting interferents was estimated. Preliminary measurements with the sensor are presented which compare favorably with the simulation results.

Campbell, Steven D.; Jeys, Thomas H.; Eapen, Xuan Le

2007-05-01

341

The validity of bioelectrical impedance models in clinical populations.  

PubMed

Bioelectrical impedance analysis (BIA) is the most commonly used body composition technique in published studies. Herein we review the theory and assumptions underlying the various BIA and bioelectrical impedance spectroscopy (BIS) models, because these assumptions may be invalidated in clinical populations. Single-frequency serial BIA and discrete multifrequency BIA may be of limited validity in populations other than healthy, young, euvolemic adults. Both models inaccurately predict total body water (TBW) and extracellular water (ECW) in populations with changes in trunk geometry or fluid compartmentalization, especially at the level of the individual. Single-frequency parallel BIA may predict body composition with greater accuracy than the serial model. Hand-to-hand and leg-to-leg BIA models do not accurately predict percent fat mass. BIS may predict ECW, but not TBW, more accurately than single-frequency BIA. Segmental BIS appears to be sensitive to fluid accumulation in the trunk. In general, bioelectrical impedance technology may be acceptable for determining body composition of groups and for monitoring changes in body composition within individuals over time. Use of the technology to make single measurements in individual patients, however, is not recommended. This has implications in clinical settings, in which measurement of individual patients is important. PMID:16215137

Buchholz, Andrea C; Bartok, Cynthia; Schoeller, Dale A

2004-10-01

342

Development, validation and application of numerical space environment models  

NASA Astrophysics Data System (ADS)

Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the Vlasov description of plasma is carried out using the Vlasiator model. The test shows that the Vlasov equation for plasma in six-dimensionsional phase space is solved correctly b! y Vlasiator, that results are obtained beyond those of the magnetohydrodynamic (MHD) description of plasma and that global magnetospheric simulations using a hybrid-Vlasov model are feasible on current hardware. For the first time four global magnetospheric models using the MHD description of plasma (BATS-R-US, GUMICS, OpenGGCM, LFM) are run with identical solar wind input and the results compared to observations in the ionosphere and outer magnetosphere. Based on the results of the global magnetospheric MHD model GUMICS a hypothesis is formulated for a new mechanism of plasmoid formation in the Earth's magnetotail.

Honkonen, Ilja

2013-10-01

343

Dynamic models and model validation for PEM fuel cells using electrical circuits  

Microsoft Academic Search

This paper presents the development of dynamic models for proton exchange membrane (PEM) fuel cells using electrical circuits. The models have been implemented in MATLAB\\/SIMULINK and PSPICE environments. Both the double-layer charging effect and the thermodynamic characteristic inside the fuel cell are included in the models. The model responses obtained at steady-state and transient conditions are validated by experimental data

Caisheng Wang; M. Hashem Nehrir; Steven R. Shaw

2005-01-01

344

Modeling and Validating Chronic Pharmacological Manipulation of Circadian Rhythms  

PubMed Central

Circadian rhythms can be entrained by a light-dark (LD) cycle and can also be reset pharmacologically, for example, by the CK1?/? inhibitor PF-670462. Here, we determine how these two independent signals affect circadian timekeeping from the molecular to the behavioral level. By developing a systems pharmacology model, we predict and experimentally validate that chronic CK1?/? inhibition during the earlier hours of a LD cycle can produce a constant stable delay of rhythm. However, chronic dosing later during the day, or in the presence of longer light intervals, is not predicted to yield an entrained rhythm. We also propose a simple method based on phase response curves (PRCs) that predicts the effects of a LD cycle and chronic dosing of a circadian drug. This work indicates that dosing timing and environmental signals must be carefully considered for accurate pharmacological manipulation of circadian phase.

Kim, J K; Forger, D B; Marconi, M; Wood, D; Doran, A; Wager, T; Chang, C; Walton, K M

2013-01-01

345

Use of Synchronized Phasor Measurements for Model Validation in ERCOT  

NASA Astrophysics Data System (ADS)

This paper discusses experiences in the use of synchronized phasor measurement technology in Electric Reliability Council of Texas (ERCOT) interconnection, USA. Implementation of synchronized phasor measurement technology in the region is a collaborative effort involving ERCOT, ONCOR, AEP, SHARYLAND, EPG, CCET, and UT-Arlington. As several phasor measurement units (PMU) have been installed in ERCOT grid in recent years, phasor data with the resolution of 30 samples per second is being used to monitor power system status and record system events. Post-event analyses using recorded phasor data have successfully verified ERCOT dynamic stability simulation studies. Real time monitoring software "RTDMS"® enables ERCOT to analyze small signal stability conditions by monitoring the phase angles and oscillations. The recorded phasor data enables ERCOT to validate the existing dynamic models of conventional and/or wind generator.

Nuthalapati, Sarma; Chen, Jian; Shrestha, Prakash; Huang, Shun-Hsien; Adams, John; Obadina, Diran; Mortensen, Tim; Blevins, Bill

2013-05-01

346

A third-generation wave model for coastal regions 1. Model description and validation  

Microsoft Academic Search

A third-generation numerical wave model to compute random, short-crested waves in coastal regions with shallow water and ambient currents (Simulating Waves Nearshore (SWAN)) has been developed, implemented, and validated. The model is based on a Eulerian formulation of the discrete spectral balance of action density that accounts for refractive propagation over arbitrary bathymetry and current fields. It is driven by

N. Booij; R. C. Ris; L. H. Holthuijsen

1999-01-01

347

An approach to model validation and model-based prediction -- polyurethane foam case study.  

SciTech Connect

Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are created and addressed. Hypothetical problems are used because no guidance was provided concern

Dowding, Kevin J.; Rutherford, Brian Milne

2003-07-01

348

Receptor modeling approach to VOC emission inventory validation  

SciTech Connect

The chemical-mass-balance (CMB) receptor model is a method for determining specific-source contributions of volatile organic compounds (VOCs) to concentrations of nonmethane organic compounds (NMOCs) measured in the ambient air. Because the method is based on air measurements, it offers an independent check on emission inventories developed by more traditional permit, survey, emission factor, and source-test techniques. This paper reports on the application of the CMB model to speciated NMOC air-measurement data sets collected during the summers of 1984--88 in five US cities: Detroit; Chicago; Beaumont, Tex.; Atlanta; and Washington, D.C. Sources modeled were vehicle tailpipe emissions, fugitive gasoline-vapor emissions, architectural coating solvents, emissions from graphic arts, petroleum refineries, coke ovens, and polyethylene production. Comparisons of the CMB allocation of NMOC to emission inventory allocation of VOC for each city is discussed. Agreement with Environmental Protection Agency inventories for the five cities was generally very good for vehicles. Refinery inventory estimates are lower than CMB estimates by more than a factor of 10 in Chicago and Detroit. Trajectory analysis was used to validate coefficients for coke ovens.

Kenski, D.M.; Wadden, R.A.; Scheff, P.A. [Univ. of Illinois, Chicago, IL (United States); Lonneman, W.A. [Environmental Protection Agency, Research Triangle Park, NC (United States)

1995-07-01

349

Validation of fixed speed wind turbine dynamic models with measured data  

Microsoft Academic Search

Power system dynamics studies involving fixed-speed wind turbines normally use a wind turbine model consisting of two lumped masses, an elastic coupling and a induction generator model which neglects stator transients. However, validations of this model with measured data are rarely reported in the literature. This paper validates the model using a recorded case obtained in a fixed speed, stall

M. Martins; A. Perdana; P. Ledesma; E. Agneholm; O. Carlson

2007-01-01

350

Assessing the validity of multinomial models using extraneous variables: An application to prospective memory  

Microsoft Academic Search

The class of multinomial processing tree (MPT) models has been used extensively in cognitive psychology to model latent cognitive processes. Critical for the usefulness of a MPT model is its psychological validity. Generally, the validity of a MPT model is demonstrated by showing that its parameters are selectively and predictably affected by theoretically meaningful experimental manipulations. Another approach is to

Jan Rummel; C. Dennis Boywitt; Thorsten Meiser

2011-01-01

351

An evaluation of diagnostic tests and their roles in validating forest biometric models  

Microsoft Academic Search

Model validation is an important part of model development. It is performed to increase the credibility and gain sufficient confidence about a model. This paper evaluated the usefulness of 10 statistical tests, five parametric and five nonparametric, in validating forest biometric models. The five parametric tests are the paired t test, the ?2 test, the separate t test, the simultaneous

Yuqing Yang; Robert A. Monserud; Shongming Huang

2004-01-01

352

Canyon building ventilation system dynamic model -- Parameters and validation  

SciTech Connect

Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical ``models.`` These component models are functionally integrated to represent the plant. With today`s low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in ``single loop`` design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

Moncrief, B.R. [Westinghouse Savannah River Co., Aiken, SC (United States); Chen, F.F.K. [Bechtel National, Inc., San Francisco, CA (United States)

1993-02-01

353

An open source lower limb model: Hip joint validation.  

PubMed

Musculoskeletal lower limb models have been shown to be able to predict hip contact forces (HCFs) that are comparable to in vivo measurements obtained from instrumented prostheses. However, the muscle recruitment predicted by these models does not necessarily compare well to measured electromyographic (EMG) signals. In order to verify if it is possible to accurately estimate HCFs from muscle force patterns consistent with EMG measurements, a lower limb model based on a published anatomical dataset (Klein Horsman et al., 2007. Clinical Biomechanics. 22, 239-247) has been implemented in the open source software OpenSim. A cycle-to-cycle hip joint validation was conducted against HCFs recorded during gait and stair climbing trials of four arthroplasty patients (Bergmann et al., 2001. Journal of Biomechanics. 34, 859-871). Hip joint muscle tensions were estimated by minimizing a polynomial function of the muscle forces. The resulting muscle activation patterns obtained by assessing multiple powers of the objective function were compared against EMG profiles from the literature. Calculated HCFs denoted a tendency to monotonically increase their magnitude when raising the power of the objective function; the best estimation obtained from muscle forces consistent with experimental EMG profiles was found when a quadratic objective function was minimized (average overestimation at experimental peak frame: 10.1% for walking, 7.8% for stair climbing). The lower limb model can produce appropriate balanced sets of muscle forces and joint contact forces that can be used in a range of applications requiring accurate quantification of both. The developed model is available at the website https://simtk.org/home/low_limb_london. PMID:21742331

Modenese, L; Phillips, A T M; Bull, A M J

2011-07-13

354

Comparison and validation of combined GRACE/GOCE models of the Earth's gravity field  

NASA Astrophysics Data System (ADS)

Accurate global models of the Earth's gravity field are needed in various applications: in geodesy - to facilitate the production of a unified global height system; in oceanography - as a source of information about the reference equipotential surface (geoid); in geophysics - to draw conclusions about the structure and composition of the Earth's interiors, etc. A global and (nearly) homogeneous set of gravimetric measurements is being provided by the dedicated satellite mission Gravity Field and Steady-State Ocean Circulation Explorer (GOCE). In particular, Satellite Gravity Gradiometry (SGG) data acquired by this mission are characterized by an unprecedented accuracy/resolution: according to the mission objectives, they must ensure global geoid modeling with an accuracy of 1 - 2 cm at the spatial scale of 100 km (spherical harmonic degree 200). A number of new models of the Earth's gravity field have been compiled on the basis of GOCE data in the course of the last 1 - 2 years. The best of them take into account also the data from the satellite gravimetry mission Gravity Recovery And Climate Experiment (GRACE), which offers an unbeatable accuracy in the range of relatively low degrees. Such combined models contain state-of-the-art information about the Earth's gravity field up to degree 200 - 250. In the present study, we compare and validate such models, including GOCO02, EIGEN-6S, and a model compiled in-house. In addition, the EGM2008 model produced in the pre-GOCE era is considered as a reference. The validation is based on the ability of the models to: (i) predict GRACE K-Band Ranging (KBR) and GOCE SGG data (not used in the production of the models under consideration), and (ii) synthesize a mean dynamic topography model, which is compared with the CNES-CLS09 model derived from in situ oceanographic data. The results of the analysis demonstrate that the GOCE SGG data lead not only to significant improvements over continental areas with a poor coverage with terrestrial gravimetry measurements (such as Africa, Himalayas, and South America), but also to some improvements over well-studied continental areas (such as North America and Australia). Furthermore, we demonstrate a somewhat higher performance of the model produced in-house compared to the other combined GRACE/GOCE models. At the same time, it is found that the combined models show a relatively high level of noise in the oceanic areas compared to EGM2008. This implies that further efforts are needed in order to suppress high-frequency noise in the combined models in the optimal way.

Hashemi Farahani, H.; Ditmar, P.

2012-04-01

355

Nonparametric model validations for hidden Markov models with applications in financial econometrics.  

PubMed

We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

Zhao, Zhibiao

2011-06-01

356

Narrowband VLF observations as validation of Plasmaspheric model  

NASA Astrophysics Data System (ADS)

PLASMON is a European Union FP7 project which will use observations of whistlers and field line resonances to construct a data assimilative model of the plasmasphere. This model will be validated by comparison with electron precipitation data derived from narrowband VLF observations of subionospheric propagation from the AARDDVARK network. A VLF receiver on Marion Island, located at 46.9° S 37.1° E (L = 2.60), is able to observe the powerful NWC transmitter in Australia over a 1.4 < L < 3.0 path which passes exclusively over the ocean. The signal is thus very strong and exhibits an excellent signal-to-noise ratio. Data from the UltraMSK narrowband VLF receiver on Marion Island are used to examine evidence of particle precipitation along this path, thereby inferring the rate at which electrons are scattered into the bounce loss cone. This path covers a small range of L-values so that there is little ambiguity in the source of any peturbations. Perturbations detected on the path during geomagnetic storms should predominantly be responses to energetic electron precipitation processes occurring inside the plasmasphere. Comparisons will be made to preliminary plasmaspheric results from the PLASMON project.

Collier, Andrew; Clilverd, Mark; Rodger, C. J.; Delport, Brett; Lichtenberger, János

2012-07-01

357

Validating the topographic climatology logic of the MTCLIM model  

SciTech Connect

The topographic climatology logic of the MTCLIM model was validated using a comparison of modeled air temperatures vs. remotely sensed, thermal infrared (TIR) surface temperatures from three Daedalus Thematic Mapper Simulator scenes. The TIR data was taken in 1990 near Sisters, Oregon, as part of the NASA OTTER project. The original air temperature calculation method was modified for the spatial context of this study. After stratifying by canopy closure and relative solar loading, r{sup 2} values of 0.74, 0.89, and 0.97 were obtained for the March, June, and August scenes, respectively, using a modified air temperature algorithm. Consistently lower coefficients of determination were obtained using the original air temperature algorithm on the same data r{sup 2} values of .070, .52, and .66 for the March, June, and August samples respectively. The difficulties of comparing screen height air temperatures with remotely sensed surface temperatures are discussed, and several ideas for follow-on studies are suggested.

Glassy, J.M.; Running, S.W. [Univ. of Montana, Missoula, MT (United States)

1995-06-01

358

Bidirectional reflectance function in coastal waters: modeling and validation  

NASA Astrophysics Data System (ADS)

The current operational algorithm for the correction of bidirectional effects from the satellite ocean color data is optimized for typical oceanic waters. However, versions of bidirectional reflectance correction algorithms, specifically tuned for typical coastal waters and other case 2 conditions, are particularly needed to improve the overall quality of those data. In order to analyze the bidirectional reflectance distribution function (BRDF) of case 2 waters, a dataset of typical remote sensing reflectances was generated through radiative transfer simulations for a large range of viewing and illumination geometries. Based on this simulated dataset, a case 2 water focused remote sensing reflectance model is proposed to correct above-water and satellite water leaving radiance data for bidirectional effects. The proposed model is first validated with a one year time series of in situ above-water measurements acquired by collocated multi- and hyperspectral radiometers which have different viewing geometries installed at the Long Island Sound Coastal Observatory (LISCO). Match-ups and intercomparisons performed on these concurrent measurements show that the proposed algorithm outperforms the algorithm currently in use at all wavelengths.

Gilerson, Alex; Hlaing, Soe; Harmel, Tristan; Tonizzo, Alberto; Arnone, Robert; Weidemann, Alan; Ahmed, Samir

2011-10-01

359

Type-II TS fuzzy model-based predictive control  

Microsoft Academic Search

Type-II fuzzy model is useful to handle the influence of uncertainties. This paper presents an algorithm of Type-II T-S fuzzy (T2TSF) modeling based on data clustering and two approaches to design T2TSF model-based predictive controllers. As the T2TSF model is an extension of T1TSF (Type-I T-S fuzzy) model, the T2TSF modeling algorithm divides the input-output data set into several Type-I

Qianfang Liao; Ning Li; Shaoyuan Li

2009-01-01

360

Experimental validation of 2D profile photoresist shrinkage model  

NASA Astrophysics Data System (ADS)

For many years, lithographic resolution has been the main obstacle in allowing the pace of transistor densification to meet Moore's Law. For the 32 nm node and beyond, new lithography techniques will be used, including immersion ArF (iArF) lithography and extreme ultraviolet lithography (EUVL). As in the past, these techniques will use new types of photoresists with the capability to print smaller feature widths and pitches. These smaller feature sizes will also require the use of thinner layers of photoresists, such as under 100 nm. In previous papers, we focused on ArF and iArF photoresist shrinkage. We evaluated the magnitude of shrinkage for both R&D and mature resists as a function of chemical formulation, lithographic sensitivity, scanning electron microscope (SEM) beam condition, and feature size. Shrinkage results were determined by the well accepted methodology described in SEMATECH's CD-SEM Unified Specification. In other associated works, we first developed a 1-D model for resist shrinkage for the bottom linewidth and then a 2-D profile model that accounted for shrinkage of all aspects of a trapezoidal profile along a given linescan. A fundamental understanding of the phenomenology of the shrinkage trends was achieved, including how the shrinkage behaves differently for different sized and shaped features. In the 1-D case, calibration of the parameters to describe the photoresist material and the electron beam was all that was required to fit the models to real shrinkage data, as long as the photoresist was thick enough that the beam could not penetrate the entire layer of resist. The later 2-D model included improvements for solving the CD shrinkage in thin photoresists, which is now of great interest for upcoming realistic lithographic processing to explore the change in resist profile with electron dose and to predict the influence of initial resist profile on shrinkage characteristics. The 2-D model also included shrinkage due to both the primary electron beam directly impacting the profile and backscattered electrons from the electron beam impacting the surrounding substrate. This dose from backscattering was shown to be an important component in the resist shrinkage process, such that at lower beam energies, it dominates linewidth shrinkage. In this work, results from a previous paper will be further explored with numerically simulated results and compared to experimental results to validate the model. With these findings, we can demonstrate the state of readiness of these models for predicting the shrinkage characteristics of photoresist measurements and estimating the errors in calculating the original CD from the shrinkage trend.

Bunday, Benjamin; Cordes, Aaron; Self, Andy; Ferry, Lorena; Danilevsky, Alex

2011-03-01

361

Idle Resource Supplement Model and Validity Time Designation Model with Reliability Measurement in Grid Computing  

NASA Astrophysics Data System (ADS)

Grid computing provides high performance like a super computer through sharing and using distributed heterogeneous computing resources. Grid computing processing time and cost are widely varied since a grid user or a grid middleware can select a variety of distributed heterogeneous resources. Therefore, grid computing absolutely needs a grid resource management method and model. In this paper, we propose two types of resource management model with resource reliability. The first model is the idle resource supplement model. The point of the idle resource supplement model is that adds to idle resources when existing resources can not process jobs. The second model is the validity time designation model which considers grid users. This model processes jobs during validity time that is decided by grid users. This paper evaluates system performance such as utilization, job-loss rate and average turn-around time. And, we estimate experiment results of our models in comparison with those of existing models such as a random model and a round-robin model. The experiment results demonstrate that the two models based on resource reliability measurement improve resource utilization and provide reliable job processing. And, we expect that our proposed models improve grid computing QoS.

Park, Da Hye; Jang, Sung Ho; Noh, Chang Hyeon; Lee, Jong Sik

362

Validation study of the Ionosphere Forecast Model using the TOPEX total electron content measurements  

Microsoft Academic Search

As a part of the validation program in the Utah State University Global Assimilation of Ionospheric Measurement (GAIM) project, a newly improved Ionosphere Forecast Model (IFM) was systematically validated by using a large database of TOPEX total electron content (TEC) measurements. The TOPEX data used for the validation are for the period from August 1992 to March 2003, and the

L. Zhu; G. Jee; L. Scherliess; J. J. Sojka; D. C. Thompson

2006-01-01

363

Verification & validation of an agent-based forest fire simulation model  

Microsoft Academic Search

In this paper, we present the verification and validation of an agent-based model of forest fires. We use a combination of a Virtual Overlay Multi-Agent System (VOMAS) validation scheme with Fire Weather Index (FWI) to validate the forest fire Simulation. FWI is based on decades of real forest fire data and is now regarded as a standard index for fire

Muaz A. Niazi; Qasim Siddique; Amir Hussain; Mario Kolberg

2010-01-01

364

Alaska North Slope Tundra Travel Model and Validation Study  

SciTech Connect

The Alaska Department of Natural Resources (DNR), Division of Mining, Land, and Water manages cross-country travel, typically associated with hydrocarbon exploration and development, on Alaska's arctic North Slope. This project is intended to provide natural resource managers with objective, quantitative data to assist decision making regarding opening of the tundra to cross-country travel. DNR designed standardized, controlled field trials, with baseline data, to investigate the relationships present between winter exploration vehicle treatments and the independent variables of ground hardness, snow depth, and snow slab thickness, as they relate to the dependent variables of active layer depth, soil moisture, and photosynthetically active radiation (a proxy for plant disturbance). Changes in the dependent variables were used as indicators of tundra disturbance. Two main tundra community types were studied: Coastal Plain (wet graminoid/moist sedge shrub) and Foothills (tussock). DNR constructed four models to address physical soil properties: two models for each main community type, one predicting change in depth of active layer and a second predicting change in soil moisture. DNR also investigated the limited potential management utility in using soil temperature, the amount of photosynthetically active radiation (PAR) absorbed by plants, and changes in microphotography as tools for the identification of disturbance in the field. DNR operated under the assumption that changes in the abiotic factors of active layer depth and soil moisture drive alteration in tundra vegetation structure and composition. Statistically significant differences in depth of active layer, soil moisture at a 15 cm depth, soil temperature at a 15 cm depth, and the absorption of photosynthetically active radiation were found among treatment cells and among treatment types. The models were unable to thoroughly investigate the interacting role between snow depth and disturbance due to a lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

Harry R. Bader; Jacynthe Guimond

2006-03-01

365

Validation of qualitative models of genetic regulatory networks by model checking: analysis of the nutritional stress response in Escherichia coli  

Microsoft Academic Search

Motivation: The modeling and simulation of genetic regu- latory networks have created the need for tools for model validation. The main challenges of model validation are the achievement of a match between the precision of model pre- dictions and experimental data, as well as the efficient and reliable comparison of the predictions and observations. Results: We present an approach towards

Grégory Batt; Delphine Ropers; Hidde De Jong; Johannes Geiselmann; Radu Mateescu; Michel Page; Dominique Schneider

2005-01-01

366

Comparison of HYDRA-II predictions to temperature data from consolidated and unconsolidated model spent fuel assemblies  

SciTech Connect

Using the HYDRA-II computer code, Pacific Northwest Laboratory (PNL) researchers analyzed the thermal performance of two model spent fuel assemblies. The numerical simulations were based on information from offsite laboratory tests conducted previously on an unconsolidated and a consolidated rod assembly. The objectives of the PNL effort were to examine the thermal characteristics of the consolidated rod assembly and to validate the predictive capability of the HYDRA-II code for application to such analyses. When compared to the physical test data, the predictions generated by HYDRA-II were in excellent agreement for all temperature comparisons. These analyses provided further validation of HYDRA-II's capability to accurately predict thermal performance of spent fuel storage system components. Results obtained for the consolidated rod assembly lend strong support to the value of further investigations of this option for dry storage of spent fuel. 14 refs., 10 figs., 7 tabs.

McCann, R.A.

1988-09-01

367

Criteria of validity in experimental psychopathology: application to models of anxiety and depression.  

PubMed

The modeling of abnormal behavior in 'normal' subjects (often animals) has a long history in pharmacological research for the screening of novel drug compounds. Systematic criteria have been outlined in that literature to estimate the external validity of a model, that is to estimate how closely the model is linked to the disorder of interest. Experimental psychopathology (EPP) also uses behavioral models to study the psychological processes that underlie abnormal behavior. Although EPP researchers may occasionally feel uneasy about the validity of the model that they use, the issue has not received direct attention in this literature. Here, we review the criteria of validity as set out in pharmacology research (face, predictive and construct validity) and discuss their relevance for EPP research. Furthermore, we propose diagnostic validity as an additional criterion of external validity that is relevant to EPP research. We evaluate two models for the study of anxiety and depression, and show that they have good face, diagnostic and construct validity. However, EPP research generally lacks direct tests of predictive validity. We conclude that combined evaluations of predictive, diagnostic and construct validity provide a sound basis to infer the external validity of behavioral models in EPP research. PMID:23146308

Vervliet, B; Raes, F

2012-11-12

368

A methodology for cost-risk analysis in the statistical validation of simulation models  

Microsoft Academic Search

A methodology is presented for constructing the relationships among model user's risk, model builder's risk, acceptable validity range, sample sizes, and cost of data collection when statistical hypothesis testing is used for validating a simulation model of a real, observable system. The use of the methodology is illustrated for the use of Hotelling's two-sample T 2 test in testing the

Osman Balci; Robert G. Sargent

1981-01-01

369

Experiments for calibration and validation of plasticity and failure material modeling: 304L stainless steel.  

SciTech Connect

Experimental data for material plasticity and failure model calibration and validation were obtained from 304L stainless steel. Model calibration data were taken from smooth tension, notched tension, and compression tests. Model validation data were provided from experiments using thin-walled tube specimens subjected to path dependent combinations of internal pressure, extension, and torsion.

Lee, Kenneth L.; Korellis, John S.; McFadden, Sam X.

2006-01-01

370

Validation of an economic model of paliperidone palmitate for chronic schizophrenia.  

PubMed

Abstract Objective: Model validation is important, but seldom applied in chronic schizophrenia. Validation consists of verifying the model itself for face validity (i.e., structure and inputs), cross-validation with other models assessing the same issue, and comparison with real-life outcomes. The primary purpose was to cross-validate a recent pharmacoeconomic model comparing long-acting injectable (LAI) antipsychotics for treating chronic schizophrenia in Sweden. The secondary purpose was to provide external validation. Methods: The model of interest was a decision tree analysis with a 1-year time horizon with costs in 2011 Swedish kroner. Drugs analyzed included paliperidone palmitate (PP-LAI), olanzapine pamoate (OLZ-LAI), risperidone (RIS-LAI), haloperidol (HAL-LAI), and oral olanzapine (oral-OLZ). Embase and Medline were searched from 1990-2012 for models examining LAIs. Articles were retrieved, with data extracted for all drugs compared including: expected costs, rates of hospitalization, proportion of time not in relapse, and associated QALYs. Outcomes from the model of interest were compared with those from other articles; costs were projected to 2012 using the consumer price index. Results: Twenty-six studies were used for validation; 14 of them provided evidence for cross-validation, 13 for external validation, and four for cost. In cross-validation, cost estimates varied -1.8% (range: -12.4-20.1%), hospitalizations 5.2% (-12.1-3.1%), stable disease 2.5% (-5.6-1.5%), QALYs 9.0% (4.3% after removing outliers). All estimates of clinical outcomes were within 15%. In external validation, hospitalization rates varied by 6.3% (-0.7-11.3%). The research was limited by data availability and validity of the original results. Conclusion: Other models validated the outputs of our model very well. PMID:24003857

Einarson, Thomas R; Hemels, Michiel E H

2013-09-13

371

Traveling with Cognitive Tests: Testing the Validity of a KABC-II Adaptation in India  

ERIC Educational Resources Information Center

|The authors evaluated the adequacy of an extensive adaptation of the American Kaufman Assessment Battery for Children, second edition (KABC-II), for 6- to 10-year-old Kannada-speaking children of low socioeconomic status in Bangalore, South India. The adapted KABC-II was administered to 598 children. Subtests showed high reliabilities, the…

Malda, Maike; van de Vijver, Fons J. R.; Srinivasan, Krishnamachari; Transler, Catherine; Sukumar, Prathima

2010-01-01

372

Image quality assessment in digital mammography: part II. NPWE as a validated alternative for contrast detail analysis.  

PubMed

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography. PMID:21701050

Monnin, P; Marshall, N W; Bosmans, H; Bochud, F O; Verdun, F R

2011-06-23

373

Modeling the Arm II core in MicroCap IV  

SciTech Connect

This paper reports on how an electrical model for the core of the Arm II machine was created and how to use this model. We wanted to get a model for the electrical characteristics of the ARM II core, in order to simulate this machine and to assist in the design of a future machine. We wanted this model to be able to simulate saturation, variable loss, and reset. Using the Hodgdon model and the circuit analysis program MicroCap IV, this was accomplished. This paper is written in such a way as to allow someone not familiar with the project to understand it.

Dalton, A.C.

1996-11-01

374

A Validation Model for Measurement of Acetabular Component Position  

Microsoft Academic Search

There is no agreement on a standard approach to evaluating acetabular cup orientation, ideal target orientation, or a standardized measurement method for cup orientation in total hip arthroplasty. The purpose of this study was to investigate a simple method for validating measurements of acetabular orientation obtained using computer navigation and computed tomography scans. This study validated the imageless navigation system

Aamer Malik; Zhinian Wan; Branislav Jaramaz; Gary Bowman; Lawrence D. Dorr

2010-01-01

375

A VALIDATION SUITE FOR FUEL-FIRED FURNACE MODELS  

Microsoft Academic Search

Validation is the key when attempting to instill con- fidence in a building simulation tool. The user expects that the underlying algorithms are correct, and will have more confidence in the simulation results generated using a program that has under- gone validation testing. The IEA BESTEST (Judkoff and Neymark 1995) was dev eloped by the International Energy Agency Solar Heating

Julia Purdy; Ian Beausoleil-Morrison

376

Modeling and validating chronic pharmacological manipulation of circadian rhythms.  

PubMed

Circadian rhythms can be entrained by a light-dark (LD) cycle and can also be reset pharmacologically, for example, by the CK1?/? inhibitor PF-670462. Here, we determine how these two independent signals affect circadian timekeeping from the molecular to the behavioral level. By developing a systems pharmacology model, we predict and experimentally validate that chronic CK1?/? inhibition during the earlier hours of a LD cycle can produce a constant stable delay of rhythm. However, chronic dosing later during the day, or in the presence of longer light intervals, is not predicted to yield an entrained rhythm. We also propose a simple method based on phase response curves (PRCs) that predicts the effects of a LD cycle and chronic dosing of a circadian drug. This work indicates that dosing timing and environmental signals must be carefully considered for accurate pharmacological manipulation of circadian phase.CPT: Pharmacometrics & Systems Pharmacology (2013) 2, e57; doi:10.1038/psp.2013.34; published online 17 July 2013. PMID:23863866

Kim, J K; Forger, D B; Marconi, M; Wood, D; Doran, A; Wager, T; Chang, C; Walton, K M

2013-07-17

377

Validation and Application of Concentrated Cesium Eluate Physical Property Models  

SciTech Connect

This work contained two objectives. To verify the mathematical equations developed for the physical properties of concentrated cesium eluate solutions against experimental test results obtained with simulated feeds. To estimate the physical properties of the radioactive AW-101 cesium eluate at saturation using the validated models. The Hanford River Protection Project (RPP) Hanford Waste Treatment and Immobilization Plant (WTP) is currently being built to extract radioisotopes from the vast inventory of Hanford tank wastes and immobilize them in a silicate glass matrix for eventual disposal at a geological repository. The baseline flowsheet for the pretreatment of supernatant liquid wastes includes removal of cesium using regenerative ion-exchange resins. The loaded cesium ion-exchange columns will be eluted with nitric acid nominally at 0.5 molar, and the resulting eluate solution will be concentrated in a forced-convection evaporator to reduce the storage volume and to recover the acid for reuse. The reboiler pot is initially charged with a concentrated nitric acid solution and kept under a controlled vacuum during feeding so the pot contents would boil at 50 degrees Celsius. The liquid level in the pot is maintained constant by controlling both the feed and boilup rates. The feeding will continue with no bottom removal until the solution in the pot reaches the target endpoint of 80 per cent saturation with respect to any one of the major salt species present.

Choi, A.S.

2004-03-18

378

REACTIVE PLUME MODEL--RPM-II: USER'S GUIDE  

EPA Science Inventory

The Reactive Plume Model (RPM-II) is a computerized model used primarily for estimating short-term concentrations of primary and secondary pollutants resulting from point-source emissions. Two main features of the model are (1) its chemical kinetic mechanism, which explicitly sol...

379

Uncertainty forecast from 3-D super-ensemble multi-model combination: validation and calibration  

NASA Astrophysics Data System (ADS)

Measurements collected during the Recognized Environmental Picture 2010 experiment (REP10) in the Ligurian Sea are used to evaluate 3-D super-ensemble (3DSE) 72-hour temperature predictions and their associated uncertainty. The 3DSE reduces the total Root-Mean-Square Difference by 12 and 32% respectively with reference to the ensemble mean and the most accurate of the models when comparing to regularly distributed surface temperature data. When validating against irregularly distributed in situ observations, the 3DSE, ensemble mean and most accurate model lead to similar scores. The 3DSE temperature uncertainty estimate is obtained from the product of a posteriori model weight error covariances by an operator containing model forecast values. This uncertainty prediction is evaluated using a criterion based on the 2.5th and 97.5th percentiles of the error distribution. The 3DSE error is found to be on average underestimated during the forecast period, reflecting (i) the influence of ocean dynamics and (ii) inaccuracies in the a priori weight error correlations. A calibration of the theoretical 3DSE uncertainty is proposed for the REP10 scenario, based on a time-evolving amplification coefficient applied to the a posteriori weight error covariance matrix. This calibration allows the end-user to be confident that, on average, the true ocean state lies in the -2/+2 3DSE uncertainty range in 95% of the cases.

Mourre, Baptiste; Chiggiato, Jacopo; Lenartz, Fabian; Rixen, Michel

2012-02-01

380

Assessing the validity of multinomial models using extraneous variables: an application to prospective memory.  

PubMed

The class of multinomial processing tree (MPT) models has been used extensively in cognitive psychology to model latent cognitive processes. Critical for the usefulness of a MPT model is its psychological validity. Generally, the validity of a MPT model is demonstrated by showing that its parameters are selectively and predictably affected by theoretically meaningful experimental manipulations. Another approach is to test the convergent validity of the model parameters and other extraneous measures intended to measure the same cognitive processes. Here, we advance the concept of construct validity (Cronbach & Meehl, 1955 ) as a criterion for model validity in MPT modelling and show how this approach can be fruitfully utilized using the example of a MPT model of event-based prospective memory. For that purpose, we investigated the convergent validity of the model parameters and established extraneous measures of prospective memory processes over a range of experimental settings, and we found a lack of convergent validity between the two indices. On a conceptual level, these results illustrate the importance of testing convergent validity. Additionally, they have implications for prospective memory research, because they demonstrate that the MPT model of event-based prospective memory is not able to differentiate between different processes contributing to prospective memory performance. PMID:21736435

Rummel, Jan; Boywitt, C Dennis; Meiser, Thorsten

2011-07-07

381

Research program to develop and validate conceptual models for flow and transport through unsaturated, fractured rock  

SciTech Connect

As part of the Yucca Mountain Project, our research program to develop and validate conceptual models for flow and transport through unsaturated fractured rock integrates fundamental physical experimentation with conceptual model formulation and mathematical modeling. Our research is directed toward developing and validating macroscopic, continuum-based models and supporting effective property models because of their widespread utility within the context of this project. Success relative to the development and validation of effective property models is predicated on a firm understanding of the basic physics governing flow through fractured media, specifically in the areas of unsaturated flow and transport in a single fracture and fracture-matrix interaction. 43 refs.

Glass, R.J.; Tidwell, V.C.

1991-01-01

382

DISCRETE EVENT MODELING IN PTOLEMY II  

Microsoft Academic Search

Abstract This report describes the discrete-event semantics and its implementation,in the Ptolemy II software architecture. The discrete-event system representation is appropriate for time-oriented systems such as queueing systems, communication networks, and hardware systems. A key strength in our discrete-event implementation ,is that simultaneous ,events are handled systematically and deterministically. A formal and rigorous treatment of this property is given. One

Lukito Muliadi

383

Land-cover change model validation by an ROC method for the Ipswich watershed, Massachusetts, USA  

Microsoft Academic Search

Scientists need a better and larger set of tools to validate land-use change models, because it is essential to know a model’s prediction accuracy. This paper describes how to use the relative operating characteristic (ROC) as a quantitative measurement to validate a land-cover change model. Typically, a crucial component of a spatially explicit simulation model of land-cover change is a

R. Gil Pontius Jr; Laura C. Schneider

2001-01-01

384

Internal and External Validation of Spatial Microsimulation Models: Small Area Estimates of Adult Obesity  

Microsoft Academic Search

Spatial microsimulation models can be used to estimate previously unknown data at the micro-level, although validation of\\u000a these models can be challenging. This paper seeks to describe an approach to validation of these models. Obesity data in adults\\u000a were estimated at the small area level using a static, deterministic, spatial microsimulation model called SimObesity. This\\u000a model utilised both Census 2001

Kimberley L. Edwards; Graham P. Clarke; James Thomas; David Forman

385

The ability model of emotional intelligence: Searching for valid measures  

Microsoft Academic Search

Current measures of ability emotional intelligence (EI) – in particular the well-known Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) – suffer from several limitations, including low discriminant validity and questionable construct and incremental validity. We show that the MSCEIT is largely predicted by personality dimensions, general intelligence, and demographics having multiple R’s with the MSCEIT branches up to .66; for the general

Marina Fiori; John Antonakis

2011-01-01

386

Towards an artificial model for Photosystem II: a manganese(II,II) dimer covalently linked to ruthenium(II) tris-bipyridine via a tyrosine derivative.  

PubMed

In order to model the individual electron transfer steps from the manganese cluster to the photooxidized sensitizer P680+ in Photosystem II (PS II) in green plants, the supramolecular complex 4 has been synthesized. In this complex, a ruthenium(II) tris-bipyridine type photosensitizer has been linked to a manganese(II) dimer via a substituted L-tyrosine, which bridges the manganese ions. The trinuclear complex 4 was characterized by electron paramagnetic resonance (EPR) and electrospray ionization mass spectrometry (ESI-MS). The excited state lifetime of the ruthenium tris-bipyridine moiety in 4 was found to be about 110 ns in acetonitrile. Using flash photolysis in the presence of an electron acceptor (methylviologen), it was demonstrated that in the supramolecular complex 4 an electron was transferred from the excited state of the ruthenium tris-bipyridine moiety to methylviologen, forming a methylviologen radical and a ruthenium(III) tris-bipyridine moiety. Next, the Ru(III) species retrieved the electron from the manganese(II/II) dimer in an intramolecular electron transfer reaction with a rate constant kET > 1.0 x 10(7) s(-1), generating a manganese(II/III) oxidation state and regenerating the ruthenium(II) photosensitizer. This is the first example of intramolecular electron transfer in a supramolecular complex, in which a manganese dimer is covalently linked to a photosensitizer via a tyrosine unit, in a process which mimics the electron transfer on the donor side of PS II. PMID:10714701

Sun, L; Raymond, M K; Magnuson, A; LeGourriérec, D; Tamm, M; Abrahamsson, M; Kenéz, P H; Mårtensson, J; Stenhagen, G; Hammarström, L; Styring, S; Akermark, B

2000-01-15

387

Model validation for robust control of uncertain systems with an integral quadratic constraint  

Microsoft Academic Search

This paper presents a new approach to the model validation problem for a class of uncertain systems in which the uncertainty is described by an integral quadratic constraint. The proposed model validation algorithm is based on the solution to a game-type Riccati differential equation and a set of state equations closely related to a robust Kalman filtering problem.

Andrey V. Savkin; Ian R. Petersen

1996-01-01

388

Servant Leadership Behaviour Scale: A hierarchical model and test of construct validity  

Microsoft Academic Search

Servant leadership is widely believed to be a multidimensional construct. However, existing measures of servant leadership typically suffer from highly correlated dimensions, raising concerns over discriminant validity. We set out in this study to examine the dimensionality of the hypothesized six-factor Servant Leadership Behaviour Scale (SLBS) and validate a hierarchical model of servant leadership. Using structural equation modelling, convergent and

Sen Sendjayar; Brian Cooper

2011-01-01

389

Preliminary results from the EPRI Plume Model Validation Project: plains site. Interim report  

Microsoft Academic Search

The EPRI Plume Model Validation Project (PMV) is a continuing effort designed to provide the data bases and analyses for rigorous operational and scientific validation of plume models. In this project the behavior and fate of buoyant plumes emitted from tall stacks are the foci of attention. To date, a network of 200 tracer samplers, 30 air quality samplers, two

N. E. Bowne; R. J. Londergan; D. H. Minott; D. R. Murray

1981-01-01

390

Preliminary results from the EPRI Plume Model Validation Project: plains site  

Microsoft Academic Search

The EPRI Plume Model Validation project (PMV) is a continuing effort designed to provide the data bases and analyses for rigorous operational and scientific validation of plume models. In this project the behavior and fate of buoyant plumes emitted from tall stacks are the foci of attention. To date, a network of 200 tracer samplers, 30 air quality samplers, two

Bowne

1981-01-01

391

MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom  

NASA Astrophysics Data System (ADS)

The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

392

Simplified Risk Model Version II (SRM-II) Structure and Application  

SciTech Connect

The Simplified Risk Model Version II (SRM-II) is a quantitative tool for efficiently evaluating the risk from Department of Energy waste management activities. Risks evaluated include human safety and health and environmental impact. Both accidents and normal, incident-free operation are considered. The risk models are simplifications of more detailed risk analyses, such as those found in environmental impact statements, safety analysis reports, and performance assessments. However, wherever possible, conservatisms in such models have been removed to obtain best estimate results. The SRM-II is used to support DOE complex-wide environmental management integration studies. Typically such activities involve risk predictions including such activities as initial storage, handling, treatment, interim storage, transportation, and final disposal.

Eide, Steven Arvid; Wierman, Thomas Edward

1999-08-01

393

Simplified Risk Model Version II (SRM-II) Structure and Application  

SciTech Connect

The Simplified Risk Model Version II (SRM-II) is a quantitative tool for efficiently evaluating the risk from Department of Energy waste management activities. Risks evaluated include human safety and health and environmental impact. Both accidents and normal, incident-free operation are considered. The risk models are simplifications of more detailed risk analyses, such as those found in environmental impact statements, safety analysis reports, and performance assessments. However, wherever possible, conservatisms in such models have been removed to obtain best estimate results. The SRM-II is used to support DOE complex-wide environmental management integration studies. Typically such studies involve risk predictions covering the entire waste management program, including such activities as initial storage, handling, treatment, interim storage, transportation, and final disposal.

S. A. Eide; T. E. Wierman

1999-08-01

394

Assessment of the Validity of the Double Superhelix Model for Reconstituted High Density Lipoproteins  

PubMed Central

For several decades, the standard model for high density lipoprotein (HDL) particles reconstituted from apolipoprotein A-I (apoA-I) and phospholipid (apoA-I/HDL) has been a discoidal particle ?100 ? in diameter and the thickness of a phospholipid bilayer. Recently, Wu et al. (Wu, Z., Gogonea, V., Lee, X., Wagner, M. A., Li, X. M., Huang, Y., Undurti, A., May, R. P., Haertlein, M., Moulin, M., Gutsche, I., Zaccai, G., Didonato, J. A., and Hazen, S. L. (2009) J. Biol. Chem. 284, 36605–36619) used small angle neutron scattering to develop a new model they termed double superhelix (DSH) apoA-I that is dramatically different from the standard model. Their model possesses an open helical shape that wraps around a prolate ellipsoidal type I hexagonal lyotropic liquid crystalline phase. Here, we used three independent approaches, molecular dynamics, EM tomography, and fluorescence resonance energy transfer spectroscopy (FRET) to assess the validity of the DSH model. (i) By using molecular dynamics, two different approaches, all-atom simulated annealing and coarse-grained simulation, show that initial ellipsoidal DSH particles rapidly collapse to discoidal bilayer structures. These results suggest that, compatible with current knowledge of lipid phase diagrams, apoA-I cannot stabilize hexagonal I phase particles of phospholipid. (ii) By using EM, two different approaches, negative stain and cryo-EM tomography, show that reconstituted apoA-I/HDL particles are discoidal in shape. (iii) By using FRET, reconstituted apoA-I/HDL particles show a 28–34-? intermolecular separation between terminal domain residues 40 and 240, a distance that is incompatible with the dimensions of the DSH model. Therefore, we suggest that, although novel, the DSH model is energetically unfavorable and not likely to be correct. Rather, we conclude that all evidence supports the likelihood that reconstituted apoA-I/HDL particles, in general, are discoidal in shape.

Jones, Martin K.; Zhang, Lei; Catte, Andrea; Li, Ling; Oda, Michael N.; Ren, Gang; Segrest, Jere P.

2010-01-01

395

Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.  

PubMed

This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. PMID:24076304

López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

2013-08-30

396

Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing.  

National Technical Information Service (NTIS)

Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics,...

M. J. Brenner R. J. Prazenica

2003-01-01

397

Validation of the Cormix Model Using Thermal Plume Data from Four Maryland Power Plants.  

National Technical Information Service (NTIS)

The purpose of this investigation was to test (validate in computer modeling terminology) the mixing zone model CORMIX (CORnell MIXing Zone Expert System) using measured thermal plume data from the four Maryland power plants (Calvert Cliffs, Chalk Point, ...

S. P. Schreiner T. A. Krebs D. E. Strebel A. Brindley C. G. McCall

1999-01-01

398

Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method.  

National Technical Information Service (NTIS)

Two theorems and a lemma are presented about the use of jackknife estimator and the cross-validation method for model selection. Theorem 1 gives the asymptotic form for the jackknife estimator. Combined with the model selection criterion, this asymptotic ...

Y. Liu

1993-01-01

399

Type-II seesaw mass models and baryon asymmetry  

NASA Astrophysics Data System (ADS)

We compute and also compare the contributions of canonical and non-canonical mass terms towards baryon asymmetry by considering type-II seesaw mass models of neutrinos: degenerate (3 varieties), normal hierarchical and inverted hierarchical (2 varieties). We have shown that for particular choices of parameter ‘?’ (the so-called discriminator) for different neutrino mass models, the baryon asymmetry is largely dominated by canonical term. Within such type-II seesaw scenario, we find normal hierarchical (NHT3) mass model as the most favourable choice of nature.

Sarma, Amal Kr.; Zeen Devi, H.; Nimai Singh, N.

2007-03-01

400

Composing Different Models of Computation in Kepler and Ptolemy II  

Microsoft Academic Search

A model of computation (MoC) is a formal abstraction of execution in a computer. There is a need for composing MoCs in e-science. Kepler, which is based on Ptolemy II, is a scientific workflow environment that allows for MoC composition. This paper explains how MoCs are combined in Kepler and Ptolemy II and analyzes which combinations of MoCs are currently

Antoon Goderis; Christopher Brooks; Ilkay Altintas; Edward A. Lee; Carole A. Goble

2007-01-01

401

System modeling and simulation at EBR-II  

Microsoft Academic Search

The codes being developed and verified using EBR-II data are the NATDEMO, DSNP and CSYRED. NATDEMO is a variation of the Westinghouse DEMO code coupled to the NATCON code previously used to simulate perturbations of reactor flow and inlet temperature and loss-of-flow transients leading to natural convection in EBR-II. CSYRED uses the Continuous System Modeling Program (CSMP) to simulate the

E. M. Dean; W. K. Lehto; H. A. Larson

1986-01-01

402

Statistical process control for validating a classification tree model for predicting mortality--a novel approach towards temporal validation.  

PubMed

Prediction models are postulated as useful tools to support tasks such as clinical decision making and benchmarking. In particular, classification tree models have enjoyed much interest in the Biomedical Informatics literature. However, their prospective predictive performance over the course of time has not been investigated. In this paper we suggest and apply statistical process control methods to monitor over more than 5 years the prospective predictive performance of TM80+, one of the few classification-tree models published in the clinical literature. TM80+ is a model for predicting mortality among very elderly patients in the intensive care based on a multi-center dataset. We also inspect the predictive performance at the tree's leaves. This study provides important insights into patterns of (in)stability of the tree's performance and its "shelf life". The study underlies the importance of continuous validation of prognostic models over time using statistical tools and the timely recalibration of tree models. PMID:21907826

Minne, Lilian; Eslami, Saeid; de Keizer, Nicolette; de Jonge, Evert; de Rooij, Sophia E; Abu-Hanna, Ameen

2011-08-31

403

Validation of a Person Specific 1-D Model of the Systemic Arterial Tree  

Microsoft Academic Search

\\u000a The aim of this study is to validate a personspecific distributed model of the main systemic arterial tree, coupled to a model\\u000a of the left ventricle of the heart. This model is built and validated with non-invasive measurements on the same person, leading\\u000a therefore to a coherent set of physiological data. Although previous studies have been done on 1-D model

P. Reymond; Y. Bohraus; F. Perren; F. Lazeyras; N. Stergiopulos

404

Linkage of Bipolar Disorder to Chromosome 18q and the Validity of Bipolar II Disorder  

Microsoft Academic Search

Background: An analysis of the relationship between clinical features and allele sharing could clarify the is- sue of genetic linkage between bipolar affective disor- der (BPAD) and chromosome 18q, contributing to the definition of genetically valid clinical subtypes. Methods: Relatives ascertained through a proband who had bipolar I disorder (BPI) were interviewed by a psy- chiatrist, assigned an all-sources diagnosis,

Francis J. McMahon; Sylvia G. Simpson; Melvin G. McInnis; Judith A. Badner; Dean F. MacKinnon; J. Raymond DePaulo

2001-01-01

405

The African American Acculturation Scale II: Cross-Validation and Short Form.  

ERIC Educational Resources Information Center

Studied African American culture, using a new, shortened, 33-item African American Acculturation Scale (AAAS-33) to assess the scale's validity and reliability. Comparisons between the original form and AAAS-33 reveal high correlations, however, the longer form may be sensitive to some beliefs, practices, and attitudes not assessed by the short…

Landrine, Hope; Klonoff, Elizabeth A.

1995-01-01

406

A concrete performance test for delayed ettringite formation: Part II validation  

Microsoft Academic Search

Delayed ettringite formation (DEF) is a rare problem of concrete, whose reaction mechanisms have been investigated by a large number of studies. In order to develop a performance test, the authors have conducted a feasibility study and an optimization study followed by this validation study. A performance test was previously developed to evaluate the risk of expansion as a result

Alexandre Pavoine; Loïc Divet; Stéphane Fenouillet

2006-01-01

407

An intelligent signal validation system for a cupola furnace. II. Testing and analysis  

Microsoft Academic Search

In part I, the motivation behind the signal validation system for the cupola furnace was presented, and a methodology for developing an ANN rule based filter (ANN-RBFTE) and inferential sensors for the molten-iron temperature was described. In this paper we present the testing results of the filter and inferential sensors using cupola experimental data. A methodology for building a signal

Senthil Subramanian; Mohamed Abdelrahman

1999-01-01

408

A New Algorithm for the Quantitationof Myocardial Perfusion SPECT.II: Validation and Diagnostic Yield  

Microsoft Academic Search

This study validates a new quantitative perfusion SPECT algo rithmfortheassessment ofmyocardial perfusion. Thealgorithm is notbasedon slicesandprovidesfully3-dimensional sampling and analysis independent of assumptions about the geometric shape of the left ventricle. Methods: Radiopharmaceutical- and sex-specific normal limits andthresholds forperfusion abnormal ity in 20 segmentsof the left ventricle were developedfor separate, dual-isotope rest@°1Tlâ€\\

Tali Sharir; Guido Germano; Parker B. Waechter; Paul B. Kavanagh; Joseph S. Areeda; Jim Gerlach; Xingping Kang; Howard C. Lewin; Daniel S. Berman

409

Transient PVT measurements and model predictions for vessel heat transfer. Part II.  

SciTech Connect

Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models in which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.

Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.; Evans, Gregory Herbert; Rice, Steven F.

2010-07-01

410

A matrix model black hole: act II  

NASA Astrophysics Data System (ADS)

In this paper we discuss the connection between the deformed matrix model and two dimensional black holes in the light of the new developements involving fermionic type 0A-string theory. We argue that many of the old results can be carried over to this new setting and that the original claims about the deformed matrix model are essentially correct. We show the agreement between correlation functions calculated using continuum and matrix model techniques. We also explain how detailed properties of the space time metric of the extremal blck hole of type 0A are reflected in the deformed matrix model.

Danielsson, Ulf H.

2004-02-01

411

Percolation and sieving segregation patterns: Quantification, mechanistic theory, model development and validation, and application  

NASA Astrophysics Data System (ADS)

The general goal of this research was to study percolation and sieving segregation patterns---quantification, mechanistic theory, model development and validation of particulate materials. A second generation primary segregation shear cell (PSSC-II) was designed and fabricated to model the sieving and percolation segregation mechanisms of particulate materials. Two test materials used in this research were spherical shaped glass beads (denoted as G) and irregular shaped mash poultry feed (denoted as F), which are considered as representatives of ideal and real world materials, respectively. The PSSC-II test results showed that there is a linear relationship between normalized segregation rate (NSR) and absolute size or size ratio for GG and FG combinations; whereas, linear relationship does not hold for FF and GF combinations although the effect of absolute size and size ratio on NSR were significant (P < 0.001). The NSR is defined as the ratio of collected fines mass to feed fines mass divided by total time. Furthermore, comparisons between these four combinations showed that, compared with coarse particle properties, fine particle properties other than size including density, surface texture, and electrostatic charges of a binary mixture play a dominant role on NSR. For instance, the higher density and smoother surface of fine glass beads lead to a NSR for GG and FG combinations much greater compared with fine feed particles with lower density and rough surface texture for FF and GF combinations. Additionally, the irregular shaped coarse bed of particles (higher porosity) cause higher segregation potential of fines compared with spherical shaped coarse particles with lower porosity. A mechanistic theory-based segregation model (denoted as MTB model) for GG and FG combinations was developed using mechanics, dimensional analysis, and linear regression methods. The MTB model, for the first time, successfully correlated the effect of particle size, density, and shape to segregation potential of binary mixtures in one quantitative equation. Furthermore, the MTB model has the potential to accommodate additional effects such as surface texture and electrostatic charge to generalize the model. Finally, as a case study, effect of feed particle segregation on bird performance was performed to examine the effectiveness of the research results. The results showed that, due to bird selection behavior and particle segregation, birds did not sufficiently consume those nutrients that are contained in smaller feed particles (<1,180 mum). The results of feed particle size and nutrients analysis verified the above observations. (Abstract shortened by UMI.)

Tang, Pingjun

412

Science Motivation Questionnaire II: Validation with Science Majors and Nonscience Majors  

ERIC Educational Resources Information Center

|From the perspective of social cognitive theory, the motivation of students to learn science in college courses was examined. The students--367 science majors and 313 nonscience majors--responded to the Science Motivation Questionnaire II, which assessed five motivation components: intrinsic motivation, self-determination, self-efficacy, career…

Glynn, Shawn M.; Brickman, Peggy; Armstrong, Norris; Taasoobshirazi, Gita

2011-01-01

413

A FUNDAMENTAL PARTICLE MODEL. Part II  

Microsoft Academic Search

A fundamental-particle model (E. van der Spuy, Nuclear Physics, 29: ; 400(1962)) is discussed. The model for baryons and mesons starts from a ; fundamental non-linear equation of motion for the field operator in the ; Heisenberg representation, with a formally symmetric self-interaction. Equations ; of motion are generated for particles and particle interactions from the equation ; of motion

van der Spuy

1962-01-01

414

Validating animal models for preclinical research: a scientific and ethical discussion.  

PubMed

The use of animals to model humans in biomedical research relies on the notion that basic processes are sufficiently similar across species to allow extrapolation. Animal model validity is discussed in terms of the similarity between the model and the human condition it is intended to model, but no formal validation of models is applied. There is a stark contrast here with the use of non-animal alternatives in toxicology and safety studies, for which an extensive validation is required. We discuss both the potential and the limitations of validating preclinical animal models for proof-of-concept studies, by using an approach similar to that applied to alternative non-animal methods in toxicology and safety testing. A major challenge in devising a validation system for animal models is the lack of a clear gold standard with which to compare results. While a complete adoption of the validation approach for alternative methods is probably inappropriate for research animal models, key features, such as making data available for external validation and defining a strategy to run experiments in a way that permits meaningful retrospective analysis, remain highly relevant. PMID:20602541

Varga, Orsolya E; Hansen, Axel K; Sandøe, Peter; Olsson, I Anna S

2010-06-01

415

Toward a model of drug relapse: An assessment of the validity of the reinstatement procedure  

PubMed Central

Background and Rationale The reinstatement model is widely used animal model of relapse to drug addiction. However, the model’s validity is open to question. Objective We assess the reinstatement model in terms of criterion and construct validity. Research highlights and Conclusions We find that the reinstatement model has adequate criterion validity in the broad sense of the term, as evidenced by the fact that reinstatement in laboratory animals is induced by conditions reported to provoke relapse in humans. The model’s criterion validity in the narrower sense, as a medication screen, seems promising for relapse to heroin, nicotine, and alcohol. For relapse to cocaine, criterion validity has not yet established, primarily because clinical studies have examined medication’s effects on reductions in cocaine intake rather than relapse during abstinence. The model’s construct validity faces more substantial challenges and is yet to be established, but we argue that some of the criticisms of the model in this regard may have been overstated.

Epstein, David H.; Preston, Kenzie L.; Stewart, Jane; Shaham, Yavin

2006-01-01

416

Design of embedded systems: formal models, validation, and synthesis  

Microsoft Academic Search

This paper addresses the design of reactive real-time embedded systems. Such systems are often heterogeneous in implementation technologies and design styles, for example by combining hardware application-specific integrated circuits (ASICs) with embedded software. The concurrent design process for such embedded systems involves solving the specification, validation, and synthesis problems. We review the variety of approaches to these problems that have

Stephen Edwards; Luciano Lavagno; Edward A. Lee; Alberto Sangiovanni-Vincentelli

1997-01-01

417

Model calibration and validation of an impact test simulation  

Microsoft Academic Search

This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave.

F. M. Hemez; A. C. Wilson; G. N. Havrilla

2001-01-01

418

Social Validity of a Positive Behavior Interventions and Support Model  

Microsoft Academic Search

:As more schools turn to positive behavior interventions and support (PBIS) to address students' academic and behavioral problems, there is an increased need to adequately evaluate these programs for social relevance. The present study used social validation measures to evaluate a statewide PBIS initiative. Active consumers of the program were polled regarding their perceptions of the program's social relevance, including

Melissa Allen Heath; Nancy Y. Miramontes; Michelle Marchant

2011-01-01

419

Social Validity of a Positive Behavior Interventions and Support Model  

Microsoft Academic Search

As more schools turn to positive behavior interventions and support (PBIS) to address students' academic and behavioral problems, there is an increased need to adequately evaluate these programs for social relevance. The present study used social validation measures to evaluate a statewide PBIS initiative. Active consumers of the program were polled regarding their perceptions of the program's social relevance, including

Melissa Allen Heath; Nancy Y. Miramontes; Michelle Marchant

2011-01-01

420

Natural Interaction Metaphors for Functional Validations of Virtual Car Models  

Microsoft Academic Search

Natural Interaction in virtual environments is a key requirement for the virtual validation of functional aspects in automotive product development processes. Natural Interaction is the metaphor people encounter in reality: the direct manipulation of objects by their hands. To enable this kind of Natural Interaction, we propose a pseudophysical metaphor that is both plausible enough to provide realistic interaction and

Mathias Moehring; Bernd Froehlich

2011-01-01

421

BIOCHEMICAL AND MORPHOLOGICAL VALIDATION OF A RODENT MODEL OF OPIDN  

EPA Science Inventory

The paper describes six years of research designed to validate the use of the rat as a viable alternative to the hen for screening and mechanistic studies of neuropathic OP compounds. To date the results indicate that if morphological rather than behavioral endpoints are used, th...

422

Examining the Reliability and Validity of Clinician Ratings on the Five-Factor Model Score Sheet  

PubMed Central

Despite substantial research use, measures of the five-factor model (FFM) are infrequently used in clinical settings due, in part, to issues related to administration time and a reluctance to use self-report instruments. The current study examines the reliability and validity of the Five-Factor Model Score Sheet (FFMSS), which is a 30-item clinician rating form designed to assess the five domains and 30 facets of one conceptualization of the FFM. Studied in a sample of 130 outpatients, clinical raters demonstrated reasonably good interrater reliability across personality profiles and the domains manifested good internal consistency with the exception of Neuroticism. The FFMSS ratings also evinced expected relations with self-reported personality traits (e.g., FFMSS Extraversion and Schedule for Nonadaptive and Adaptive Personality Positive Temperament) and consensus-rated personality disorder symptoms (e.g., FFMSS Agreeableness and Narcissistic Personality Disorder). Finally, on average, the FFMSS domains were able to account for approximately 50% of the variance in domains of functioning (e.g., occupational, parental) and were even able to account for variance after controlling for Axis I and Axis II pathology. Given these findings, it is believed that the FFMSS holds promise for clinical use.

Few, Lauren R.; Miller, Joshua D.; Morse, Jennifer Q.; Yaggi, Kirsten E.; Reynolds, Sarah K.; Pilkonis, Paul A.

2013-01-01

423

Laser-silicon interaction for selective emitter formation in photovoltaics. I. Numerical model and validation  

NASA Astrophysics Data System (ADS)

Laser doping to form selective emitters offers an attractive method to increase the performance of silicon wafer based photovoltaics. However, the effect of processing conditions, such as laser power and travel speed, on molten zone geometry and the phosphorus dopant profile is not well understood. A mathematical model is developed to quantitatively investigate and understand how processing parameters impact the heat and mass transfer and fluid flow during laser doping using continuous wave lasers. Calculated molten zone dimensions and dopant concentration profiles are in good agreement with independent experimental data reported in the literature. The mechanisms for heat (conduction) and mass (convection) transport are examined, which lays the foundation for quantitatively understanding the effect of processing conditions on molten zone geometry and dopant concentration distribution. The validated model and insight into heat and mass transport mechanisms also provide the bases for developing process maps, which are presented in part II. These maps illustrate the effects of output power and travel speed on molten zone geometry, average dopant concentration, dopant profile shape, and sheet resistance.

Blecher, J. J.; Palmer, T. A.; DebRoy, T.

2012-12-01

424

Modeling the Photoionized Interface in Blister H II Regions  

NASA Astrophysics Data System (ADS)

We present a grid of photoionization models for the emission from photoevaporative interfaces between the ionized gas and molecular cloud in blister H II regions. For the density profiles of the emitting gas in the models, we use a general power-law form calculated for photoionized, photoevaporative flows by Bertoldi. We find that the spatial emission-line profiles are dependent on the incident flux, the shape of the ionizing continuum, and the elemental abundances. In particular, we find that the peak emissivity of the [S II] and [N II] lines are more sensitive to the elemental abundances than are the total line intensities. The diagnostics obtained from the grid of models can be used in conjunction with high spatial resolution data to infer the properties of ionized interfaces in blister H II regions. As an example, we consider a location at the tip of an ``elephant trunk'' structure in M16 (the Eagle Nebula) and show how narrowband Hubble Space Telescope Wide Field Planetary Camera 2 (HSTWFPC2) images constrain the H II region properties. We present a photoionization model that explains the ionization structure and emission from the interface seen in these high spatial resolution data.

Sankrit, Ravi; Hester, J. Jeff

2000-06-01

425

The solar CA II H profile computed with theoretical models  

NASA Astrophysics Data System (ADS)

The calibration in absolute flux units (erg/sec sq m A) of the Ca II resonance profiles in late type stars is a difficult task which has been solved in different ways. In 1976, Ayres proposed the radiative equilibrium (RE) photospheric model, a method essentially consisting in fitting the observed far wing profiles with computed fluxes. This method was applied to a set of Ca II H profiles observed at ESO and tested on the sun in 1984. Results published by Castelli in 1988 found that, with an RE model, without any increase in temperature in the upper layers, the Ca II H profiles computed in nonlocal thermodynamic equilibrium (both with partial and complete redistribution) do not show remarkable differences from the profile computed in local thermodynamic equilibrium. The paper discusses the comparison of the observed flux with that computed employing LTE and RE theoretical models, and LTE profiles.

Castelli, Fiorella

426

Validation of a methane emission model using eddy covariance observations and footprint modeling.  

NASA Astrophysics Data System (ADS)

Several methane emission models were developed recently to quantify methane emissions. However, calibration of these models is currently performed using chamber flux methane measurements, which have a number of limitations, such as small footprint area and low temporal resolution. Furthermore, chamber measurements are unsuitable to register ebullition events, which can have a significant influence on observed fluxes. Eddy covariance measurements on the other hand provide high frequency (5 to 20 Hz) data and cover larger areas, while being a non-intrusive way to measure fluxes and account for ebullition. In this study, we present a validation of methane emission model using eddy covariance data, collected in summer periods at the Indigirka lowland site in Eastern Siberia. A flux footprint model was used together with a high resolution vegetation map of the area to retrieve vegetation distribution inside the footprint. Subsequently, this data with eddy covariance data is used to calibrate a methane emission model.

Budishchev, A.; Mi, Y.; Gallagher, A.; van Huissteden, J.; Schaepman-Strub, G.; Dolman, A. J.; Maximov, T. C.

2012-04-01

427

Understanding and using the Implicit Association Test: II. Method variables and construct validity.  

PubMed

The Implicit Association Test (IAT) assesses relative strengths of four associations involving two pairs of contrasted concepts (e.g., male-female and family-career). In four studies, analyses of data from 11 Web IATs, averaging 12,000 respondents per data set, supported the following conclusions: (a) sorting IAT trials into subsets does not yield conceptually distinct measures; (b) valid IAT measures can be produced using as few as two items to represent each concept; (c) there are conditions for which the administration order of IAT and self-report measures does not alter psychometric properties of either measure; and (d) a known extraneous effect of IAT task block order was sharply reduced by using extra practice trials. Together, these analyses provide additional construct validation for the IAT and suggest practical guidelines to users of the IAT. PMID:15619590

Nosek, Brian A; Greenwald, Anthony G; Banaji, Mahzarin R

2005-02-01

428

A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection  

Microsoft Academic Search

We review accuracy estimation methods and compare the two most common methods cross- validation and bootstrap Recent experimen­ tal results on artificial data and theoretical re cults m restricted settings have shown that for selecting a good classifier from a set of classi­ fiers (model selection), ten-fold cross-validation may be better than the more expensive ka\\\\p one-out cross-validation We report

Ron Kohavi

1995-01-01

429

Gravitropic responses of the Avena coleoptile in space and on clinostats. II. Is reciprocity valid?  

PubMed

Experiments were undertaken to determine if the reciprocity rule is valid for gravitropic responses of oat coleoptiles in the acceleration region below 1 g. The rule predicts that the gravitropic response should be proportional to the product of the applied acceleration and the stimulation time. Seedlings were cultivated on 1 g centrifuges and transferred to test centrifuges to apply a transverse g-stimulation. Since responses occurred in microgravity, the uncertainties about the validity of clinostat simulation of weightlessness was avoided. Plants at two stages of coleoptile development were tested. Plant responses were obtained using time-lapse video recordings that were analyzed after the flight. Stimulus intensities and durations were varied and ranged from 0.1 to 1.0 g and from 2 to 130 min, respectively. For threshold g-doses the reciprocity rule was obeyed. The threshold dose was of the order of 55 g s and 120 g s, respectively, for two groups of plants investigated. Reciprocity was studied also at bending responses which are from just above the detectable level to about 10 degrees. The validity of the rule could not be confirmed for higher g-doses, chiefly because the data were more variable. It was investigated whether the uniformity of the overall response data increased when the gravitropic dose was defined as (gm x t) with m-values different from unity. This was not the case and the reciprocity concept is, therefore, valid also in the hypogravity region. The concept of gravitropic dose, the product of the transverse acceleration and the stimulation time, is also well-defined in the acceleration region studied. With the same hardware, tests were done on earth where responses occurred on clinostats. The results did not contradict the reciprocity rule but scatter in the data was large. PMID:11539923

Johnsson, A; Brown, A H; Chapman, D K; Heathcote, D; Karlsson, C

1995-09-01

430

Nyala and Bushbuck II: A Harvesting Model.  

ERIC Educational Resources Information Center

|Adds a cropping or harvesting term to the animal overpopulation model developed in Part I of this article. Investigates various harvesting strategies that might suggest a solution to the overpopulation problem without actually culling any animals. (ASK)|

Fay, Temple H.; Greeff, Johanna C.

1999-01-01

431

Cross-Validation of the Taiwan Version of the Moorehead–Ardelt Quality of Life Questionnaire II with WHOQOL and SF36  

Microsoft Academic Search

Background  Obesity has become a major worldwide public health issue. There is a need for tools to measure patient-reported outcomes.\\u000a The Moorehead–Ardelt Quality of Life Questionnaire II (MA II) contains six items. The objective of this study was to translate\\u000a the MA II into Chinese and validate it in patients with morbid obesity.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  The MA II was translated into Chinese and

Chi-Yang Chang; Chih-Kun Huang; Yu-Yin Chang; Chi-Ming Tai; Jaw-Town Lin; Jung-Der Wang

2010-01-01

432

Development and validation of instantaneous risk model in nuclear power plant's risk monitor  

SciTech Connect

The instantaneous risk model is the fundament of calculation and analysis in a risk monitor. This study focused on the development and validation of an instantaneous risk model. Therefore the principles converting from the baseline risk model to the instantaneous risk model were studied and separated trains' failure modes modeling method was developed. The development and validation process in an operating nuclear power plant's risk monitor were also introduced. Correctness of instantaneous risk model and rationality of converting method were demonstrated by comparison with the result of baseline risk model. (authors)

Wang, J.; Li, Y.; Wang, F.; Wang, J.; Hu, L. [Inst. of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); School of Nuclear Science and Technology, Univ. of Science and Technology of China, Hefei, Anhui, 230031 (China)

2012-07-01