Sample records for ii model validation

  1. Fatigue crack growth under variable-amplitude loading: Part II Code development and model validation q

    E-print Network

    Ray, Asok

    Fatigue crack growth under variable-amplitude loading: Part II ± Code development and model 2001; accepted 12 February 2001 Abstract A state-space model of fatigue crack growth has been information for code development and validates the state-space model with fatigue test data for dierent types

  2. Large-scale Validation of AMIP II Land-surface Simulations: Preliminary Results for Ten Models

    SciTech Connect

    Phillips, T J; Henderson-Sellers, A; Irannejad, P; McGuffie, K; Zhang, H

    2005-12-01

    This report summarizes initial findings of a large-scale validation of the land-surface simulations of ten atmospheric general circulation models that are entries in phase II of the Atmospheric Model Intercomparison Project (AMIP II). This validation is conducted by AMIP Diagnostic Subproject 12 on Land-surface Processes and Parameterizations, which is focusing on putative relationships between the continental climate simulations and the associated models' land-surface schemes. The selected models typify the diversity of representations of land-surface climate that are currently implemented by the global modeling community. The current dearth of global-scale terrestrial observations makes exacting validation of AMIP II continental simulations impractical. Thus, selected land-surface processes of the models are compared with several alternative validation data sets, which include merged in-situ/satellite products, climate reanalyses, and off-line simulations of land-surface schemes that are driven by observed forcings. The aggregated spatio-temporal differences between each simulated process and a chosen reference data set then are quantified by means of root-mean-square error statistics; the differences among alternative validation data sets are similarly quantified as an estimate of the current observational uncertainty in the selected land-surface process. Examples of these metrics are displayed for land-surface air temperature, precipitation, and the latent and sensible heat fluxes. It is found that the simulations of surface air temperature, when aggregated over all land and seasons, agree most closely with the chosen reference data, while the simulations of precipitation agree least. In the latter case, there also is considerable inter-model scatter in the error statistics, with the reanalyses estimates of precipitation resembling the AMIP II simulations more than to the chosen reference data. In aggregate, the simulations of land-surface latent and sensible heat fluxes appear to occupy intermediate positions between these extremes, but the existing large observational uncertainties in these processes make this a provisional assessment. In all selected processes as well, the error statistics are found to be sensitive to season and latitude sector, confirming the need for finer-scale analyses which also are in progress.

  3. Use of ISLSCP II data to intercompare and validate the terrestrial net primary production in a land surface model coupled to a general circulation model

    Microsoft Academic Search

    Li Dan; Jinjun Ji; Yong He

    2007-01-01

    Using the global terrestrial NPP and climate data from International Satellite Land Surface Climatology Project Initiative II (ISLSCP II) and additional NPP data, we validated the NPP simulations and explored the relationship between NPP and climate variation in a global two-way coupled model AVIM-GOALS. The strength of this study is that the global simulations produced will enhance interactive climate and

  4. Development of a livestock odor dispersion model: part II. Evaluation and validation.

    PubMed

    Yu, Zimu; Guo, Huiqing; Laguë, Claude

    2011-03-01

    A livestock odor dispersion model (LODM) was developed to predict odor concentration and odor frequency using routine hourly meteorological data input. The odor concentrations predicted by the LODM were compared with the results obtained from other commercial models (Industrial Source Complex Short-Term model, version 3, CALPUFF) to evaluate its appropriateness. Two sets of field odor plume measurement data were used to validate the model. The model-predicted mean odor concentrations and odor frequencies were compared with those measured. Results show that this model has good performance for predicting odor concentrations and odor frequencies. PMID:21416754

  5. A wheat grazing model for simulating grain and beef production: Part II - model validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Model evaluation is a prerequisite to its adoption and successful application. The objective of this paper is to evaluate the ability of a newly developed wheat grazing model to predict fall-winter forage and grain yields of winter wheat (Triticum aestivum L.) as well as daily weight gains per steer...

  6. Black liquor combustion validated recovery boiler modeling: Final year report. Volume 2 (Appendices I, section 5 and II, section 1)

    SciTech Connect

    Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

    1998-08-01

    This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 2 contains the last section of Appendix I, Radiative heat transfer in kraft recovery boilers, and the first section of Appendix II, The effect of temperature and residence time on the distribution of carbon, sulfur, and nitrogen between gaseous and condensed phase products from low temperature pyrolysis of kraft black liquor.

  7. SAGE II aerosol data validation based on retrieved aerosol model size distribution from SAGE II aerosol measurements.

    PubMed

    Wang, P H; McCormick, M P; McMaster, L R; Chu, W P; Swissler, T J; Osborn, M T; Russell, P B; Oberbeck, V R; Livingston, J; Rosen, J M; Hofmann, D J; Grams, G W; Fuller, W H; Yue, G K

    1989-06-20

    This paper describes an investigation of the comprehensive aerosol correlative measurement experiments conducted between November 1984 and July 1986 for satellite measurement program of the Stratospheric Aerosol and Gas Experiment (SAGE II). The correlative sensors involved in the experiments consist of the NASA Ames Research Center impactor/laser probe, the University of Wyoming dustsonde, and the NASA Langley Research Center airborne 14-inch (36 cm) lidar system. The approach of the analysis is to compare the primary aerosol quantities measured by the ground-based instruments with the calculated ones based on the aerosol size distributions retrieved from the SAGE II aerosol extinction measurements. The analysis shows that the aerosol size distributions derived from the SAGE II observations agree qualitatively with the in situ measurements made by the impactor/laser probe. The SAGE II-derived vertical distributions of the ratio N0.15/N0.25 (where Nr is the cumulative aerosol concentration for particle radii greater than r, in micrometers) and the aerosol backscatter profiles at 0.532- and 0.6943-micrometer lidar wavelengths are shown to agree with the dustsonde and the 14-inch (36-cm) lidar observations, with the differences being within the respective uncertainties of the SAGE II and the other instruments. PMID:11539801

  8. A common model validation in the case of the Toyota Prius II

    Microsoft Academic Search

    Keyu Chen; Rochdi Trigui; Alain Bouscayrol; Emmanuel Vinot; Walter Lhomme; Alain Berthon

    2010-01-01

    Using different approaches, modeling for different HEVs has been studied. Generally, considering the different combinations of components, each architecture has its own modeling. The modeling and control design of different HEVs could be achieved in a general way, despite the fact that HEVs can be very different from each other in terms of structure. If the modeling and control design

  9. The model SIRANE for atmospheric urban pollutant dispersion; PART II, validation of the model on a real case study

    NASA Astrophysics Data System (ADS)

    Soulhac, L.; Salizzoni, P.; Mejean, P.; Didier, D.; Rios, I.

    2012-03-01

    We analyse the performance of the model SIRANE by comparing its outputs to field data measured within an urban district. SIRANE is the first urban dispersion model based on the concept of street network, and contains specific parametrical law to explicitly simulate the main transfer mechanisms within the urban canopy. The model validation is performed by means of field data collected during a 15 days measurement campaign in an urban district in Lyon, France. The campaign provided information on traffic fluxes and cars emissions, meteorological conditions, background pollution levels and pollutant concentration in different location within the district. This data set, together with complementary modelling tools needed to estimate the spatial distribution of traffic fluxes, allowed us to estimate the input data required by the model. The data set provide also the information essential to evaluate the accuracy of the model outputs. Comparison between model predictions and field measurements was performed in two ways. By evaluate the reliability of the model in simulating the spatial distribution of the pollutant and of their time variability. The study includes a sensitivity analysis to identify the key input parameters influencing the performance of the model, namely the emissions rates and the wind velocity. The analysis focuses only on the influence of varying input parameters in the modelling chain in the model predictions and complements the analyses provided by wind tunnel studies focussing on the parameterisation implemented in the model. The study also elucidates the critical role of background concentrations that represent a significant contribution to local pollution levels. The overall model performance, measured using the Chang and Hanna (2004) criteria can be considered as 'good' except for NO and some of BTX species. The results suggest that improvements of the performances on NO require testing new photochemical models, whereas the improvement on BTX could be achieved by correcting their vehicular emissions factors.

  10. Assessing Wildlife Habitat Value of New England Salt Marshes: II. Model Testing and Validation

    EPA Science Inventory

    We test a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. Assessment scores ranged f...

  11. Model Validation Status Review

    SciTech Connect

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and engineered barriers, plus the TSPA model itself Description of the model areas is provided in Section 3, and the documents reviewed are described in Section 4. The responsible manager for the Model Validation Status Review was the Chief Science Officer (CSO) for Bechtel-SAIC Co. (BSC). The team lead was assigned by the CSO. A total of 32 technical specialists were engaged to evaluate model validation status in the 21 model areas. The technical specialists were generally independent of the work reviewed, meeting technical qualifications as discussed in Section 5.

  12. TAMDAR Sensor Validation in 2003 AIRS II

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.; Murray, John J.; Anderson, Mark V.; Mulally, Daniel J.; Jensen, Kristopher R.; Grainger, Cedric A.; Delene, David J.

    2005-01-01

    This study entails an assessment of TAMDAR in situ temperature, relative humidity and winds sensor data from seven flights of the UND Citation II. These data are undergoing rigorous assessment to determine their viability to significantly augment domestic Meteorological Data Communications Reporting System (MDCRS) and the international Aircraft Meteorological Data Reporting (AMDAR) system observational databases to improve the performance of regional and global numerical weather prediction models. NASA Langley Research Center participated in the Second Alliance Icing Research Study from November 17 to December 17, 2003. TAMDAR data taken during this period is compared with validation data from the UND Citation. The data indicate acceptable performance of the TAMDAR sensor when compared to measurements from the UND Citation research instruments.

  13. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  14. Black liquor combustion validated recovery boiler modeling: Final year report. Volume 3 (Appendices II, sections 2--3 and III)

    SciTech Connect

    Grace, T.M.; Frederick, W.J.; Salcudean, M.; Wessel, R.A.

    1998-08-01

    This project was initiated in October 1990, with the objective of developing and validating a new computer model of a recovery boiler furnace using a computational fluid dynamics (CFD) code specifically tailored to the requirements for solving recovery boiler flows, and using improved submodels for black liquor combustion based on continued laboratory fundamental studies. The key tasks to be accomplished were as follows: (1) Complete the development of enhanced furnace models that have the capability to accurately predict carryover, emissions behavior, dust concentrations, gas temperatures, and wall heat fluxes. (2) Validate the enhanced furnace models, so that users can have confidence in the predicted results. (3) Obtain fundamental information on aerosol formation, deposition, and hardening so as to develop the knowledge base needed to relate furnace model outputs to plugging and fouling in the convective sections of the boiler. (4) Facilitate the transfer of codes, black liquid submodels, and fundamental knowledge to the US kraft pulp industry. Volume 3 contains the following appendix sections: Formation and destruction of nitrogen oxides in recovery boilers; Sintering and densification of recovery boiler deposits laboratory data and a rate model; and Experimental data on rates of particulate formation during char bed burning.

  15. Development of a new version of the Liverpool Malaria Model. II. Calibration and validation for West Africa

    PubMed Central

    2011-01-01

    Background In the first part of this study, an extensive literature survey led to the construction of a new version of the Liverpool Malaria Model (LMM). A new set of parameter settings was provided and a new development of the mathematical formulation of important processes related to the vector population was performed within the LMM. In this part of the study, so far undetermined model parameters are calibrated through the use of data from field studies. The latter are also used to validate the new LMM version, which is furthermore compared against the original LMM version. Methods For the calibration and validation of the LMM, numerous entomological and parasitological field observations were gathered for West Africa. Continuous and quality-controlled temperature and precipitation time series were constructed using intermittent raw data from 34 weather stations across West Africa. The meteorological time series served as the LMM data input. The skill of LMM simulations was tested for 830 different sets of parameter settings of the undetermined LMM parameters. The model version with the highest skill score in terms of entomological malaria variables was taken as the final setting of the new LMM version. Results Validation of the new LMM version in West Africa revealed that the simulations compare well with entomological field observations. The new version reproduces realistic transmission rates and simulated malaria seasons are comparable to field observations. Overall the new model version performs much better than the original model. The new model version enables the detection of the epidemic malaria potential at fringes of endemic areas and, more importantly, it is now applicable to the vast area of malaria endemicity in the humid African tropics. Conclusions A review of entomological and parasitological data from West Africa enabled the construction of a new LMM version. This model version represents a significant step forward in the modelling of a weather-driven malaria transmission cycle. The LMM is now more suitable for the use in malaria early warning systems as well as for malaria projections based on climate change scenarios, both in epidemic and endemic malaria areas. PMID:21410939

  16. A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part II - Validation and localization analysis

    NASA Astrophysics Data System (ADS)

    Das, Arghya; Tengattini, Alessandro; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

    2014-10-01

    We study the mechanical failure of cemented granular materials (e.g., sandstones) using a constitutive model based on breakage mechanics for grain crushing and damage mechanics for cement fracture. The theoretical aspects of this model are presented in Part I: Tengattini et al. (2014), A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables, Part I - Theory (Journal of the Mechanics and Physics of Solids, 10.1016/j.jmps.2014.05.021). In this Part II we investigate the constitutive and structural responses of cemented granular materials through analyses of Boundary Value Problems (BVPs). The multiple failure mechanisms captured by the proposed model enable the behavior of cemented granular rocks to be well reproduced for a wide range of confining pressures. Furthermore, through comparison of the model predictions and experimental data, the micromechanical basis of the model provides improved understanding of failure mechanisms of cemented granular materials. In particular, we show that grain crushing is the predominant inelastic deformation mechanism under high pressures while cement failure is the relevant mechanism at low pressures. Over an intermediate pressure regime a mixed mode of failure mechanisms is observed. Furthermore, the micromechanical roots of the model allow the effects on localized deformation modes of various initial microstructures to be studied. The results obtained from both the constitutive responses and BVP solutions indicate that the proposed approach and model provide a promising basis for future theoretical studies on cemented granular materials.

  17. Validation of updated neutronic calculation models proposed for Atucha-II PHWR. Part II: Benchmark comparisons of PUMA core parameters with MCNP5 and improvements due to a simple cell heterogeneity correction

    SciTech Connect

    Grant, C. [Comision Nacional de Energia Atomica, Av del Libertador 8250, Buenos Aires 1429 (Argentina); Mollerach, R. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina); Leszczynski, F.; Serra, O.; Marconi, J. [Comision Nacional de Energia Atomica, Av del Libertador 8250, Buenos Aires 1429 (Argentina); Fink, J. [Nucleoelectrica Argentina S.A., Arribenos 3619, Buenos Aires 1429 (Argentina)

    2006-07-01

    In 2005 the Argentine Government took the decision to complete the construction of the Atucha-II nuclear power plant, which has been progressing slowly during the last ten years. Atucha-II is a 745 MWe nuclear station moderated and cooled with heavy water, of German (Siemens) design located in Argentina. It has a pressure vessel design with 451 vertical coolant channels and the fuel assemblies (FA) are clusters of 37 natural UO{sub 2} rods with an active length of 530 cm. For the reactor physics area, a revision and update of reactor physics calculation methods and models was recently carried out covering cell, supercell (control rod) and core calculations. This paper presents benchmark comparisons of core parameters of a slightly idealized model of the Atucha-I core obtained with the PUMA reactor code with MCNP5. The Atucha-I core was selected because it is smaller, similar from a neutronic point of view, more symmetric than Atucha-II, and has some experimental data available. To validate the new models benchmark comparisons of k-effective, channel power and axial power distributions obtained with PUMA and MCNP5 have been performed. In addition, a simple cell heterogeneity correction recently introduced in PUMA is presented, which improves significantly the agreement of calculated channel powers with MCNP5. To complete the validation, the calculation of some of the critical configurations of the Atucha-I reactor measured during the experiments performed at first criticality is also presented. (authors)

  18. Applied model validation

    NASA Astrophysics Data System (ADS)

    Davies, A. D.

    1985-07-01

    The NBS Center for Fire Research (CFR) conducts scientific research bearing on the fire safety of buildings, vehicles, tunnels and other inhabited structures. Data from controlled fire experiments are collected, analyzed and reduced to the analytical formulas that appear to underly the observed phenomena. These results and more general physical principles are then combined into models to predict the development of environments that may be hostile to humans. This is a progress report of an applied model validation case study. The subject model is Transport of Fire, Smoke and Gases (FAST). Products from a fire in a burn room exit through a connected corridor to outdoors. Cooler counterflow air in a lower layer feeds the fire. The model predicts corridor layer temperatures and thicknesses vs. time, given enclosure, fire and ambient specifications. Data have been collected from 38 tests using several fire sizes, but have not been reduced. Corresponding model results, and model and test documentation are yet to come. Considerable modeling and calculation is needed to convert instrument readings to test results comparable with model outputs so that residual differences may be determined.

  19. Resolving the mass-anisotropy degeneracy of the spherically symmetric Jeans equation - II. Optimum smoothing and model validation

    NASA Astrophysics Data System (ADS)

    Diakogiannis, Foivos I.; Lewis, Geraint F.; Ibata, Rodrigo A.

    2014-09-01

    The spherical Jeans equation is widely used to estimate the mass content of stellar systems with apparent spherical symmetry. However, this method suffers from a degeneracy between the assumed mass density and the kinematic anisotropy profile, ?(r). In a previous work, we laid the theoretical foundations for an algorithm that combines smoothing B splines with equations from dynamics to remove this degeneracy. Specifically, our method reconstructs a unique kinematic profile of ? _{rr}^2 and ? _{tt}^2 for an assumed free functional form of the potential and mass density (?, ?) and given a set of observed line-of-sight velocity dispersion measurements, ? _los^2. In Paper I, we demonstrated the efficiency of our algorithm with a very simple example and we commented on the need for optimum smoothing of the B-spline representation; this is in order to avoid unphysical variational behaviour when we have large uncertainty in our data. In the current contribution, we present a process of finding the optimum smoothing for a given data set by using information of the behaviour from known ideal theoretical models. Markov Chain Monte Carlo methods are used to explore the degeneracy in the dynamical modelling process. We validate our model through applications to synthetic data for systems with constant or variable mass-to-light ratio ?. In all cases, we recover excellent fits of theoretical functions to observables and unique solutions. Our algorithm is a robust method for the removal of the mass-anisotropy degeneracy of the spherically symmetric Jeans equation for an assumed functional form of the mass density.

  20. Data Assimilation of Photosynthetic Light-use Efficiency using Multi-angular Satellite Data: II Model Implementation and Validation

    NASA Technical Reports Server (NTRS)

    Hilker, Thomas; Hall, Forest G.; Tucker, J.; Coops, Nicholas C.; Black, T. Andrew; Nichol, Caroline J.; Sellers, Piers J.; Barr, Alan; Hollinger, David Y.; Munger, J. W.

    2012-01-01

    Spatially explicit and temporally continuous estimates of photosynthesis will be of great importance for increasing our understanding of and ultimately closing the terrestrial carbon cycle. Current capabilities to model photosynthesis, however, are limited by accurate enough representations of the complexity of the underlying biochemical processes and the numerous environmental constraints imposed upon plant primary production. A potentially powerful alternative to model photosynthesis through these indirect observations is the use of multi-angular satellite data to infer light-use efficiency (e) directly from spectral reflectance properties in connection with canopy shadow fractions. Hall et al. (this issue) introduced a new approach for predicting gross ecosystem production that would allow the use of such observations in a data assimilation mode to obtain spatially explicit variations in e from infrequent polar-orbiting satellite observations, while meteorological data are used to account for the more dynamic responses of e to variations in environmental conditions caused by changes in weather and illumination. In this second part of the study we implement and validate the approach of Hall et al. (this issue) across an ecologically diverse array of eight flux-tower sites in North America using data acquired from the Compact High Resolution Imaging Spectroradiometer (CHRIS) and eddy-flux observations. Our results show significantly enhanced estimates of e and therefore cumulative gross ecosystem production (GEP) over the course of one year at all examined sites. We also demonstrate that e is greatly heterogeneous even across small study areas. Data assimilation and direct inference of GEP from space using a new, proposed sensor could therefore be a significant step towards closing the terrestrial carbon cycle.

  1. Ecological reality and model validation

    SciTech Connect

    Cale, Jr, W. G.; Shugart, H. H.

    1980-01-01

    Definitions of model realism and model validation are developed. Ecological and mathematical arguments are then presented to show that model equations which explicitly treat ecosystem processes can be systematically improved such that greater realism is attained and the condition of validity is approached. Several examples are presented.

  2. Model-Based Sensor Fault Detection and Isolation System for Unmanned Ground Vehicles: Experimental Validation (part II)

    Microsoft Academic Search

    Andrea Monteriu; Prateek Asthan; Kimon P. Valavanis; Sauro Longhi

    2007-01-01

    This paper presents implementation details of a model-based sensor fault detection and isolation system (SFDIS) applied to unmanned ground vehicles (UGVs). Structural analysis, applied to the nonlinear model of the UGV, is followed to build the residual generation module, followed by a residual evaluation module capable of detecting single and multiple sensor faults, as detailed in part I (Monteriu et

  3. A Framework for Model Validation

    SciTech Connect

    Easterling, R.G.

    1999-02-02

    Computational models have the potential of being used to make credible predictions in place of physical testing in many contexts, but success and acceptance require a convincing model validation. In general, model validation is understood to be a comparison of model predictions to experimental results but there appears to be no standard framework for conducting this comparison. This paper gives a statistical framework for the problem of model validation that is quite analogous to calibration, with the basic goal being to design and analyze a set of experiments to obtain information pertaining to the `limits of error' that can be associated with model predictions. Implementation, though, in the context of complex, high-dimensioned models, poses a considerable challenge for the development of appropriate statistical methods and for the interaction of statisticians with model developers and experimentalists. The proposed framework provides a vehicle for communication between modelers, experimentalists, and the analysts and decision-makers who must rely on model predictions.

  4. Predicting germination in semi-arid wildland seedbeds II. Field validation of wet thermal-time models

    Microsoft Academic Search

    Jennifer K. Rawlins; Bruce A. Roundy; Dennis Egget; Nathan Cline

    Accurate prediction of germination for species used for semi-arid land revegetation would support selection of plant materials for specific climatic conditions and sites. Wet thermal-time models predict germination time by summing progress toward germination subpopulation percentages as a function of temperature across intermittent wet periods or within singular wet periods. Wet periods may be defined by any reasonable seedbed water

  5. Model Fe-Al Steel with Exceptional Resistance to High Temperature Coarsening. Part II: Experimental Validation and Applications

    NASA Astrophysics Data System (ADS)

    Zhou, Tihe; Zhang, Peng; O'Malley, Ronald J.; Zurob, Hatem S.; Subramanian, Mani

    2015-01-01

    In order to achieve a fine uniform grain-size distribution using the process of thin slab casting and directing rolling (TSCDR), it is necessary to control the grain-size prior to the onset of thermomechanical processing. In the companion paper, Model Fe- Al Steel with Exceptional Resistance to High Temperature Coarsening. Part I: Coarsening Mechanism and Particle Pinning Effects, a new steel composition which uses a small volume fraction of austenite particles to pin the growth of delta-ferrite grains at high temperature was proposed and grain growth was studied in reheated samples. This paper will focus on the development of a simple laboratory-scale setup to simulate thin-slab casting of the newly developed steel and demonstrate the potential for grain size control under industrial conditions. Steel bars with different diameters are briefly dipped into the molten steel to create a shell of solidified material. These are then cooled down to room temperature at different cooling rates. During cooling, the austenite particles nucleate along the delta-ferrite grain boundaries and greatly retard grain growth. With decreasing temperature, more austenite particles precipitate, and grain growth can be completely arrested in the holding furnace. Additional applications of the model alloy are discussed including grain-size control in the heat affected zone in welds and grain-growth resistance at high temperature.

  6. TUTORIAL: Validating biorobotic models

    NASA Astrophysics Data System (ADS)

    Webb, Barbara

    2006-09-01

    Some issues in neuroscience can be addressed by building robot models of biological sensorimotor systems. What we can conclude from building models or simulations, however, is determined by a number of factors in addition to the central hypothesis we intend to test. These include the way in which the hypothesis is represented and implemented in simulation, how the simulation output is interpreted, how it is compared to the behaviour of the biological system, and the conditions under which it is tested. These issues will be illustrated by discussing a series of robot models of cricket phonotaxis behaviour. .

  7. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  8. Uncertainty Modeling Via Frequency Domain Model Validation

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Andrisani, Dominick, II

    1999-01-01

    Abstract The majority of literature on robust control assumes that a design model is available and that the uncertainty model bounds the actual variations about the nominal model. However, methods for generating accurate design models have not received as much attention in the literature. The influence of the level of accuracy of the uncertainty model on closed loop performance has received even less attention. The research reported herein is an initial step in applying and extending the concept of model validation to the problem of obtaining practical uncertainty models for robust control analysis and design applications. An extension of model validation called 'sequential validation' is presented and applied to a simple spring-mass-damper system to establish the feasibility of the approach and demonstrate the benefits of the new developments.

  9. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  10. A dynamic model of some malaria-transmitting anopheline mosquitoes of the Afrotropical region. II. Validation of species distribution and seasonal variations

    PubMed Central

    2013-01-01

    Background The first part of this study aimed to develop a model for Anopheles gambiae s.l. with separate parametrization schemes for Anopheles gambiae s.s. and Anopheles arabiensis. The characterizations were constructed based on literature from the past decades. This part of the study is focusing on the model’s ability to separate the mean state of the two species of the An. gambiae complex in Africa. The model is also evaluated with respect to capturing the temporal variability of An. arabiensis in Ethiopia. Before conclusions and guidance based on models can be made, models need to be validated. Methods The model used in this paper is described in part one (Malaria Journal 2013, 12:28). For the validation of the model, a data base of 5,935 points on the presence of An. gambiae s.s. and An. arabiensis was constructed. An additional 992 points were collected on the presence An. gambiae s.l.. These data were used to assess if the model could recreate the spatial distribution of the two species. The dataset is made available in the public domain. This is followed by a case study from Madagascar where the model’s ability to recreate the relative fraction of each species is investigated. In the last section the model’s ability to reproduce the temporal variability of An. arabiensis in Ethiopia is tested. The model was compared with data from four papers, and one field survey covering two years. Results Overall, the model has a realistic representation of seasonal and year to year variability in mosquito densities in Ethiopia. The model is also able to describe the distribution of An. gambiae s.s. and An. arabiensis in sub-Saharan Africa. This implies this model can be used for seasonal and long term predictions of changes in the burden of malaria. Before models can be used to improving human health, or guide which interventions are to be applied where, there is a need to understand the system of interest. Validation is an important part of this process. It is also found that one of the main mechanisms separating An. gambiae s.s. and An. arabiensis is the availability of hosts; humans and cattle. Climate play a secondary, but still important, role. PMID:23442727

  11. Bistatic scattering from forest components. Part II: first validation of a bistatic polarimetric forest model in the VHF-UHF band [225-475 MHz] using indoor measurements

    NASA Astrophysics Data System (ADS)

    Colin-Koeniguer, Elise; Thirion-Lefevre, Laetitia

    2010-01-01

    This paper introduces the validation of the extension of a scattering model of forests to the bistatic configuration (COBISMO). The measurement in an anechoic chamber is first described. The various stages of the validation process are presented. One dielectric cylinder on a metallic plate is chosen as the canonical element to be tested. Indoor measurements are confronted with the results predicted by the model, first in the horizontal/azimuthal plane, then in the vertical/elevation plane. Then mutual coupling is also investigated using a group of three cylinders. The agreement between simulation and measurement is surprisingly good in light of the precision of such indoor measurements. Several other aspects are discussed: the influence of the frequency, of the shape of the section of the cylinder, and polarimetric effects.

  12. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  13. SAGE II aerosol validation: selected altitude measurements, including particle micromeasurements.

    PubMed

    Oberbeck, V R; Livingston, J M; Russell, P B; Pueschel, R F; Rosen, J N; Osborn, M T; Kritz, M A; Snetsinger, K G; Ferry, G V

    1989-06-20

    Correlative aerosol measurements taken at a limited number of altitudes during coordinated field experiments are used to test the validity of particulate extinction coefficients derived from limb path solar radiance measurements taken by the Stratospheric Aerosol and Gas Experiment (SAGE) II Sun photometer. In particular, results are presented from correlative measurement missions that were conducted during January 1985, August 1985, and July 1986. Correlative sensors included impactors, laser spectrometers, and filter samplers aboard an U-2-airplane, an upward pointing lidar aboard a P-3 airplane, and balloon-borne optical particle counters (dustsondes). The main body of this paper focuses on the July 29, 1986, validation experiment, which minimized the many difficulties (e.g., spatial and temporal inhomogeneities, imperfect coincidences) that can complicate the validation process. On this day, correlative aerosol measurements taken at an altitude of 20.5 km agreed with each other within their respective uncertainties, and particulate extinction values calculated at SAGE II wavelengths from these measurements validated corresponding SAGE II values. Additional validation efforts on days when measurement and logistical conditions were much less favorable for validation are discussed in an appendix. PMID:11539800

  14. The MicroArray Quality Control (MAQC)-II study of common practices for the development and validation of microarray-based predictive models.

    PubMed

    Shi, Leming; Campbell, Gregory; Jones, Wendell D; Campagne, Fabien; Wen, Zhining; Walker, Stephen J; Su, Zhenqiang; Chu, Tzu-Ming; Goodsaid, Federico M; Pusztai, Lajos; Shaughnessy, John D; Oberthuer, André; Thomas, Russell S; Paules, Richard S; Fielden, Mark; Barlogie, Bart; Chen, Weijie; Du, Pan; Fischer, Matthias; Furlanello, Cesare; Gallas, Brandon D; Ge, Xijin; Megherbi, Dalila B; Symmans, W Fraser; Wang, May D; Zhang, John; Bitter, Hans; Brors, Benedikt; Bushel, Pierre R; Bylesjo, Max; Chen, Minjun; Cheng, Jie; Cheng, Jing; Chou, Jeff; Davison, Timothy S; Delorenzi, Mauro; Deng, Youping; Devanarayan, Viswanath; Dix, David J; Dopazo, Joaquin; Dorff, Kevin C; Elloumi, Fathi; Fan, Jianqing; Fan, Shicai; Fan, Xiaohui; Fang, Hong; Gonzaludo, Nina; Hess, Kenneth R; Hong, Huixiao; Huan, Jun; Irizarry, Rafael A; Judson, Richard; Juraeva, Dilafruz; Lababidi, Samir; Lambert, Christophe G; Li, Li; Li, Yanen; Li, Zhen; Lin, Simon M; Liu, Guozhen; Lobenhofer, Edward K; Luo, Jun; Luo, Wen; McCall, Matthew N; Nikolsky, Yuri; Pennello, Gene A; Perkins, Roger G; Philip, Reena; Popovici, Vlad; Price, Nathan D; Qian, Feng; Scherer, Andreas; Shi, Tieliu; Shi, Weiwei; Sung, Jaeyun; Thierry-Mieg, Danielle; Thierry-Mieg, Jean; Thodima, Venkata; Trygg, Johan; Vishnuvajjala, Lakshmi; Wang, Sue Jane; Wu, Jianping; Wu, Yichao; Xie, Qian; Yousef, Waleed A; Zhang, Liang; Zhang, Xuegong; Zhong, Sheng; Zhou, Yiming; Zhu, Sheng; Arasappan, Dhivya; Bao, Wenjun; Lucas, Anne Bergstrom; Berthold, Frank; Brennan, Richard J; Buness, Andreas; Catalano, Jennifer G; Chang, Chang; Chen, Rong; Cheng, Yiyu; Cui, Jian; Czika, Wendy; Demichelis, Francesca; Deng, Xutao; Dosymbekov, Damir; Eils, Roland; Feng, Yang; Fostel, Jennifer; Fulmer-Smentek, Stephanie; Fuscoe, James C; Gatto, Laurent; Ge, Weigong; Goldstein, Darlene R; Guo, Li; Halbert, Donald N; Han, Jing; Harris, Stephen C; Hatzis, Christos; Herman, Damir; Huang, Jianping; Jensen, Roderick V; Jiang, Rui; Johnson, Charles D; Jurman, Giuseppe; Kahlert, Yvonne; Khuder, Sadik A; Kohl, Matthias; Li, Jianying; Li, Li; Li, Menglong; Li, Quan-Zhen; Li, Shao; Li, Zhiguang; Liu, Jie; Liu, Ying; Liu, Zhichao; Meng, Lu; Madera, Manuel; Martinez-Murillo, Francisco; Medina, Ignacio; Meehan, Joseph; Miclaus, Kelci; Moffitt, Richard A; Montaner, David; Mukherjee, Piali; Mulligan, George J; Neville, Padraic; Nikolskaya, Tatiana; Ning, Baitang; Page, Grier P; Parker, Joel; Parry, R Mitchell; Peng, Xuejun; Peterson, Ron L; Phan, John H; Quanz, Brian; Ren, Yi; Riccadonna, Samantha; Roter, Alan H; Samuelson, Frank W; Schumacher, Martin M; Shambaugh, Joseph D; Shi, Qiang; Shippy, Richard; Si, Shengzhu; Smalter, Aaron; Sotiriou, Christos; Soukup, Mat; Staedtler, Frank; Steiner, Guido; Stokes, Todd H; Sun, Qinglan; Tan, Pei-Yi; Tang, Rong; Tezak, Zivana; Thorn, Brett; Tsyganova, Marina; Turpaz, Yaron; Vega, Silvia C; Visintainer, Roberto; von Frese, Juergen; Wang, Charles; Wang, Eric; Wang, Junwei; Wang, Wei; Westermann, Frank; Willey, James C; Woods, Matthew; Wu, Shujian; Xiao, Nianqing; Xu, Joshua; Xu, Lei; Yang, Lun; Zeng, Xiao; Zhang, Jialu; Zhang, Li; Zhang, Min; Zhao, Chen; Puri, Raj K; Scherf, Uwe; Tong, Weida; Wolfinger, Russell D

    2010-08-01

    Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis. PMID:20676074

  15. Probabilistic Methods for Model Validation

    E-print Network

    Halder, Abhishek

    2014-05-01

    by proposing elliptical orbits for planets in 1609. Kepler’s laws were later explained through Newton’s law of gravitation in 1687. It is instructive to note that no single observation was enough to validate a model or to gain its acceptance. Another example... ?p Intensity parameter of Poisson counting process N(t) Re (z) Real part of the complex number z Im (z) Imaginary part of the complex number z # Cardinality ? Minimum function, i.e. a ? b = minimum of a and b ? Model variable ? Distributional law, i...

  16. (Validity of environmental transfer models)

    SciTech Connect

    Blaylock, B.G.; Hoffman, F.O.; Gardner, R.H.

    1990-11-07

    BIOMOVS (BIOspheric MOdel Validation Study) is an international cooperative study initiated in 1985 by the Swedish National Institute of Radiation Protection to test models designed to calculate the environmental transfer and bioaccumulation of radionuclides and other trace substances. The objective of the symposium and workshop was to synthesize results obtained during Phase 1 of BIOMOVS (the first five years of the study) and to suggest new directions that might be pursued during Phase 2 of BIOMOVS. The travelers were an instrumental part of the development of BIOMOVS. This symposium allowed the travelers to present a review of past efforts at model validation and a synthesis of current activities and to refine ideas concerning future development of models and data for assessing the fate, effect, and human risks of environmental contaminants. R. H. Gardner also visited the Free University, Amsterdam, and the National Institute of Public Health and Environmental Protection (RIVM) in Bilthoven to confer with scientists about current research in theoretical ecology and the use of models for estimating the transport and effect of environmental contaminants and to learn about the European efforts to map critical loads of acid deposition.

  17. ADAPT model: Model use, calibration and validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper presents an overview of the Agricultural Drainage and Pesticide Transport (ADAPT) model and a case study to illustrate the calibration and validation steps for predicting subsurface tile drainage and nitrate-N losses from an agricultural system. The ADAPT model is a daily time step field ...

  18. Validation for a recirculation model.

    PubMed

    LaPuma, P T

    2001-04-01

    Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation. PMID:11318387

  19. Validation of Magnetospheric Magnetohydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar wind, the OpenGGCM has a large region of Earthward flow velocity (Ux) in the current sheet region that grows as time progresses in a compressed environment. BATS-R-US Bz , rho and Ux stabilize to a near constant value approximately one hour into the run under high compression conditions. Under high compression, the SWMF parameters begin to oscillate approximately 100 minutes into the run. All three models have similar magnetopause positions under low pressure conditions. The OpenGGCM current sheet velocities along the Sun-Earth line are largest under low pressure conditions. The results of this analysis indicate the need for accounting for model uncertainties and differences when comparing model predictions with data, provide error bars on model prediction in various magnetospheric regions, and show that the magnetotail is sensitive to the preconditioning time.

  20. VALIDATING SIMULATION MODELS Klaus G. Troitzsch

    E-print Network

    Tesfatsion, Leigh

    , stochastic model, simulation model, validation. ABSTRACT This paper discusses aspects of validating their generalisations around observation, developing new theo- retical structures based on and validated by new evidence of this trait of thinking is the role of simulation or computational mod- eling which can be found in Gilbert

  1. Atlas II and IIA analyses and environments validation

    NASA Astrophysics Data System (ADS)

    Martin, Richard E.

    1995-06-01

    General Dynamics has now flown all four versions of the Atlas commercial launch vehicle, which cover a payload weight capability to geosynchronous transfer orbit (GTO) in the range of 5000-8000 lb. The key analyses to set design and environmental test parameters for the vehicle modifications and the ground and flight test data that validated them were prepared in paper IAF-91-170 for the first version, Atlas I. This paper presents similar data for the next two versions, Atlas II and IIA. The Atlas II has propellant tanks lengthened by 12 ft and is boosted by MA-5A rocket engines uprated to 474,000 lb liftoff thrust. GTO payload capability is 6225 lb with the 11-ft fairing. The Atlas IIA is an Atlas II with uprated RL10A-4 engines on the lengthened Centaur II upper stage. The two 20,800 lb thrust, 449 s specific impulse engines with an optional extendible nozzle increase payload capability to GTO to 6635 lb. The paper describes design parameters and validated test results for many other improvements that have generally provided greater capability at less cost, weight and complexity and better reliability. Those described include: moving the MA-5A start system to the ground, replacing the vernier engines with a simple 50 lb thrust on-off hydrazine roll control system, addition of a POGO suppressor, replacement of Centaur jettisonable insulation panels with fixed foam, a new inertial navigation unit (INU) that combines in one package a ring-laser gyro based strapdown guidance system with two MIL-STD-1750A processors, redundant MIL-STD-1553 data bus interfaces, robust Ada-based software and a new Al-Li payload adapter. Payload environment is shown to be essentially unchanged from previous Atlas vehicles. Validation of load, stability, control and pressurization requirements for the larger vehicle is discussed. All flights to date (five Atlas II, one Atlas IIA) have been successful in launching satellites for EUTELSAT, the U.S. Air Force and INTELSAT. Significant design parameters validated by these flights are presented. Particularly noteworthy has been the performance of the INU, which has provided average GTO insertion errors of only 10 miles apogee, 0.2 miles perigee and 0.004 degrees inclination. It is concluded that Atlas II/IIA have successfully demonstrated probably the largest number of current state-of-the-art components of any expendable launch vehicle flying today.

  2. Factor Structure and Construct Validity of the Behavioral Dyscontrol Scale-II.

    PubMed

    Shura, Robert D; Rowland, Jared A; Yoash-Gantz, Ruth E

    2015-01-01

    The Behavioral Dyscontrol Scale-II (BDS-II) was developed as an improved scoring method to the original BDS, which was designed to evaluate the capacity for independent regulation of behavior and attention. The purpose of this study was to evaluate the factor structure and construct validity of the BDS-II, which had not been adequately re-examined since the development of the new scoring system. In a sample of 164 Veterans with a mean age of 35 years, exploratory factor analysis was used to evaluate BDS-II latent factor structure. Correlations and regressions were used to explore validity against 22 psychometrically sound neurocognitive measures across seven neurocognitive domains of sensation, motor output, processing speed, attention, visual-spatial reasoning, memory, and executive functions. Factor analysis found a two-factor solution for this sample which explained 41% of the variance in the model. Validity analyses found significant correlations among the BDS-II scores and all other cognitive domains except sensation and language (which was not evaluated). Hierarchical regressions revealed that PASAT performance was strongly associated with all three BDS-II scores; dominant hand Finger Tapping Test was also associated with the Total score and Factor 1, and CPT-II Commissions was also associated with Factor 2. These results suggest the BDS-II is both a general test of cerebral functioning, and a more specific test of working memory, motor output, and impulsivity. The BDS-II may therefore show utility with younger populations for measuring frontal lobe abilities and might be very sensitive to neurological injury. PMID:25650736

  3. Geochemistry Model Validation Report: External Accumulation Model

    SciTech Connect

    K. Zarrabi

    2001-09-27

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation, density of accumulation, and the geometry of the accumulation zone. The density of accumulation and the geometry of the accumulation zone are calculated using a characterization of the fracture system based on field measurements made in the proposed repository (BSC 2001k). The model predicts that accumulation would spread out in a conical accumulation volume. The accumulation volume is represented with layers as shown in Figure 1. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance.

  4. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  5. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  6. Description and validation of realistic and structured endourology training model

    PubMed Central

    Soria, Federico; Morcillo, Esther; Sanz, Juan Luis; Budia, Alberto; Serrano, Alvaro; Sanchez-Margallo, Francisco M

    2014-01-01

    Purpose: The aim of the present study was to validate a model of training, which combines the use of non-biological and ex vivo biological bench models, as well as the modelling of urological injuries for endourological treatment in a porcine animal model. Material and Methods: A total of 40 participants took part in this study. The duration of the activity was 16 hours. The model of training was divided into 3 levels: level I, concerning the acquisition of basic theoretical knowledge; level II, involving practice with the bench models and level III, concerning practice in the porcine animal model. First, trainees practiced with animals without using a model of injured (ureteroscopy, management of guide wires and catheters under fluoroscopic control) and later practiced in lithiasic animal model. During the activity, an evaluation of the face and content validity was conducted, as well as constructive validation provided by the trainees versus experts. Evolution of the variables during the course within each group was analysed using the Student’s t test for paired samples, while comparisons between groups, were performed using the Student’s t test for unpaired samples. Results: The assessments of face and content validity were satisfactory. The constructive validation, “within one trainee” shows that were statistical significant differences between the first time the trainees performed the tasks in the animal model and the last time, mainly in the knowledge of procedure and Holmium laser lithotripsy cathegories. At the beginning of level III, there are also statistical significant differences between trainee’s scores and the expert’s scores.Conclusions: This realistic Endourology training model allows the acquisition of knowledge and technical and non-technical skills as evidenced by the face, content and constructive validity. Structured use of bench models (biological and non biological) and animal model simulators increase the endourological basic skills. PMID:25374928

  7. Four-scale linear model for anisotropic reflectance (FLAIR) for plant canopies. II. validation and inversion with CASI POLDER, and PARABOLA data at BOREAS

    Microsoft Academic Search

    H. Peter White; John R. Miller; Jing M. Chen

    2002-01-01

    For pt.I see ibid., vol.39, no.5, p.1072-83 (2001). To address the need for a flexible model of the bidirectional reflectance distribution function (BRDF) that is also suitable for inversion, the FLAIR Model (Four-Scale Linear Model for AnIsotropic Reflectance) has been developed H. P. White et al. (2001). Based on the more detailed Four-Scale Model J. M. Chen et al. (1997),

  8. The range of validity of the two-body approximation in models of terrestrial planet accumulation. II - Gravitational cross sections and runaway accretion

    NASA Technical Reports Server (NTRS)

    Wetherill, G. W.; Cox, L. P.

    1985-01-01

    The validity of the two-body approximation in calculating encounters between planetesimals has been evaluated as a function of the ratio of unperturbed planetesimal velocity (with respect to a circular orbit) to mutual escape velocity when their surfaces are in contact (V/V-sub-e). Impact rates as a function of this ratio are calculated to within about 20 percent by numerical integration of the equations of motion. It is found that when the ratio is greater than 0.4 the two-body approximation is a good one. Consequences of reducing the ratio to less than 0.02 are examined. Factors leading to an optimal size for growth of planetesimals from a swarm of given eccentricity and placing a limit on the extent of runaway accretion are derived.

  9. Systematic Independent Validation of Inner Heliospheric Models

    NASA Technical Reports Server (NTRS)

    MacNeice, P. J.; Takakishvili, Alexandre

    2008-01-01

    This presentation is the first in a series which will provide independent validation of community models of the outer corona and inner heliosphere. In this work we establish a set of measures to be used in validating this group of models. We use these procedures to generate a comprehensive set of results from the Wang- Sheeley-Arge (WSA) model which will be used as a baseline, or reference, against which to compare all other models. We also run a test of the validation procedures by applying them to a small set of results produced by the ENLIL Magnetohydrodynamic (MHD) model. In future presentations we will validate other models currently hosted by the Community Coordinated Modeling Center(CCMC), including a comprehensive validation of the ENLIL model. The Wang-Sheeley-Arge (WSA) model is widely used to model the Solar wind, and is used by a number of agencies to predict Solar wind conditions at Earth as much as four days into the future. Because it is so important to both the research and space weather forecasting communities, it is essential that its performance be measured systematically, and independently. In this paper we offer just such an independent and systematic validation. We report skill scores for the model's predictions of wind speed and IMF polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests line of sight magnetograms. For this study we generated model results for monthly magnetograms from the National Solar Observatory (SOLIS), Mount Wilson Observatory and the GONG network, spanning the Carrington rotation range from 1650 to 2068. We compare the influence of the different magnetogram sources, performance at quiet and active times, and estimate the effect of different empirical wind speed tunings. We also consider the ability of the WSA model to identify sharp transitions in wind speed from slow to fast wind. These results will serve as a baseline against which to compare future versions of the model

  10. Ecological Validity of the Conners' Continuous Performance Test II in a School-Based Sample

    ERIC Educational Resources Information Center

    Weis, Robert; Totten, Sara J.

    2004-01-01

    The ecological validity of the Conners' Continuous Performance Test II (CPT-II) was examined using a sample of 206 first- and second-grade children. Children's CPT-II scores were correlated with observations of inattentive/hyperactive behavior during CPT-II administration, observations of children's behavior during analogue academic task,…

  11. An approach to validation of thermomechanical models

    SciTech Connect

    Costin, L.S. [Sandia National Labs., Albuquerque, NM (United States); Hardy, M.P.; Brechtel, C.E. [Agapito (J.F.T.) and Associates, Inc., Grand Junction, CO (United States)

    1993-08-01

    Thermomechanical models are being developed to support the design of an Exploratory Studies Facility (ESF) and a potential high-level nuclear waste repository at Yucca Mountain, Nevada. These models are used for preclosure design of underground openings, such as access drifts, emplacement drifts, and waste emplacement boreholes; and in support of postclosure issue resolution relating to waste canister performance, disturbance of the hydrological properties of the host rock, and overall system performance assessment. For both design and performance assessment, the purpose of using models in analyses is to better understand and quantify some phenomenon or process. Therefore, validation is an important process that must be pursued in conjunction with the development and application of models. The Site Characterization Plan (SCP) addressed some general aspects of model validation, but no specific approach has, as yet, been developed for either design or performance assessment models. This paper will discuss a proposed process for thermomechanical model validation and will focus on the use of laboratory and in situ experiments as part of the validation process. The process may be generic enough in nature that it could be applied to the validation of other types of models, for example, models of unsaturated hydrologic flow.

  12. Validation of a synoptic solar wind model

    Microsoft Academic Search

    O. Cohen; I. V. Sokolov; I. I. Roussev; T. I. Gombosi

    2008-01-01

    We present a validation of a three-dimensional magnetohydrodynamic model for the solar corona and the inner heliosphere. We compare the results of the model with long-term satellite data at 1 AU for a 1 year period during solar minimum and another year period of solar maximum. Overall, the model predicts rather well the magnitude of the magnetohydrodynamical variables for solar

  13. Hot gas defrost model development and validation

    Microsoft Academic Search

    N. Hoffenbecker; S. A. Klein; D. T. Reindl

    2005-01-01

    This paper describes the development, validation, and application of a transient model for predicting the heat and mass transfer effects associated with an industrial air-cooling evaporator during a hot gas defrost cycle. The inputs to the model include the space dry bulb temperature, space humidity, coil geometry, frost thickness, frost density, and hot gas inlet temperature. The model predicts the

  14. Inert doublet model and LEP II limits

    SciTech Connect

    Lundstroem, Erik; Gustafsson, Michael; Edsjoe, Joakim [Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden); INFN, Sezione di Padova, Department of Physics 'Galileo Galilei', Via Marzolo 8, I-35131, Padua (Italy) and Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden); Department of Physics, Stockholm University, AlbaNova University Center, SE - 106 91 Stockholm (Sweden)

    2009-02-01

    The inert doublet model is a minimal extension of the standard model introducing an additional SU(2) doublet with new scalar particles that could be produced at accelerators. While there exists no LEP II analysis dedicated for these inert scalars, the absence of a signal within searches for supersymmetric neutralinos can be used to constrain the inert doublet model. This translation however requires some care because of the different properties of the inert scalars and the neutralinos. We investigate what restrictions an existing DELPHI Collaboration study of neutralino pair production can put on the inert scalars and discuss the result in connection with dark matter. We find that although an important part of the inert doublet model parameter space can be excluded by the LEP II data, the lightest inert particle still constitutes a valid dark matter candidate.

  15. Validation of the Hot Strip Mill Model

    SciTech Connect

    Richard Shulkosky; David Rosberg; Jerrud Chapman

    2005-03-30

    The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

  16. Modelling, Identification and Experimental Validation

    E-print Network

    Of Hydraulic Manipulator; Glen Bilodeau; Evangelos Papadopoulos

    1997-01-01

    servovalve dynamics, fluid dynamics and the vane and load dynamics. Included in the model are line losses, leakage, and hysteresis. System parameters are identified using the elbow joint of the SARCOS slave experimental hydraulic manipulator. Specialized hardware was designed and constructed for this purpose

  17. Validation plan for the German CAMAELEON model

    Microsoft Academic Search

    James R. McManamey

    1997-01-01

    Engineers and scientists at the US Army's Night Vision and Electronic Sensors Directorate (NVESD) are in the process of evaluating the German CAMAELEON model, a signature evaluation model that was created for use in designing and evaluating camouflage in the visible spectrum and is based on computational vision methodologies. Verification and preliminary validation have been very positive. For this reason,

  18. Preparation of Power Distribution System for High Penetration of Renewable Energy Part I. Dynamic Voltage Restorer for Voltage Regulation Pat II. Distribution Circuit Modeling and Validation

    NASA Astrophysics Data System (ADS)

    Khoshkbar Sadigh, Arash

    Part I: Dynamic Voltage Restorer In the present power grids, voltage sags are recognized as a serious threat and a frequently occurring power-quality problem and have costly consequence such as sensitive loads tripping and production loss. Consequently, the demand for high power quality and voltage stability becomes a pressing issue. Dynamic voltage restorer (DVR), as a custom power device, is more effective and direct solutions for "restoring" the quality of voltage at its load-side terminals when the quality of voltage at its source-side terminals is disturbed. In the first part of this thesis, a DVR configuration with no need of bulky dc link capacitor or energy storage is proposed. This fact causes to reduce the size of the DVR and increase the reliability of the circuit. In addition, the proposed DVR topology is based on high-frequency isolation transformer resulting in the size reduction of transformer. The proposed DVR circuit, which is suitable for both low- and medium-voltage applications, is based on dc-ac converters connected in series to split the main dc link between the inputs of dc-ac converters. This feature makes it possible to use modular dc-ac converters and utilize low-voltage components in these converters whenever it is required to use DVR in medium-voltage application. The proposed configuration is tested under different conditions of load power factor and grid voltage harmonic. It has been shown that proposed DVR can compensate the voltage sag effectively and protect the sensitive loads. Following the proposition of the DVR topology, a fundamental voltage amplitude detection method which is applicable in both single/three-phase systems for DVR applications is proposed. The advantages of proposed method include application in distorted power grid with no need of any low-pass filter, precise and reliable detection, simple computation and implementation without using a phased locked loop and lookup table. The proposed method has been verified by simulation and experimental tests under various conditions considering all possible cases such as different amounts of voltage sag depth (VSD), different amounts of point-on-wave (POW) at which voltage sag occurs, harmonic distortion, line frequency variation, and phase jump (PJ). Furthermore, the ripple amount of fundamental voltage amplitude calculated by the proposed method and its error is analyzed considering the line frequency variation together with harmonic distortion. The best and worst detection time of proposed method were measured 1ms and 8.8ms, respectively. Finally, the proposed method has been compared with other voltage sag detection methods available in literature. Part 2: Power System Modeling for Renewable Energy Integration: As power distribution systems are evolving into more complex networks, electrical engineers have to rely on software tools to perform circuit analysis. There are dozens of powerful software tools available in the market to perform the power system studies. Although their main functions are similar, there are differences in features and formatting structures to suit specific applications. This creates challenges for transferring power system circuit models data (PSCMD) between different software and rebuilding the same circuit in the second software environment. The objective of this part of thesis is to develop a Unified Platform (UP) to facilitate transferring PSCMD among different software packages and relieve the challenges of the circuit model conversion process. UP uses a commonly available spreadsheet file with a defined format, for any home software to write data to and for any destination software to read data from, via a script-based application called PSCMD transfer application. The main considerations in developing the UP are to minimize manual intervention and import a one-line diagram into the destination software or export it from the source software, with all details to allow load flow, short circuit and other analyses. In this study, ETAP, OpenDSS, and GridLab-D are considered, and PSCMD trans

  19. Systematic Independent Validation of Inner Heliospheric Models

    NASA Astrophysics Data System (ADS)

    MacNeice, P. J.; Taktakishvili, A.

    2008-12-01

    This presentation is the first in a series which will provide independent validation of community models of the outer corona and inner heliosphere. In this work we establish a set of measures to be used in validating this group of models. We use these procedures to generate a comprehensive set of results from the Wang- Sheeley-Arge (WSA) model which will be used as a baseline, or reference, against which to compare all other models. We also run a test of the validation procedures by applying them to a small set of results produced by the ENLIL Magnetohydrodynamic (MHD) model. In future presentations we will validate other models currently hosted by the Community Coordinated Modeling Center(CCMC), including a comprehensive validation of the ENLIL model. The Wang-Sheeley-Arge (WSA) model is widely used to model the Solar wind, and is used by a number of agencies to predict Solar wind conditions at Earth as much as four days into the future. Because it is so important to both the research and space weather forecasting communities, it is essential that its performance be measured systematically, and independently. In this paper we offer just such an independent and systematic validation. We report skill scores for the model's predictions of wind speed and IMF polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests line of sight magnetograms. For this study we generated model results for monthly magnetograms from the National Solar Observatory (SOLIS), Mount Wilson Observatory and the GONG network, spanning the Carrington rotation range from 1650 to 2068. We compare the influence of the different magnetogram sources, performance at quiet and active times, and estimate the effect of different empirical wind speed tunings. We also consider the ability of the WSA model to identify sharp transitions in wind speed from slow to fast wind. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of MHD models under development for use in forecasting.

  20. Feature extraction for structural dynamics model validation

    SciTech Connect

    Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

    2010-11-08

    This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

  1. Structural system identification: Structural dynamics model validation

    SciTech Connect

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  2. Oil spill impact modeling: development and validation.

    PubMed

    French-McCay, Deborah P

    2004-10-01

    A coupled oil fate and effects model has been developed for the estimation of impacts to habitats, wildlife, and aquatic organisms resulting from acute exposure to spilled oil. The physical fates model estimates the distribution of oil (as mass and concentrations) on the water surface, on shorelines, in the water column, and in the sediments, accounting for spreading, evaporation, transport, dispersion, emulsification, entrainment, dissolution, volatilization, partitioning, sedimentation, and degradation. The biological effects model estimates exposure of biota of various behavior types to floating oil and subsurface contamination, resulting percent mortality, and sublethal effects on production (somatic growth). Impacts are summarized as areas or volumes affected, percent of populations lost, and production foregone because of a spill's effects. This paper summarizes existing information and data used to develop the model, model algorithms and assumptions, validation studies, and research needs. Simulation of the Exxon Valdez oil spill is presented as a case study and validation of the model. PMID:15511105

  3. Validation of an Experimentally Derived Uncertainty Model

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Cox, D. E.; Balas, G. J.; Juang, J.-N.

    1996-01-01

    The results show that uncertainty models can be obtained directly from system identification data by using a minimum norm model validation approach. The error between the test data and an analytical nominal model is modeled as a combination of unstructured additive and structured input multiplicative uncertainty. Robust controllers which use the experimentally derived uncertainty model show significant stability and performance improvements over controllers designed with assumed ad hoc uncertainty levels. Use of the identified uncertainty model also allowed a strong correlation between design predictions and experimental results.

  4. Validation of Space Weather Models at Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Pulkkinen, A.; Rastaetter, L.; Hesse, M.; Chulaki, A.; Maddox, M.

    2011-01-01

    The Community Coordinated Modeling Center (CCMC) is a multiagency partnership, which aims at the creation of next generation space weather modes. CCMC goal is to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. The presentation will demonstrate the recent progress in CCMC metrics and validation activities.

  5. On the Validity of Climate Models

    NASA Astrophysics Data System (ADS)

    Phillips, Thomas; AchutaRao, Krishna; Bader, David; Covey, Curtis; Gleckler, Peter; Sperber, Kenneth; Taylor, Karl

    2007-03-01

    We object to contributor Kevin Corbett's assertions, in his article ``On award to Crichton'' (Eos, 87(43), 464, 2006), that ``Too often now, models are taken as data and their results taken as fact, when the accuracy of the models in predicting even short-term effects is poor and the fundamental validity for most climate models is opaque....'' Corbett cites (among other references) our Eos article ``Coupled climate model appraisal: A benchmark for future studies'', implying that our findings support his remarks. In fact, our evaluation of model simulations relative to observational data leads us to very different conclusions.

  6. Validity of the Sleep Subscale of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II)

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Malone, Carrie J.

    2006-01-01

    Currently there are no available sleep disorder measures for individuals with severe and profound intellectual disability. We, therefore, attempted to establish the external validity of the "Diagnostic Assessment for the Severely Handicapped-II" (DASH-II) sleep subscale by comparing daily observational sleep data with the responses of direct care…

  7. Regimes of validity for balanced models

    NASA Astrophysics Data System (ADS)

    Gent, Peter R.; McWilliams, James C.

    1983-07-01

    Scaling analyses are presented which delineate the atmospheric and oceanic regimes of validity for the family of balanced models described in Gent and McWilliams (1983a). The analyses follow and extend the classical work of Charney (1948) and others. The analyses use three non-dimensional parameters which represent the flow scale relative to the Earth's radius, the dominance of turbulent or wave-like processes, and the dominant component of the potential vorticity. For each regime, the models that are accurate both at leading order and through at least one higher order of accuracy in the appropriate small parameter are then identified. In particular, it is found that members of the balanced family are the appropriate models of higher-order accuracy over a broad range of parameter regimes. Examples are also given of particular atmospheric and oceanic phenomena which are in the regimes of validity for the different balanced models.

  8. Fraction Model II

    NSDL National Science Digital Library

    NCTM Illuminations

    2000-01-01

    With this tool, students can explore different representations for fractions. They can create a fraction, selecting any numerator or denominator up to 20, and see a model of the fraction as well as its percent and decimal equivalents. For the model, they can choose either a circle, a rectangle, or a set model.

  9. Solar Sail Model Validation from Echo Trajectories

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Brickerhoff, Adam T.

    2007-01-01

    The NASA In-Space Propulsion program has been engaged in a project to increase the technology readiness of solar sails. Recently, these efforts came to fruition in the form of several software tools to model solar sail guidance, navigation and control. Furthermore, solar sails are one of five technologies competing for the New Millennium Program Space Technology 9 flight demonstration mission. The historic Echo 1 and Echo 2 balloons were comprised of aluminized Mylar, which is the near-term material of choice for solar sails. Both spacecraft, but particularly Echo 2, were in low Earth orbits with characteristics similar to the proposed Space Technology 9 orbit. Therefore, the Echo balloons are excellent test cases for solar sail model validation. We present the results of studies of Echo trajectories that validate solar sail models of optics, solar radiation pressure, shape and low-thrust orbital dynamics.

  10. Using airborne laser scanning profiles to validate marine geoid models

    NASA Astrophysics Data System (ADS)

    Julge, Kalev; Gruno, Anti; Ellmann, Artu; Liibusk, Aive; Oja, Tõnis

    2014-05-01

    Airborne laser scanning (ALS) is a remote sensing method which utilizes LiDAR (Light Detection And Ranging) technology. The datasets collected are important sources for large range of scientific and engineering applications. Mostly the ALS is used to measure terrain surfaces for compilation of Digital Elevation Models but it can also be used in other applications. This contribution focuses on usage of ALS system for measuring sea surface heights and validating gravimetric geoid models over marine areas. This is based on the ALS ability to register echoes of LiDAR pulse from the water surface. A case study was carried out to analyse the possibilities for validating marine geoid models by using ALS profiles. A test area at the southern shores of the Gulf of Finland was selected for regional geoid validation. ALS measurements were carried out by the Estonian Land Board in spring 2013 at different altitudes and using different scan rates. The one wavelength Leica ALS50-II laser scanner on board of a small aircraft was used to determine the sea level (with respect to the GRS80 reference ellipsoid), which follows roughly the equipotential surface of the Earth's gravity field. For the validation a high-resolution (1'x2') regional gravimetric GRAV-GEOID2011 model was used. This geoid model covers the entire area of Estonia and surrounding waters of the Baltic Sea. The fit between the geoid model and GNSS/levelling data within the Estonian dry land revealed RMS of residuals ±1… ±2 cm. Note that such fitting validation cannot proceed over marine areas. Therefore, an ALS observation-based methodology was developed to evaluate the GRAV-GEOID2011 quality over marine areas. The accuracy of acquired ALS dataset were analyzed, also an optimal width of nadir-corridor containing good quality ALS data was determined. Impact of ALS scan angle range and flight altitude to obtainable vertical accuracy were investigated as well. The quality of point cloud is analysed by cross validation between overlapped flight lines and the comparison with tide gauge stations readings. The comparisons revealed that the ALS based profiles of sea level heights agree reasonably with the regional geoid model (within accuracy of the ALS data and after applying corrections due to sea level variations). Thus ALS measurements are suitable for measuring sea surface heights and validating marine geoid models.

  11. Concepts of Model Verification and Validation

    SciTech Connect

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all safety-related nuclear facility design, analyses, and operations. In fact, DNFSB 2002-1 recommends to the DOE and National Nuclear Security Administration (NNSA) that a V&V process be performed for all safety related software and analysis. Model verification and validation are the primary processes for quantifying and building credibility in numerical models. Verification is the process of determining that a model implementation accurately represents the developer's conceptual description of the model and its solution. Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Both verification and validation are processes that accumulate evidence of a model's correctness or accuracy for a specific scenario; thus, V&V cannot prove that a model is correct and accurate for all possible scenarios, but, rather, it can provide evidence that the model is sufficiently accurate for its intended use. Model V&V is fundamentally different from software V&V. Code developers developing computer programs perform software V&V to ensure code correctness, reliability, and robustness. In model V&V, the end product is a predictive model based on fundamental physics of the problem being solved. In all applications of practical interest, the calculations involved in obtaining solutions with the model require a computer code, e.g., finite element or finite difference analysis. Therefore, engineers seeking to develop credible predictive models critically need model V&V guidelines and procedures. The expected outcome of the model V&V process is the quantified level of agreement between experimental data and model prediction, as well as the predictive accuracy of the model. This report attempts to describe the general philosophy, definitions, concepts, and processes for conducting a successful V&V program. This objective is motivated by the need for highly accurate numerical models for making predictions to s

  12. Code validation with EBR-II test data

    Microsoft Academic Search

    J. P. Herzog; L. K. Chang; E. M. Dean; E. E. Feldman; D. J. Hill; D. Mohr; H. P. Planchon

    1991-01-01

    An extensive system of computer codes is used at Argonne National Laboratory to analyze whole-plant transient behavior of the Experiment Breeder Reactor 2. Three of these codes, NATDEMO\\/HOTCHAN, SASSYS, and DSNP have been validated with data from reactor transient tests. The validated codes are the foundation of safety analyses and pretest predictions for the continuing design improvements and experimental programs

  13. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    USGS Publications Warehouse

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead of promoting passive or self-righteous decisions.

  14. Using Model Checking to Validate AI Planner Domain Models

    NASA Technical Reports Server (NTRS)

    Penix, John; Pecheur, Charles; Havelund, Klaus

    1999-01-01

    This report describes an investigation into using model checking to assist validation of domain models for the HSTS planner. The planner models are specified using a qualitative temporal interval logic with quantitative duration constraints. We conducted several experiments to translate the domain modeling language into the SMV, Spin and Murphi model checkers. This allowed a direct comparison of how the different systems would support specific types of validation tasks. The preliminary results indicate that model checking is useful for finding faults in models that may not be easily identified by generating test plans.

  15. Model Validation using Automatically Generated Requirements-Based Tests

    Microsoft Academic Search

    Ajitha Rajan; Michael W. Whalen; Mats P. E. Heimdahl

    2007-01-01

    In current model-based development practice, validation that we are building a correct model is achieved by manually deriving requirements-based test cases for model testing. Model validation performed this way is time consuming and expensive, particularly in the safety critical systems domain where high confidence in the model correctness is required. In an effort to reduce the validation effort, we propose

  16. MODEL VALIDATION VIA UNCERTAINTY PROPAGATION USING RESPONSE SURFACE MODELS

    Microsoft Academic Search

    Lusine Baghdasaryan; Wei Chen; Thaweepat Buranathiti; Jian Cao

    Model validation has become a primary means to evaluate accuracy and reliability of computational simulations in engineering design. Mathematical models enable engineers to establish what the most likely response of a system is. However, despite the enormous power of computational models, uncertainty is inevitable in all model-based engineering design problems, due to the variation in the physical system itself, or

  17. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive ability. PMID:23755236

  18. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  19. Validation of Space Weather Models at Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; Zheng, Y.; MacNeice, P. J.; Shim, J.; Taktakishvili, A.; Chulaki, A.

    2011-01-01

    The Community Coordinated Modeling Center (CCMC) is a multi-agency partnership to support the research and developmental work necessary to substantially increase space weather modeling capabilities and to facilitate advanced models deployment in forecasting operations. Space weather models and coupled model chains hosted at the CCMC range from the solar corona to the Earth's upper atmosphere. CCMC has developed a number of real-time modeling systems, as well as a large number of modeling and data products tailored to address the space weather needs of NASA's robotic missions. The CCMC conducts unbiased model testing and validation and evaluates model readiness for operational environment. CCMC has been leading recent comprehensive modeling challenges under GEM, CEDAR and SHINE programs. The presentation will focus on experience in carrying out comprehensive and systematic validation of large sets of. space weather models

  20. Turbulence Modeling Validation, Testing, and Development

    NASA Technical Reports Server (NTRS)

    Bardina, J. E.; Huang, P. G.; Coakley, T. J.

    1997-01-01

    The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

  1. A regulatory perspective on model validation

    SciTech Connect

    Eisenberg, N.; Federline, M. [Nuclear Regulatory Commission, Washington, DC (United States); Sagar, B. [Center of Nuclear Waste Regulatory Analyses, San Antonio, TX (United States)] [and others

    1995-12-01

    Licensing of a high-level nuclear waste repository will depend on the accumulated evidence regarding its long-term safety. Performance assessment models are a critical part of this evidence and provide a means to synthesize and collect information and to focus it on the long-term performance of the repository. To fulfill this essential role in making the case for the safety of the repository, performance assessment models must have a suitable degree of predictive capability and the case for this capability must be scrutable. Since the usual manner of demonstrating the predictive capability of the models, that is comparison to actual performance, is unavailable because of the long times involved, other means of building confidence must be adopted. To achieve this goal, an applied science perspective on model validation is adopted; i.e., a model need only make predictions accurate enough to serve the purpose for which the model is intended. The model need not predict all phenomena and behavior observed. The model, however, must predict the essential behavior of interest and must be firmly rooted in accepted scientific principles and their application. Although it may not be possible to prove that performance assessment models precisely represent the real world, it is possible to engage in activities that may develop confidence in models sufficient for demonstrating repository safety.

  2. Evaluating the Validity and Reliability of PDQ-II and Comparison with DDST-II for Two Step Developmental Screening

    PubMed Central

    Shahshahani, Soheila; Sajedi, Firoozeh; Azari, Nadia; Vameghi, Roshanak; Kazemnejad, Anooshirvan; Tonekaboni, Seyed-Hasan

    2011-01-01

    Objective This research was designed to identify the validity and reliability of the Prescreening Developmental Questionnaire 2 (PDQ-II) in Tehran in comparison with the Denver Developmental Screening Test-II (DDST-II). Methods After translation and back translation, the final Persian version of test was verified by three pediatricians and also by reviewing relevant literature for content validity. The test was performed on 237 children ranging from 0 to 6 years old, recruited by convenient sampling, from four health care clinics in Tehran city. They were also evaluated by DDST II simultaneously. Interrater methods and Cronbach's ? were used to determine reliability of the test. The Kappa agreement coefficient between PDQ and DDST II was determined. The data was analyzed by SPSS software. Findings All of the questions in PDQ had satisfactory content validity. The total Cronbach's ? coefficient of 0–9 months, 9–24 months, 2–4 years and 4–6 years questionnaires were 0.951, 0.926, 0.950 and 0.876, respectively. The Kappa measure of agreement for interrater tests was 0.89. The estimated agreement coefficient between PDQ and DDST II was 0.383. Based on two different categorizing possibilities for questionable scores, that is, "Delayed" or "Normal", sensitivity and specificity of PDQ was determined to be 35.7–63% and 75.8–92.2%, respectively. Conclusion PDQ has a good content validity and reliability and moderate sensitivity and specificity in comparison with the DDST-II, but by considering their relatively weak agreement coefficient, using it along with DDST-II for a two-stage developmental screening process, remains doubtful. PMID:23056811

  3. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  4. Improving Model Quality by Validating Constraints with Model Unit Tests

    Microsoft Academic Search

    Lars Hamann; Martin Gogolla

    2010-01-01

    A central part of modern development methods is the use of tests. A well-defined test suite is usually the basis for code refactoring because changes to the system under test can be easily validated against the test suite. In model-based development tests can be derived from the model but possibilities to test the originally specified model and therefore to improve

  5. Validity

    NSDL National Science Digital Library

    Edwin P. Christmann

    2008-11-01

    In this chapter, the authors will describe the four types of validity: construct validity, content validity, concurrent validity, and predictive validity. Depending on the test and the rationale or purpose for its administration, and understanding of the

  6. Validation of uncertainty estimates in hydrologic modelling

    NASA Astrophysics Data System (ADS)

    Thyer, M.; Engeland, K.; Renard, B.; Kuczera, G.; Franks, S.

    2009-04-01

    Meaningful characterization of uncertainties affecting conceptual rainfall-runoff (CRR) models remains a challenging research area in the hydrological community. Numerous methods aimed at quantifying the uncertainty in hydrologic predictions have been proposed over the last decades. In most cases, the outcome of such methods takes the form of a predictive interval, computed from a predictive distribution. Regardless of the method used to derive it, it is important to notice that the predictive distribution results from the assumptions made during the inference. Consequently, unsupported assumptions may lead to inadequate predictive distributions, i.e. under- or over-estimated uncertainties. It follows that the estimated predictive distribution must be thoroughly scrutinized ("validated"); as discussed by Hall et al. [2007] "Without validation, calibration is worthless, and so is uncertainty estimation". The aim of this communication is to study diagnostic tools aimed at assessing the reliability of uncertainty estimates. From a methodological point of view, this requires diagnostic approaches that compare a time-varying distribution (the predictive distribution at all times t) to a time series of observations. This is a much more stringent test than validation methods currently used in hydrology, which simply compare two time series (observations and "optimal" simulations). Indeed, standard goodness-of-fit assessments (e.g. using the Nash-Sutcliff statistic) can not check if the predictive distribution is consistent with the observed data. The usefulness of the proposed diagnostic tools will be illustrated with a case study comparing the performance of several uncertainty quantification frameworks. In particular, it will be shown that standard validation approaches (e.g. based on the Nash-Sutcliff statistic or verifying that about p% of the observations lie within the p% predictive interval) are not able to discriminate competing frameworks whose performance (in terms of uncertainty quantification) is evidently different.

  7. Plasma Reactor Modeling and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Meyyappan, M.; Bose, D.; Hash, D.; Hwang, H.; Cruden, B.; Sharma, S. P.; Rao, M. V. V. S.; Arnold, Jim (Technical Monitor)

    2001-01-01

    Plasma processing is a key processing stop in integrated circuit manufacturing. Low pressure, high density plum reactors are widely used for etching and deposition. Inductively coupled plasma (ICP) source has become popular recently in many processing applications. In order to accelerate equipment and process design, an understanding of the physics and chemistry, particularly, plasma power coupling, plasma and processing uniformity and mechanism is important. This understanding is facilitated by comprehensive modeling and simulation as well as plasma diagnostics to provide the necessary data for model validation which are addressed in this presentation. We have developed a complete code for simulating an ICP reactor and the model consists of transport of electrons, ions, and neutrals, Poisson's equation, and Maxwell's equation along with gas flow and energy equations. Results will be presented for chlorine and fluorocarbon plasmas and compared with data from Langmuir probe, mass spectrometry and FTIR.

  8. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  9. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and simulation tool, and completion of service requests from a broad end user consortium including Operations, Science and Technology Planning, and Exploration Planning. CONCLUSIONS: The VVC approach established by the IMM project of combining the IMM VV Plan with 7009 requirements is comprehensive and includes the involvement of end users at every stage in IMM evolution. Methods and techniques used to quantify the VVC status of the IMM have not only received approval from the local NASA community but have also garnered recognition by other federal agencies seeking to develop similar guidelines in the medical modeling community.

  10. Boron-10 Lined Proportional Counter Model Validation

    SciTech Connect

    Lintereur, Azaree T.; Siciliano, Edward R.; Kouzes, Richard T.

    2012-06-30

    The Department of Energy Office of Nuclear Safeguards (NA-241) is supporting the project “Coincidence Counting With Boron-Based Alternative Neutron Detection Technology” at Pacific Northwest National Laboratory (PNNL) for the development of an alternative neutron coincidence counter. The goal of this project is to design, build and demonstrate a boron-lined proportional tube-based alternative system in the configuration of a coincidence counter. This report discusses the validation studies performed to establish the degree of accuracy of the computer modeling methods current used to simulate the response of boron-lined tubes. This is the precursor to developing models for the uranium neutron coincidence collar under Task 2 of this project.

  11. Hydrological validation of multifractal rainfall simulation models

    NASA Astrophysics Data System (ADS)

    Mouhous, N.; Gaume, E.; Andrieu, H.

    2003-04-01

    The observed scaling invariance properties of rainfall time series have often been put forward to justify the choice of multifractal (scaling) models for rainfall stochastic modelling. These models are nevertheless seldom validated on real hydrological applications. Two types of multifractal models - the first one with a Log-Poisson generator and the second one with a uniform generator - were calibrated on a 8 year point rainfall series with a five minute time step. The results obtained with the rainfall series simulated with these models on two hydrological applications (the computation of intensity-duration-frequency, IDF, curves and the conception of a urban drainage storage volume) were compared with those obtained with the original measured rainfall series. The disagreements reveal some limitations of the multifractal models. On the one hand, using the vocabulary of the multifractalists, the models are calibrated on the basis of the statistical properties of the simulated undressed series while the IDF curves are computed on the dressed series. The statistical properties of both types of series clearly differ if a canonical model is used : here the model with the Log-Poisson generator. On the other hand, the optimal dimensions of the storage volume depend on the shape of the hyetographs. The discordances between the volumes obtained with the simulated or measured rainfall series indicate that the temporal structure of the simulated rainfall intensity series (i.e. the shapes of the simulated hyetographs) are not comparable with the one of the measured series. As a conclusion, multifractal models appear to reproduce accuratly only some of the properties of the real measured series. Their appropriateness should not be a priori asserted but verified for each considered application.

  12. Validation status of the TARDEC visual model (TVM)

    Microsoft Academic Search

    Grant R. Gerhart; Richard Goetz; Thomas J. Meitzler; Robert E. Karlsen

    1996-01-01

    An extensive effort is ongoing to validate the TARDEC visual mode (TVM). This paper describes in detail some recent efforts to utilize the model for dual need commercial and military target acquisition applications. The recent completion of a visual perception laboratory within TARDEC is a useful tool to calibrate and validate human performance models for specific visual tasks. Some validation

  13. Model Validation using Automatically Generated Requirements-Based Tests

    Microsoft Academic Search

    Ajitha Rajan; Michael W. Whalen; Mats Per Erik Heimdahl

    2007-01-01

    In current model-baseddevelopmentpractice, validation that we are building a correct model is achieved by manu- ally deriving requirements-based test cases for model test- ing. Model validation performed this way is time consum- ing and expensive, particularly in the safety critical systems domain where high confidence in the model correctness is required. In an effort to reducethe validationeffort, we proposean approach

  14. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  15. Boron-10 Lined Proportional Counter Model Validation

    SciTech Connect

    Lintereur, Azaree T.; Ely, James H.; Kouzes, Richard T.; Rogers, Jeremy L.; Siciliano, Edward R.

    2012-11-18

    The decreasing supply of 3He is stimulating a search for alternative neutron detectors; one potential 3He replacement is 10B-lined proportional counters. Simulations are being performed to predict the performance of systems designed with 10B-lined tubes. Boron-10-lined tubes are challenging to model accurately because the neutron capture material is not the same as the signal generating material. Thus, to simulate the efficiency, the neutron capture reaction products that escape the lining and enter the signal generating fill gas must be tracked. The tube lining thickness and composition are typically proprietary vendor information, and therefore add additional variables to the system simulation. The modeling methodologies used to predict the neutron detection efficiency of 10B-lined proportional counters were validated by comparing simulated to measured results. The measurements were made with a 252Cf source positioned at several distances from a moderated 2.54-cm diameter 10B-lined tube. Models were constructed of the experimental configurations using the Monte Carlo transport code MCNPX, which is capable of tracking the reaction products from the (n,10B) reaction. Several different lining thicknesses and compositions were simulated for comparison with the measured data. This paper presents the results of the evaluation of the experimental and simulated data, and a summary of how the different linings affect the performance of a coincidence counter configuration designed with 10B-lined proportional counters.

  16. Diurnal ocean surface layer model validation

    NASA Technical Reports Server (NTRS)

    Hawkins, Jeffrey D.; May, Douglas A.; Abell, Fred, Jr.

    1990-01-01

    The diurnal ocean surface layer (DOSL) model at the Fleet Numerical Oceanography Center forecasts the 24-hour change in a global sea surface temperatures (SST). Validating the DOSL model is a difficult task due to the huge areas involved and the lack of in situ measurements. Therefore, this report details the use of satellite infrared multichannel SST imagery to provide day and night SSTs that can be directly compared to DOSL products. This water-vapor-corrected imagery has the advantages of high thermal sensitivity (0.12 C), large synoptic coverage (nearly 3000 km across), and high spatial resolution that enables diurnal heating events to be readily located and mapped. Several case studies in the subtropical North Atlantic readily show that DOSL results during extreme heating periods agree very well with satellite-imagery-derived values in terms of the pattern of diurnal warming. The low wind and cloud-free conditions necessary for these events to occur lend themselves well to observation via infrared imagery. Thus, the normally cloud-limited aspects of satellite imagery do not come into play for these particular environmental conditions. The fact that the DOSL model does well in extreme events is beneficial from the standpoint that these cases can be associated with the destruction of the surface acoustic duct. This so-called afternoon effect happens as the afternoon warming of the mixed layer disrupts the sound channel and the propagation of acoustic energy.

  17. Validating agent based models through virtual worlds.

    SciTech Connect

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina [Sandia National Laboratories, Livermore, CA; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E. [North Carolina State University, Raleigh, NC; Bernstein, Jeremy Ray Rhythm [Gaikai, Inc., Aliso Viejo, CA

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior. Results from our work indicate that virtual worlds have the potential for serving as a proxy in allocating and populating behaviors that would be used within further agent-based modeling studies.

  18. Model Based Test Generation for Microprocessor Architecture Validation

    E-print Network

    Minnesota, University of

    Model Based Test Generation for Microprocessor Architecture Validation Sreekumar V. Kodakara.dingankar@intel.com Abstract Functional validation of microprocessors is growing in complexity in current and future microprocessors. Tra- ditionally, the different components (or validation collaterals) used in simulation based

  19. Lidar validation of SAGE II aerosol measurements after the 1991 Mount Pinatubo eruption

    E-print Network

    Robock, Alan

    Lidar validation of SAGE II aerosol measurements after the 1991 Mount Pinatubo eruption Juan Carlos the possibility of filling the vertical gaps using lidar data. We compare every coincident backscattering measurement (at a wavelength of 0.694 mm) from two lidars, at Mauna Loa, Hawaii (19.5°N, 155.6°W

  20. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  1. The Risk Map: A New Tool for Validating Risk Models

    E-print Network

    Paris-Sud XI, Université de

    The Risk Map: A New Tool for Validating Risk Models Gilbert Colletaz Christophe Hurlin Christophe Pérignon October 2012 Abstract This paper presents a new method to validate risk models: the Risk Map information about the performance of a risk model. It relies on the concept of a super exception, which is de

  2. Challenges of Validating Global Assimilative Models of the Ionosphere

    Microsoft Academic Search

    G. J. Bishop; L. F. McNamara; J. A. Welsh; D. T. Decker; C. R. Baker

    2008-01-01

    This paper addresses the often surprisingly difficult challenges that arise in conceptually simple validations of global models of the ionosphere. AFRL has been tasked with validating the Utah State University GAIM (Global Assimilation of Ionospheric Measurements) model of the ionosphere, which is run in real time by the Air Force Weather Agency. The USU-GAIM model currently assimilates, in addition to

  3. What do we mean by validating a prognostic model?

    Microsoft Academic Search

    Douglas G. Altman; Patrick Royston

    2000-01-01

    SUMMARY Prognostic models are used in medicine for investigating patient outcome in relation to patient and disease characteristics. Such models do not always work well in practice, so it is widely recommended that they need to be validated. The idea of validating a prognostic model is generally taken to mean establishing that it works satisfactorily for patients other than those

  4. Micromachined accelerometer design, modeling and validation

    SciTech Connect

    Davies, B.R.; Bateman, V.I.; Brown, F.A.; Montague, S.; Murray, J.R.; Rey, D.; Smith, J.H.

    1998-04-01

    Micromachining technologies enable the development of low-cost devices capable of sensing motion in a reliable and accurate manner. The development of various surface micromachined accelerometers and gyroscopes to sense motion is an ongoing activity at Sandia National Laboratories. In addition, Sandia has developed a fabrication process for integrating both the micromechanical structures and microelectronics circuitry of Micro-Electro-Mechanical Systems (MEMS) on the same chip. This integrated surface micromachining process provides substantial performance and reliability advantages in the development of MEMS accelerometers and gyros. A Sandia MEMS team developed a single-axis, micromachined silicon accelerometer capable of surviving and measuring very high accelerations, up to 50,000 times the acceleration due to gravity or 50 k-G (actually measured to 46,000 G). The Sandia integrated surface micromachining process was selected for fabrication of the sensor due to the extreme measurement sensitivity potential associated with integrated microelectronics. Measurement electronics capable of measuring at to Farad (10{sup {minus}18} Farad) changes in capacitance were required due to the very small accelerometer proof mass (< 200 {times} 10{sup {minus}9} gram) used in this surface micromachining process. The small proof mass corresponded to small sensor deflections which in turn required very sensitive electronics to enable accurate acceleration measurement over a range of 1 to 50 k-G. A prototype sensor, based on a suspended plate mass configuration, was developed and the details of the design, modeling, and validation of the device will be presented in this paper. The device was analyzed using both conventional lumped parameter modeling techniques and finite element analysis tools. The device was tested and performed well over its design range.

  5. Geochemistry Model Validation Report: Material Degradation and Release Model

    SciTech Connect

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  6. A Toxicokinetic Model for Predicting the Tissue Distribution and Elimination of Organic and Inorganic Mercury Following Exposure to Methyl Mercury in Animals and Humans. II. Application and Validation of the Model in Humans

    Microsoft Academic Search

    Gaétan Carrier; Michèle Bouchard; Robert C. Brunet; Mylène Caza

    2001-01-01

    The objective of this study was to develop a biologically based dynamical model describing the disposition kinetics of methyl mercury and its inorganic mercury metabolites in humans following different methyl mercury exposure scenarios. The model conceptual and functional representation was similar to that used for rats but relevant data on humans served to determine the critical parameters of the kinetic

  7. Validation of Arabic and English versions of the ARSMA-II Acculturation Rating Scale.

    PubMed

    Jadalla, Ahlam; Lee, Jerry

    2015-02-01

    To translate and adapt the Acculturation Rating Scale of Mexican-Americans II (ARSMA-II) for Arab Americans. A multistage translation process followed by a pilot and a large study. The translated and adapted versions, Acculturation Rating Scale for Arabic Americans-II Arabic and English (ARSAA-IIA, ARSAA-IIE), were validated in a sample of 297 Arab Americans. Factor analyses with principal axis factoring extractions and direct oblimin rotations were used to identify the underlying structure of ARSAA-II. Factor analysis confirmed the underlying structure of ARSAA-II and produced two interpretable factors labeled as 'Attraction to American Culture' (AAmC) and 'Attraction to Arabic Culture' (AArC). The Cronbach's alphas of AAmC and AArC were .89 and .85 respectively. Findings support ARSAA-II A & E to assess acculturation among Arab Americans. The emergent factors of ARSAA-II support the theoretical structure of the original ARSMA-II tool and show high internal consistency. PMID:23934518

  8. Statistical Validation of Normal Tissue Complication Probability Models

    SciTech Connect

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van't; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)] [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands) [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  9. Testing a better method of predicting postsurgery soft tissue response in Class II patients: A prospective study and validity assessment.

    PubMed

    Yoon, Kyoung-Sik; Lee, Ho-Jin; Lee, Shin-Jae; Donatelli, Richard E

    2014-10-01

    Objective: (1) To perform a prospective study using a new set of data to test the validity of a new soft tissue prediction method developed for Class II surgery patients and (2) to propose a better validation method that can be applied to a validation study. Materials and Methods: Subjects were composed of two subgroups: training subjects and validation subjects. Eighty Class II surgery patients provided the training data set that was used to build the prediction algorithm. The validation data set of 34 new patients was used for evaluating the prospective performance of the prediction algorithm. The validation was conducted using four validation methods: (1) simple validation and (2) fivefold, (3) 10-fold, and (4) leave-one-out cross-validation (LOO). Results: The characteristics between the training and validation subjects did not differ. The multivariate partial least squares regression returned more accurate prediction results than the conventional method did. During the prospective validation, all of the cross-validation methods (fivefold, 10-fold, and LOO) demonstrated fewer prediction errors and more stable results than the simple validation method did. No significant difference was noted among the three cross-validation methods themselves. Conclusion: After conducting a prospective study using a new data set, this new prediction method again performed well. In addition, a cross-validation technique may be considered a better option than simple validation when constructing a prediction algorithm. PMID:25275546

  10. Model for Use of Sociometry to Validate Attitude Measures.

    ERIC Educational Resources Information Center

    McGuiness, Thomas P.; Stank, Peggy L.

    A study concerning the development and validation of an instrument intended to measure Goal II of quality education is presented. This goal is that quality education should help every child acquire understanding and appreciation of persons belonging to social, cultural and ethnic groups different from his own. The rationale for measurement…

  11. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  12. The SEGUE Stellar Parameter Pipeline. II. Validation with Galactic Globular and Open Clusters

    E-print Network

    Y. S. Lee; T. C. Beers; T. Sivarani; J. A. Johnson; D. An; R. Wilhelm; C. Allende Prieto; L. Koesterke; P. Re Fiorentin; C. A. L. Bailer-Jones; J. E. Norris; B. Yanny; C. M. Rockosi; H. J. Newberg; K. M. Cudworth; K. Pan

    2007-10-31

    We validate the performance and accuracy of the current SEGUE (Sloan Extension for Galactic Understanding and Exploration) Stellar Parameter Pipeline (SSPP), which determines stellar atmospheric parameters (effective temperature, surface gravity, and metallicity) by comparing derived overall metallicities and radial velocities from selected likely members of three globular clusters (M 13, M 15, and M 2) and two open clusters (NGC 2420 and M 67) to the literature values. Spectroscopic and photometric data obtained during the course of the original Sloan Digital Sky Survey (SDSS-I) and its first extension (SDSS-II/SEGUE) are used to determine stellar radial velocities and atmospheric parameter estimates for stars in these clusters. Based on the scatter in the metallicities derived for the members of each cluster, we quantify the typical uncertainty of the SSPP values, sigma([Fe/H]) = 0.13 dex for stars in the range of 4500 K < Teff < 7500 K and 2.0 < log g < 5.0, at least over the metallicity interval spanned by the clusters studied (-2.3 < [Fe/H] < 0). The surface gravities and effective temperatures derived by the SSPP are also compared with those estimated from the comparison of the color-magnitude diagrams with stellar evolution models; we find satisfactory agreement. At present, the SSPP underestimates [Fe/H] for near-solar-metallicity stars, represented by members of M 67 in this study, by about 0.3 dex.

  13. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    SciTech Connect

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  14. Beyond the Standard Model II

    NASA Astrophysics Data System (ADS)

    Milton, Kimball A.; Kantowski, Ronald; Samuel, Mark A.

    1991-07-01

    The Table of Contents for the full book PDF is as follows: * Preface * Electroweak Symmetry-Breaking Effects at Colliders * Precision Tests of the Electroweak Theory * Hadron Colliders: B Factories for Now and the Future * The MSW Effect as the Solution to the Solar Neutrino Problem * New Physics Effects from String Models * Strings and Large N QCD * Searching for Millicharged Particles * Recent Results from CLEO * Standard Model Investigations at ALEPH * Z0 Couplings to Hadrons and Charged Leptons * Is Chiral Symmetry Restored at High Temperatures? * Fermion Masses out of Radiative Corrections * Extra Z and Atomic Parity Violation * Lepton Number and Supersymmetry * The Mass Generation in the Standard Electroweak Theory * GRANDE: A Neutrino Telescope for Arkansas * Neutrino and Gravitational Radiation Observations from Supernovae * Supersymmetric Contributions to the Neutrino Magnetic Moment * Observables from p overline {p} rightarrow {W^+X} rightarrow {e^+vX} Beyond Leading Order * Random Walks on p-adic Numbers * Solar Neutrino Puzzle and Physics Beyond the Standard Model * The SFT: A Super Fixed Target Beauty Facility at the SSC * Non-Standard Stellar Evolution * Analogous Behavior in the Quantum Hall Effect, Anyon Superconductivity, and the Standard Model * Gauge Boson Dynamics * Rare Decays and CP Asymmetries in Charged B Decays * Total Hadronic Cross-section in e+e- Annihilation at the Four-loop Level of Perturbative QCD * Neutrino Oscillations and Solar Neutrinos * Canonical Quantization of Axial Gauges: Perturbative and Non-perturbative Implications * Large Technicolor Effect at Z0 * Finite Size Scaling for Heavy Mesons in the Continuum * Are There Electroweak Skyrmions? * Testing the Flipped String * Virasoro Constructions from Twisted Kac-Moody Algebras * Electroweak Symmetry Breaking by Fourth Generation Quark and Lepton Condensates * Novel Extension of the Standard Model * O * Interpreting Precision Measurements * Rare K Decays: Present Status and Future Prospects * Quantum Mechanics at the Black Hole Horizon * Target-Space Duality and the Curse of the Wormhole * Mass Enhancement and Critical Behavior in Technicolor Theories * Proton-Proton and Proton-Antiproton Elastic Scattering at High Energies - Theory, Phenomenology, and Experiment * Gauge Masses in String Field Theory * An Introduction to Bosonic Technicolor * Anyonic Superconductivity * Hunting the Higgs Boson at LEP with OPAL * Beyond the Standard Model - The Sextet Quarks Way * Dynamical Breakdown of Z2 and Parity in QED3 with Fermion Self-Coupling * Scaling Properties of QED3 with Fermion Self-Couplings * Wheeler-DeWitt Quantum Gravity in (2+1) Dimensions * Kac-Moody Algebras from Covariantization of the Lax Operators * An Upper Bound on the Higgs Mass * Suppression of the Vacuum Energy Expectation Value * Lorentz Covariance of Quantum Fluctuations in Quantum Field Theory * The Gauge Invariance of the Critical Curve in Strong-coupling Gauge Theory * Heavy W Decays into Sfermions and a Photon * New Insights on Majoron Models * Program of Beyond the Standard Model II * List of Participants

  15. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  16. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  17. Validity of the Autism\\/Pervasive Developmental Disorder Subscale of the Diagnostic Assessment for the Severely Handicapped-II

    Microsoft Academic Search

    Johnny L. Matson; Brandi B. Smiroldo; Theresa L. Hastings

    1998-01-01

    This study was designed to establish the empirical validity of the Diagnostic Assessment for the Severely Handicapped-II (DASH-II) to screen for the presence of autism in severely and profoundly mentally retarded adults. Participants included 51 individuals residing in a large developmental center in Central Louisiana. The Autism\\/Pervasive Developmental Disorder subscale of the DASH-II was internally consistent. Additionally, the DASH-II was

  18. Validation of Numerical Shallow Water Models for Tidal Lagoons

    SciTech Connect

    Eliason, D.; Bourgeois, A.

    1999-11-01

    An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

  19. A DESIGN-DRIVEN VALIDATION APPROACH USING BAYESIAN PREDICTION MODELS

    E-print Network

    Chen, Wei

    1 A DESIGN-DRIVEN VALIDATION APPROACH USING BAYESIAN PREDICTION MODELS ABSTRACT In most under uncertainty. In this work, a design-driven validation approach is presented. By combining data framework for drawing inferences for predictions in the intended, but maybe untested, design domain

  20. Adolescent Personality: A Five-Factor Model Construct Validation

    ERIC Educational Resources Information Center

    Baker, Spencer T.; Victor, James B.; Chambers, Anthony L.; Halverson, Jr., Charles F.

    2004-01-01

    The purpose of this study was to investigate convergent and discriminant validity of the five-factor model of adolescent personality in a school setting using three different raters (methods): self-ratings, peer ratings, and teacher ratings. The authors investigated validity through a multitrait-multimethod matrix and a confirmatory factor…

  1. Validation of an Evaluation Model for Learning Management Systems

    ERIC Educational Resources Information Center

    Kim, S. W.; Lee, M. G.

    2008-01-01

    This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

  2. Alternative Models of Collegiate Business Education: Their Validity and Implications.

    ERIC Educational Resources Information Center

    Van Auken, Stuart; And Others

    1996-01-01

    Two models of management education are examined: the academic model, which treats the field of business as a science; and the professional model, which is responsive to the perceived needs of the business community. A study investigated the models' validity within the context of existing programs by 268 surveying program deans about their beliefs…

  3. BEHAVIORAL MODEL SPECIFICATION TOWARDS SIMULATION VALIDATION USING RELATIONAL DATABASES

    E-print Network

    and in particular in terms of model validation, it is important to use model repositories. The structure and behavior of dynamical systems can be represented as atomic models having inputs, outputs, states, and functions. Scalable System Entity Structure Modeler with Complexity Measures (SESM/CM) offers a basis

  4. Predicting Vehicle Crashworthiness: Validation of Computer Models for

    E-print Network

    Berger, Jim

    Predicting Vehicle Crashworthiness: Validation of Computer Models for Functional and Hierarchical.1 The Computer Model for Vehicle Crashworthiness The CRASH computer model simulates the effect of a collision be made to meet mandated standards for crashworthiness, but the computer model plays an integral part

  5. Validating Computational Cognitive Process Models across Multiple Timescales

    NASA Astrophysics Data System (ADS)

    Myers, Christopher; Gluck, Kevin; Gunzelmann, Glenn; Krusmark, Michael

    2010-12-01

    Model comparison is vital to evaluating progress in the fields of artificial general intelligence (AGI) and cognitive architecture. As they mature, AGI and cognitive architectures will become increasingly capable of providing a single model that completes a multitude of tasks, some of which the model was not specifically engineered to perform. These models will be expected to operate for extended periods of time and serve functional roles in real-world contexts. Questions arise regarding how to evaluate such models appropriately, including issues pertaining to model comparison and validation. In this paper, we specifically address model validation across multiple levels of abstraction, using an existing computational process model of unmanned aerial vehicle basic maneuvering to illustrate the relationship between validity and timescales of analysis.

  6. Particulate dispersion apparatus for the validation of plume models

    E-print Network

    Bala, William D

    2001-01-01

    The purpose of this thesis is to document design, development, and fabrication of a transportable source of dry aerosol to improve testing and validation of atmospheric plume models. The proposed dispersion apparatus is intended to complement...

  7. Utilization of a Validated Power System Model on Two

    E-print Network

    Deliverable # 5 Interim Report: Preliminary Results of the Scenario Analysis Delivered to: Richard Rocheleau In this report, preliminary results of the Big Island power system model validation and scenario analysis

  8. Statistical Validation of Engineering and Scientific Models: Background

    SciTech Connect

    Hills, Richard G.; Trucano, Timothy G.

    1999-05-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made.

  9. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  10. HEDR model validation plan. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ``tools`` for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ``validate`` these tools. In the sense of the HEDR Project, ``validation`` is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model.

  11. SAGE III Aerosol Extinction Validation in the Arctic Winter: Comparisons with SAGE II and POAM III

    NASA Technical Reports Server (NTRS)

    Thomason, L. W.; Poole, L. R.; Randall, C. E.

    2007-01-01

    The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use potential vorticity as a spatial coordinate and thus greatly increased the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020-nm extinction ratio shows a consistent bias of approx. 30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that the two data sets are not well correlated below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

  12. Towards integrated design evaluation: Validation of models

    Microsoft Academic Search

    Graham Green

    2000-01-01

    This paper professes the importance of the evaluation activity, particularly during the conceptual phase of the engineering design process. It provides a review of a range of complementary models and reports on research aimed at modelling the evaluation of conceptualdesigns,leading to the proposal of a general framework enabling the combination of separate models into a possible future integrated design evaluation

  13. Validation of the Archimedes Diabetes Model

    Microsoft Academic Search

    DAVID M. EDDY; LEONARD SCHLESSINGER

    trolled trials by repeating in the model the steps taken for the real trials and comparing the results calculated by the model with the results of the trial. Eighteen trials were chosen by an independent advisory committee. Half the trials had been used to help build the model (\\

  14. New methods for estimation, modeling and validation of dynamical systems using automatic differentiation

    E-print Network

    Griffith, Daniel Todd

    2005-02-17

    .2 Methods for Validating Solution Accuracy?????.. 123 5.3 Automatic Generation of Exact Dynamical Models??. 129 5.4 Multibody System Examples.??????????? 133 5.5 Accuracy of Solution and Space/Time Derivatives??.. 140 5.6 Summary... exact representations and special case exact solutions for multibody distributed parameter systems with significant complexity. Concluding remarks including a description of future work is given in Chapter VI. 5 CHAPTER II OVERVIEW...

  15. Gear Windage Modeling Progress - Experimental Validation Status

    NASA Technical Reports Server (NTRS)

    Kunz, Rob; Handschuh, Robert F.

    2008-01-01

    In the Subsonics Rotary Wing (SRW) Project being funded for propulsion work at NASA Glenn Research Center, performance of the propulsion system is of high importance. In current rotorcraft drive systems many gearing components operate at high rotational speed (pitch line velocity > 24000 ft/ min). In our testing of high speed helical gear trains at NASA Glenn we have found that the work done on the air - oil mist within the gearbox can become a significant part of the power loss of the system. This loss mechanism is referred to as windage. The effort described in this presentation is to try to understand the variables that affect windage, develop a good experimental data base to validate, the analytical project being conducted at Penn State University by Dr. Rob Kunz under a NASA SRW NRA. The presentation provides an update to the status of these efforts.

  16. Circumplex Structure and Personality Disorder Correlates of the Interpersonal Problems Model (IIP-C): Construct Validity and Clinical Implications

    ERIC Educational Resources Information Center

    Monsen, Jon T.; Hagtvet, Knut A.; Havik, Odd E.; Eilertsen, Dag E.

    2006-01-01

    This study assessed the construct validity of the circumplex model of the Inventory of Interpersonal Problems (IIP-C) in Norwegian clinical and nonclinical samples. Structure was examined by evaluating the fit of the circumplex model to data obtained by the IIP-C. Observer-rated personality disorder criteria (DSM-IV, Axis II) were used as external…

  17. Empirical validation of computer models for passive-solar residences

    NASA Astrophysics Data System (ADS)

    Sebald, A. V.

    1983-06-01

    The theoretical underpinnings for experimental validation of thermodynamic models of passive solar buildings are given. Computer algorithms for such validation are discussed. Models for passive solar buildings are essentially incapable of validation in the classical sense. This is principally due to the fact that buildings are exposed to excitations which have insufficient frequency content to permit estimation of the coefficients in all but the most rudimentary models. One can, however, generate a set of possible models which explain the measured data. Unfortunately, while all models in the set may equally well track the measured data, the coefficients may vary significantly within the set. When used to estimate auxiliary energy consumption by simulation, models within the set may predict substantially different consumptions.

  18. SURVEY, ANALYSIS AND VALIDATION OF INFORMATION FOR BUSINESS PROCESS MODELING

    E-print Network

    of the information necessary for As Is business processes modeling, conjugating top-down and bottom-up approaches follow a top-down approach, more adjusted to the To Be modeling, what makes the development-up approach and the to be modeling a top down. The characteristics of the survey, analysis and validation

  19. CMOS Transistor Mismatch Model valid from Weak to Strong Inversion

    E-print Network

    Barranco, Bernabe Linares

    CMOS Transistor Mismatch Model valid from Weak to Strong Inversion Teresa Serrano and PMOS transistors for 30 different geometries has been done with this continuos model. The model is able of transistor mismatch is crucial for precision analog design. Using very reduced transistor geometries produces

  20. Validation of Model Forecasts of the Ambient Solar Wind

    NASA Technical Reports Server (NTRS)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  1. Development and validation of a two-phase, three-dimensional model for PEM fuel cells.

    SciTech Connect

    Chen, Ken Shuang

    2010-04-01

    The objectives of this presentation are: (1) To develop and validate a two-phase, three-dimensional transport modelfor simulating PEM fuel cell performance under a wide range of operating conditions; (2) To apply the validated PEM fuel cell model to improve fundamental understanding of key phenomena involved and to identify rate-limiting steps and develop recommendations for improvements so as to accelerate the commercialization of fuel cell technology; (3) The validated PEMFC model can be employed to improve and optimize PEM fuel cell operation. Consequently, the project helps: (i) address the technical barriers on performance, cost, and durability; and (ii) achieve DOE's near-term technical targets on performance, cost, and durability in automotive and stationary applications.

  2. Spatial statistical modeling of shallow landslides—Validating predictions for different landslide inventories and rainfall events

    NASA Astrophysics Data System (ADS)

    von Ruette, Jonas; Papritz, Andreas; Lehmann, Peter; Rickli, Christian; Or, Dani

    2011-10-01

    Statistical models that exploit the correlation between landslide occurrence and geomorphic properties are often used to map the spatial occurrence of shallow landslides triggered by heavy rainfalls. In many landslide susceptibility studies, the true predictive power of the statistical model remains unknown because the predictions are not validated with independent data from other events or areas. This study validates statistical susceptibility predictions with independent test data. The spatial incidence of landslides, triggered by an extreme rainfall in a study area, was modeled by logistic regression. The fitted model was then used to generate susceptibility maps for another three study areas, for which event-based landslide inventories were also available. All the study areas lie in the northern foothills of the Swiss Alps. The landslides had been triggered by heavy rainfall either in 2002 or 2005. The validation was designed such that the first validation study area shared the geomorphology and the second the triggering rainfall event with the calibration study area. For the third validation study area, both geomorphology and rainfall were different. All explanatory variables were extracted for the logistic regression analysis from high-resolution digital elevation and surface models (2.5 m grid). The model fitted to the calibration data comprised four explanatory variables: (i) slope angle (effect of gravitational driving forces), (ii) vegetation type (grassland and forest; root reinforcement), (iii) planform curvature (convergent water flow paths), and (iv) contributing area (potential supply of water). The area under the Receiver Operating Characteristic (ROC) curve ( AUC) was used to quantify the predictive performance of the logistic regression model. The AUC values were computed for the susceptibility maps of the three validation study areas (validation AUC), the fitted susceptibility map of the calibration study area (apparent AUC: 0.80) and another susceptibility map obtained for the calibration study area by 20-fold cross-validation (cross-validation AUC: 0.74). The AUC values of the first and second validation study areas (0.72 and 0.69, respectively) and the cross-validation AUC matched fairly well, and all AUC values were distinctly smaller than the apparent AUC. Based on the apparent AUC one would have clearly overrated the predictive performance for the first two validation areas. Rather surprisingly, the AUC value of the third validation study area (0.82) was larger than the apparent AUC. A large part of the third validation study area consists of gentle slopes, and the regression model correctly predicted that no landslides occur in the flat parts. This increased the predictive performance of the model considerably. The predicted susceptibility maps were further validated by summing the predicted susceptibilities for the entire validation areas and by comparing the sums with the observed number of landslides. The sums exceeded the observed counts for all the validation areas. Hence, the logistic regression model generally over-estimated the risk of landslide occurrence. Obviously, a predictive model that is based on static geomorphic properties alone cannot take a full account of the complex and time dependent processes in the subsurface. However, such a model is still capable of distinguishing zones highly or less prone to shallow landslides.

  3. SWAT: Model use, calibration, and validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    SWAT (Soil and Water Assessment Tool) is a comprehensive, semi-distributed river basin model that requires a large number of input parameters which complicates model parameterization and calibration. Several calibration techniques have been developed for SWAT including manual calibration procedures...

  4. Validating the Mexican American Intergenerational Caregiving Model

    ERIC Educational Resources Information Center

    Escandon, Socorro

    2011-01-01

    The purpose of this study was to substantiate and further develop a previously formulated conceptual model of Role Acceptance in Mexican American family caregivers by exploring the theoretical strengths of the model. The sample consisted of women older than 21 years of age who self-identified as Hispanic, were related through consanguinal or…

  5. Combustion turbine dynamic model validation from tests

    Microsoft Academic Search

    L. N. Hannett; Afzal Khan

    1993-01-01

    Studies have been conducted on the Alaskan Railbelt System to examine the hydrothermal power system response after the hydroelectric power units at Bradley Lake are installed. The models and data for the generating units for the initial studies were not complete. Typical models were used, but their response appeared to be faster than judged by operating experience. A testing program

  6. Validation of the Poisson Stochastic Radiative Transfer Model

    NASA Technical Reports Server (NTRS)

    Zhuravleva, Tatiana; Marshak, Alexander

    2004-01-01

    A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

  7. Validating Predictions from Climate Envelope Models

    PubMed Central

    Watling, James I.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species. PMID:23717452

  8. Validating Requirements for Fault Tolerant Systems Using Model Checking

    NASA Technical Reports Server (NTRS)

    Schneider, Francis; Easterbrook, Steve M.; Callahan, John R.; Holzmann, Gerard J.

    1997-01-01

    Model checking is shown to be an effective tool in validating the behavior of a fault tolerant embedded spacecraft controller. The case study presented here shows that by judiciously abstracting away extraneous complexity, the state space of the model could be exhaustively searched allowing critical functional requirements to be validated down to the design level. Abstracting away detail not germane to the problem of interest leaves by definition a partial specification behind. The success of this procedure shows that it is feasible to effectively validate a partial specification with this technique. Three anomalies were found in the system one of which is an error in the detailed requirements, and the other two are missing/ambiguous requirements. Because the method allows validation of partial specifications, it also is an effective methodology towards maintaining fidelity between a co-evolving specification and an implementation.

  9. EXODUS II: A finite element data model

    SciTech Connect

    Schoof, L.A.; Yarberry, V.R.

    1994-09-01

    EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise the Application Programming Interface (API).

  10. On cluster validity for the fuzzy c-means model

    Microsoft Academic Search

    N. R. Pal; J. C. Bezdek

    1995-01-01

    Many functionals have been proposed for validation of partitions of object data produced by the fuzzy c-means (FCM) clustering algorithm. We examine the role a subtle but important parameter-the weighting exponent m of the FCM model-plays in determining the validity of FCM partitions. The functionals considered are the partition coefficient and entropy indexes of Bezdek, the Xie-Beni (1991), and extended

  11. Shuttle Spacesuit: Fabric/LCVG Model Validation

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

    2001-01-01

    A detailed spacesuit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the spacesuit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of spacesuit shielding properties assumed the basic fabric lay-up (Thermal Micrometeroid Garment, fabric restraints, and pressure envelope) and Liquid Cooling and Ventilation Garment (LCVG) could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present spacesuit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and highresolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the spacesuit's protection properties.

  12. SIMULATION MODEL, DRAINMOD-N II

    Microsoft Academic Search

    X. Wang; M. A. Youssef; R. W. Skaggs; J. D. Atwood; J. R. Frankenberger

    2005-01-01

    A two-step global sensitivity analysis was conducted for the nitrogen simulation model DRAINMOD-N II to assess the sensitivity of model predictions of NO3-N losses with drainage water to various model inputs. Factors screening using the LH-OAT (Latin hypercube sampling - one at a time) sensitivity analysis method was performed as a first step considering 48 model parameters; then a variance-based

  13. FACES IV & the Circumplex Model: Validation Study

    Microsoft Academic Search

    David H. Olson

    2010-01-01

    FACES IV was developed to tap the full continuum of the cohesion and flexibility dimensions from the Circumplex Model of Marital and Family Systems. Six scales were developed, with two balanced scales and four unbalanced scales designed to tap low & high cohesion (disengaged and enmeshed) and flexibility (rigid and chaotic). This study provides initial evidence the six scales in

  14. WEPP: Model use, calibration and validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Water Erosion Prediction Project (WEPP) model is a process-based, continuous simulation, distributed parameter, hydrologic and soil erosion prediction system. It has been developed over the past 25 years to allow for easy application to a large number of land management scenarios. Most general o...

  15. Validation of nuclear models used in space radiation shielding applications

    SciTech Connect

    Norman, Ryan B., E-mail: Ryan.B.Norman@nasa.gov [NASA Langley Research Center, Hampton, VA 23681 (United States); Blattnig, Steve R. [NASA Langley Research Center, Hampton, VA 23681 (United States)] [NASA Langley Research Center, Hampton, VA 23681 (United States)

    2013-01-15

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  16. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  17. Validating regional-scale surface energy balance models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    One of the major challenges in developing reliable regional surface flux models is the relative paucity of scale-appropriate validation data. Direct comparisons between coarse-resolution model flux estimates and flux tower data can often be dominated by sub-pixel heterogeneity effects, making it di...

  18. ORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical

    E-print Network

    Payan, Yohan

    , biomechanical modelling of the human upper airway has received a growing interest since it allows a better of the biomechanical properties of the upper airway (geometry, rheology). This makes them of interest to improveORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical simulations using

  19. ORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical

    E-print Network

    Lagrée, Pierre-Yves

    Obstructive sleep apnea syndrome 1 Introduction Since the 1990s, biomechanical modelling of the human upper properties of the upper airway (geometry, rheology). This makes them of interest to improve the qualityORIGINAL ARTICLE Modelling the human pharyngeal airway: validation of numerical simulations using

  20. Mechanical validation of whole bone composite femur models

    Microsoft Academic Search

    Luca Cristofolini; Marco Viceconti; Angelo Cappello; Aldo Toni

    1996-01-01

    Composite synthetic models of the human femur have recently become commercially available as substitutes for cadaveric specimens. Their quick diffusion was justified by the advantages they offer as a substitute for real femurs. The present investigation concentrated on an extensive experimental validation of the mechanical behaviour of the whole bone composite model, compared to human fresh-frozen and dried-rehydrated specimens for

  1. IC immunity modeling process validation using on-chip measurements

    E-print Network

    Paris-Sud XI, Université de

    1 IC immunity modeling process validation using on-chip measurements S. Ben Dhia (1)(2) , A. Boyer Eisenhower, 31023 Toulouse, France Abstract-- Developing integrated circuit (IC) immunity models and simulation flow has become one of the major concerns of ICs suppliers to predict whether a chip will pass

  2. Modeling HIV Immune Response and Validation with Clinical Data

    E-print Network

    Modeling HIV Immune Response and Validation with Clinical Data H. T. Banksa,1 , M. Davidiana,2 equations is formulated to describe the pathogenesis of HIV infection, wherein certain important features, and stimulation by antigens other than HIV. A stability analysis illustrates the capability of this model

  3. Nonisothermal Modeling of Polymer Electrolyte Fuel Cells I. Experimental Validation

    E-print Network

    Nonisothermal Modeling of Polymer Electrolyte Fuel Cells I. Experimental Validation Hyunchul Ju 21921, USA A three-dimensional, nonisothermal model of polymer electrolyte fuel cells PEFC is applied as the current density distribution is mainly controlled by hydration of the polymer electrolyte, which

  4. Reduced-complexity modeling of braided rivers: Assessing model performance by sensitivity analysis, calibration, and validation

    NASA Astrophysics Data System (ADS)

    Ziliani, L.; Surian, N.; Coulthard, T. J.; Tarantola, S.

    2013-12-01

    paper addresses an important question of modeling stream dynamics: How may numerical models of braided stream morphodynamics be rigorously and objectively evaluated against a real case study? Using simulations from the Cellular Automaton Evolutionary Slope and River (CAESAR) reduced-complexity model (RCM) of a 33 km reach of a large gravel bed river (the Tagliamento River, Italy), this paper aims to (i) identify a sound strategy for calibration and validation of RCMs, (ii) investigate the effectiveness of multiperformance model assessments, (iii) assess the potential of using CAESAR at mesospatial and mesotemporal scales. The approach used has three main steps: first sensitivity analysis (using a screening method and a variance-based method), then calibration, and finally validation. This approach allowed us to analyze 12 input factors initially and then to focus calibration only on the factors identified as most important. Sensitivity analysis and calibration were performed on a 7.5 km subreach, using a hydrological time series of 20 months, while validation on the whole 33 km study reach over a period of 8 years (2001-2009). CAESAR was able to reproduce the macromorphological changes of the study reach and gave good results as for annual bed load sediment estimates which turned out to be consistent with measurements in other large gravel bed rivers but showed a poorer performance in reproducing the characteristics of the braided channel (e.g., braiding intensity). The approach developed in this study can be effectively applied in other similar RCM contexts, allowing the use of RCMs not only in an explorative manner but also in obtaining quantitative results and scenarios.

  5. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  6. On the development and validation of QSAR models.

    PubMed

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds. PMID:23086855

  7. Experiments for foam model development and validation.

    SciTech Connect

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  8. Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU.

    SciTech Connect

    Ko, Y. C.; Hu, L. W.; Olson, A. P.; Dunn, F. E.; Nuclear Engineering Division; MIT

    2007-01-01

    An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory.

  9. Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU

    SciTech Connect

    Ko, Y.-C. [Nuclear Science and Engineering Department, MIT, Cambridge, MA 02139 (United States); Hu, L.-W. [Nuclear Reactor Laboratory, MIT, Cambridge, MA 02139 (United States)], E-mail: lwhu@mit.edu; Olson, Arne P.; Dunn, Floyd E. [RERTR Program, Argonne National Laboratory, Argonne, IL 60439 (United States)

    2008-07-15

    An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced to natural convection during a loss of primary flow transient, and the ability to calculate safety limits and limiting safety system settings for licensing applications. This paper describes the validation of the code against PLTEMP/ANL 3.0 for steady-state analysis, and against RELAP5-3D for loss of primary coolant transient analysis. Coolant temperature measurements obtained from loss of primary flow transients as part of the MITR-II startup testing were also used for validating this code. The agreement between MULCH-II and the other computer codes is satisfactory. (author)

  10. Measurements of Humidity in the Atmosphere and Validation Experiments (Mohave, Mohave II): Results Overview

    NASA Technical Reports Server (NTRS)

    Leblanc, Thierry; McDermid, Iain S.; McGee, Thomas G.; Twigg, Laurence W.; Sumnicht, Grant K.; Whiteman, David N.; Rush, Kurt D.; Cadirola, Martin P.; Venable, Demetrius D.; Connell, R.; Demoz, Belay B.; Vomel, Holger; Miloshevich, L.

    2008-01-01

    The Measurements of Humidity in the Atmosphere and Validation Experiments (MOHAVE, MOHAVE-II) inter-comparison campaigns took place at the Jet Propulsion Laboratory (JPL) Table Mountain Facility (TMF, 34.5(sup o)N) in October 2006 and 2007 respectively. Both campaigns aimed at evaluating the capability of three Raman lidars for the measurement of water vapor in the upper troposphere and lower stratosphere (UT/LS). During each campaign, more than 200 hours of lidar measurements were compared to balloon borne measurements obtained from 10 Cryogenic Frost-point Hygrometer (CFH) flights and over 50 Vaisala RS92 radiosonde flights. During MOHAVE, fluorescence in all three lidar receivers was identified, causing a significant wet bias above 10-12 km in the lidar profiles as compared to the CFH. All three lidars were reconfigured after MOHAVE, and no such bias was observed during the MOHAVE-II campaign. The lidar profiles agreed very well with the CFH up to 13-17 km altitude, where the lidar measurements become noise limited. The results from MOHAVE-II have shown that the water vapor Raman lidar will be an appropriate technique for the long-term monitoring of water vapor in the UT/LS given a slight increase in its power-aperture, as well as careful calibration.

  11. Sub-nanometer Level Model Validation of the SIM Interferometer

    NASA Technical Reports Server (NTRS)

    Korechoff, Robert P.; Hoppe, Daniel; Wang, Xu

    2004-01-01

    The Space Interferometer Mission (SIM) flight instrument will not undergo a full performance, end-to-end system test on the ground due to a number of constraints. Thus, analysis and physics-based models will play a significant role in providing confidence that SIM will meet its science goals on orbit. The various models themselves are validated against the experimental results obtained from the MicroArcsecond Metrology (MAM) testbed adn the Diffraction testbed (DTB). The metric for validation is provided by the SIM astrometric error budget.

  12. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  13. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  14. Stark II and physician compensation models: integrating business and legal issues.

    PubMed

    Pursell, David; Marsh, Jennifer; Thompson, David; Potter, Keith

    2005-01-01

    In March of 2004, the Centers for Medicare & Medicaid Services released new regulations that interpreted the Federal Physician Self Referral Act, otherwise known as Stark II. The new regulations, commonly referred to as the Phase II regulations, must be carefully considered when structuring physician compensation models. Stark II generally holds that physicians may not make a referral for designated health services to an entity with which they have a direct or indirect financial relationship. This Article outlines the provisions of Stark II that are applicable to physician compensation methodologies. In addition, the authors evaluate hypothetical transactions involving physician groups seeking viable compensation schemes and explore the validity and risks of each. PMID:15968939

  15. Wavelet spectrum analysis approach to model validation of dynamic systems

    NASA Astrophysics Data System (ADS)

    Jiang, Xiaomo; Mahadevan, Sankaran

    2011-02-01

    Feature-based validation techniques for dynamic system models could be unreliable for nonlinear, stochastic, and transient dynamic behavior, where the time series is usually non-stationary. This paper presents a wavelet spectral analysis approach to validate a computational model for a dynamic system. Continuous wavelet transform is performed on the time series data for both model prediction and experimental observation using a Morlet wavelet function. The wavelet cross-spectrum is calculated for the two sets of data to construct a time-frequency phase difference map. The Box-plot, an exploratory data analysis technique, is applied to interpret the phase difference for validation purposes. In addition, wavelet time-frequency coherence is calculated using the locally and globally smoothed wavelet power spectra of the two data sets. Significance tests are performed to quantitatively verify whether the wavelet time-varying coherence is significant at a specific time and frequency point, considering uncertainties in both predicted and observed time series data. The proposed wavelet spectrum analysis approach is illustrated with a dynamics validation challenge problem developed at the Sandia National Laboratories. A comparison study is conducted to demonstrate the advantages of the proposed methodologies over classical frequency-independent cross-correlation analysis and time-independent cross-coherence analysis for the validation of dynamic systems.

  16. Dynamic modelling and experimental validation of three wheeled tilting vehicles

    Microsoft Academic Search

    Nicola Amati; Andrea Festini; Luigi Pelizza; Andrea Tonoli

    2011-01-01

    The present paper describes the study of the stability in the straight running of a three-wheeled tilting vehicle for urban and sub-urban mobility. The analysis was carried out by developing a multibody model in the Matlab\\/SimulinkSimMechanics environment. An Adams-Motorcycle model and an equivalent analytical model were developed for the cross-validation and for highlighting the similarities with the lateral dynamics of

  17. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  18. Validation and Calibration in ACE Models: An Investigation on the CATS model.

    E-print Network

    Tesfatsion, Leigh

    Validation and Calibration in ACE Models: An Investigation on the CATS model. Carlo Bianchi deal with some validation (and a ...rst calibration) experiments on the CATS model proposed in Gallegati et al. (2003a, 2004b). The CATS model has been intensively used (see, for example, Delli Gatti et

  19. Models of our Galaxy - II

    NASA Astrophysics Data System (ADS)

    Binney, James; McMillan, Paul

    2011-05-01

    Stars near the Sun oscillate both horizontally and vertically. In a previous paper by Binney it was assumed that the coupling between these motions can be modelled by determining the horizontal motion without reference to the vertical motion, and recovering the coupling between the motions by assuming that the vertical action is adiabatically conserved as the star oscillates horizontally. Here, we show that, although the assumption of adiabatic invariance works well, more accurate results can be obtained by taking the vertical action into account when calculating the horizontal motion. We use orbital tori to present a simple but fairly realistic model of the Galaxy's discs in which the motion of stars is handled rigorously, without decomposing it into horizontal and vertical components. We examine the ability of the adiabatic approximation to calculate the model's observables, and find that it performs perfectly in the plane, but errs slightly away from the plane. When the new correction to the adiabatic approximation is used, the density, mean-streaming velocity and velocity dispersions are in error by less than 10 per cent for distances up to 2.5 kpc from the Sun. The torus-based model reveals that at locations above the plane, the long axis of the velocity ellipsoid points almost to the Galactic centre, even though the model potential is significantly flattened. This result contradicts the widespread belief that the shape of the Galaxy's potential can be strongly constrained by the orientation of velocity ellipsoid near the Sun. An analysis of individual orbits reveals that in a general potential the orientation of the velocity ellipsoid depends on the structure of the model's distribution function as much as on its gravitational potential, contrary to what is the case for Stäckel potentials. We argue that the adiabatic approximation will provide a valuable complement to torus-based models in the interpretation of current surveys of the Galaxy.

  20. Data for model validation summary report. A summary of data for validation and benchmarking of recovery boiler models

    SciTech Connect

    Grace, T.; Lien, S.; Schmidl, W.; Salcudean, M.; Abdullah, Z.

    1997-07-01

    One of the tasks in the project was to obtain data from operating recovery boilers for the purpose of model validation. Another task was to obtain water model data and computer output from University of British Columbia for purposes of benchmarking the UBC model against other codes. In the course of discussions on recovery boiler modeling over the course of this project, it became evident that there would be value in having some common cases for carrying out benchmarking exercises with different recovery boiler models. In order to facilitate such a benchmarking exercise, the data that was obtained on this project for validation and benchmarking purposes has been brought together in a single, separate report. The intent is to make this data available to anyone who may want to use it for model validation. The report contains data from three different cases. Case 1 is an ABBCE recovery boiler which was used for model validation. The data are for a single set of operating conditions. Case 2 is a Babcock & Wilcox recovery boiler that was modified by Tampella. In this data set, several different operating conditions were employed. The third case is water flow data supplied by UBC, along with computational output using the UBC code, for benchmarking purposes.

  1. Using the split Hopkinson pressure bar to validate material models.

    PubMed

    Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian

    2014-08-28

    This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. PMID:25071238

  2. Modeling and Validation of Microwave Ablations with Internal Vaporization

    PubMed Central

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

    2014-01-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  3. Modeling and validation of microwave ablations with internal vaporization.

    PubMed

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L

    2015-02-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this paper, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10, and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intraprocedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard index of 0.27, 0.49, 0.61, 0.67, and 0.69 at 1, 2, 3, 4, and 5 min, respectively. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  4. Modeling of a Foamed Emulsion Bioreactor: II. Model Parametric Sensitivity

    E-print Network

    ARTICLE Modeling of a Foamed Emulsion Bioreactor: II. Model Parametric Sensitivity Eunsung Kan: The sensitivity of a conceptual model of a foam emulsion bioreactor (FEBR) used for the control of toluene vapors in air was examined. Model parametric sensitivity studies showed which parameters affect the removal

  5. Validation of Computer Models for Homeland Security Purposes

    SciTech Connect

    Schweppe, John E.; Ely, James H.; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-10-23

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources.

  6. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  7. Climate Model Validation Using Spectrally Resolved Shortwave Radiation Measurements

    NASA Astrophysics Data System (ADS)

    Roberts, Y.; Taylor, P. C.; Lukashin, C.; Feldman, D.; Pilewskie, P.; Collins, W.

    2013-12-01

    The climate science community has made significant strides in the development and improvement of Global Climate Models (GCMs) to predict how the Earth's climate system will change over the next several decades. It is crucial to evaluate how well these models reproduce observed climate variability using strict validation techniques to assist the climate modeling community with improving GCM prediction. The ability of climate models to simulate Earth's present-day climate is an initial evaluation of their ability to predict future changes in climate. Models are evaluated in several ways, including model intercomparison projects and comparing model simulations of physical variables with observations of those variables. We are developing new methods for rigorous climate model validation and physical attribution of the cause of model errors using existing, direct measurements of hyperspectral shortwave reflectance. We have also developed a SCIAMACHY-based (Scanning Imaging Absorption Spectrometer for Atmospheric Cartography) hyperspectral shortwave climate validation product to demonstrate using the product to validate GCMs. We are also investigating the information content added by using multispectral and hyperspectral data to study climate variability and validate climate models. The goal is to determine if it is necessary to use data with continuous spectral sampling across the shortwave spectral range, or if it is sufficient to use a subset of carefully selected spectral bands (e.g. MODIS-like data) to study climate trends and evaluate climate model performance. We are carrying out this activity by comparing the information content within broadband, multispectral (discrete-band sampling), and hyperspectral (high spectral resolution with continuous spectral sampling) data sets. Changes in climate-relevant atmospheric and surface variables impact the spectral, spatial, and temporal variability of Earth-reflected solar radiation (0.3-2.5 ?m) through spectrally dependent scattering and absorption processes. Previous studies have demonstrated that highly accurate, hyperspectral (spectrally contiguous and overlapping) measurements of shortwave reflectance are important for monitoring climate variability from space. We are continuing to work to demonstrate that highly accurate, high information content hyperspectral shortwave measurements can be used to detect changes in climate, identify climate variance drivers, and validate GCMs.

  8. Radiative transfer model validations during the First ISLSCP Field Experiment

    NASA Technical Reports Server (NTRS)

    Frouin, Robert; Breon, Francois-Marie; Gautier, Catherine

    1990-01-01

    Two simple radiative transfer models, the 5S model based on Tanre et al. (1985, 1986) and the wide-band model of Morcrette (1984) are validated by comparing their outputs with results obtained during the First ISLSCP Field Experiment on concomitant radiosonde, aerosol turbidity, and radiation measurements and sky photographs. Results showed that the 5S model overestimates the short-wave irradiance by 13.2 W/sq m, whereas the Morcrette model underestimated the long-wave irradiance by 7.4 W/sq m.

  9. Hydrologic and water quality models: Use, calibration, and validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper introduces a special collection of 22 research articles that present and discuss calibration and validation concepts in detail for hydrologic and water quality models by their developers and presents a broad framework for developing the American Society of Agricultural and Biological Engi...

  10. Validation of Model Assumptions in Quality Of Life Measurements

    E-print Network

    Mesbah, Mounir

    1 Validation of Model Assumptions in Quality Of Life Measurements Hamon, A. Dupuy, J.F. and Mesbah, M. University of South Brittany, Vannes, France Abstract: The measurement of Quality of Life (Qo programs designed to evaluate Quality of Life (QoL). Now that the data from this trials are being analyzed

  11. A Model for Investigating Predictive Validity at Highly Selective Institutions.

    ERIC Educational Resources Information Center

    Gross, Alan L.; And Others

    A statistical model for investigating predictive validity at highly selective institutions is described. When the selection ratio is small, one must typically deal with a data set containing relatively large amounts of missing data on both criterion and predictor variables. Standard statistical approaches are based on the strong assumption that…

  12. Validating Work Discrimination and Coping Strategy Models for Sexual Minorities

    ERIC Educational Resources Information Center

    Chung, Y. Barry; Williams, Wendi; Dispenza, Franco

    2009-01-01

    The purpose of this study was to validate and expand on Y. B. Chung's (2001) models of work discrimination and coping strategies among lesbian, gay, and bisexual persons. In semistructured individual interviews, 17 lesbians and gay men reported 35 discrimination incidents and their related coping strategies. Responses were coded based on Chung's…

  13. Validating Real Time Specifications using Real Time Event Queue Modeling

    Microsoft Academic Search

    Robert J. Hall

    2008-01-01

    Interrupt-driven real time control software is difficult to design and validate. It does not line up well with traditional state-based, timed-transition specification formalisms, due to the complexity of timers and the pending interrupt queue. The present work takes a new approach to the problem of modeling and tool-supported reasoning about such systems based on infinite-state modeling of the temporal event

  14. Propeller aircraft interior noise model utilization study and validation

    NASA Astrophysics Data System (ADS)

    Pope, L. D.

    1984-09-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  15. Propeller aircraft interior noise model utilization study and validation

    NASA Technical Reports Server (NTRS)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  16. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    SciTech Connect

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Department of Materials, Imperial College London, London SW7 2AZ (United Kingdom)

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

  17. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    NASA Astrophysics Data System (ADS)

    Smith, N. A. S.; Rokosz, M. K.; Correia, T. M.

    2014-07-01

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

  18. Validating the BHR RANS model for variable density turbulence

    SciTech Connect

    Israel, Daniel M [Los Alamos National Laboratory; Gore, Robert A [Los Alamos National Laboratory; Stalsberg - Zarling, Krista L [Los Alamos National Laboratory

    2009-01-01

    The BHR RANS model is a turbulence model for multi-fluid flows in which density variation plays a strong role in the turbulence processes. In this paper they demonstrate the usefulness of BHR over a wide range of flows which include the effects of shear, buoyancy, and shocks. The results are in good agreement with experimental and DNS data across the entire set of validation cases, with no need to retune model coefficients between cases. The model has potential application to a number of aerospace related flow problems.

  19. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  20. Validation of impaired renal function chick model with uranyl nitrate

    SciTech Connect

    Harvey, R.B.; Kubena, L.F.; Phillips, T.D.; Heidelbaugh, N.D.

    1986-01-01

    Uranium is a highly toxic element when soluble salts are administered parenterally, whereas the index of toxicity is very low when ingested. In the salt form, uranium is one of the oldest substances used experimentally to induce mammalian renal failure. Renal damage occurs when uranium reacts chemically with the protein of columnar cells lining the tubular epithelium, leading to cellular injury and necrosis. Uranyl nitrate (UN) is the most common uranium salt utilized for nephrotoxic modeling. The development of an impaired renal function (IRF) chick model required a suitable nephrotoxic compound, such as UN, for validation, yet toxicity data for chickens were notably absent in the literature. The objective of the present study was to validate the IRF model with UN, based upon preliminary nephrotoxic dosages developed in this laboratory.

  1. Angular Distribution Models for Top-of-Atmosphere Radiative Flux Estimation from the Clouds and the Earth's Radiant Energy System Instrument on the Tropical Rainfall Measuring Mission Satellite. Part II; Validation

    NASA Technical Reports Server (NTRS)

    Loeb, N. G.; Loukachine, K.; Wielicki, B. A.; Young, D. F.

    2003-01-01

    Top-of-atmosphere (TOA) radiative fluxes from the Clouds and the Earth s Radiant Energy System (CERES) are estimated from empirical angular distribution models (ADMs) that convert instantaneous radiance measurements to TOA fluxes. This paper evaluates the accuracy of CERES TOA fluxes obtained from a new set of ADMs developed for the CERES instrument onboard the Tropical Rainfall Measuring Mission (TRMM). The uncertainty in regional monthly mean reflected shortwave (SW) and emitted longwave (LW) TOA fluxes is less than 0.5 W/sq m, based on comparisons with TOA fluxes evaluated by direct integration of the measured radiances. When stratified by viewing geometry, TOA fluxes from different angles are consistent to within 2% in the SW and 0.7% (or 2 W/sq m) in the LW. In contrast, TOA fluxes based on ADMs from the Earth Radiation Budget Experiment (ERBE) applied to the same CERES radiance measurements show a 10% relative increase with viewing zenith angle in the SW and a 3.5% (9 W/sq m) decrease with viewing zenith angle in the LW. Based on multiangle CERES radiance measurements, 18 regional instantaneous TOA flux errors from the new CERES ADMs are estimated to be 10 W/sq m in the SW and, 3.5 W/sq m in the LW. The errors show little or no dependence on cloud phase, cloud optical depth, and cloud infrared emissivity. An analysis of cloud radiative forcing (CRF) sensitivity to differences between ERBE and CERES TRMM ADMs, scene identification, and directional models of albedo as a function of solar zenith angle shows that ADM and clear-sky scene identification differences can lead to an 8 W/sq m root-mean-square (rms) difference in 18 daily mean SW CRF and a 4 W/sq m rms difference in LW CRF. In contrast, monthly mean SW and LW CRF differences reach 3 W/sq m. CRF is found to be relatively insensitive to differences between the ERBE and CERES TRMM directional models.

  2. The range of validity of sorption kinetic models.

    PubMed

    Douven, Sigrid; Paez, Carlos A; Gommes, Cedric J

    2015-06-15

    Several hundred papers are published yearly reporting liquid-phase adsorption kinetics data. In general the data is analyzed using a variety of standard models such as the pseudo first- and second-order models and the Intraparticle-Diffusion model. The validity of these models is often assessed empirically via their ability to fit the data, independently of their physicochemical soundness. The aim of the present paper is to rationalize the analysis of liquid-phase adsorption kinetics data, and to investigate experimental factors that influence the adsorption kinetics, in addition to the characteristics of the adsorbent material itself. For that purpose we use a simple Langmuir adsorption-diffusion model, which enables us to identify three dimensionless numbers that characterize the working regime of any batch adsorption experiment: an adsorption Thiele modulus, a saturation modulus, and a loading modulus. The standard models are found to be particular cases of the general adsorption-diffusion model for specific values of the dimensionless numbers. This provides sound physicochemical criteria for the validity of the models. Based on our modeling, we also propose a general yet simple data analysis procedure to practically estimate the diffusion coefficient in adsorbent pellets starting from adsorption half-times. PMID:25765735

  3. Prediction of driving ability: Are we building valid models?

    PubMed

    Hoggarth, Petra A; Innes, Carrie R H; Dalrymple-Alford, John C; Jones, Richard D

    2015-04-01

    The prediction of on-road driving ability using off-road measures is a key aim in driving research. The primary goal in most classification models is to determine a small number of off-road variables that predict driving ability with high accuracy. Unfortunately, classification models are often over-fitted to the study sample, leading to inflation of predictive accuracy, poor generalization to the relevant population and, thus, poor validity. Many driving studies do not report sufficient details to determine the risk of model over-fitting and few report any validation technique, which is critical to test the generalizability of a model. After reviewing the literature, we generated a model using a moderately large sample size (n=279) employing best practice techniques in the context of regression modelling. By then randomly selecting progressively smaller sample sizes we show that a low ratio of participants to independent variables can result in over-fitted models and spurious conclusions regarding model accuracy. We conclude that more stable models can be constructed by following a few guidelines. PMID:25667204

  4. Robust cross-validation of linear regression QSAR models.

    PubMed

    Konovalov, Dmitry A; Llewellyn, Lyndon E; Vander Heyden, Yvan; Coomans, Danny

    2008-10-01

    A quantitative structure-activity relationship (QSAR) model is typically developed to predict the biochemical activity of untested compounds from the compounds' molecular structures. "The gold standard" of model validation is the blindfold prediction when the model's predictive power is assessed from how well the model predicts the activity values of compounds that were not considered in any way during the model development/calibration. However, during the development of a QSAR model, it is necessary to obtain some indication of the model's predictive power. This is often done by some form of cross-validation (CV). In this study, the concepts of the predictive power and fitting ability of a multiple linear regression (MLR) QSAR model were examined in the CV context allowing for the presence of outliers. Commonly used predictive power and fitting ability statistics were assessed via Monte Carlo cross-validation when applied to percent human intestinal absorption, blood-brain partition coefficient, and toxicity values of saxitoxin QSAR data sets, as well as three known benchmark data sets with known outlier contamination. It was found that (1) a robust version of MLR should always be preferred over the ordinary-least-squares MLR, regardless of the degree of outlier contamination and that (2) the model's predictive power should only be assessed via robust statistics. The Matlab and java source code used in this study is freely available from the QSAR-BENCH section of www.dmitrykonovalov.org for academic use. The Web site also contains the java-based QSAR-BENCH program, which could be run online via java's Web Start technology (supporting Windows, Mac OSX, Linux/Unix) to reproduce most of the reported results or apply the reported procedures to other data sets. PMID:18826208

  5. A verification and validation process for model-driven engineering

    NASA Astrophysics Data System (ADS)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  6. Concurrent Validity of the WISC-IV and DAS-II in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Kuriakose, Sarah

    2014-01-01

    Cognitive assessments are used for a variety of research and clinical purposes in children with autism spectrum disorder (ASD). This study establishes concurrent validity of the Wechsler Intelligence Scales for Children-fourth edition (WISC-IV) and Differential Ability Scales-second edition (DAS-II) in a sample of children with ASD with a broad…

  7. KINEROS2-AGWA: Model Use, Calibration, and Validation

    NASA Technical Reports Server (NTRS)

    Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

    2013-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  8. Verifying and Validating Proposed Models for FSW Process Optimization

    NASA Technical Reports Server (NTRS)

    Schneider, Judith

    2008-01-01

    This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms

  9. Validation of Advanced EM Models for UXO Discrimination

    NASA Astrophysics Data System (ADS)

    Weichman, Peter B.

    2013-07-01

    The work reported here details basic validation of our advanced physics-based EMI forward and inverse models against data collected by the NRL TEMTADS system. The data was collected under laboratory-type conditions using both artificial spheroidal targets and real UXO. The artificial target models are essentially exact, and enable detailed comparison of theory and data in support of measurement platform characterization and target identification. Real UXO targets cannot be treated exactly, but it is demonstrated that quantitative comparisons of the data with the spheroid models nevertheless aids in extracting key target discrimination information, such as target geometry and hollow target shell thickness.

  10. Validation of the Health-Promoting Lifestyle Profile II for Hispanic male truck drivers in the Southwest.

    PubMed

    Mullins, Iris L; O'Day, Trish; Kan, Tsz Yin

    2013-08-01

    The aims of the study were to validate the English and Spanish Versions of the Health-Promoting Lifestyle Profile II (HPLP II) with Hispanic male truck drivers and to determine if there were any differences in drivers' responses based on driving responsibility. The methods included a descriptive correlation design, the HPLP II (English and Spanish versions), and a demographic questionnaire. Fifty-two Hispanic drivers participated in the study. There were no significant differences in long haul and short haul drivers' responses to the HPLP II. Cronbach's alpha for the Spanish version was .97 and the subscales alphas ranged from .74 to .94. The English version alpha was .92 and the subscales ranged from .68 to .84. Findings suggest the subscales of Health Responsibility, Physical Activities, Nutrition, and Spirituality Growth on the HPLP II Spanish and English versions may not adequately assess health-promoting behaviors and cultural influences for the Hispanic male population in the southwestern border region. PMID:23047981

  11. Aggregating validity indicators embedded in Conners' CPT-II outperforms individual cutoffs at separating valid from invalid performance in adults with traumatic brain injury.

    PubMed

    Erdodi, Laszlo A; Roth, Robert M; Kirsch, Ned L; Lajiness-O'neill, Renee; Medoff, Brent

    2014-08-01

    Continuous performance tests (CPT) provide a useful paradigm to assess vigilance and sustained attention. However, few established methods exist to assess the validity of a given response set. The present study examined embedded validity indicators (EVIs) previously found effective at dissociating valid from invalid performance in relation to well-established performance validity tests in 104 adults with TBI referred for neuropsychological testing. Findings suggest that aggregating EVIs increases their signal detection performance. While individual EVIs performed well at their optimal cutoffs, two specific combinations of these five indicators generally produced the best classification accuracy. A CVI-5A ?3 had a specificity of .92-.95 and a sensitivity of .45-.54. At ?4 the CVI-5B had a specificity of .94-.97 and sensitivity of .40-.50. The CVI-5s provide a single numerical summary of the cumulative evidence of invalid performance within the CPT-II. Results support the use of a flexible, multivariate approach to performance validity assessment. PMID:24957927

  12. STRUCTURAL VALIDATION OF SYSTEM DYNAMICS AND AGENT-BASED SIMULATION MODELS

    E-print Network

    Tesfatsion, Leigh

    , population dynamics, energy systems, and urban planning. The usefulness of these models is predicatedSTRUCTURAL VALIDATION OF SYSTEM DYNAMICS AND AGENT- BASED SIMULATION MODELS Hassan Qudrat: hassanq@yorku.ca KEYWORDS Simulation; System Dynamics; Structural Validity ABSTRACT Simulation models

  13. Microelectronics package design using experimentally-validated modeling and simulation.

    SciTech Connect

    Johnson, Jay Dean; Young, Nathan Paul; Ewsuk, Kevin Gregory

    2010-11-01

    Packaging high power radio frequency integrated circuits (RFICs) in low temperature cofired ceramic (LTCC) presents many challenges. Within the constraints of LTCC fabrication, the design must provide the usual electrical isolation and interconnections required to package the IC, with additional consideration given to RF isolation and thermal management. While iterative design and prototyping is an option for developing RFIC packaging, it would be expensive and most likely unsuccessful due to the complexity of the problem. To facilitate and optimize package design, thermal and mechanical simulations were used to understand and control the critical parameters in LTCC package design. The models were validated through comparisons to experimental results. This paper summarizes an experimentally-validated modeling approach to RFIC package design, and presents some results and key findings.

  14. Rationality Validation of a Layered Decision Model for Network Defense

    SciTech Connect

    Wei, Huaqiang; Alves-Foss, James; Zhang, Du; Frincke, Deb

    2007-08-31

    We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDM rationality through simulation.

  15. Approaches to Validation of Models for Low Gravity Fluid Behavior

    NASA Technical Reports Server (NTRS)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  16. Seine estuary modelling and AirSWOT measurements validation

    NASA Astrophysics Data System (ADS)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being improved, by testing different roughness coefficients, adding tributary inflows. Groundwater contributions will also be introduced (digital TUGOm development in progress) . The model outputs will be validated using data from the GPMR tide gauge data and measurements from the Topex/Poseidon and Jason-1/-2 altimeters for year 2007.

  17. Validating a model of patient satisfaction with emergency care

    Microsoft Academic Search

    Benjamin C. Sun; James G. Adams; Helen R. Burstin

    2001-01-01

    Study Objective: We sought to validate a previously developed model of emergency department patient satisfaction in a general population using a standard mailed format. The study aims to export the findings of a comprehensive ED quality-of-care study to an easily measured patient population. Methods: A double-sided, single-page survey was mailed to all patients discharged home from 4 teaching hospital EDs

  18. The validation of ecosystem models of turbid estuaries

    NASA Astrophysics Data System (ADS)

    Radford, P. J.; Ruardij, P.

    1987-11-01

    The ecosystem model of the Bristol Channel and Severn Estuary (GEMBASE) was fitted to 3 years of survey data, and has subsequently been validated against a further 5 years of monitoring data. A control chart technique clearly demonstrates that the model is, on the whole, an adequate representation of the estuarine carbon cycle, although the precision of model estimates reduces with increasing trophic level. An ecosystem model of the Ems Estuary has been adapted to simulate the Severn Estuary, and the impact of introducing a notional tidal power scheme assessed. The results were compared to those obtained using GEMBASE in the Severn. The broad predictions from both models are in agreement, although some detail is at variance, which implies that the fundamental ecological assumptions of the models are compatible.

  19. Cryo-EM model validation using independent map reconstructions

    PubMed Central

    DiMaio, Frank; Zhang, Junjie; Chiu, Wah; Baker, David

    2013-01-01

    An increasing number of cryo-electron microscopy (cryo-EM) density maps are being generated with suitable resolution to trace the protein backbone and guide sidechain placement. Generating and evaluating atomic models based on such maps would be greatly facilitated by independent validation metrics for assessing the fit of the models to the data. We describe such a metric based on the fit of atomic models with independent test maps from single particle reconstructions not used in model refinement. The metric provides a means to determine the proper balance between the fit to the density and model energy and stereochemistry during refinement, and is likely to be useful in determining values of model building and refinement metaparameters quite generally. PMID:23592445

  20. In-Drift Microbial Communities Model Validation Calculations

    SciTech Connect

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  1. In-Drift Microbial Communities Model Validation Calculation

    SciTech Connect

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  2. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    SciTech Connect

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  3. Clutter model validation for millimeter wave (MMW) seekers

    NASA Astrophysics Data System (ADS)

    Salemme, R.; Bowyer, D.; Merritt, R.

    1980-10-01

    In the clutter modeling process, an empirical approach is used in which deterministic map data are used to establish homogeneous terrain subareas. These subareas are then represented, statistically, with a spatial distribution for the median clutter backscatter from each cell and a temporal distribution for the scintillation around this median. In addition, a spatial correlation is applied to the median backscatter for adjacent cells. To validate these clutter models, the primary method is to use the statistics which are derived from measurement data over a variety of different terrain types to verify the statistics in the clutter model. An alternate method is to compare the actual seeker output from flight tests over a specific test site to the output of a simulation of this same test flight. The requirement for validated clutter modes led to the establishment of an extensive clutter measurements program. The requirements for this measurements program were established based on the clutter parameters which must be validated and projected flight configurations of tactical seekers which are currently under development.

  4. Full-scale validation of a model of algal productivity.

    PubMed

    Béchet, Quentin; Shilton, Andy; Guieysse, Benoit

    2014-12-01

    While modeling algal productivity outdoors is crucial to assess the economic and environmental performance of full-scale cultivation, most of the models hitherto developed for this purpose have not been validated under fully relevant conditions, especially with regard to temperature variations. The objective of this study was to independently validate a model of algal biomass productivity accounting for both light and temperature and constructed using parameters experimentally derived using short-term indoor experiments. To do this, the accuracy of a model developed for Chlorella vulgaris was assessed against data collected from photobioreactors operated outdoor (New Zealand) over different seasons, years, and operating conditions (temperature-control/no temperature-control, batch, and fed-batch regimes). The model accurately predicted experimental productivities under all conditions tested, yielding an overall accuracy of ±8.4% over 148 days of cultivation. For the purpose of assessing the feasibility of full-scale algal cultivation, the use of the productivity model was therefore shown to markedly reduce uncertainty in cost of biofuel production while also eliminating uncertainties in water demand, a critical element of environmental impact assessments. Simulations at five climatic locations demonstrated that temperature-control in outdoor photobioreactors would require tremendous amounts of energy without considerable increase of algal biomass. Prior assessments neglecting the impact of temperature variations on algal productivity in photobioreactors may therefore be erroneous. PMID:25369326

  5. Cannonsville Reservoir Watershed SWAT2000 model development, calibration and validation

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan A.; Shoemaker, Christine A.

    2007-04-01

    SummaryThe Soil and Water Assessment Tool version 2000 (SWAT2000) watershed model was utilized to simulate the transport of flow, sediments and phosphorus to the Cannonsville Reservoir in Upstate, New York. The available datasets for model development, particularly the phosphorus input and water quality calibration data, in this case study are unique because of the large amount of watershed specific, spatially and temporally varying data that are available for model development. Relative to the default SWAT inputs, alternative model input generation methodologies were tested and shown to produce more representative inputs that generate substantially different simulation results. The successful application of SWAT2000 in this case study required two critical model modifications regarding excess soil water movement in frozen soils and soil erosion predictions under snow cover. The Nash-Suttcliffe coefficient of efficiency ( ENS) for daily flows at the main flow station in the watershed was at least 0.80 in both the seven-year calibration period and the one year and four year validation periods. Average monthly total phosphorus loads were predicted within 15% of the corresponding measured data and the monthly ENS coefficients for total phosphorus were at least 0.63 in the calibration and validation periods. The results of this study are important for future SWAT modelling studies in gauged and ungauged watersheds, especially those in regions like the Northeast US that are subject to freezing temperatures in winter.

  6. Validation of a transparent decision model to rate drug interactions

    PubMed Central

    2012-01-01

    Background Multiple databases provide ratings of drug-drug interactions. The ratings are often based on different criteria and lack background information on the decision making process. User acceptance of rating systems could be improved by providing a transparent decision path for each category. Methods We rated 200 randomly selected potential drug-drug interactions by a transparent decision model developed by our team. The cases were generated from ward round observations and physicians’ queries from an outpatient setting. We compared our ratings to those assigned by a senior clinical pharmacologist and by a standard interaction database, and thus validated the model. Results The decision model rated consistently with the standard database and the pharmacologist in 94 and 156 cases, respectively. In two cases the model decision required correction. Following removal of systematic model construction differences, the DM was fully consistent with other rating systems. Conclusion The decision model reproducibly rates interactions and elucidates systematic differences. We propose to supply validated decision paths alongside the interaction rating to improve comprehensibility and to enable physicians to interpret the ratings in a clinical context. PMID:22950884

  7. The Space Weather Modeling Framework (SWMF): Models and Validation

    Microsoft Academic Search

    Tamas Gombosi; Gabor Toth; Igor Sokolov; Darren de Zeeuw; Bart van der Holst; Aaron Ridley; Ward Manchester IV

    2010-01-01

    In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are

  8. Validation of the WATEQ4 geochemical model for uranium

    SciTech Connect

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

  9. Dynamic valid models for the conservative Hénon-Heiles system

    NASA Astrophysics Data System (ADS)

    Bastos, S. B.; Mendes, E. M. A. M.

    2011-03-01

    In this work the discretization of the Hénon-Heiles system obtained by applying the Monaco and Normand-Cyrot method is investigated. In order to obtain dynamically valid models, several approaches covering from the choice of terms in the difference equation originated from the discretization process to the increase of the discretization order are analyzed. As a conclusion it is shown that discretized models that preserve both the symmetry and the stability of their continuous counterpart can be obtained, even for large discretization steps.

  10. Robust design and model validation of nonlinear compliant micromechanisms.

    SciTech Connect

    Howell, Larry L. (Brigham Young University, Provo, UT); Baker, Michael Sean; Wittwer, Jonathan W. (Brigham Young University, Provo, UT)

    2005-02-01

    Although the use of compliance or elastic flexibility in microelectromechanical systems (MEMS) helps eliminate friction, wear, and backlash, compliant MEMS are known to be sensitive to variations in material properties and feature geometry, resulting in large uncertainties in performance. This paper proposes an approach for design stage uncertainty analysis, model validation, and robust optimization of nonlinear MEMS to account for critical process uncertainties including residual stress, layer thicknesses, edge bias, and material stiffness. A fully compliant bistable micromechanism (FCBM) is used as an example, demonstrating that the approach can be used to handle complex devices involving nonlinear finite element models. The general shape of the force-displacement curve is validated by comparing the uncertainty predictions to measurements obtained from in situ force gauges. A robust design is presented, where simulations show that the estimated force variation at the point of interest may be reduced from {+-}47 {micro}N to {+-}3 {micro}N. The reduced sensitivity to process variations is experimentally validated by measuring the second stable position at multiple locations on a wafer.

  11. Validation Analysis of the Shoal Groundwater Flow and Transport Model

    SciTech Connect

    A. Hassan; J. Chapman

    2008-11-01

    Environmental restoration at the Shoal underground nuclear test is following a process prescribed by a Federal Facility Agreement and Consent Order (FFACO) between the U.S. Department of Energy, the U.S. Department of Defense, and the State of Nevada. Characterization of the site included two stages of well drilling and testing in 1996 and 1999, and development and revision of numerical models of groundwater flow and radionuclide transport. Agreement on a contaminant boundary for the site and a corrective action plan was reached in 2006. Later that same year, three wells were installed for the purposes of model validation and site monitoring. The FFACO prescribes a five-year proof-of-concept period for demonstrating that the site groundwater model is capable of producing meaningful results with an acceptable level of uncertainty. The corrective action plan specifies a rigorous seven step validation process. The accepted groundwater model is evaluated using that process in light of the newly acquired data. The conceptual model of ground water flow for the Project Shoal Area considers groundwater flow through the fractured granite aquifer comprising the Sand Springs Range. Water enters the system by the infiltration of precipitation directly on the surface of the mountain range. Groundwater leaves the granite aquifer by flowing into alluvial deposits in the adjacent basins of Fourmile Flat and Fairview Valley. A groundwater divide is interpreted as coinciding with the western portion of the Sand Springs Range, west of the underground nuclear test, preventing flow from the test into Fourmile Flat. A very low conductivity shear zone east of the nuclear test roughly parallels the divide. The presence of these lateral boundaries, coupled with a regional discharge area to the northeast, is interpreted in the model as causing groundwater from the site to flow in a northeastward direction into Fairview Valley. Steady-state flow conditions are assumed given the absence of groundwater withdrawal activities in the area. The conceptual and numerical models were developed based upon regional hydrogeologic investigations conducted in the 1960s, site characterization investigations (including ten wells and various geophysical and geologic studies) at Shoal itself prior to and immediately after the test, and two site characterization campaigns in the 1990s for environmental restoration purposes (including eight wells and a year-long tracer test). The new wells are denoted MV-1, MV-2, and MV-3, and are located to the northnortheast of the nuclear test. The groundwater model was generally lacking data in the north-northeastern area; only HC-1 and the abandoned PM-2 wells existed in this area. The wells provide data on fracture orientation and frequency, water levels, hydraulic conductivity, and water chemistry for comparison with the groundwater model. A total of 12 real-number validation targets were available for the validation analysis, including five values of hydraulic head, three hydraulic conductivity measurements, three hydraulic gradient values, and one angle value for the lateral gradient in radians. In addition, the fracture dip and orientation data provide comparisons to the distributions used in the model and radiochemistry is available for comparison to model output. Goodness-of-fit analysis indicates that some of the model realizations correspond well with the newly acquired conductivity, head, and gradient data, while others do not. Other tests indicated that additional model realizations may be needed to test if the model input distributions need refinement to improve model performance. This approach (generating additional realizations) was not followed because it was realized that there was a temporal component to the data disconnect: the new head measurements are on the high side of the model distributions, but the heads at the original calibration locations themselves have also increased over time. This indicates that the steady-state assumption of the groundwater model is in error. To test the robustness of the model d

  12. A turbulence model for iced airfoils and its validation

    NASA Technical Reports Server (NTRS)

    Shin, Jaiwon; Chen, Hsun H.; Cebeci, Tuncer

    1992-01-01

    A turbulence model based on the extension of the algebraic eddy viscosity formulation of Cebeci and Smith developed for two dimensional flows over smooth and rough surfaces is described for iced airfoils and validated for computed ice shapes obtained for a range of total temperatures varying from 28 to -15 F. The validation is made with an interactive boundary layer method which uses a panel method to compute the inviscid flow and an inverse finite difference boundary layer method to compute the viscous flow. The interaction between inviscid and viscous flows is established by the use of the Hilbert integral. The calculated drag coefficients compare well with recent experimental data taken at the NASA-Lewis Icing Research Tunnel (IRT) and show that, in general, the drag increase due to ice accretion can be predicted well and efficiently.

  13. A turbulence model for iced airfoils and its validation

    NASA Technical Reports Server (NTRS)

    Shin, Jaiwon; Chen, Hsun H.; Cebeci, Tuncer

    1992-01-01

    A turbulence model based on the extension of the algebraic eddy viscosity formulation of Cebeci and Smith developed for two-dimensional flows over smooth and rough surfaces is described for iced airfoils and validated for computed ice shapes obtained for a range of total temperatures varying from 28 to -15 F. The validation is made with an interactive boundary layer method which uses a panel method to compute the inviscid flow and an inverse finite difference boundary layer method to compute the viscous flow. The interaction between inviscid and viscous flows is established by the use of the Hilbert integral. The calculated drag coefficients compare well with recent experimental data taken at the NASA-Lewis Icing Research Tunnel (IRT) and show that, in general, the drag increase due to ice accretion can be predicted well and efficiently.

  14. Checking Transformation Model Properties with a UML and OCL Model Validator

    E-print Network

    Gogolla, Martin - Fachbereich 3

    Checking Transformation Model Properties with a UML and OCL Model Validator Martin Gogolla, Lars|lhamann|fhilken}@informatik.uni-bremen.de Abstract. This paper studies model transformations in the form of transformation models connecting source and target metamodels. We propose to analyze transformation models with a UML and OCL tool on the basis

  15. Model validation and assessment of uncertainty in a deterministic model for gonorrhea

    E-print Network

    Givens, Geof H.

    Model validation and assessment of uncertainty in a deterministic model for gonorrhea Geof H: gonorrhea incidence in upstate New York using data from Rothenberg (1983) and a model similar to one model for gonorrhea. In its most general form, a deterministic model may be thought of as a mapping from

  16. A proposal to use geoid slope validation lines to validate models of geoid change

    NASA Astrophysics Data System (ADS)

    Smith, D. A.

    2010-12-01

    The United States National Geodetic Survey (NGS) has embarked on a ten year project called GRAV-D (Gravity for the Redefinition of the American Vertical Datum). The purpose of this project is to replace the current official vertical datum, NAVD 88 (the North American Vertical Datum of 1988) with a geopotential reference system based on a new survey of the gravity field and a gravimetric geoid. As part of GRAV-D, the National Geodetic Survey will develop a set of “geoid slope validation lines” at various locations of the country. These lines will be surveys designed to independently measure the slope of the geoid to provide a check against both the data and theory used to create the final gravimetric geoid which will be used in the geopotential reference system. The first of these lines is proposed to be established in the Autumn of 2011 in the west central region of Texas. The survey will be approximately 300 kilometers long, consisting of GPS, geodetic leveling, deflections of the vertical, surface absolute and relative gravity, including the use of relative meters for low-high surface gradient determination. This region was chosen for many factors including the availability of GRAV-D airborne gravity over the area, its relatively low elevation (220 meter orthometric height max), its geoid slope (from the latest high resolution models being a few decimeters over 300 km), lack of significant topographic relief, lack of large forestation, availability of good roads, clarity of weather and lack of large water crossings. Further lines are planned in the out-years, in more difficult areas, though their locations are not yet determined. Although the original intent of these lines was to serve as calibrations against geoid modeling data and theory, there may be additional uses relevant to geoid monitoring. A gap is being anticipated between the GRACE and GRACE-Follow On missions. GRACE has shown a quantifiable change (millimeters per year) in the geoid over parts of North America. As such, the GRAV-D project contains plans to monitor geoid change. However, without GRACE, some method of modeling geoid change and then testing that model must be developed. It is proposed, therefore, that as NGS develops more “geoid slope validation lines” that some consideration be made to placing one or more of them in areas of known, ongoing geoid change. Re-surveying of these lines would yield a direct, independent look at actual geoid change along the line. The sparseness and linear nature of such lines would not allow them to be used to directly create a continental model of geoid change, but they could stand as in-situ validations of models of geoid change coming from, say a model of mantle and glacial dynamics.

  17. Validity and Reliability Determination of Denver Developmental Screening Test-II in 0-6 Year–Olds in Tehran

    PubMed Central

    Shahshahani, Soheila; Vameghi, Roshanak; Azari, Nadia; Sajedi, Firoozeh; Kazemnejad, Anooshirvan

    2010-01-01

    Objective This research was designed to identify the validity and reliability of the Persian version of Denver Developmental Screening Test II (DDST-II) in Iranian children, in order to provide an appropriate developmental screening tool for Iranian child health workers. Methods At first a precise translation of test was done by three specialists in English literature and then it was revised by three pediatricians familiar with developmental domains. Then, DDST-II was performed on 221 children ranging from 0 to 6 years, in four Child Health Clinics, in north, south, east and west regions of Tehran city. In order to determine the agreement coefficient, these children were also evaluated by ASQ test. Because ASQ is designed to use for 4–60 month- old children, children who were out of this rang were evaluated by developmental pediatricians. Available sampling was used. Obtained data was analyzed by SPSS software. Findings Developmental disorders were observed in 34% of children who were examined by DDST-II, and in 12% of children who were examined by ASQ test. The estimated consistency coefficient between DDST-II and ASQ was 0.21, which is weak, and between DDST-II and the physicians’ examination was 0.44. The content validity of DDST-II was verified by reviewing books and journals, and by specialists’ opinions. All of the questions in DDST-II had appropriate content validity, and there was no need to change them. Test-retest and Inter-rater methods were used in order to determine reliability of the test, by Cronbach's ? and Kauder-Richardson coefficients. Kauder-Richardson coefficient for different developmental domains was between 61% and 74%, which is good. Cronbach's ? coefficient and Kappa measure of agreement for test-retest were 92% and 87% and for Inter-rater 90% and 76%, respectively. Conclusion This research showed that Persian version of DDST-II has a good validity and reliability, and can be used as a screening tool for developmental screening of children in Tehran city. PMID:23056723

  18. A model for the separation of cloud and aerosol in SAGE II occultation data

    NASA Technical Reports Server (NTRS)

    Kent, G. S.; Winker, D. M.; Osborn, M. T.; Skeens, K. M.

    1993-01-01

    The Stratospheric Aerosol and Gas Experiment (SAGE) II satellite experiment measures the extinction due to aerosols and thin cloud, at wavelengths of 0.525 and 1.02 micrometers, down to an altitude of 6 km. The wavelength dependence of the extinction due to aerosols differs from that of the extinction due to cloud and is used as the basis of a model for separating these two components. The model is presented and its validation using airborne lidar data, obtained coincident with SAGE II observations, is described. This comparison shows that smaller SAGE II cloud extinction values correspond to the presence of subvisible cirrus cloud in the lidar record. Examples of aerosol and cloud data products obtained using this model to interpret SAGE II upper tropospheric and lower stratospheric data are also shown.

  19. Rapid target gene validation in complex cancer mouse models using re-derived embryonic stem cells

    PubMed Central

    Huijbers, Ivo J; Bin Ali, Rahmen; Pritchard, Colin; Cozijnsen, Miranda; Kwon, Min-Chul; Proost, Natalie; Song, Ji-Ying; Vries, Hilda; Badhai, Jitendra; Sutherland, Kate; Krimpenfort, Paul; Michalak, Ewa M; Jonkers, Jos; Berns, Anton

    2014-01-01

    Human cancers modeled in Genetically Engineered Mouse Models (GEMMs) can provide important mechanistic insights into the molecular basis of tumor development and enable testing of new intervention strategies. The inherent complexity of these models, with often multiple modified tumor suppressor genes and oncogenes, has hampered their use as preclinical models for validating cancer genes and drug targets. In our newly developed approach for the fast generation of tumor cohorts we have overcome this obstacle, as exemplified for three GEMMs; two lung cancer models and one mesothelioma model. Three elements are central for this system; (i) The efficient derivation of authentic Embryonic Stem Cells (ESCs) from established GEMMs, (ii) the routine introduction of transgenes of choice in these GEMM-ESCs by Flp recombinase-mediated integration and (iii) the direct use of the chimeric animals in tumor cohorts. By applying stringent quality controls, the GEMM-ESC approach proofs to be a reliable and effective method to speed up cancer gene assessment and target validation. As proof-of-principle, we demonstrate that MycL1 is a key driver gene in Small Cell Lung Cancer. PMID:24401838

  20. Coupled Thermal-Chemical-Mechanical Modeling of Validation Cookoff Experiments

    SciTech Connect

    ERIKSON,WILLIAM W.; SCHMITT,ROBERT G.; ATWOOD,A.I.; CURRAN,P.D.

    2000-11-27

    The cookoff of energetic materials involves the combined effects of several physical and chemical processes. These processes include heat transfer, chemical decomposition, and mechanical response. The interaction and coupling between these processes influence both the time-to-event and the violence of reaction. The prediction of the behavior of explosives during cookoff, particularly with respect to reaction violence, is a challenging task. To this end, a joint DoD/DOE program has been initiated to develop models for cookoff, and to perform experiments to validate those models. In this paper, a series of cookoff analyses are presented and compared with data from a number of experiments for the aluminized, RDX-based, Navy explosive PBXN-109. The traditional thermal-chemical analysis is used to calculate time-to-event and characterize the heat transfer and boundary conditions. A reaction mechanism based on Tarver and McGuire's work on RDX{sup 2} was adjusted to match the spherical one-dimensional time-to-explosion data. The predicted time-to-event using this reaction mechanism compares favorably with the validation tests. Coupled thermal-chemical-mechanical analysis is used to calculate the mechanical response of the confinement and the energetic material state prior to ignition. The predicted state of the material includes the temperature, stress-field, porosity, and extent of reaction. There is little experimental data for comparison to these calculations. The hoop strain in the confining steel tube gives an estimation of the radial stress in the explosive. The inferred pressure from the measured hoop strain and calculated radial stress agree qualitatively. However, validation of the mechanical response model and the chemical reaction mechanism requires more data. A post-ignition burn dynamics model was applied to calculate the confinement dynamics. The burn dynamics calculations suffer from a lack of characterization of the confinement for the flaw-dominated failure mode experienced in the tests. High-pressure burning rates are needed for more detailed post-ignition studies. Sub-models for chemistry, mechanical response and burn dynamics need to be validated against data from less complex experiments. The sub-models can then be used in integrated analysis for comparison with experimental data taken during integrated tests.

  1. Validation of High Displacement Piezoelectric Actuator Finite Element Models

    NASA Technical Reports Server (NTRS)

    Taleghani, B. K.

    2000-01-01

    The paper presents the results obtained by using NASTRAN(Registered Trademark) and ANSYS(Regitered Trademark) finite element codes to predict doming of the THUNDER piezoelectric actuators during the manufacturing process and subsequent straining due to an applied input voltage. To effectively use such devices in engineering applications, modeling and characterization are essential. Length, width, dome height, and thickness are important parameters for users of such devices. Therefore, finite element models were used to assess the effects of these parameters. NASTRAN(Registered Trademark) and ANSYS(Registered Trademark) used different methods for modeling piezoelectric effects. In NASTRAN(Registered Trademark), a thermal analogy was used to represent voltage at nodes as equivalent temperatures, while ANSYS(Registered Trademark) processed the voltage directly using piezoelectric finite elements. The results of finite element models were validated by using the experimental results.

  2. Validation of coupled atmosphere-fire behavior models

    SciTech Connect

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.; Winterkamp, J.L. [Los Alamos National Lab., NM (United States); Schaub, R. [Dynamac Corp., Kennedy Space Center, FL (United States); Riggan, P.J. [Forest Service, Riverside, CA (United States)

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexity of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.

  3. USER'S MANUAL FOR THE PLUME VISIBILITY MODEL (PLUVUE II)

    EPA Science Inventory

    This publication contains information about the computer programs for the Plume Visibility Model PLUVUE II. A technical overview of PLUVUE II and the results of model evaluation studies are presented. The source code of PLUVUE II, as well as two sets of input and output data, is ...

  4. Deviatoric constitutive model: domain of strain rate validity

    SciTech Connect

    Zocher, Marvin A [Los Alamos National Laboratory

    2009-01-01

    A case is made for using an enhanced methodology in determining the parameters that appear in a deviatoric constitutive model. Predictability rests on our ability to solve a properly posed initial boundary value problem (IBVP), which incorporates an accurate reflection of material constitutive behavior. That reflection is provided through the constitutive model. Moreover, the constitutive model is required for mathematical closure of the IBVP. Common practice in the shock physics community is to divide the Cauchy tensor into spherical and deviatoric parts, and to develop separate models for spherical and deviatoric constitutive response. Our focus shall be on the Cauchy deviator and deviatoric constitutive behavior. Discussions related to the spherical part of the Cauchy tensor are reserved for another time. A number of deviatoric constitutive models have been developed for utilization in the solution of IBVPs that are of interest to those working in the field of shock physics, e.g. All of these models are phenomenological and contain a number of parameters that must be determined in light of experimental data. The methodology employed in determining these parameters dictates the loading regime over which the model can be expected to be accurate. The focus of this paper is the methodology employed in determining model parameters and the consequences of that methodology as it relates to the domain of strain rate validity. We shall begin by describing the methodology that is typically employed. We shall discuss limitations imposed upon predictive capability by the typically employed methodology. We shall propose a modification to the typically employed methodology that significantly extends the domain of strain rate validity.

  5. Volumetric Intraoperative Brain Deformation Compensation: Model Development and Phantom Validation

    PubMed Central

    DeLorenzo, Christine; Papademetris, Xenophon; Staib, Lawrence H.; Vives, Kenneth P.; Spencer, Dennis D.; Duncan, James S.

    2012-01-01

    During neurosurgery, nonrigid brain deformation may affect the reliability of tissue localization based on preoperative images. To provide accurate surgical guidance in these cases, preoperative images must be updated to reflect the intraoperative brain. This can be accomplished by warping these preoperative images using a biomechanical model. Due to the possible complexity of this deformation, intraoperative information is often required to guide the model solution. In this paper, a linear elastic model of the brain is developed to infer volumetric brain deformation associated with measured intraoperative cortical surface displacement. The developed model relies on known material properties of brain tissue, and does not require further knowledge about intraoperative conditions. To provide an initial estimation of volumetric model accuracy, as well as determine the model’s sensitivity to the specified material parameters and surface displacements, a realistic brain phantom was developed. Phantom results indicate that the linear elastic model significantly reduced localization error due to brain shift, from >16 mm to under 5 mm, on average. In addition, though in vivo quantitative validation is necessary, preliminary application of this approach to images acquired during neocortical epilepsy cases confirms the feasibility of applying the developed model to in vivo data. PMID:22562728

  6. Modeling TCP Throughput: A Simple Model and Its Empirical Validation

    Microsoft Academic Search

    Jitendra Padhye; Victor Firoiu; Donald F. Towsley; James F. Kurose

    1998-01-01

    In this paper we develop a simple analytic characterization of the steady state throughput, as a function of loss rate and round trip time for a bulk transfer TCP flow, i.e., a flow with an unlimited amount of data to send. Unlike the models in [6, 7, 10], our model captures not only the behavior of TCP's fast retransmit mechanism

  7. Parallel Measurement and Modeling of Transport in the Darht II Beamline on ETA II

    Microsoft Academic Search

    F. W. Chambers; B. A. Raymond; S. Falabella; B. S. Lee; R. A. Richardson; J. T. Weir; H. A. Davis; M. E. Schultze

    2005-01-01

    To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data

  8. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology

    PubMed Central

    Krishnamoorthi, Shankarjee; Perotti, Luigi E.; Borgstrom, Nils P.; Ajijola, Olujimi A.; Frid, Anna; Ponnaluri, Aditya V.; Weiss, James N.; Qu, Zhilin; Klug, William S.; Ennis, Daniel B.; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations. PMID:25493967

  9. Modeling and Validation of Damped Plexiglas Windows for Noise Control

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Gibbs, Gary P.; Klos, Jacob; Mazur, Marina

    2003-01-01

    Windows are a significant path for structure-borne and air-borne noise transmission in general aviation aircraft. In this paper, numerical and experimental results are used to evaluate damped plexiglas windows for the reduction of structure-borne and air-borne noise transmitted into the interior of an aircraft. In contrast to conventional homogeneous windows, the damped plexiglas windows were fabricated using two or three layers of plexiglas with transparent viscoelastic damping material sandwiched between the layers. Transmission loss and radiated sound power measurements were used to compare different layups of the damped plexiglas windows with uniform windows of the same nominal thickness. This vibro-acoustic test data was also used for the verification and validation of finite element and boundary element models of the damped plexiglas windows. Numerical models are presented for the prediction of radiated sound power for a point force excitation and transmission loss for diffuse acoustic excitation. Radiated sound power and transmission loss predictions are in good agreement with experimental data. Once validated, the numerical models were used to perform a parametric study to determine the optimum configuration of the damped plexiglas windows for reducing the radiated sound power for a point force excitation.

  10. Experimental Validation of a Pulse Tube Cfd Model

    NASA Astrophysics Data System (ADS)

    Taylor, R. P.; Nellis, G. F.; Klein, S. A.; Radebaugh, R.; Lewis, M.; Bradley, P.

    2010-04-01

    Computational fluid dynamic (CFD) analysis has been applied by various authors to study the processes occurring in the pulse tube cryocooler and carry out parametric design and optimization. However, a thorough and quantitative validation of the CFD model predications against experimental data has not been accomplished. This is in part due to the difficulty associated with measuring the specific quantities of interest (e.g., internal enthalpy flows and acoustic power) rather than generic system performance (e.g., cooling power). This paper presents the experimental validation of a previously published two-dimensional, axisymmetric CFD model of the pulse tube and its associated flow transitions. The test facility designed for this purpose is unique in that it allows the precise measurement of the cold end acoustic power, regenerator loss, and cooling power. Therefore, it allows the separate and precise measurement of both the pulse tube loss and the regenerator loss. The experimental results are presented for various pulse tube and flow transition configurations operating at a cold end temperature of 80 K over a range of pressure ratios. The comparison of the model prediction to the experimental data is presented with discussion.

  11. On the Validity of the Independent Hot-Spot Model

    SciTech Connect

    Mounaix, Philippe

    2001-08-20

    The results of the independent hot-spot (IHS) model are compared to those of the underlying stochastic amplifier in the regime where the coupling of the amplifier is close to its critical value. The considered case is that of a 1D linear amplifier with at most one hot spot per interaction length. It is shown that the validity of the critical coupling given by the IHS model depends on the correlation function of the pump field and should be discussed in each particular case. The discrepancy between the IHS model and the underlying amplifier is shown to be due to the random fluctuations of the hot-spot field around its dominant, deterministic, component.

  12. Multiscale Modeling of Gastrointestinal Electrophysiology and Experimental Validation

    PubMed Central

    Du, Peng; O'Grady, Greg; Davidson, John B.; Cheng, Leo K.; Pullan, Andrew J.

    2011-01-01

    Normal gastrointestinal (GI) motility results from the coordinated interplay of multiple cooperating mechanisms, both intrinsic and extrinsic to the GI tract. A fundamental component of this activity is an omnipresent electrical activity termed slow waves, which is generated and propagated by the interstitial cells of Cajal (ICCs). The role of ICC loss and network degradation in GI motility disorders is a significant area of ongoing research. This review examines recent progress in the multiscale modeling framework for effectively integrating a vast range of experimental data in GI electrophysiology, and outlines the prospect of how modeling can provide new insights into GI function in health and disease. The review begins with an overview of the GI tract and its electrophysiology, and then focuses on recent work on modeling GI electrical activity, spanning from cell to body biophysical scales. Mathematical cell models of the ICCs and smooth muscle cell are presented. The continuum framework of monodomain and bidomain models for tissue and organ models are then considered, and the forward techniques used to model the resultant body surface potential and magnetic field are discussed. The review then outlines recent progress in experimental support and validation of modeling, and concludes with a discussion on potential future research directions in this field. PMID:21133835

  13. Functional Validation of AADL Models via Model Transformation to SystemC with ATL

    E-print Network

    Paris-Sud XI, Université de

    factors in the choice of the best architecture. Moreover, it is methodologically efficient to tie bothFunctional Validation of AADL Models via Model Transformation to SystemC with ATL Pierre Bomel In this paper, we put into action an ATL model transformation in order to automatically generate SystemC models

  14. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  15. Exploring Gravity and Gravitational Wave Dynamics Part II: Gravity Models

    NASA Astrophysics Data System (ADS)

    Murad, P. A.

    2007-01-01

    The need for a new gravity model may explain anomalous behavior exhibited by several recent experiments described in Part I. Although Newtonian gravity is adequate for predicting the motion of celestial bodies, these bodies move at slow speeds compared to relativistic conditions. Moreover, anomalous behavior as well as the existence of gravitational waves limit and invalidate the use of Newtonian gravity. During prior STAIF Conferences, the author proposed a theory based upon gravitational anomalies that would use a universal gravitation model with a radial force term coupled with angular momentum extending the work of Jefimenko. This also extended the previous work of Murad and Baker, Dyatlov who explains angular momentum effects as consequences of a `spin' field. Angular momentum may explain various spin asymmetries allowing the transfer of gravitational radiation directly into angular momentum observed in some anomalous gyroscope experiments, some possible work by the Germans during WW II, and recent experiments performed by the Russians to replicate the Searl Device where they record a sizable weight reduction. It is feasible that Jefimenko's cogravity field may represent the elusive `spin' or `torsion' field. In these experiments, results heavily depend upon rotation rate and direction. A new model is proposed without the constraints used by Jefimenko and the data from these experiments are used to partially validate this newer model as well as define gravitational currents as the differences that exist between the Newtonian model and this newer theory. Finally, if true, these new effects can have a revolutionary impact upon theoretical physics and Astronautics.

  16. Embedded effort indicators on the California Verbal Learning Test - Second Edition (CVLT-II): an attempted cross-validation.

    PubMed

    Donders, Jacobus; Strong, Carrie-Ann H

    2011-01-01

    This study determined whether the logistic regression method that was recently developed by Wolfe and colleagues (2010) for the detection of invalid effort on the California Verbal Learning Test - Second Edition (CVLT-II) could be cross-validated in an independent sample of 100 consecutively referred patients with traumatic brain injury. Although the CVLT-II logistic regression formula demonstrated a statistically significant level of agreement with results from the Word Memory Test, it was associated with an unacceptably high proportion of false positives. The component variables of the logistic regression were sensitive to length of coma but did not covary with psychosocial complicating factors (e.g., unresolved prior psychiatric history) that were associated with a higher relative risk of failure of WMT validity criteria. It is concluded that the Wolfe et al. logistic regression should be used only with great caution in the context of clinical neuropsychological evaluations. PMID:21181604

  17. Organic acid modeling and model validation: Workshop summary

    SciTech Connect

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  18. Organic acid modeling and model validation: Workshop summary. Final report

    SciTech Connect

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  19. Modified WHODAS-II provides valid measure of global disability but filter items increased skewness

    Microsoft Academic Search

    Michael Von Korff; Paul K. Crane; Jordi Alonso; Gemma Vilagut; Matthias C. Angermeyer; Ronny Bruffaerts; Giovanni de Girolamo; Oye Gureje; Ron de Graaf; Yueqin Huang; Noboru Iwata; Elie G. Karam; Viviane Kovess; Carmen Lara; Daphna Levinson; José Posada-Villa; Kate M. Scott; Johan Ormel

    2008-01-01

    ObjectiveThe WHODAS-II was substantially modified for use in the World Mental Health Surveys. This article considers psychometric properties and implications of filter items used to reduce respondent burden of the modified WHODAS-II.

  20. Evaluation and cross-validation of Environmental Models

    NASA Astrophysics Data System (ADS)

    Lemaire, Joseph

    Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore, preliminary analyses should be carried out under the control and authority of an established international professional Organization or Association, before any final political decision is made by ISO to select a specific Environmental Models, like for example IGRF and DGRF. Of course, Commissions responsible for checking the consistency of definitions, methods and algorithms for data processing might consider to delegate specific tasks (e.g. bench-marking the technical tools, the calibration procedures, the methods of data analysis, and the software algorithms employed in building the different types of models, as well as their usage) to private, intergovernmental or international organization/agencies (e.g.: NASA, ESA, AGU, EGU, COSPAR, . . . ); eventually, the latter should report conclusions to the Commissions members appointed by IAGA or any established authority like IUGG.

  1. Analytical Validation of the PRO-Trac II ELISA for the Determination of Tacrolimus (FK506) in Whole Blood

    Microsoft Academic Search

    Gordon D. MacFarlane; Daniel G. Scheller; Diana L. Ersfeld; Leslie M. Shaw; Raman Venkatarmanan; Laszlo Sarkozi; Richard Mullins; Bonnie R. Fox

    Background: The analytical validation of multiple lots of the PRO-TracTM II ELISA (DiaSorin) for the determi- nation of tacrolimus in whole blood is described. Methods: The analytical parameters assessed included analytical sensitivity, dilution linearity, functional sensi- tivity, values in samples containing no tacrolimus, intra- and interassay precision, supplementation and recovery, metabolite cross-reactivity, interference studies, and method comparisons HPLC-tandem mass spectrometry

  2. Validating a spatially distributed hydrological model with soil morphology data

    NASA Astrophysics Data System (ADS)

    Doppler, T.; Honti, M.; Zihlmann, U.; Weisskopf, P.; Stamm, C.

    2014-09-01

    Spatially distributed models are popular tools in hydrology claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for inputs of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography and artificial drainage. We translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to observed groundwater levels and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the the groundwater level predictions were not accurate enough to be used for the prediction of saturated areas. Groundwater level dynamics were not adequately reproduced and the predicted spatial saturation patterns did not correspond to those estimated from the soil map. Our results indicate that an accurate prediction of the groundwater level dynamics of the shallow groundwater in our catchment that is subject to artificial drainage would require a model that better represents processes at the boundary between the unsaturated and the saturated zone. However, data needed for such a more detailed model are not generally available. This severely hampers the practical use of such models despite their usefulness for scientific purposes.

  3. Ultrasonic transducers for cure monitoring: design, modelling and validation

    NASA Astrophysics Data System (ADS)

    Lionetto, Francesca; Montagna, Francesco; Maffezzoli, Alfonso

    2011-12-01

    The finite element method (FEM) has been applied to simulate the ultrasonic wave propagation in a multilayered transducer, expressly designed for high-frequency dynamic mechanical analysis of polymers. The FEM model includes an electro-acoustic (active element) and some acoustic (passive elements) transmission lines. The simulation of the acoustic propagation accounts for the interaction between the piezoceramic and the materials in the buffer rod and backing, and the coupling between the electric and mechanical properties of the piezoelectric material. As a result of the simulations, the geometry and size of the modelled ultrasonic transducer has been optimized and used for the realization of a prototype transducer for cure monitoring. The transducer performance has been validated by measuring the velocity changes during the polymerization of a thermosetting matrix of composite materials.

  4. Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

    2013-03-01

    SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

  5. A validated predictive model of coronary fractional flow reserve.

    PubMed

    Huo, Yunlong; Svendsen, Mark; Choy, Jenny Susana; Zhang, Z-D; Kassab, Ghassan S

    2012-06-01

    Myocardial fractional flow reserve (FFR), an important index of coronary stenosis, is measured by a pressure sensor guidewire. The determination of FFR, only based on the dimensions (lumen diameters and length) of stenosis and hyperaemic coronary flow with no other ad hoc parameters, is currently not possible. We propose an analytical model derived from conservation of energy, which considers various energy losses along the length of a stenosis, i.e. convective and diffusive energy losses as well as energy loss due to sudden constriction and expansion in lumen area. In vitro (constrictions were created in isolated arteries using symmetric and asymmetric tubes as well as an inflatable occluder cuff) and in vivo (constrictions were induced in coronary arteries of eight swine by an occluder cuff) experiments were used to validate the proposed analytical model. The proposed model agreed well with the experimental measurements. A least-squares fit showed a linear relation as (?p or FFR)(experiment) = a(?p or FFR)(theory) + b, where a and b were 1.08 and -1.15 mmHg (r(2) = 0.99) for in vitro ?p, 0.96 and 1.79 mmHg (r(2) = 0.75) for in vivo ?p, and 0.85 and 0.1 (r(2) = 0.7) for FFR. Flow pulsatility and stenosis shape (e.g. eccentricity, exit angle divergence, etc.) had a negligible effect on myocardial FFR, while the entrance effect in a coronary stenosis was found to contribute significantly to the pressure drop. We present a physics-based experimentally validated analytical model of coronary stenosis, which allows prediction of FFR based on stenosis dimensions and hyperaemic coronary flow with no empirical parameters. PMID:22112650

  6. Predictive validity of behavioural animal models for chronic pain

    PubMed Central

    Berge, Odd-Geir

    2011-01-01

    Rodent models of chronic pain may elucidate pathophysiological mechanisms and identify potential drug targets, but whether they predict clinical efficacy of novel compounds is controversial. Several potential analgesics have failed in clinical trials, in spite of strong animal modelling support for efficacy, but there are also examples of successful modelling. Significant differences in how methods are implemented and results are reported means that a literature-based comparison between preclinical data and clinical trials will not reveal whether a particular model is generally predictive. Limited reports on negative outcomes prevents reliable estimate of specificity of any model. Animal models tend to be validated with standard analgesics and may be biased towards tractable pain mechanisms. But preclinical publications rarely contain drug exposure data, and drugs are usually given in high doses and as a single administration, which may lead to drug distribution and exposure deviating significantly from clinical conditions. The greatest challenge for predictive modelling is, however, the heterogeneity of the target patient populations, in terms of both symptoms and pharmacology, probably reflecting differences in pathophysiology. In well-controlled clinical trials, a majority of patients shows less than 50% reduction in pain. A model that responds well to current analgesics should therefore predict efficacy only in a subset of patients within a diagnostic group. It follows that successful translation requires several models for each indication, reflecting critical pathophysiological processes, combined with data linking exposure levels with effect on target. LINKED ARTICLES This article is part of a themed issue on Translational Neuropharmacology. To view the other articles in this issue visit http://dx.doi.org/10.1111/bph.2011.164.issue-4 PMID:21371010

  7. Literature-derived bioaccumulation models for earthworms: Development and validation

    SciTech Connect

    Sample, B.E.; Suter, G.W. II; Beauchamp, J.J.; Efroymson, R.A.

    1999-09-01

    Estimation of contaminant concentrations in earthworms is a critical component in many ecological risk assessments. Without site-specific data, literature-derived uptake factors or models are frequently used. Although considerable research has been conducted on contaminant transfer from soil to earthworms, most studies focus on only a single location. External validation of transfer models has not been performed. The authors developed a database of soil and tissue concentrations for nine inorganic and two organic chemicals. Only studies that presented total concentrations in departed earthworms were included. Uptake factors and simple and multiple regression models of natural-log-transformed concentrations of each analyte in soil and earthworms were developed using data from 26 studies. These models were then applied to data from six additional studies. Estimated and observed earthworm concentrations were compared using nonparametric Wilcoxon signed-rank tests. Relative accuracy and quality of different estimation methods were evaluated by calculating the proportional deviation of the estimate from the measured value. With the exception of Cr, significant, single-variable (e.g., soil concentration) regression models were fit for each analyte. Inclusion of soil Ca improved model fits for Cd and Pb. Soil pH only marginally improved model fits. The best general estimates of chemical concentrations in earthworms were generated by simple ln-ln regression models for As, Cd, Cu, Hg, Mn, Pb, Zn, and polychlorinated biphenyls. No method accurately estimated Cr or Ni in earthworms. Although multiple regression models including pH generated better estimates for a few analytes, in general, the predictive utility gained by incorporating environmental variables was marginal.

  8. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  9. USING GRAPHICAL MODELS AND GENOMIC EXPRESSION DATA TO STATISTICALLY VALIDATE MODELS OF

    E-print Network

    Hartemink, Alexander

    USING GRAPHICAL MODELS AND GENOMIC EXPRESSION DATA TO STATISTICALLY VALIDATE MODELS OF GENETIC Cambridge Center, Cambridge, MA 02142 We propose a model-driven approach for analyzing genomic expression and their extensions. As a demonstration of this approach, we utilize 52 genomes worth of Affymetrix Gene

  10. USING GRAPHICAL MODELS AND GENOMIC EXPRESSION DATA TO STATISTICALLY VALIDATE MODELS OF

    E-print Network

    Gifford, David K.

    USING GRAPHICAL MODELS AND GENOMIC EXPRESSION DATA TO STATISTICALLY VALIDATE MODELS OF GENETIC Cambridge Center, Cambridge, MA 02142 We propose a model-driven approach for analyzing genomic expression and their extensions. As a demonstration of this approach, we utilize 52 genomes worth of A#11;ymetrix Gene

  11. Assessing uncertainty in pollutant wash-off modelling via model validation.

    PubMed

    Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

    2014-11-01

    Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies. PMID:25169872

  12. Canyon building ventilation system dynamic model -- Parameters and validation

    SciTech Connect

    Moncrief, B.R. (Westinghouse Savannah River Co., Aiken, SC (United States)); Chen, F.F.K. (Bechtel National, Inc., San Francisco, CA (United States))

    1993-01-01

    Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical models.'' These component models are functionally integrated to represent the plant. With today's low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in single loop'' design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

  13. [Validation of the ALMANAC model with different spatial scale].

    PubMed

    Xie, Yun; James, Kiniry; Liu, Baoyuan

    2003-08-01

    The ALMANAC model was validated in a drought-stressed year and in a period from 1989 to 1998 at different sites of Texas to evaluate its ability in simulating maize and sorghum yields at different spatial scales and to extend its application range. There were 11 sites for maize and 8 sites for sorghum in plot-size simulations, and 9 counties for maize and sorghum in county level simulations. The model showed similar accuracy in simulating both plot-size and county level mean grain yields. It could also simulate single-year yields under water-limited climatic conditions for several sites and mean county yields of maize and sorghum, and had small CV values of mean yields for a long-term prediction. The mean error was 8.9% for sorghum and 9.4% for maize in field scale simulations, and was only 2.6% for maize and-0.6% for sorghum in county level mean yield simulations. Crop models often require extensive input data sets to realistically simulate crop growth. The development of such input data sets is difficult for some model users. The soil, weather, and crop parameter data sets developed in this study could be used as the guidelines for model applications in similar climatic regions and on similar soils. PMID:14655361

  14. Validation of two-equation turbulence models for propulsion flowfields

    NASA Technical Reports Server (NTRS)

    Deshpande, Manish; Venkateswaran, S.; Merkle, Charles L.

    1994-01-01

    The objective of the study is to assess the capability of two-equation turbulence models for simulating propulsion-related flowfields. The standard kappa-epsilon model with Chien's low Reynolds number formulation for near-wall effects is used as the baseline turbulence model. Several experimental test cases, representative of rocket combustor internal flowfields, are used to catalog the performance of the baseline model. Specific flowfields considered here include recirculating flow behind a backstep, mixing between coaxial jets and planar shear layers. Since turbulence solutions are notoriously dependent on grid and numerical methodology, the effects of grid refinement and artificial dissipation on numerical accuracy are studied. In the latter instance, computational results obtained with several central-differenced and upwind-based formulations are compared. Based on these results, improved turbulence modes such as enhanced kappa-epsilon models as well as other two-equation formulations (e.g., kappa-omega) are being studied. In addition, validation of swirling and reacting flowfields are also currently underway.

  15. Systematic approach to verification and validation: High explosive burn models

    SciTech Connect

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code, run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.

  16. Modeling interactions of Hg(II) and bauxitic soils

    Microsoft Academic Search

    Rohan Weerasooriya; Heinz J. Tobschall; Atula Bandara

    2007-01-01

    The adsorptive interactions of Hg(II) with gibbsite-rich soils (hereafter SOIL-g) were modeled by 1-pK surface complexation theory using charge distribution multi-site ion competition model (CD MUSIC) incorporating basic Stern layer model (BSM) to account for electrostatic effects. The model calibrations were performed for the experimental data of synthetic gibbsite–Hg(II) adsorption. When [NaNO3]?0.01M, the Hg(II) adsorption density values, of gibbsite, ?Hg(II),

  17. Combined Analysis and Validation of Earth Rotation Models and Observations

    NASA Astrophysics Data System (ADS)

    Kutterer, Hansjoerg; Göttl, Franziska; Heiker, Andrea; Kirschner, Stephanie; Schmidt, Michael; Seitz, Florian

    2010-05-01

    Global dynamic processes cause changes in the Earth's rotation, gravity field and geometry. Thus, they can be traced in geodetic observations of these quantities. However, the sensitivity of the various geodetic observation techniques to specific processes in the Earth system differs. More meaningful conclusions with respect to contributions from individual Earth subsystems can be drawn from the combined analysis of highly precise and consistent parameter time series from heterogeneous observation types which carry partially redundant and partially complementary information. For the sake of a coordinated research in this field, the Research Unit FOR 584 "Earth Rotation and Global Dynamic Processes" is funded at present by the German Research Foundation (DFG). It is concerned with the refined and consistent modeling and data analysis. One of the projects (P9) within this Research Unit addresses the combined analysis and validation of Earth rotation models and observations. In P9 three main topics are addressed: (1) the determination and mutual validation of reliable consistent time series for Earth rotation parameters and gravity field coefficients due to the consideration of their physical connection by the Earth's tensor of inertia, (2) the separation of individual Earth rotation excitation mechanisms by merging all available relevant data from recent satellite missions (GRACE, Jason-1, …) and geodetic space techniques (GNSS, SLR, VLBI, …) in a highly consistent way, (3) the estimation of fundamental physical Earth parameters (Love numbers, …) by an inverse model using the improved geodetic observation time series as constraints. Hence, this project provides significant and unique contributions to the field of Earth system science in general; it corresponds with the goals of the Global Geodetic Observing System (GGOS). In this paper project P9 is introduced, the goals are summarized and a status report including a presentation and discussion of intermediate results is given.

  18. NAIRAS aircraft radiation model development, dose climatology, and initial validation

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

    2013-10-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway.

  19. ACTA POLYTECHNICA 1 Validity of the one-dimensional limp model for porous

    E-print Network

    Paris-Sud XI, Université de

    ACTA POLYTECHNICA 1 Validity of the one-dimensional limp model for porous media Olivier DOUTRES the limp model validity for porous materials is addressed here. The limp model is an "equivalent fluid" model which gives a better description of the porous behavior than the well known "rigid frame" model

  20. Development and validation of a realistic head model for EEG

    NASA Astrophysics Data System (ADS)

    Bangera, Nitin Bhalchandra

    The utility of extracranial electrical or magnetic field recordings (EEG or MEG) is greatly enhanced if the generators of the bioelectromagnetic fields can be determined accurately from the measured fields. This procedure, known as the 'inverse method,' depends critically on calculations of the projection from generators in the brain to the EEG and MEG sensors. Improving and validating this calculation, known as the 'forward solution,' is the focus of this dissertation. The improvements involve more accurate modeling of the structures of the brain and thus understanding how current flows within the brain as a result of addition of structures in the forward model. Validation compares calculations using different forward models to the experimental results obtained by stimulating with implanted dipole electrodes. The human brain tissue displays inhomogeneity in electrical conductivity and also displays anisotropy, notably in the skull and brain white matter. In this dissertation, a realistic head model has been implemented using the finite element method to calculate the effects of inhomogeneity and anisotropy in the human brain. Accurate segmentation of the brain tissue type is implemented using a semi-automatic method to segment multimodal imaging data from multi-spectral MRI scans (different flip angles) in conjunction with the regular T1-weighted scans and computed x-ray tomography images. The electrical conductivity in the anisotropic white matter tissue is quantified from diffusion tensor MRI. The finite element model is constructed using AMIRA, a commercial segmentation and visualization tool and solved using ABAQUS, a commercial finite element solver. The model is validated using experimental data collected from intracranial stimulation in medically intractable epileptic patients. Depth electrodes are implanted in medically intractable epileptic patients in order to direct surgical therapy when the foci cannot be localized with the scalp EEG. These patients present the unique opportunity to generate sources at known positions in the human brain using the depth electrodes. Known dipolar sources were created inside the human brain at known locations by injecting a weak biphasic current (sub-threshold) between alternate contacts on the depth electrode. The corresponding bioelectric fields (intracranial and scalp EEG) were recorded in patients during the injection of biphasic pulses. The in vivo depth stimulation data provides a direct test of the performance of the forward model. The factors affecting the accuracy of the intracranial measurements are quantified in a precise manner by studying the effects of including different tissue types and anisotropy. The results show that white matter anisotropy is crucial for predicting the electric fields in a precise manner for intracranial locations, thereby affecting the source reconstructions. Accurate modeling of the skull is necessary for predicting accurately the scalp measurements. In sum, with the aid of high-resolution finite element realistic head models it is possible to accurately predict electric fields generated by current sources in the brain and thus in a precise way, understand the relationship between electromagnetic measure and neuronal activity at the voxel-scale.

  1. Geophysical Monitoring for Validation of Transient Permafrost Models (Invited)

    NASA Astrophysics Data System (ADS)

    Hauck, C.; Hilbich, C.; Marmy, A.; Scherler, M.

    2013-12-01

    Permafrost is a widespread phenomenon at high latitudes and high altitudes and describes the permanently frozen state of the subsurface in lithospheric material. In the context of climate change, both, new monitoring and modelling techniques are required to observe and predict potential permafrost changes, e.g. the warming and degradation which may lead to the liberation of carbon (Arctic) and the destabilisation of permafrost slopes (mountains). Mountain permafrost occurrences in the European Alps are characterised by temperatures only a few degrees below zero and are therefore particularly sensitive to projected climate changes in the 21st century. Traditional permafrost observation techniques are mainly based on thermal monitoring in vertical and horizontal dimension, but they provide only weak indications of physical properties such as ice or liquid water content. Geophysical techniques can be used to characterise permafrost occurrences and to monitor their changes as the physical properties of frozen and unfrozen ground measured by geophysical techniques are markedly different. In recent years, electromagnetic, seismic but especially electrical methods have been used to continuously monitor permafrost occurrences and to detect long-term changes within the active layer and regarding the ice content within the permafrost layer. On the other hand, coupled transient thermal/hydraulic models are used to predict the evolution of permafrost occurrences under different climate change scenarios. These models rely on suitable validation data for a certain observation period, which is usually restricted to data sets of ground temperature and active layer depth. Very important initialisation and validation data for permafrost models are, however, ground ice content and unfrozen water content in the active layer. In this contribution we will present a geophysical monitoring application to estimate ice and water content and their evolution in time at a permafrost station in the Swiss Alps. These data are then used to validate the coupled mass and energy balance soil model COUP, which is used for long-term projections of the permafrost evolution in the Swiss Alps. For this, we apply the recently developed 4-phase model, which is based on simple petrophysical relationships and which uses geoelectric and seismic tomographic data sets as input data.. In addition, we use continuously measured electrical resistivity tomography data sets and soil moisture data in daily resolution to compare modelled ice content changes and geophysical observations in high temporal resolution. The results show still large uncertainties in both model approaches regarding the absolute ice content values, but much smaller uncertainties regarding the changes in ice and unfrozen water content. We conclude that this approach is well suited for the analysis of permafrost changes in both, model and monitoring studies, even though more efforts are needed for obtaining in situ ground truth data of ice content and porosity.

  2. Modeling interactions of Hg(II) and bauxitic soils.

    PubMed

    Weerasooriya, Rohan; Tobschall, Heinz J; Bandara, Atula

    2007-11-01

    The adsorptive interactions of Hg(II) with gibbsite-rich soils (hereafter SOIL-g) were modeled by 1-pK surface complexation theory using charge distribution multi-site ion competition model (CD MUSIC) incorporating basic Stern layer model (BSM) to account for electrostatic effects. The model calibrations were performed for the experimental data of synthetic gibbsite-Hg(II) adsorption. When [NaNO(3)] > or = 0.01M, the Hg(II) adsorption density values, of gibbsite, Gamma(Hg(II)), showed a negligible variation with ionic strength. However, Gamma(Hg(II)) values show a marked variation with the [Cl(-)]. When [Cl(-)] > or = 0.01M, the Gamma(Hg(II)) values showed a significant reduction with the pH. The Hg(II) adsorption behavior in NaNO(3) was modeled assuming homogeneous solid surface. The introduction of high affinity sites, i.e., >Al(s)OH at a low concentration (typically about 0.045 sites nm(-2)) is required to model Hg(II) adsorption in NaCl. According to IR spectroscopic data, the bauxitic soil (SOIL-g) is characterized by gibbsite and bayerite. These mineral phases were not treated discretely in modeling of Hg(II) and soil interactions. The CD MUSIC/BSM model combination can be used to model Hg(II) adsorption on bauxitic soil. The role of organic matter seems to play a role on Hg(II) binding when pH>8. The Hg(II) adsorption in the presence of excess Cl(-) ions required the selection of high affinity sites in modeling. PMID:17659321

  3. Bioaerosol optical sensor model development and initial validation

    NASA Astrophysics Data System (ADS)

    Campbell, Steven D.; Jeys, Thomas H.; Eapen, Xuan Le

    2007-04-01

    This paper describes the development and initial validation of a bioaerosol optical sensor model. This model was used to help determine design parameters and estimate performance of a new low-cost optical sensor for detecting bioterrorism agents. In order to estimate sensor performance in detecting biowarfare simulants and rejecting environmental interferents, use was made of a previously reported catalog of EEM (excitation/emission matrix) fluorescence cross-section measurements and previously reported multiwavelength-excitation biosensor modeling work. In the present study, the biosensor modeled employs a single high-power 365 nm UV LED source plus an IR laser diode for particle size determination. The sensor has four output channels: IR size channel, UV elastic channel and two fluorescence channels. The sensor simulation was used to select the fluorescence channel wavelengths of 400-450 and 450-600 nm. Using these selected fluorescence channels, the performance of the sensor in detecting simulants and rejecting interferents was estimated. Preliminary measurements with the sensor are presented which compare favorably with the simulation results.

  4. Empirical validation of SAR values predicted by FDTD modeling.

    PubMed

    Gajsek, P; Walters, T J; Hurt, W D; Ziriax, J M; Nelson, D A; Mason, P A

    2002-01-01

    Rapid increase in the use of numerical techniques to predict current density or specific absorption rate (SAR) in sophisticated three dimensional anatomical computer models of man and animals has resulted in the need to understand how numerical solutions of the complex electrodynamics equations match with empirical measurements. This aspect is particularly important because different numerical codes and computer models are used in research settings as a guide in designing clinical devices, telecommunication systems, and safety standards. To ensure compliance with safety guidelines during equipment design, manufacturing and maintenance, realistic and accurate models could be used as a bridge between empirical data and actual exposure conditions. Before these tools are transitioned into the hands of health safety officers and system designers, their accuracy and limitations must be verified under a variety of exposure conditions using available analytical and empirical dosimetry techniques. In this paper, empirical validation of SAR values predicted by finite difference time domain (FDTD) numerical code on sphere and rat is presented. The results of this study show a good agreement between empirical and theoretical methods and, thus, offer a relatively high confidence in SAR predictions obtained from digital anatomical models based on the FDTD numerical code. PMID:11793404

  5. First principles Candu fuel model and validation experimentation

    SciTech Connect

    Corcoran, E.C.; Kaye, M.H.; Lewis, B.J.; Thompson, W.T. [Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Akbari, F. [Atomic Energy of Canada Limited - Chalk River Ontario, Ontario KOJ IJ0 (Canada); Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Higgs, J.D. [Atomic Energy of Canada Limited - 430 Bayside Drive, Saint John, NB E2J 1A8 (Canada); Royal Military College of Canada, Department of Chemistry and Chemical Engineering, P.O. Box 17000 Station Forces, Kingston, Ontario K7K 7B4 (Canada); Verrall, R.A.; He, Z.; Mouris, J.F. [Atomic Energy of Canada Limited - Chalk River Laboratories, Chalk River Ontario, Ontario KOJ IJ0 (Canada)

    2007-07-01

    Many modeling projects on nuclear fuel rest on a quantitative understanding of the co-existing phases at various stages of burnup. Since the various fission products have considerably different abilities to chemically associate with oxygen, and the O/M ratio is slowly changing as well, the chemical potential (generally expressed as an equivalent oxygen partial pressure) is a function of burnup. Concurrently, well-recognized small fractions of new phases such as inert gas, noble metals, zirconates, etc. also develop. To further complicate matters, the dominant UO{sub 2} fuel phase may be non-stoichiometric and most of minor phases have a variable composition dependent on temperature and possible contact with the coolant in the event of a sheathing defect. A Thermodynamic Fuel Model to predict the phases in partially burned Candu nuclear fuel containing many major fission products has been under development. This model is capable of handling non-stoichiometry in the UO{sub 2} fluorite phase, dilute solution behaviour of significant solute oxides, noble metal inclusions, a second metal solid solution U(Pd-Rh-Ru)3, zirconate and uranate solutions as well as other minor solid phases, and volatile gaseous species. The treatment is a melding of several thermodynamic modeling projects dealing with isolated aspects of this important multi-component system. To simplify the computations, the number of elements has been limited to twenty major representative fission products known to appear in spent fuel. The proportion of elements must first be generated using SCALES-5. Oxygen is inferred from the concentration of the other elements. Provision to study the disposition of very minor fission products is included within the general treatment but these are introduced only on an as needed basis for a particular purpose. The building blocks of the model are the standard Gibbs energies of formation of the many possible compounds expressed as a function of temperature. To these data are added mixing terms associated with the appearance of the component species in particular phases. To validate the model, coulometric titration experiments relating the chemical potential of oxygen to the moles of oxygen introduced to SIMFUEL are underway. A description of the apparatus to oxidize and reduce samples in a controlled way in a H{sub 2}/H{sub 2}O mixture is presented. Existing measurements for irradiated fuel, both defected and non-defected, are also being incorporated into the validation process. (authors)

  6. Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.

    SciTech Connect

    Dowding, Kevin J.; Leslie, Ian H. (New Mexico State University, Las Cruces, NM); Hobbs, Michael L.; Rutherford, Brian Milne; Hills, Richard Guy (New Mexico State University, Las Cruces, NM); Pilch, Martin M.

    2004-10-01

    A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

  7. Using remote sensing for validation of a large scale hydrologic and hydrodynamic model in the Amazon

    NASA Astrophysics Data System (ADS)

    Paiva, R. C.; Bonnet, M.; Buarque, D. C.; Collischonn, W.; Frappart, F.; Mendes, C. B.

    2011-12-01

    We present the validation of the large-scale, catchment-based hydrological MGB-IPH model in the Amazon River basin. In this model, physically-based equations are used to simulate the hydrological processes, such as the Penman Monteith method to estimate evapotranspiration, or the Moore and Clarke infiltration model. A new feature recently introduced in the model is a 1D hydrodynamic module for river routing. It uses the full Saint-Venant equations and a simple floodplain storage model. River and floodplain geometry parameters are extracted from SRTM DEM using specially developed GIS algorithms that provide catchment discretization, estimation of river cross-sections geometry and water storage volume variations in the floodplains. The model was forced using satellite-derived daily rainfall TRMM 3B42, calibrated against discharge data and first validated using daily discharges and water levels from 111 and 69 stream gauges, respectively. Then, we performed a validation against remote sensing derived hydrological products, including (i) monthly Terrestrial Water Storage (TWS) anomalies derived from GRACE, (ii) river water levels derived from ENVISAT satellite altimetry data (212 virtual stations from Santos da Silva et al., 2010) and (iii) a multi-satellite monthly global inundation extent dataset at ~25 x 25 km spatial resolution (Papa et al., 2010). Validation against river discharges shows good performance of the MGB-IPH model. For 70% of the stream gauges, the Nash and Suttcliffe efficiency index (ENS) is higher than 0.6 and at Óbidos, close to Amazon river outlet, ENS equals 0.9 and the model bias equals,-4.6%. Largest errors are located in drainage areas outside Brazil and we speculate that it is due to the poor quality of rainfall datasets in these areas poorly monitored and/or mountainous. Validation against water levels shows that model is performing well in the major tributaries. For 60% of virtual stations, ENS is higher than 0.6. But, similarly, largest errors are also located in drainage areas outside Brazil, mostly Japurá River, and in the lower Amazon River. In the latter, correlation with observations is high but the model underestimates the amplitude of water levels. We also found a large bias between model and ENVISAT water levels, ranging from -3 to -15 m. The model provided TWS in good accordance with GRACE estimates. ENS values for TWS over the whole Amazon equals 0.93. We also analyzed results in 21 sub-regions of 4 x 4°. ENS is smaller than 0.8 only in 5 areas, and these are found mostly in the northwest part of the Amazon, possibly due to same errors reported in discharge results. Flood extent validation is under development, but a previous analysis in Brazilian part of Solimões River basin suggests a good model performance. The authors are grateful for the financial and operational support from the brazilian agencies FINEP, CNPq and ANA and from the french observatories HYBAM and SOERE RBV.

  8. Development, validation and application of numerical space environment models

    NASA Astrophysics Data System (ADS)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the Vlasov description of plasma is carried out using the Vlasiator model. The test shows that the Vlasov equation for plasma in six-dimensionsional phase space is solved correctly b! y Vlasiator, that results are obtained beyond those of the magnetohydrodynamic (MHD) description of plasma and that global magnetospheric simulations using a hybrid-Vlasov model are feasible on current hardware. For the first time four global magnetospheric models using the MHD description of plasma (BATS-R-US, GUMICS, OpenGGCM, LFM) are run with identical solar wind input and the results compared to observations in the ionosphere and outer magnetosphere. Based on the results of the global magnetospheric MHD model GUMICS a hypothesis is formulated for a new mechanism of plasmoid formation in the Earth's magnetotail.

  9. Validation of atmospheric propagation models in littoral waters

    NASA Astrophysics Data System (ADS)

    de Jong, Arie N.; Schwering, Piet B. W.; van Eijk, Alexander M. J.; Gunter, Willem H.

    2013-04-01

    Various atmospheric propagation effects are limiting the long-range performance of electro-optical imaging systems. These effects include absorption and scattering by molecules and aerosols, refraction due to vertical temperature gradients and scintillation and blurring due to turbulence. In maritime and coastal areas, ranges up to 25 km are relevant for detection and classification tasks on small targets (missiles, pirates). From November 2009 to October 2010 a measurement campaign was set-up over a range of more than 15 km in the False Bay in South Africa, where all of the propagation effects could be investigated quantitatively. The results have been used to provide statistical information on basic parameters as visibility, air-sea temperature difference, absolute humidity and wind speed. In addition various propagation models on aerosol particle size distribution, temperature profile, blur and scintillation under strong turbulence conditions could be validated. Examples of collected data and associated results are presented in this paper.

  10. Utilizing Chamber Data for Developing and Validating Climate Change Models

    NASA Technical Reports Server (NTRS)

    Monje, Oscar

    2012-01-01

    Controlled environment chambers (e.g. growth chambers, SPAR chambers, or open-top chambers) are useful for measuring plant ecosystem responses to climatic variables and CO2 that affect plant water relations. However, data from chambers was found to overestimate responses of C fluxes to CO2 enrichment. Chamber data may be confounded by numerous artifacts (e.g. sidelighting, edge effects, increased temperature and VPD, etc) and this limits what can be measured accurately. Chambers can be used to measure canopy level energy balance under controlled conditions and plant transpiration responses to CO2 concentration can be elucidated. However, these measurements cannot be used directly in model development or validation. The response of stomatal conductance to CO2 will be the same as in the field, but the measured response must be recalculated in such a manner to account for differences in aerodynamic conductance, temperature and VPD between the chamber and the field.

  11. Validation of a Deterministic Vibroacoustic Response Prediction Model

    NASA Technical Reports Server (NTRS)

    Caimi, Raoul E.; Margasahayam, Ravi

    1997-01-01

    This report documents the recently completed effort involving validation of a deterministic theory for the random vibration problem of predicting the response of launch pad structures in the low-frequency range (0 to 50 hertz). Use of the Statistical Energy Analysis (SEA) methods is not suitable in this range. Measurements of launch-induced acoustic loads and subsequent structural response were made on a cantilever beam structure placed in close proximity (200 feet) to the launch pad. Innovative ways of characterizing random, nonstationary, non-Gaussian acoustics are used for the development of a structure's excitation model. Extremely good correlation was obtained between analytically computed responses and those measured on the cantilever beam. Additional tests are recommended to bound the problem to account for variations in launch trajectory and inclination.

  12. Validation of a modified clinical risk score to predict cancer-specific survival for stage II colon cancer

    PubMed Central

    Oliphant, Raymond; Horgan, Paul G; Morrison, David S; McMillan, Donald C

    2015-01-01

    Many patients with stage II colon cancer will die of their disease despite curative surgery. Therefore, identification of patients at high risk of poor outcome after surgery for stage II colon cancer is desirable. This study aims to validate a clinical risk score to predict cancer-specific survival in patients undergoing surgery for stage II colon cancer. Patients undergoing surgery for stage II colon cancer in 16 hospitals in the West of Scotland between 2001 and 2004 were identified from a prospectively maintained regional clinical audit database. Overall and cancer-specific survival rates up to 5 years were calculated. A total of 871 patients were included. At 5 years, cancer-specific survival was 81.9% and overall survival was 65.6%. On multivariate analysis, age ?75 years (hazard ratio (HR) 2.11, 95% confidence intervals (CI) 1.57–2.85; P<0.001) and emergency presentation (HR 1.97, 95% CI 1.43–2.70; P<0.001) were independently associated with cancer-specific survival. Age and mode of presentation HRs were added to form a clinical risk score of 0–2. The cancer-specific survival at 5 years for patients with a cumulative score 0 was 88.7%, 1 was 78.2% and 2 was 65.9%. These results validate a modified simple clinical risk score for patients undergoing surgery for stage II colon cancer. The combination of these two universally documented clinical factors provides a solid foundation for the examination of the impact of additional clinicopathological and treatment factors on overall and cancer-specific survival. PMID:25487740

  13. Traveling with Cognitive Tests: Testing the Validity of a KABC-II Adaptation in India

    ERIC Educational Resources Information Center

    Malda, Maike; van de Vijver, Fons J. R.; Srinivasan, Krishnamachari; Transler, Catherine; Sukumar, Prathima

    2010-01-01

    The authors evaluated the adequacy of an extensive adaptation of the American Kaufman Assessment Battery for Children, second edition (KABC-II), for 6- to 10-year-old Kannada-speaking children of low socioeconomic status in Bangalore, South India. The adapted KABC-II was administered to 598 children. Subtests showed high reliabilities, the…

  14. Vibroacoustic Model Validation for a Curved Honeycomb Composite Panel

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Robinson, Jay H.; Grosveld, Ferdinand W.

    2001-01-01

    Finite element and boundary element models are developed to investigate the vibroacoustic response of a curved honeycomb composite sidewall panel. Results from vibroacoustic tests conducted in the NASA Langley Structural Acoustic Loads and Transmission facility are used to validate the numerical predictions. The sidewall panel is constructed from a flexible honeycomb core sandwiched between carbon fiber reinforced composite laminate face sheets. This type of construction is being used in the development of an all-composite aircraft fuselage. In contrast to conventional rib-stiffened aircraft fuselage structures, the composite panel has nominally uniform thickness resulting in a uniform distribution of mass and stiffness. Due to differences in the mass and stiffness distribution, the noise transmission mechanisms for the composite panel are expected to be substantially different from those of a conventional rib-stiffened structure. The development of accurate vibroacoustic models will aide in the understanding of the dominant noise transmission mechanisms and enable optimization studies to be performed that will determine the most beneficial noise control treatments. Finite element and boundary element models of the sidewall panel are described. Vibroacoustic response predictions are presented for forced vibration input and the results are compared with experimental data.

  15. Validation of an Acoustic Impedance Prediction Model for Skewed Resonators

    NASA Technical Reports Server (NTRS)

    Howerton, Brian M.; Parrott, Tony L.

    2009-01-01

    An impedance prediction model was validated experimentally to determine the composite impedance of a series of high-aspect ratio slot resonators incorporating channel skew and sharp bends. Such structures are useful for packaging acoustic liners into constrained spaces for turbofan noise control applications. A formulation of the Zwikker-Kosten Transmission Line (ZKTL) model, incorporating the Richards correction for rectangular channels, is used to calculate the composite normalized impedance of a series of six multi-slot resonator arrays with constant channel length. Experimentally, acoustic data was acquired in the NASA Langley Normal Incidence Tube over the frequency range of 500 to 3500 Hz at 120 and 140 dB OASPL. Normalized impedance was reduced using the Two-Microphone Method for the various combinations of channel skew and sharp 90o and 180o bends. Results show that the presence of skew and/or sharp bends does not significantly alter the impedance of a slot resonator as compared to a straight resonator of the same total channel length. ZKTL predicts the impedance of such resonators very well over the frequency range of interest. The model can be used to design arrays of slot resonators that can be packaged into complex geometries heretofore unsuitable for effective acoustic treatment.

  16. Image quality assessment in digital mammography: part II. NPWE as a validated alternative for contrast detail analysis.

    PubMed

    Monnin, P; Marshall, N W; Bosmans, H; Bochud, F O; Verdun, F R

    2011-07-21

    Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography. PMID:21701050

  17. Modeling Topaz-II system performance

    SciTech Connect

    Lee, H.H.; Klein, A.C. (Oregon State Univ., Corvallis (United States))

    1993-01-01

    The US acquisition of the Topaz-11 in-core thermionic space reactor test system from Russia provides a good opportunity to perform a comparison of the Russian reported data and the results from computer codes such as MCNP (Ref. 3) and TFEHX (Ref. 4). The comparison study includes both neutronic and thermionic performance analyses. The Topaz II thermionic reactor is modeled with MCNP using actual Russian dimensions and parameters. The computation of the neutronic performance considers several important aspects such as the fuel enrichment and location of the thermionic fuel elements (TFES) in the reactor core. The neutronic analysis included the calculation of both radial and axial power distribution, which are then used in the TFEHX code for electrical performance. The reactor modeled consists of 37 single-cell TFEs distributed in a 13-cm-radius zirconium hydride block surrounded by 8 cm of beryllium metal reflector. The TFEs use 90% enriched [sup 235]U and molybdenum coated with a thin layer of [sup 184]W for emitter surface. Electrons emitted are captured by a collector surface with a gap filled with cesium vapor between the collector and emitter surfaces. The collector surface is electrically insulated with alumina. Liquid NaK provides the cooling system for the TFEs. The axial thermal power distribution is obtained by dividing the TFE into 40 axial nodes. Comparison of the true axial power distribution with that produced by electrical heaters was also performed.

  18. Dynamic models and model validation for PEM fuel cells using electrical circuits

    Microsoft Academic Search

    Caisheng Wang; M. Hashem Nehrir; Steven R. Shaw

    2005-01-01

    This paper presents the development of dynamic models for proton exchange membrane (PEM) fuel cells using electrical circuits. The models have been implemented in MATLAB\\/SIMULINK and PSPICE environments. Both the double-layer charging effect and the thermodynamic characteristic inside the fuel cell are included in the models. The model responses obtained at steady-state and transient conditions are validated by experimental data

  19. VALIDATION OF A SUB-MODEL OF FORAGE GROWTH OF THE INTEGRATED FARM SYSTEM MODEL - IFSM

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sub-model of forage production developed for temperate climate is being adapted to tropical conditions in Brazil. Sub-model predictive performance has been evaluated using data of Cynodon spp. Results from sensitivity and validation tests were consistent, but values of DM production for the wet se...

  20. Canyon building ventilation system dynamic model -- Parameters and validation

    SciTech Connect

    Moncrief, B.R. [Westinghouse Savannah River Co., Aiken, SC (United States); Chen, F.F.K. [Bechtel National, Inc., San Francisco, CA (United States)

    1993-02-01

    Plant system simulation crosses many disciplines. At the core is the mimic of key components in the form of mathematical ``models.`` These component models are functionally integrated to represent the plant. With today`s low cost high capacity computers, the whole plant can be truly and effectively reproduced in a computer model. Dynamic simulation has its roots in ``single loop`` design, which is still a common objective in the employment of simulation. The other common objectives are the ability to preview plant operation, to anticipate problem areas, and to test the impact of design options. As plant system complexity increases and our ability to simulate the entire plant grows, the objective to optimize plant system design becomes practical. This shift in objectives from problem avoidance to total optimization by far offers the most rewarding potential. Even a small reduction in bulk materials and space can sufficiently justify the application of this technology. Furthermore, to realize an optimal plant starts from a tight and disciplined design. We believe the assurance required to execute such a design strategy can partly be derived from a plant model. This paper reports on the application of a dynamic model to evaluate the capacity of an existing production plant ventilation system. This study met the practical objectives of capacity evaluation under present and future conditions, and under normal and accidental situations. More importantly, the description of this application, in its methods and its utility, aims to validate the technology of dynamic simulation in the environment of plant system design and safe operation.

  1. An approach to model validation and model-based prediction -- polyurethane foam case study.

    SciTech Connect

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are created and addressed. Hypothetical problems are used because no guidance was provided concern

  2. Validation model for Raman based skin carotenoid detection.

    PubMed

    Ermakov, Igor V; Gellermann, Werner

    2010-12-01

    Raman spectroscopy holds promise as a rapid objective non-invasive optical method for the detection of carotenoid compounds in human tissue in vivo. Carotenoids are of interest due to their functions as antioxidants and/or optical absorbers of phototoxic light at deep blue and near UV wavelengths. In the macular region of the human retina, carotenoids may prevent or delay the onset of age-related tissue degeneration. In human skin, they may help prevent premature skin aging, and are possibly involved in the prevention of certain skin cancers. Furthermore, since carotenoids exist in high concentrations in a wide variety of fruits and vegetables, and are routinely taken up by the human body through the diet, skin carotenoid levels may serve as an objective biomarker for fruit and vegetable intake. Before the Raman method can be accepted as a widespread optical alternative for carotenoid measurements, direct validation studies are needed to compare it with the gold standard of high performance liquid chromatography. This is because the tissue Raman response is in general accompanied by a host of other optical processes which have to be taken into account. In skin, the most prominent is strongly diffusive, non-Raman scattering, leading to relatively shallow light penetration of the blue/green excitation light required for resonant Raman detection of carotenoids. Also, sizable light attenuation exists due to the combined absorption from collagen, porphyrin, hemoglobin, and melanin chromophores, and additional fluorescence is generated by collagen and porphyrins. In this study, we investigate for the first time the direct correlation of in vivo skin tissue carotenoid Raman measurements with subsequent chromatography derived carotenoid concentrations. As tissue site we use heel skin, in which the stratum corneum layer thickness exceeds the light penetration depth, which is free of optically confounding chromophores, which can be easily optically accessed for in vivo RRS measurement, and which can be easily removed for subsequent biochemical measurements. Excellent correlation (coefficient R=0.95) is obtained for this tissue site which could serve as a model site for scaled up future validation studies of large populations. The obtained results provide proof that resonance Raman spectroscopy is a valid non-invasive objective methodology for the quantitative assessment of carotenoid antioxidants in human skin in vivo. PMID:20678465

  3. Comparison and validation of combined GRACE/GOCE models of the Earth's gravity field

    NASA Astrophysics Data System (ADS)

    Hashemi Farahani, H.; Ditmar, P.

    2012-04-01

    Accurate global models of the Earth's gravity field are needed in various applications: in geodesy - to facilitate the production of a unified global height system; in oceanography - as a source of information about the reference equipotential surface (geoid); in geophysics - to draw conclusions about the structure and composition of the Earth's interiors, etc. A global and (nearly) homogeneous set of gravimetric measurements is being provided by the dedicated satellite mission Gravity Field and Steady-State Ocean Circulation Explorer (GOCE). In particular, Satellite Gravity Gradiometry (SGG) data acquired by this mission are characterized by an unprecedented accuracy/resolution: according to the mission objectives, they must ensure global geoid modeling with an accuracy of 1 - 2 cm at the spatial scale of 100 km (spherical harmonic degree 200). A number of new models of the Earth's gravity field have been compiled on the basis of GOCE data in the course of the last 1 - 2 years. The best of them take into account also the data from the satellite gravimetry mission Gravity Recovery And Climate Experiment (GRACE), which offers an unbeatable accuracy in the range of relatively low degrees. Such combined models contain state-of-the-art information about the Earth's gravity field up to degree 200 - 250. In the present study, we compare and validate such models, including GOCO02, EIGEN-6S, and a model compiled in-house. In addition, the EGM2008 model produced in the pre-GOCE era is considered as a reference. The validation is based on the ability of the models to: (i) predict GRACE K-Band Ranging (KBR) and GOCE SGG data (not used in the production of the models under consideration), and (ii) synthesize a mean dynamic topography model, which is compared with the CNES-CLS09 model derived from in situ oceanographic data. The results of the analysis demonstrate that the GOCE SGG data lead not only to significant improvements over continental areas with a poor coverage with terrestrial gravimetry measurements (such as Africa, Himalayas, and South America), but also to some improvements over well-studied continental areas (such as North America and Australia). Furthermore, we demonstrate a somewhat higher performance of the model produced in-house compared to the other combined GRACE/GOCE models. At the same time, it is found that the combined models show a relatively high level of noise in the oceanic areas compared to EGM2008. This implies that further efforts are needed in order to suppress high-frequency noise in the combined models in the optimal way.

  4. A Test of Model Validation from Observed Temperature Trends

    NASA Astrophysics Data System (ADS)

    Singer, S. F.

    2006-12-01

    How much of current warming is due to natural causes and how much is manmade? This requires a comparison of the patterns of observed warming with the best available models that incorporate both anthropogenic (greenhouse gases and aerosols) as well as natural climate forcings (solar and volcanic). Fortunately, we have the just published U.S.-Climate Change Science Program (CCSP) report (www.climatescience.gov/Library/sap/sap1-1/finalreport/default.htm), based on best current information. As seen in Fig. 1.3F of the report, modeled surface temperature trends change little with latitude, except for a stronger warming in the Arctic. The observations, however, show a strong surface warming in the northern hemisphere but not in the southern hemisphere (see Fig. 3.5C and 3.6D). The Antarctic is found to be cooling and Arctic temperatures, while currently rising, were higher in the 1930s than today. Although the Executive Summary of the CCSP report claims "clear evidence" for anthropogenic warming, based on comparing tropospheric and surface temperature trends, the report itself does not confirm this. Greenhouse models indicate that the tropics should provide the most sensitive location for their validation; trends there should increase by 200-300 percent with altitude, peaking at around 10 kilometers. The observations, however, show the opposite: flat or even decreasing tropospheric trend values (see Fig. 3.7 and also Fig. 5.7E). This disparity is demonstrated most strikingly in Fig. 5.4G, which shows the difference between surface and troposphere trends for a collection of models (displayed as a histogram) and for balloon and satellite data. [The disparities are less apparent in the Summary, which displays model results in terms of "range" rather than as histograms.] There may be several possible reasons for the disparity: Instrumental and other effects that exaggerate or otherwise distort observed temperature trends. Or, more likely: Shortcomings in models that result in much reduced values of climate sensitivity; for example, the neglect of important negative feedbacks. Allowing for uncertainties in the data and for imperfect models, there is only one valid conclusion from the failure of greenhouse models to explain the observations: The human contribution to global warming is still quite small, so that natural climate factors are dominant. This may also explain why the climate was cooling from 1940 to 1975 -- even as greenhouse-gas levels increased rapidly. An overall test for climate prediction may soon be possible by measuring the ongoing rise in sea level. According to my estimates, sea level should rise by 1.5 to 2.0 cm per decade (about the same rate as in past millennia); the U.N.-IPCC (4th Assessment Report) predicts 1.4 to 4.3 cm per decade. In the New York Review of Books (July 13, 2006), however, James Hansen suggests 20 feet or more per century -- equivalent to about 60 cm or more per decade.

  5. Validating the Thinking Styles Inventory-Revised II among Chinese university students with hearing impairment through test accommodations.

    PubMed

    Cheng, Sanyin; Zhang, Li-Fang

    2014-01-01

    The present study pioneered in adopting test accommodations to validate the Thinking Styles Inventory-Revised II (TSI-R2; Sternberg, Wagner, & Zhang, 2007) among Chinese university students with hearing impairment. A series of three studies were conducted that drew their samples from the same two universities, in which accommodating test directions (N = 213), combining test directions with language accommodations from students' perspectives (N = 366), and integrating test directions with language accommodations from teachers' perspectives (N = 129) were used. The accommodated TSI-R2 generally indicated acceptable internal scale reliabilities and factorial validity for Chinese university students with hearing loss. Limitations in relation to the study participants are discussed, as well as test accommodations and the significance and implications of the study. PMID:25051880

  6. Calibration and Validation of Airborne InSAR Geometric Model

    NASA Astrophysics Data System (ADS)

    Chunming, Han; huadong, Guo; Xijuan, Yue; Changyong, Dou; Mingming, Song; Yanbing, Zhang

    2014-03-01

    The image registration or geo-coding is a very important step for many applications of airborne interferometric Synthetic Aperture Radar (InSAR), especially for those involving Digital Surface Model (DSM) generation, which requires an accurate knowledge of the geometry of the InSAR system. While the trajectory and attitude instabilities of the aircraft introduce severe distortions in three dimensional (3-D) geometric model. The 3-D geometrical model of an airborne SAR image depends on the SAR processor itself. Working at squinted model, i.e., with an offset angle (squint angle) of the radar beam from broadside direction, the aircraft motion instabilities may produce distortions in airborne InSAR geometric relationship, which, if not properly being compensated for during SAR imaging, may damage the image registration. The determination of locations of the SAR image depends on the irradiated topography and the exact knowledge of all signal delays: range delay and chirp delay (being adjusted by the radar operator) and internal delays which are unknown a priori. Hence, in order to obtain reliable results, these parameters must be properly calibrated. An Airborne InSAR mapping system has been developed by the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS) to acquire three-dimensional geo-spatial data with high resolution and accuracy. To test the performance of the InSAR system, the Validation/Calibration (Val/Cal) campaign has carried out in Sichun province, south-west China, whose results will be reported in this paper.

  7. Validating Model-Driven Performance Predictions on Random Software Systems

    Microsoft Academic Search

    Vlastimil Babka; Petr Tuma; Lubomír Bulej

    2010-01-01

    \\u000a Software performance prediction methods are typically validated by taking an appropriate software system, performing both\\u000a performance predictions and performance measurements for that system, and comparing the results. The validation includes manual\\u000a actions, which makes it feasible only for a small number of systems.\\u000a \\u000a \\u000a To significantly increase the number of systems on which software performance prediction methods can be validated, and

  8. Using Laboratory Magnetospheres to Develop and Validate Space Weather Models

    NASA Astrophysics Data System (ADS)

    Mauel, M. E.; Garnier, D.; Kesner, J.

    2012-12-01

    Reliable space weather predictions can be used to plan satellite operations, predict radio outages, and protect the electrical transmission grid. While direct observation of the solar corona and satellite measurements of the solar wind give warnings of possible subsequent geomagnetic activity, more accurate and reliable models of how solar fluxes effect the earth's space environment are needed. The recent development in laboratory magnetic dipoles have yielded well confined high-beta plasmas with intense energetic electron belts similar to magnetospheres. With plasma diagnostics spanning from global to small spatial scales and user-controlled experiments, these devices can be used to study current issues in space weather such as fast particle excitation and rapid depolarization events. In levitated dipole experiments, which remove the collisional loss along field lines that normally dominate laboratory dipole plasmas, slow radial convection processes can be observed. We describe ongoing experiments and investigations that (i) control interchange mixing through application of vorticity injection, (ii) make whole-plasma, high-speed images of turbulent plasma dynamics, (iii) simulate nonlinear gyrokinetic dynamics of bounded driven dipole plasma, and (iv) compare laboratory plasma measurements and global convection models.; Photographs of the LDX and CTX Laboratory Magnetospheres. Trapped plasma and energetic particles are created and studied with a variety of imaging diagnostics. Shown to the right are multiple probes for simultaneous measurements of plasma structures and turbulent mixing.

  9. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    PubMed

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. PMID:25111293

  10. Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

  11. Alaska North Slope Tundra Travel Model and Validation Study

    SciTech Connect

    Harry R. Bader; Jacynthe Guimond

    2006-03-01

    The Alaska Department of Natural Resources (DNR), Division of Mining, Land, and Water manages cross-country travel, typically associated with hydrocarbon exploration and development, on Alaska's arctic North Slope. This project is intended to provide natural resource managers with objective, quantitative data to assist decision making regarding opening of the tundra to cross-country travel. DNR designed standardized, controlled field trials, with baseline data, to investigate the relationships present between winter exploration vehicle treatments and the independent variables of ground hardness, snow depth, and snow slab thickness, as they relate to the dependent variables of active layer depth, soil moisture, and photosynthetically active radiation (a proxy for plant disturbance). Changes in the dependent variables were used as indicators of tundra disturbance. Two main tundra community types were studied: Coastal Plain (wet graminoid/moist sedge shrub) and Foothills (tussock). DNR constructed four models to address physical soil properties: two models for each main community type, one predicting change in depth of active layer and a second predicting change in soil moisture. DNR also investigated the limited potential management utility in using soil temperature, the amount of photosynthetically active radiation (PAR) absorbed by plants, and changes in microphotography as tools for the identification of disturbance in the field. DNR operated under the assumption that changes in the abiotic factors of active layer depth and soil moisture drive alteration in tundra vegetation structure and composition. Statistically significant differences in depth of active layer, soil moisture at a 15 cm depth, soil temperature at a 15 cm depth, and the absorption of photosynthetically active radiation were found among treatment cells and among treatment types. The models were unable to thoroughly investigate the interacting role between snow depth and disturbance due to a lack of variability in snow depth cover throughout the period of field experimentation. The amount of change in disturbance indicators was greater in the tundra communities of the Foothills than in those of the Coastal Plain. However the overall level of change in both community types was less than expected. In Coastal Plain communities, ground hardness and snow slab thickness were found to play an important role in change in active layer depth and soil moisture as a result of treatment. In the Foothills communities, snow cover had the most influence on active layer depth and soil moisture as a result of treatment. Once certain minimum thresholds for ground hardness, snow slab thickness, and snow depth were attained, it appeared that little or no additive effect was realized regarding increased resistance to disturbance in the tundra communities studied. DNR used the results of this modeling project to set a standard for maximum permissible disturbance of cross-country tundra travel, with the threshold set below the widely accepted standard of Low Disturbance levels (as determined by the U.S. Fish and Wildlife Service). DNR followed the modeling project with a validation study, which seemed to support the field trial conclusions and indicated that the standard set for maximum permissible disturbance exhibits a conservative bias in favor of environmental protection. Finally DNR established a quick and efficient tool for visual estimations of disturbance to determine when investment in field measurements is warranted. This Visual Assessment System (VAS) seemed to support the plot disturbance measurements taking during the modeling and validation phases of this project.

  12. Validation of transport models using additive flux minimization technique

    SciTech Connect

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States)] [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States)] [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States)] [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)] [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  13. An analysis of structural validity in entity-relationship modeling

    E-print Network

    Song, Il-Yeol

    the structural validity of any ERD containing recursive, binary, and ternary relationships. These decision rules to correct, so early discovery of an error is highly desirable. The structural validity of an ER diagram (ERD) is concerned whether or not a given ERD contains any constructs that are contradictory to each other. An ERD

  14. A methodology for cost-risk analysis in the statistical validation of simulation models

    Microsoft Academic Search

    Osman Balci; Robert G. Sargent

    1981-01-01

    A methodology is presented for constructing the relationships among model user's risk, model builder's risk, acceptable validity range, sample sizes, and cost of data collection when statistical hypothesis testing is used for validating a simulation model of a real, observable system. The use of the methodology is illustrated for the use of Hotelling's two-sample T 2 test in testing the

  15. The Sandia MEMS Passive Shock Sensor : FY08 testing for functionality, model validation, and technology readiness

    Microsoft Academic Search

    Jeremy Allen Walraven; Jill Blecke; Michael Sean Baker; Rebecca C. Clemens; John Anthony Mitchell; Matthew Robert Brake; David S. Epp; Jonathan W. Wittwer

    2008-01-01

    This report summarizes the functional, model validation, and technology readiness testing of the Sandia MEMS Passive Shock Sensor in FY08. Functional testing of a large number of revision 4 parts showed robust and consistent performance. Model validation testing helped tune the models to match data well and identified several areas for future investigation related to high frequency sensitivity and thermal

  16. A Validity-Based Model for the Evaluation of a Criterion-Referenced Test.

    ERIC Educational Resources Information Center

    Schattgen, Sharon; And Others

    This paper describes a model for the evaluation and approval of a test battery for compliance with a midwestern state law mandating criterion-referenced testing of specific objectives. Standards specifying that the test scores must demonstrate content validity and criterion-related validity form the foundation of the model. The model also…

  17. Validation of population-based disease simulation models: a review of concepts and methods

    Microsoft Academic Search

    Jacek A Kopec; Philippe Finès; Douglas G Manuel; David L Buckeridge; William M Flanagan; Jillian Oderkirk; Michal Abrahamowicz; Samuel Harper; Behnam Sharif; Anya Okhmatovskaia; Eric C Sayre; M Mushfiqur Rahman; Michael C Wolfson

    2010-01-01

    BACKGROUND: Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. METHODS: We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of

  18. A Streaming Validation Model for SOAP Digital Signature , Kenneth Chiu2

    E-print Network

    A Streaming Validation Model for SOAP Digital Signature Wei Lu1 , Kenneth Chiu2 , Aleksander provides a rich and flexible message signature model for XML documents, and it has been adopted by SOAP for the streaming validation of SOAP Digital Signature. Our model consists of a streaming canonicalization

  19. Experiments for calibration and validation of plasticity and failure material modeling: 304L stainless steel.

    SciTech Connect

    Lee, Kenneth L.; Korellis, John S.; McFadden, Sam X.

    2006-01-01

    Experimental data for material plasticity and failure model calibration and validation were obtained from 304L stainless steel. Model calibration data were taken from smooth tension, notched tension, and compression tests. Model validation data were provided from experiments using thin-walled tube specimens subjected to path dependent combinations of internal pressure, extension, and torsion.

  20. Automatic validation of computational models using pseudo-3D spatio-temporal model checking.

    PubMed

    Pârvu, Ovidiu; Gilbert, David

    2014-12-01

    BackgroundComputational models play an increasingly important role in systems biology for generating predictions and in synthetic biology as executable prototypes/designs. For real life (clinical) applications there is a need to scale up and build more complex spatio-temporal multiscale models; these could enable investigating how changes at small scales reflect at large scales and viceversa. Results generated by computational models can be applied to real life applications only if the models have been validated first. Traditional in silico model checking techniques only capture how non-dimensional properties (e.g. concentrations) evolve over time and are suitable for small scale systems (e.g. metabolic pathways). The validation of larger scale systems (e.g. multicellular populations) additionally requires capturing how spatial patterns and their properties change over time, which are not considered by traditional non-spatial approaches.ResultsWe developed and implemented a methodology for the automatic validation of computational models with respect to both their spatial and temporal properties. Stochastic biological systems are represented by abstract models which assume a linear structure of time and a pseudo-3D representation of space (2D space plus a density measure). Time series data generated by such models is provided as input to parameterised image processing modules which automatically detect and analyse spatial patterns (e.g. cell) and clusters of such patterns (e.g. cellular population). For capturing how spatial and numeric properties change over time the Probabilistic Bounded Linear Spatial Temporal Logic is introduced. Given a collection of time series data and a formal spatio-temporal specification the model checker Mudi (http://mudi.modelchecking.org) determines probabilistically if the formal specification holds for the computational model or not. Mudi is an approximate probabilistic model checking platform which enables users to choose between frequentist and Bayesian, estimate and statistical hypothesis testing based validation approaches. We illustrate the expressivity and efficiency of our approach based on two biological case studies namely phase variation patterning in bacterial colony growth and the chemotactic aggregation of cells.ConclusionsThe formal methodology implemented in Mudi enables the validation of computational models against spatio-temporal logic properties and is a precursor to the development and validation of more complex multidimensional and multiscale models. PMID:25440773

  1. Vibration and shock reliability of MEMS: modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Sundaram, Subramanian; Tormen, Maurizio; Timotijevic, Branislav; Lockhart, Robert; Overstolz, Thomas; Stanley, Ross P.; Shea, Herbert R.

    2011-04-01

    A methodology to predict shock and vibration levels that could lead to the failure of MEMS devices is reported as a function of vibration frequency and shock pulse duration. A combined experimental-analytical approach is developed, maintaining the simplicity and insightfulness of analytical methods without compromising on the accuracy characteristic of experimental methods. The minimum frequency-dependent acceleration that will lead to surfaces coming into contact, for vibration or shock inputs, is determined based on measured mode shapes, damping, resonant frequencies, and an analysis of failure modes, thus defining a safe operating region, without requiring shock or vibration testing. This critical acceleration for failure is a strong function of the drive voltage, and the safe operating region is predicted for transport (unbiased) and operation (biased condition). The model was experimentally validated for over-damped and under-damped modes of a comb-drive-driven silicon-on-insulator-based tunable grating. In-plane and out-of-plane vibration (up to 65 g) and shock (up to 6000 g) tests were performed for biased and unbiased conditions, and very good agreement was found between predicted and observed critical accelerations.

  2. Validation of Community Models: Identifying Events in Space Weather Model Timelines

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    I develop and document a set of procedures which test the quality of predictions of solar wind speed and polarity of the interplanetary magnetic field (IMF) made by coupled models of the ambient solar corona and heliosphere. The Wang-Sheeley-Arge (WSA) model is used to illustrate the application of these validation procedures. I present an algorithm which detects transitions of the solar wind from slow to high speed. I also present an algorithm which processes the measured polarity of the outward directed component of the IMF. This removes high-frequency variations to expose the longer-scale changes that reflect IMF sector changes. I apply these algorithms to WSA model predictions made using a small set of photospheric synoptic magnetograms obtained by the Global Oscillation Network Group as input to the model. The results of this preliminary validation of the WSA model (version 1.6) are summarized.

  3. Validation of qualitative models of genetic regulatory networks by model checking: analysis of the nutritional stress response in Escherichia coli

    Microsoft Academic Search

    Grégory Batt; Delphine Ropers; Hidde De Jong; Johannes Geiselmann; Radu Mateescu; Michel Page; Dominique Schneider

    2005-01-01

    Motivation: The modeling and simulation of genetic regu- latory networks have created the need for tools for model validation. The main challenges of model validation are the achievement of a match between the precision of model pre- dictions and experimental data, as well as the efficient and reliable comparison of the predictions and observations. Results: We present an approach towards

  4. An independent verification and validation of the Future Theater Level Model conceptual model

    SciTech Connect

    Hartley, D.S. III; Kruse, K.L.; Martellaro, A.J.; Packard, S.L.; Thomas, B. Jr.; Turley, V.K.

    1994-08-01

    This report describes the methodology and results of independent verification and validation performed on a combat model in its design stage. The combat model is the Future Theater Level Model (FTLM), under development by The Joint Staff/J-8. J-8 has undertaken its development to provide an analysis tool that addresses the uncertainties of combat more directly than previous models and yields more rapid study results. The methodology adopted for this verification and validation consisted of document analyses. Included were detailed examination of the FTLM design documents (at all stages of development), the FTLM Mission Needs Statement, and selected documentation for other theater level combat models. These documents were compared to assess the FTLM as to its design stage, its purpose as an analytical combat model, and its capabilities as specified in the Mission Needs Statement. The conceptual design passed those tests. The recommendations included specific modifications as well as a recommendation for continued development. The methodology is significant because independent verification and validation have not been previously reported as being performed on a combat model in its design stage. The results are significant because The Joint Staff/J-8 will be using the recommendations from this study in determining whether to proceed with develop of the model.

  5. Validity of Social, Moral and Emotional Facets of Self-Description Questionnaire II

    ERIC Educational Resources Information Center

    Leung, Kim Chau; Marsh, Herbert W.; Yeung, Alexander Seeshing; Abduljabbar, Adel S.

    2015-01-01

    Studies adopting a construct validity approach can be categorized into within- and between-network studies. Few studies have applied between-network approach and tested the correlations of the social (same-sex relations, opposite-sex relations, parent relations), moral (honesty-trustworthiness), and emotional (emotional stability) facets of the…

  6. Numerical predictions of two-dimensional conduction, convection, and radiation heat transfer. II. Validation

    Microsoft Academic Search

    Daniel R. Rousse; Guillaume Gautier; Jean-Francois Sacadura

    2000-01-01

    This paper presents several test problems that were used to validate the formulation and implementation of a CVFEM for combined-mode heat transfer in participating media. The objective here is to demonstrate that the proposed CVFEM can be used to solve combined modes of heat transfer in media that emit, absorb, and scatter radiant energy in regularly- and irregularly-shaped geometries. The

  7. Effects of Risperidone on Aberrant Behavior in Persons with Developmental Disabilities: II. Social Validity Measures.

    ERIC Educational Resources Information Center

    McAdam, David B.; Zarcone, Jennifer R.; Hellings, Jessica; Napolitano, Deborah A.; Schroeder, Stephen R.

    2002-01-01

    Consumer satisfaction and social validity were measured during a double-blind, placebo-controlled evaluation of risperidone in treating aberrant behaviors of persons with developmental disabilities. A survey showed all 17 caregivers felt participation was positive. Community members (n=52) also indicated that when on medication, the 5 participants…

  8. The African American Acculturation Scale II: Cross-Validation and Short Form.

    ERIC Educational Resources Information Center

    Landrine, Hope; Klonoff, Elizabeth A.

    1995-01-01

    Studied African American culture, using a new, shortened, 33-item African American Acculturation Scale (AAAS-33) to assess the scale's validity and reliability. Comparisons between the original form and AAAS-33 reveal high correlations, however, the longer form may be sensitive to some beliefs, practices, and attitudes not assessed by the short…

  9. An approach to model validation and model-based prediction -- polyurethane foam case study

    Microsoft Academic Search

    Kevin J. Dowding; Brian Milne Rutherford

    2003-01-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of

  10. Stress concentration near stiff inclusions: validation of rigid inclusion model and boundary layers by means of photoelasticity

    E-print Network

    Diego Misseroni; Francesco Dal Corso; Summer Shahzad; Davide Bigoni

    2014-04-03

    Photoelasticity is employed to investigate the stress state near stiff rectangular and rhombohedral inclusions embedded in a 'soft' elastic plate. Results show that the singular stress field predicted by the linear elastic solution for the rigid inclusion model can be generated in reality, with great accuracy, within a material. In particular, experiments: (i.) agree with the fact that the singularity is lower for obtuse than for acute inclusion angles; (ii.) show that the singularity is stronger in Mode II than in Mode I (differently from a notch); (iii.) validate the model of rigid quadrilateral inclusion; (iv.) for thin inclusions, show the presence of boundary layers deeply influencing the stress field, so that the limit case of rigid line inclusion is obtained in strong dependence on the inclusion's shape. The introduced experimental methodology opens the possibility of enhancing the design of thin reinforcements and of analyzing complex situations involving interaction between inclusions and defects.

  11. The Internal Validation of Level II and Level III Respiratory Therapy Examinations. Final Report.

    ERIC Educational Resources Information Center

    Jouett, Michael L.

    This project began with the delineation of the roles and functions of respiratory therapy personnel by the American Association for Respiratory Therapy. In Phase II, The Psychological Corporation used this delineation to develop six proficiency examinations, three at each of two levels. One exam at each level was designated for the purpose of the…

  12. Results of site validation experiments. Volume II. Supporting documents 5 through 14

    SciTech Connect

    Not Available

    1983-01-01

    Volume II contains the following supporting documents: Summary of Geologic Mapping of Underground Investigations; Logging of Vertical Coreholes - ''Double Box'' Area and Exploratory Drift; WIPP High Precision Gravity Survey; Basic Data Reports for Drillholes, Brine Content of Facility Internal Strata; Mineralogical Content of Facility Interval Strata; Location and Characterization of Interbedded Materials; Characterization of Aquifers at Shaft Locations; and Permeability of Facility Interval Strate.

  13. Development of a Land Surface Model. Part II: Data Assimilation

    Microsoft Academic Search

    Jonathan E. Pleim; Aijun Xiu

    2003-01-01

    Part I described a land surface model, its implementation in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5), and some model evaluation results. Part II describes the indirect soil moisture data assimilation scheme. As described in Part I, the land surface model includes explicit soil moisture, which is based on the Interactions between Soil, Biosphere,

  14. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  15. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  16. Land-cover change model validation by an ROC method for the Ipswich watershed, Massachusetts, USA

    Microsoft Academic Search

    R. Gil Pontius Jr; Laura C. Schneider

    2001-01-01

    Scientists need a better and larger set of tools to validate land-use change models, because it is essential to know a model’s prediction accuracy. This paper describes how to use the relative operating characteristic (ROC) as a quantitative measurement to validate a land-cover change model. Typically, a crucial component of a spatially explicit simulation model of land-cover change is a

  17. Higgs potential in the type II seesaw model

    SciTech Connect

    Arhrib, A. [Departement de Mathematiques, Faculte des Sciences et Techniques, Tanger (Morocco); Laboratoire de Physique des Hautes Energies et Astrophysique, Departement de Physique, Faculte des Sciences Semlalia, Marrakech (Morocco); Benbrik, R. [Laboratoire de Physique des Hautes Energies et Astrophysique, Departement de Physique, Faculte des Sciences Semlalia, Marrakech (Morocco); Faculte Polydisciplinaire, Universite Cadi Ayyad, Sidi Bouzid, Safi-Morocco (Morocco); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Chabab, M.; Rahili, L.; Ramadan, J. [Laboratoire de Physique des Hautes Energies et Astrophysique, Departement de Physique, Faculte des Sciences Semlalia, Marrakech (Morocco); Moultaka, G. [Universite Montpellier 2, Laboratoire Charles Coulomb UMR 5221, F-34095 Montpellier (France); CNRS, Laboratoire Charles Coulomb UMR 5221, F-34095 Montpellier (France); Peyranere, M. C. [Universite Montpellier 2, Laboratoire Univers and Particules de Montpellier UMR 5299, F-34095 Montpellier (France); CNRS/IN2P3, Laboratoire Univers and Particules de Montpellier UMR 5299, F-34095 Montpellier (France)

    2011-11-01

    The standard model Higgs sector, extended by one weak gauge triplet of scalar fields with a very small vacuum expectation value, is a very promising setting to account for neutrino masses through the so-called type II seesaw mechanism. In this paper we consider the general renormalizable doublet/triplet Higgs potential of this model. We perform a detailed study of its main dynamical features that depend on five dimensionless couplings and two mass parameters after spontaneous symmetry breaking, and highlight the implications for the Higgs phenomenology. In particular, we determine (i) the complete set of tree-level unitarity constraints on the couplings of the potential and (ii) the exact tree-level boundedness from below constraints on these couplings, valid for all directions. When combined, these constraints delineate precisely the theoretically allowed parameter space domain within our perturbative approximation. Among the seven physical Higgs states of this model, the mass of the lighter (heavier) CP{sub even} state h{sup 0} (H{sup 0}) will always satisfy a theoretical upper (lower) bound that is reached for a critical value {mu}{sub c} of {mu} (the mass parameter controlling triple couplings among the doublet/triplet Higgses). Saturating the unitarity bounds, we find an upper bound m{sub h}{sup 0} or approx. {mu}{sub c} and {mu} < or approx. {mu}{sub c}. In the first regime the Higgs sector is typically very heavy, and only h{sup 0} that becomes SM-like could be accessible to the LHC. In contrast, in the second regime, somewhat overlooked in the literature, most of the Higgs sector is light. In particular, the heaviest state H{sup 0} becomes SM-like, the lighter states being the CP{sub odd} Higgs, the (doubly) charged Higgses, and a decoupled h{sup 0}, possibly leading to a distinctive phenomenology at the colliders.

  18. A Technique for Global Monitoring of Net Solar Irradiance at the Ocean Surface. Part II: Validation

    Microsoft Academic Search

    Beth Chertock; Robert Frouin; Catherine Gautier

    1992-01-01

    The present study constitutes the generation and validation of the first satellite-based, long-term record of surface solar irradiance over the global oceans. The record is generated using Nimbus-7 earth radiation budget (ERB) wide-field-of-view (WFOV) planetary-albedo data as input to a numerical algorithm designed and implemented for this study based on radiative transfer theory. Net surface solar irradiance is obtained by

  19. Gravitropic responses of the Avena coleoptile in space and on clinostats. II. Is reciprocity valid?

    NASA Technical Reports Server (NTRS)

    Johnsson, A.; Brown, A. H.; Chapman, D. K.; Heathcote, D.; Karlsson, C.

    1995-01-01

    Experiments were undertaken to determine if the reciprocity rule is valid for gravitropic responses of oat coleoptiles in the acceleration region below 1 g. The rule predicts that the gravitropic response should be proportional to the product of the applied acceleration and the stimulation time. Seedlings were cultivated on 1 g centrifuges and transferred to test centrifuges to apply a transverse g-stimulation. Since responses occurred in microgravity, the uncertainties about the validity of clinostat simulation of weightlessness was avoided. Plants at two stages of coleoptile development were tested. Plant responses were obtained using time-lapse video recordings that were analyzed after the flight. Stimulus intensities and durations were varied and ranged from 0.1 to 1.0 g and from 2 to 130 min, respectively. For threshold g-doses the reciprocity rule was obeyed. The threshold dose was of the order of 55 g s and 120 g s, respectively, for two groups of plants investigated. Reciprocity was studied also at bending responses which are from just above the detectable level to about 10 degrees. The validity of the rule could not be confirmed for higher g-doses, chiefly because the data were more variable. It was investigated whether the uniformity of the overall response data increased when the gravitropic dose was defined as (gm x t) with m-values different from unity. This was not the case and the reciprocity concept is, therefore, valid also in the hypogravity region. The concept of gravitropic dose, the product of the transverse acceleration and the stimulation time, is also well-defined in the acceleration region studied. With the same hardware, tests were done on earth where responses occurred on clinostats. The results did not contradict the reciprocity rule but scatter in the data was large.

  20. PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II

    SciTech Connect

    Chambers, F W; Raymond, B A; Falabella, S; Lee, B S; Richardson, R A; Weir, J T; Davis, H A; Schultze, M E

    2005-05-31

    To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model is coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.

  1. Reconceptualizing the Learning Transfer Conceptual Framework: Empirical Validation of a New Systemic Model

    ERIC Educational Resources Information Center

    Kontoghiorghes, Constantine

    2004-01-01

    The main purpose of this study was to examine the validity of a new systemic model of learning transfer and thus determine if a more holistic approach to training transfer could better explain the phenomenon. In all, this study confirmed the validity of the new systemic model and suggested that a high performance work system could indeed serve as…

  2. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  3. A Process Modelling Framework for Formal Validation of Panama Canal System Operations

    E-print Network

    Austin, Mark

    1 A Process Modelling Framework for Formal Validation of Panama Canal System Operations John develop a process modeling framework for the evaluation and formal validation of Panama Canal system. The Panama Canal is one of the world's most important waterways. Initially opened for operation in 1914

  4. Why test animals to treat humans? On the validity of animal models

    Microsoft Academic Search

    Cameron Shelley

    2010-01-01

    Critics of animal modeling have advanced a variety of arguments against the validity of the practice. The point of one such form of argument is to establish that animal modeling is pointless and therefore immoral. In this article, critical arguments of this form are divided into three types, the pseudoscience argument, the disanalogy argument, and the predictive validity argument. I

  5. Validation of a biomechanical heart model using animal data with acute myocardial infarction

    E-print Network

    Paris-Sud XI, Université de

    Experimental data The experimental data consisted of animal data obtained with a farm pig of 25kg. The inValidation of a biomechanical heart model using animal data with acute myocardial infarction R Hospital, Cr´eteil, France Abstract. In this paper, we validate a biomechanical heart model with animal

  6. Experimental identification and validation of an electrochemical model of a Lithium-Ion Battery

    E-print Network

    Stefanopoulou, Anna

    Experimental identification and validation of an electrochemical model of a Lithium-Ion Battery an experimental parameter iden- tification and validation for an electrochemical lithium-ion battery model. The identification procedure is based on experimental data collected from a 6.8 Ah lithium-ion battery during charge

  7. MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom

    NASA Astrophysics Data System (ADS)

    Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

    The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

  8. [C II] emission and star formation in late-type galaxies. II A model

    E-print Network

    D. Pierini; K. J. Leech; H. J. Voelk

    2002-10-30

    We study the relationship between gas cooling via the [C II] (158 micron) line emission and dust cooling via the far-IR continuum emission on the global scale of a galaxy in normal (i.e. non-AGN dominated and non-starburst) late-type systems. It is known that the luminosity ratio of total gas and dust cooling, L(C II)/L(FIR), shows a non-linear behaviour with the equivalent width of the Halpha line emission, the ratio decreasing in galaxies of lower massive star-formation activity. This result holds despite the fact that known individual Galactic and extragalactic sources of the [C II] line emission show different [C II] line-to-far-IR continuum emission ratios. This non-linear behaviour is reproduced by a simple quantitative model of gas and dust heating from different stellar populations, assuming that the photoelectric effect on dust, induced by far-UV photons, is the dominant mechanism of gas heating in the general diffuse interstellar medium of the galaxies under investigation. According to the model, the global L(C II)/L(FIR) provides a direct measure of the fractional amount of non-ionizing UV light in the interstellar radiation field and not of the efficiency of the photoelectric heating. The model also defines a method to constrain the stellar initial mass function from measurements of L(C II) and L(FIR). A sample of 20 Virgo cluster galaxies observed in the [C II] line with the LWS on board ISO is used to illustrate the model. The limited statistics and the necessary assumptions behind the determination of the global [C II] luminosities from the spatially limited data do not allow us to establish definitive conclusions but data-sets available in the future will allow tests of both the reliability of the assumptions of our model and the statistical significance of our results.

  9. Importance of Sea Ice for Validating Global Climate Models

    NASA Technical Reports Server (NTRS)

    Geiger, Cathleen A.

    1997-01-01

    Reproduction of current day large-scale physical features and processes is a critical test of global climate model performance. Without this benchmark, prognoses of future climate conditions are at best speculation. A fundamental question relevant to this issue is, which processes and observations are both robust and sensitive enough to be used for model validation and furthermore are they also indicators of the problem at hand? In the case of global climate, one of the problems at hand is to distinguish between anthropogenic and naturally occuring climate responses. The polar regions provide an excellent testing ground to examine this problem because few humans make their livelihood there, such that anthropogenic influences in the polar regions usually spawn from global redistribution of a source originating elsewhere. Concomitantly, polar regions are one of the few places where responses to climate are non-anthropogenic. Thus, if an anthropogenic effect has reached the polar regions (e.g. the case of upper atmospheric ozone sensitivity to CFCs), it has most likely had an impact globally but is more difficult to sort out from local effects in areas where anthropogenic activity is high. Within this context, sea ice has served as both a monitoring platform and sensitivity parameter of polar climate response since the time of Fridtjof Nansen. Sea ice resides in the polar regions at the air-sea interface such that changes in either the global atmospheric or oceanic circulation set up complex non-linear responses in sea ice which are uniquely determined. Sea ice currently covers a maximum of about 7% of the earth's surface but was completely absent during the Jurassic Period and far more extensive during the various ice ages. It is also geophysically very thin (typically <10 m in Arctic, <3 m in Antarctic) compared to the troposphere (roughly 10 km) and deep ocean (roughly 3 to 4 km). Because of these unique conditions, polar researchers regard sea ice as one of the more important features to monitor in terms of heat, mass, and momentum transfer between the air and sea and furthermore, the impact of such responses to global climate.

  10. Contributions to the validation of the CJS model for granular materials

    NASA Astrophysics Data System (ADS)

    Elamrani, Khadija

    1992-07-01

    Behavior model validation in the field of geotechnics is addressed, with the objective of showing the advantages and limits of the CJS (Cambou Jafari Sidoroff) behavior model for granular materials. Several levels are addressed: theoretical analysis of the CJS model to reveal consistence and first capacities; shaping (followed by validation by confrontation with other programs) of a computation code by finite elements (FINITEL) to integrate this model and prepare it for complex applications; validation of the code/model structure thus constituted by comparing its results to those of experiments in the case of nonhomogeneous (superficial foundations) problems.

  11. Assessment of the Validity of the Double Superhelix Model for Reconstituted High Density Lipoproteins

    PubMed Central

    Jones, Martin K.; Zhang, Lei; Catte, Andrea; Li, Ling; Oda, Michael N.; Ren, Gang; Segrest, Jere P.

    2010-01-01

    For several decades, the standard model for high density lipoprotein (HDL) particles reconstituted from apolipoprotein A-I (apoA-I) and phospholipid (apoA-I/HDL) has been a discoidal particle ?100 ? in diameter and the thickness of a phospholipid bilayer. Recently, Wu et al. (Wu, Z., Gogonea, V., Lee, X., Wagner, M. A., Li, X. M., Huang, Y., Undurti, A., May, R. P., Haertlein, M., Moulin, M., Gutsche, I., Zaccai, G., Didonato, J. A., and Hazen, S. L. (2009) J. Biol. Chem. 284, 36605–36619) used small angle neutron scattering to develop a new model they termed double superhelix (DSH) apoA-I that is dramatically different from the standard model. Their model possesses an open helical shape that wraps around a prolate ellipsoidal type I hexagonal lyotropic liquid crystalline phase. Here, we used three independent approaches, molecular dynamics, EM tomography, and fluorescence resonance energy transfer spectroscopy (FRET) to assess the validity of the DSH model. (i) By using molecular dynamics, two different approaches, all-atom simulated annealing and coarse-grained simulation, show that initial ellipsoidal DSH particles rapidly collapse to discoidal bilayer structures. These results suggest that, compatible with current knowledge of lipid phase diagrams, apoA-I cannot stabilize hexagonal I phase particles of phospholipid. (ii) By using EM, two different approaches, negative stain and cryo-EM tomography, show that reconstituted apoA-I/HDL particles are discoidal in shape. (iii) By using FRET, reconstituted apoA-I/HDL particles show a 28–34-? intermolecular separation between terminal domain residues 40 and 240, a distance that is incompatible with the dimensions of the DSH model. Therefore, we suggest that, although novel, the DSH model is energetically unfavorable and not likely to be correct. Rather, we conclude that all evidence supports the likelihood that reconstituted apoA-I/HDL particles, in general, are discoidal in shape. PMID:20974855

  12. Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.

    PubMed

    López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

    2013-11-01

    This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. PMID:24076304

  13. Modeling the Arm II core in MicroCap IV

    SciTech Connect

    Dalton, A.C.

    1996-11-01

    This paper reports on how an electrical model for the core of the Arm II machine was created and how to use this model. We wanted to get a model for the electrical characteristics of the ARM II core, in order to simulate this machine and to assist in the design of a future machine. We wanted this model to be able to simulate saturation, variable loss, and reset. Using the Hodgdon model and the circuit analysis program MicroCap IV, this was accomplished. This paper is written in such a way as to allow someone not familiar with the project to understand it.

  14. Development of a Hydraulic Manipulator Servoactuator Model: Simulation and Experimental Validation

    E-print Network

    Papadopoulos, Evangelos

    Development of a Hydraulic Manipulator Servoactuator Model: Simulation and Experimental Validation Abstract In this paper, modelling and identification of a hydraulic servoactuator system is presented, leakage, and load dynamics. System parameters are identified based on a high-performance hydraulic

  15. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    NASA Astrophysics Data System (ADS)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the repeatability of such catastrophe losses in the country. The validation process also included collaboration between Aon Benfield and its client in order to consider the insurance market penetration in Algeria estimated approximately at 5%. Thus, we believe that the applied approach led towards the production of an earthquake model for Algeria that is scientifically sound and reliable from one side and market and client oriented on the other side.

  16. ShipIR model validation using NATO SIMVEX experiment results

    NASA Astrophysics Data System (ADS)

    Fraedrich, Doug S.; Stark, Espen; Heen, Lars T.; Miller, Craig

    2003-09-01

    An infrared field trial has been conducted by a NATO science panel on IR ship signatures, TG-16. This trial was planned, designed and executed for the expressed purpose of the validation of predictive IR ship signature simulations. The details of the trial were dictated by a thoughtful validation methodology, which exploits the concept of "experimental precision." Two governmental defense laboratories, the Norwegian Defence Research Establishment and the US Naval Research Laboratory have used this trial data to perform a validation analysis on the ShipIR IR signature code. This analysis quantifies prediction accuracy of the current versions of the code and identifies specific portions of the code that need to be upgraded to improve prediction accuracy.

  17. ORIGINAL PAPER Modeling Heart Rate Regulation--Part II: Parameter

    E-print Network

    Olufsen, Mette Sofie

    with 17 parameters. The model uses blood pressure data as input topredict the heart rate whileORIGINAL PAPER Modeling Heart Rate Regulation--Part II: Parameter Identification and Analysis K. R of this study we introduced a 17- parameter model that can predict heart rate regulation during postural change

  18. Cross-validation pitfalls when selecting and assessing regression and classification models

    PubMed Central

    2014-01-01

    Background We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. Methods We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. Results We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. Conclusions We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error. PMID:24678909

  19. Design of embedded systems: formal models, validation, and synthesis

    Microsoft Academic Search

    Stephen Edwards; Luciano Lavagno; Edward A. Lee; Alberto Sangiovanni-Vincentelli

    1997-01-01

    This paper addresses the design of reactive real-time embedded systems. Such systems are often heterogeneous in implementation technologies and design styles, for example by combining hardware application-specific integrated circuits (ASICs) with embedded software. The concurrent design process for such embedded systems involves solving the specification, validation, and synthesis problems. We review the variety of approaches to these problems that have

  20. Examining the Reliability and Validity of Clinician Ratings on the Five-Factor Model Score Sheet

    PubMed Central

    Few, Lauren R.; Miller, Joshua D.; Morse, Jennifer Q.; Yaggi, Kirsten E.; Reynolds, Sarah K.; Pilkonis, Paul A.

    2013-01-01

    Despite substantial research use, measures of the five-factor model (FFM) are infrequently used in clinical settings due, in part, to issues related to administration time and a reluctance to use self-report instruments. The current study examines the reliability and validity of the Five-Factor Model Score Sheet (FFMSS), which is a 30-item clinician rating form designed to assess the five domains and 30 facets of one conceptualization of the FFM. Studied in a sample of 130 outpatients, clinical raters demonstrated reasonably good interrater reliability across personality profiles and the domains manifested good internal consistency with the exception of Neuroticism. The FFMSS ratings also evinced expected relations with self-reported personality traits (e.g., FFMSS Extraversion and Schedule for Nonadaptive and Adaptive Personality Positive Temperament) and consensus-rated personality disorder symptoms (e.g., FFMSS Agreeableness and Narcissistic Personality Disorder). Finally, on average, the FFMSS domains were able to account for approximately 50% of the variance in domains of functioning (e.g., occupational, parental) and were even able to account for variance after controlling for Axis I and Axis II pathology. Given these findings, it is believed that the FFMSS holds promise for clinical use. PMID:20519735

  1. Computational modeling and validation of intraventricular flow in a simple model of the left ventricle

    NASA Astrophysics Data System (ADS)

    Vedula, Vijay; Fortini, Stefania; Seo, Jung-Hee; Querzoli, Giorgio; Mittal, Rajat

    2014-12-01

    Simulations of flow inside a laboratory model of the left ventricle are validated against experiments. The simulations employ an immersed boundary-based method for flowmodeling, and the computationalmodel of the expanding-contracting ventricle is constructed via image-segmentation. A quantitative comparison of the phase-averaged velocity and vorticity fields between the simulation and the experiment shows a reasonable agreement, given the inherent uncertainties in the modeling procedure. Simulations also exhibit a good agreement in terms of time-varying net circulation, as well as clinically important metrics such as flow-wave propagation velocity and its ratio with peak early-wave flow velocity. The detailed and critical assessment of this comparison is used to identify and discuss the key challenges that are faced in such a validation study.

  2. Quantum theory of the Bianchi II model

    NASA Astrophysics Data System (ADS)

    Bergeron, Hervé; Hrycyna, Orest; Ma?kiewicz, Przemys?aw; Piechocki, W?odzimierz

    2014-08-01

    We describe the quantum evolution of the vacuum Bianchi II universe in terms of the transition amplitude between two asymptotic quantum Kasner-like states. For large values of the momentum variable, the classical and quantum calculations give similar results. The difference occurs for small values of this variable due to the Heisenberg uncertainty principle. Our results can be used, to some extent, as a building block of the quantum evolution of the vacuum Bianchi IX universe.

  3. A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection

    Microsoft Academic Search

    Ron Kohavi

    1995-01-01

    We review accuracy estimation methods and compare the two most common methods cross- validation and bootstrap Recent experimen­ tal results on artificial data and theoretical re cults m restricted settings have shown that for selecting a good classifier from a set of classi­ fiers (model selection), ten-fold cross-validation may be better than the more expensive ka\\\\p one-out cross-validation We report

  4. 78 FR 5765 - Wireline Competition Bureau Releases Connect America Phase II Cost Model Virtual Workshop...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ...Wireline Competition Bureau Releases Connect America Phase II Cost Model Virtual Workshop Discussion...forward-looking cost model for Connect America Phase II. DATES: Comments are due on...currently provided, known as Connect America Phase II. The Commission delegated...

  5. Test cell modeling and optimization for FPD-II

    SciTech Connect

    Haney, S.W.; Fenstermacher, M.E.

    1985-04-10

    The Fusion Power Demonstration, Configuration II (FPD-II), will ba a DT burning tandem mirror facility with thermal barriers, designed as the next step engineering test reactor (ETR) to follow the tandem mirror ignition test machines. Current plans call for FPD-II to be a multi-purpose device. For approximately the first half of its lifetime, it will operate as a high-Q ignition machine designed to reach or exceed engineering break-even and to demonstrate the technological feasibility of tandem mirror fusion. The second half of its operation will focus on the evaluation of candidate reactor blanket designs using a neutral beam driven test cell inserted at the midplane of the 90 m long cell. This machine called FPD-II+T, uses an insert configuration similar to that used in the MFTF-..cap alpha..+T study. The modeling and optimization of FPD-II+T are the topic of the present paper.

  6. Schizosaccharomyces pombe and its Ni(II)-insensitive mutant GA1 in Ni(II) uptake from aqueous solutions: a biodynamic model.

    PubMed

    Sayar, Nihat Alpagu; Durmaz-Sam, Selcen; Kazan, Dilek; Sayar, Ahmet Alp

    2014-08-01

    In the present study, Ni(II) uptake from aqueous solution by living cells of the Schizosaccharomyces pombe haploid 972 with h (-) mating type and a Ni(II)-insensitive mutant GA1 derived from 972 was investigated at various initial glucose and Ni(II) concentrations. A biodynamic model was developed to predict the unsteady and steady-state phases of the uptake process. Gompertz growth and uptake process parameters were optimized to predict the maximum growth rate ? m and the process metric C r, the remaining Ni(II) content in the aqueous solution. The simulated overall metal uptake values were found to be in acceptable agreement with experimental results. The model validation was done through regression statistics and uncertainty and sensitivity analyses. To gain insight into the phenomenon of Ni(II) uptake by wild-type and mutant S. pombe, probable active and passive metal transport mechanisms in yeast cells were discussed in view of the simulation results. The present work revealed the potential of mutant GA1 to remove Ni(II) cations from aqueous media. The results obtained provided new insights for understanding the combined effect of biosorption and bioaccumulation processes for metal removal and offered a possibility for the use of growing mutant S. pombe cell in bioremediation. PMID:24752843

  7. Systematic clinical methodology for validating bipolar-II disorder: data in mid-stream from a French national multi-site study (EPIDEP)

    Microsoft Academic Search

    Elie G Hantouche; Hagop S Akiskal; Sylvie Lancrenon; Jean-François Allilaire; Daniel Sechter; Jean-Michel Azorin; Marc Bourgeois; Jean-Philippe Fraud; Liliane Châtenet-Duchêne

    1998-01-01

    Background: This paper presents the methodology and clinical data in mid-stream from a French multi-center study (EPIDEP) in progress on a national sample of patients with DSM-IV major depressive episode (MDE). The aim of EPIDEP is to show the feasibility of validating the spectrum of soft bipolar disorders by practising clinicians. In this report, we focus on bipolar II (BP-II).

  8. A Validated Model of Calf Compression and Deep Vessel Collapse During External Cuff Inflation

    Microsoft Academic Search

    A. J. Narracott; G. W. John; R. J. Morris; J. P. Woodcock; D. R. Hose; P. V. Lawford

    2009-01-01

    This paper presents a validated model of calf compression with an external pressure cuff as used for deep vein thrombosis. Magnetic resonance (MR) images of calf geometry were used to generate subject-specific finite-element (FE) models of the calf cross section. Ultrasound images of deep vessel collapse obtained through a water-filled cuff were used to validate model behavior. Calf\\/cuff pressure interface

  9. The Sandia MEMS Passive Shock Sensor : FY08 testing for functionality, model validation, and technology readiness.

    SciTech Connect

    Walraven, Jeremy Allen; Blecke, Jill; Baker, Michael Sean; Clemens, Rebecca C.; Mitchell, John Anthony; Brake, Matthew Robert; Epp, David S.; Wittwer, Jonathan W.

    2008-10-01

    This report summarizes the functional, model validation, and technology readiness testing of the Sandia MEMS Passive Shock Sensor in FY08. Functional testing of a large number of revision 4 parts showed robust and consistent performance. Model validation testing helped tune the models to match data well and identified several areas for future investigation related to high frequency sensitivity and thermal effects. Finally, technology readiness testing demonstrated the integrated elements of the sensor under realistic environments.

  10. Development and validation of instantaneous risk model in nuclear power plant's risk monitor

    SciTech Connect

    Wang, J.; Li, Y.; Wang, F.; Wang, J.; Hu, L. [Inst. of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); School of Nuclear Science and Technology, Univ. of Science and Technology of China, Hefei, Anhui, 230031 (China)

    2012-07-01

    The instantaneous risk model is the fundament of calculation and analysis in a risk monitor. This study focused on the development and validation of an instantaneous risk model. Therefore the principles converting from the baseline risk model to the instantaneous risk model were studied and separated trains' failure modes modeling method was developed. The development and validation process in an operating nuclear power plant's risk monitor were also introduced. Correctness of instantaneous risk model and rationality of converting method were demonstrated by comparison with the result of baseline risk model. (authors)

  11. Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--7.

    PubMed

    Eddy, David M; Hollingworth, William; Caro, J Jaime; Tsevat, Joel; McDonald, Kathryn M; Wong, John B

    2012-01-01

    Trust and confidence are critical to the success of health care models. There are two main methods for achieving this: transparency (people can see how the model is built) and validation (how well the model reproduces reality). This report describes recommendations for achieving transparency and validation developed by a taskforce appointed by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. Recommendations were developed iteratively by the authors. A nontechnical description--including model type, intended applications, funding sources, structure, intended uses, inputs, outputs, other components that determine function, and their relationships, data sources, validation methods, results, and limitations--should be made available to anyone. Technical documentation, written in sufficient detail to enable a reader with necessary expertise to evaluate the model and potentially reproduce it, should be made available openly or under agreements that protect intellectual property, at the discretion of the modelers. Validation involves face validity (wherein experts evaluate model structure, data sources, assumptions, and results), verification or internal validity (check accuracy of coding), cross validity (comparison of results with other models analyzing the same problem), external validity (comparing model results with real-world results), and predictive validity (comparing model results with prospectively observed events). The last two are the strongest form of validation. Each section of this article contains a number of recommendations that were iterated among the authors, as well as among the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. PMID:22999134

  12. Understanding Student Teachers' Behavioural Intention to Use Technology: Technology Acceptance Model (TAM) Validation and Testing

    ERIC Educational Resources Information Center

    Wong, Kung-Teck; Osman, Rosma bt; Goh, Pauline Swee Choo; Rahmat, Mohd Khairezan

    2013-01-01

    This study sets out to validate and test the Technology Acceptance Model (TAM) in the context of Malaysian student teachers' integration of their technology in teaching and learning. To establish factorial validity, data collected from 302 respondents were tested against the TAM using confirmatory factor analysis (CFA), and structural equation…

  13. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  14. Validation of a New Conceptual Model of School Connectedness and Its Assessment Measure

    ERIC Educational Resources Information Center

    Hirao, Katsura

    2011-01-01

    A self-report assessment scale of school connectedness was validated in this study based on the data from middle-school children in a northeastern state of the United States (n = 145). The scale was based on the School Bonding Model (Morita, 1991), which was derived reductively from the social control (bond) theory (Hirschi, 1969). This validation

  15. The Validity of Classical Nucleation Theory for Ising Models Seunghwa Ryu1

    E-print Network

    Cai, Wei

    . The steady-state solution of the Markov chain predicts the nucleation rate to be I = f+ c exp - Fc kBT (1The Validity of Classical Nucleation Theory for Ising Models Seunghwa Ryu1 and Wei Cai2 1 theory (CNT) is widely used to predict the rate of first-order phase transitions, its validity has been

  16. Validated risk stratification model accurately predicts low risk in patients with unstable angina

    Microsoft Academic Search

    James E Calvin; Lloyd W Klein; Elizabeth J VandenBerg; Peter Meyer; Joseph E Parrillo

    2000-01-01

    BACKGROUNDIn the mid 1990s, two unstable angina risk prediction models were proposed but neither has been validated on separate population or compared.OBJECTIVESThe purpose of this study was to compare patient outcome among high, medium and low risk unstable angina patients defined by the Agency for Health Care Policy and Research (AHCPR) guideline to similar risk groups defined by a validated

  17. Validating the Model of Lifestyle Balance on a Working Swedish Population

    Microsoft Academic Search

    Petra Wagman; Carita Håkansson; Kathleen M. Matuska; Anita Björklund; Torbjörn Falkmer

    2012-01-01

    An analysis of data from a previously conducted grounded theory study exploring perceptions of life balance among 19 working adults without recent long term sick leave was carried out. The aim of this secondary analysis was to use these perceptions of life balance to validate the Model of Lifestyle Balance proposed by Matuska and Christiansen. For the validation, a matrix

  18. Validating the Model of Lifestyle Balance on a Working Swedish Population

    Microsoft Academic Search

    Petra Wagman; Carita Håkansson; Kathleen M. Matuska; Anita Björklund; Torbjörn Falkmer

    2011-01-01

    An analysis of data from a previously conducted grounded theory study exploring perceptions of life balance among 19 working adults without recent long term sick leave was carried out. The aim of this secondary analysis was to use these perceptions of life balance to validate the Model of Lifestyle Balance proposed by Matuska and Christiansen. For the validation, a matrix

  19. Steam generator steady-state model for on-line data validation. [LMFBR

    SciTech Connect

    Tzanos, C.P.

    1984-01-01

    To develop an efficient algorithm for on-line plant-wide data validation and fault identification fast running computer models that adequately describe the different plant processes are required. For example, if the data validation interval is of the order of one second, these models must be running faster than one second. This paper presents a fast running model for steady-state analysis of a once-through LMFBR steam generator. In computer codes like DSNP and SASSYS, the computation time for steady-state analysis of a typical once-through LMFBR steam generator is approx. 5 to 7 seconds. This time imposes excessively long validation intervals.

  20. Criterion for evaluating the predictive ability of nonlinear regression models without cross-validation.

    PubMed

    Kaneko, Hiromasa; Funatsu, Kimito

    2013-09-23

    We propose predictive performance criteria for nonlinear regression models without cross-validation. The proposed criteria are the determination coefficient and the root-mean-square error for the midpoints between k-nearest-neighbor data points. These criteria can be used to evaluate predictive ability after the regression models are updated, whereas cross-validation cannot be performed in such a situation. The proposed method is effective and helpful in handling big data when cross-validation cannot be applied. By analyzing data from numerical simulations and quantitative structural relationships, we confirm that the proposed criteria enable the predictive ability of the nonlinear regression models to be appropriately quantified. PMID:23971910

  1. Composing Different Models of Computation in Kepler and Ptolemy II

    Microsoft Academic Search

    Antoon Goderis; Christopher Brooks; Ilkay Altintas; Edward A. Lee; Carole A. Goble

    2007-01-01

    A model of computation (MoC) is a formal abstraction of execution in a computer. There is a need for composing MoCs in e-science. Kepler, which is based on Ptolemy II, is a scientific workflow environment that allows for MoC composition. This paper explains how MoCs are combined in Kepler and Ptolemy II and analyzes which combinations of MoCs are currently

  2. A free wake vortex lattice model for vertical axis wind turbines: Modeling, verification and validation

    NASA Astrophysics Data System (ADS)

    Meng, Fanzhong; Schwarze, Holger; Vorpahl, Fabian; Strobel, Michael

    2014-12-01

    Since the 1970s several research activities had been carried out on developing aerodynamic models for Vertical Axis Wind Turbines (VAWTs). In order to design large VAWTs of MW scale, more accurate aerodynamic calculation is required to predict their aero-elastic behaviours. In this paper, a 3D free wake vortex lattice model for VAWTs is developed, verified and validated. Comparisons to the experimental results show that the 3D free wake vortex lattice model developed is capable of making an accurate prediction of the general performance and the instantaneous aerodynamic forces on the blades. The comparison between momentum method and the vortex lattice model shows that free wake vortex models are needed for detailed loads calculation and for calculating highly loaded rotors.

  3. Validation of mechanical models for reinforced concrete structures: Presentation of the French project ``Benchmark des Poutres de la Rance''

    NASA Astrophysics Data System (ADS)

    L'Hostis, V.; Brunet, C.; Poupard, O.; Petre-Lazar, I.

    2006-11-01

    Several ageing models are available for the prediction of the mechanical consequences of rebar corrosion. They are used for service life prediction of reinforced concrete structures. Concerning corrosion diagnosis of reinforced concrete, some Non Destructive Testing (NDT) tools have been developed, and have been in use for some years. However, these developments require validation on existing concrete structures. The French project “Benchmark des Poutres de la Rance” contributes to this aspect. It has two main objectives: (i) validation of mechanical models to estimate the influence of rebar corrosion on the load bearing capacity of a structure, (ii) qualification of the use of the NDT results to collect information on steel corrosion within reinforced-concrete structures. Ten French and European institutions from both academic research laboratories and industrial companies contributed during the years 2004 and 2005. This paper presents the project that was divided into several work packages: (i) the reinforced concrete beams were characterized from non-destructive testing tools, (ii) the mechanical behaviour of the beams was experimentally tested, (iii) complementary laboratory analysis were performed and (iv) finally numerical simulations results were compared to the experimental results obtained with the mechanical tests.

  4. Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-7.

    PubMed

    Eddy, David M; Hollingworth, William; Caro, J Jaime; Tsevat, Joel; McDonald, Kathryn M; Wong, John B

    2012-01-01

    Trust and confidence are critical to the success of health care models. There are two main methods for achieving this: transparency (people can see how the model is built) and validation (how well it reproduces reality). This report describes recommendations for achieving transparency and validation, developed by a task force appointed by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM). Recommendations were developed iteratively by the authors. A nontechnical description should be made available to anyone-including model type and intended applications; funding sources; structure; inputs, outputs, other components that determine function, and their relationships; data sources; validation methods and results; and limitations. Technical documentation, written in sufficient detail to enable a reader with necessary expertise to evaluate the model and potentially reproduce it, should be made available openly or under agreements that protect intellectual property, at the discretion of the modelers. Validation involves face validity (wherein experts evaluate model structure, data sources, assumptions, and results), verification or internal validity (check accuracy of coding), cross validity (comparison of results with other models analyzing same problem), external validity (comparing model results to real-world results), and predictive validity (comparing model results with prospectively observed events). The last two are the strongest form of validation. Each section of this paper contains a number of recommendations that were iterated among the authors, as well as the wider modeling task force jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. PMID:22990088

  5. A multi-source satellite data approach for modelling Lake Turkana water level: calibration and validation using satellite altimetry data

    NASA Astrophysics Data System (ADS)

    Velpuri, N. M.; Senay, G. B.; Asante, K. O.

    2012-01-01

    Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of inter- and intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellite-driven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2 m. The lake level fluctuated in the range up to 4 m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins.

  6. A multi-source satellite data approach for modelling Lake Turkana water level: Calibration and validation using satellite altimetry data

    USGS Publications Warehouse

    Velpuri, N.M.; Senay, G.B.; Asante, K.O.

    2012-01-01

    Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of interand intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellitedriven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2m. The lake level fluctuated in the range up to 4m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance model for (i) quantitative assessment of the impact of basin developmental activities on lake levels and for (ii) forecasting lake level changes and their impact on fisheries. From this study, we suggest that globally available satellite altimetry data provide a unique opportunity for calibration and validation of hydrologic models in ungauged basins. ?? Author(s) 2012.

  7. Verification and Validation of Simulation Models and Applications: A Methodological Approach

    NASA Astrophysics Data System (ADS)

    Wang, Zhongshi; Lehmann, Axel

    Under pressure, time- and cost-constraints, the importance of modelling and simulation (M&S) techniques for analyses of dynamic systems behaviour is permanently increasing. With respect to the permanent increase of embedding and networking of computing and telecommunication systems, the complexity of real systems applications is permanently increasing. As a consequence, the complexity of models and simulation applications is also increasing, and the urgent demand for developing appropriate verification and validation methods, techniques, and tools to guarantee the models credibility is viable. The basic requirement for analysis of a model's credibility is to verify the model's correctness, and to validate its validity with respect to predefined application purposes and validity criteria. This requires that the different design, development and application phases of M&S are well specified, and the results of each phase are well documented.

  8. Modeling and Validation of a Three-Stage Solidification Model for Sprays

    NASA Astrophysics Data System (ADS)

    Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

    2010-09-01

    A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

  9. Transient PVT measurements and model predictions for vessel heat transfer. Part II.

    SciTech Connect

    Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.; Evans, Gregory Herbert; Rice, Steven F.

    2010-07-01

    Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models in which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.

  10. Cephalopod coloration model. II. Multiple layer skin effects

    E-print Network

    Hanlon, Roger T.

    Cephalopod coloration model. II. Multiple layer skin effects Richard L. Sutherland,1,2, * Lydia M A mathematical model of multiple layer skin coloration in cephalopods, a class of aquatic animals, is presented underlying cephalopod coloration is expected to yield insights into their possible functions. © 2008 Optical

  11. Multiperiod Multiproduct Advertising Budgeting. Part II: Stochastic Optimization Modeling

    E-print Network

    Beltran-Royo, Cesar

    Multiperiod Multiproduct Advertising Budgeting. Part II: Stochastic Optimization Modeling C for the Multiperiod Multiproduct Advertising Budgeting problem, so that the expected profit of the advertising of standard opti- mization software. The model has been tested for planning a realistic advertising campaign

  12. TILTING SATURN. II. NUMERICAL MODEL Douglas P. Hamilton

    E-print Network

    Hamilton, Douglas P.

    TILTING SATURN. II. NUMERICAL MODEL Douglas P. Hamilton Department of Astronomy, University@boulder.swri.edu Receivved 2003 December 30; accepted 2004 July 15 ABSTRACT We argue that the gas giants Jupiter and Saturn of Saturn's rotation to that of Neptune's orbit. Strong support for this model comes from (1) a near- match

  13. Tilting Saturn II. Numerical Model Douglas P. Hamilton

    E-print Network

    Hamilton, Douglas P.

    Tilting Saturn II. Numerical Model Douglas P. Hamilton Astronomy Department, University of Maryland and Saturn were both formed with their rotation axes nearly perpendicular to their orbital planes of Saturn's rotation to that of Neptune's orbit. Strong support for this model comes from i) a near match

  14. Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU

    Microsoft Academic Search

    Y. C. Ko; L. W. Hu; A. P. Olson; F. E. Dunn

    2007-01-01

    An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced

  15. Validation of the MULCH-II code for thermal-hydraulic safety analysis of the MIT research reactor conversion to LEU

    Microsoft Academic Search

    Y.-C. Ko; L.-W. Hu; Arne P. Olson; Floyd E. Dunn

    2008-01-01

    An in-house thermal hydraulics code was developed for the steady-state and loss of primary flow analysis of the MIT Research Reactor (MITR). This code is designated as MULti-CHannel-II or MULCH-II. The MULCH-II code is being used for the MITR LEU conversion design study. Features of the MULCH-II code include a multi-channel analysis, the capability to model the transition from forced

  16. High-speed AMB machining spindle model updating and model validation

    NASA Astrophysics Data System (ADS)

    Wroblewski, Adam C.; Sawicki, Jerzy T.; Pesch, Alexander H.

    2011-04-01

    High-Speed Machining (HSM) spindles equipped with Active Magnetic Bearings (AMBs) have been envisioned to be capable of automated self-identification and self-optimization in efforts to accurately calculate parameters for stable high-speed machining operation. With this in mind, this work presents rotor model development accompanied by automated model-updating methodology followed by updated model validation. The model updating methodology is developed to address the dynamic inaccuracies of the nominal open-loop plant model when compared with experimental open-loop transfer function data obtained by the built in AMB sensors. The nominal open-loop model is altered by utilizing an unconstrained optimization algorithm to adjust only parameters that are a result of engineering assumptions and simplifications, in this case Young's modulus of selected finite elements. Minimizing the error of both resonance and anti-resonance frequencies simultaneously (between model and experimental data) takes into account rotor natural frequencies and mode shape information. To verify the predictive ability of the updated rotor model, its performance is assessed at the tool location which is independent of the experimental transfer function data used in model updating procedures. Verification of the updated model is carried out with complementary temporal and spatial response comparisons substantiating that the updating methodology is effective for derivation of open-loop models for predictive use.

  17. WIZER: What-If Analyzer for Automated Social Model Space Exploration and Validation

    E-print Network

    Sadeh, Norman M.

    WIZER: What-If Analyzer for Automated Social Model Space Exploration and Validation Alex Yahja, or the US Government. #12;WIZER: What-If Analyzer for Automated Social Model Space Exploration and model space exploration [Prietula, Carley & Gasser 1998]. One of the computational models is the multi

  18. Resampling methods for meta-model validation with recommendations for evolutionary computation.

    PubMed

    Bischl, B; Mersmann, O; Trautmann, H; Weihs, C

    2012-01-01

    Meta-modeling has become a crucial tool in solving expensive optimization problems. Much of the work in the past has focused on finding a good regression method to model the fitness function. Examples include classical linear regression, splines, neural networks, Kriging and support vector regression. This paper specifically draws attention to the fact that assessing model accuracy is a crucial aspect in the meta-modeling framework. Resampling strategies such as cross-validation, subsampling, bootstrapping, and nested resampling are prominent methods for model validation and are systematically discussed with respect to possible pitfalls, shortcomings, and specific features. A survey of meta-modeling techniques within evolutionary optimization is provided. In addition, practical examples illustrating some of the pitfalls associated with model selection and performance assessment are presented. Finally, recommendations are given for choosing a model validation technique for a particular setting. PMID:22339368

  19. On the validity of 3D polymer gel dosimetry: II. Physico-chemical effects

    NASA Astrophysics Data System (ADS)

    Vandecasteele, Jan; De Deene, Yves

    2013-01-01

    This study quantifies some major physico-chemical factors that influence the validity of MRI (PAGAT) polymer gel dosimetry: temperature history (pre-, during and post-irradiation), oxygen exposure (post-irradiation) and volumetric effects (experiment with phantom in which a small test tube is inserted). Present results confirm the effects of thermal history prior to irradiation. By exposing a polymer gel sample to a linear temperature gradient of ˜2.8 °C cm-1 and following the dose deviation as a function of post-irradiation time new insights into temporal variations were added. A clear influence of the temperature treatment on the measured dose distribution is seen during the first hours post-irradiation (resulting in dose deviations up to 12%). This effect diminishes to 5% after 54 h post-irradiation. Imposing a temperature offset (maximum 6 °C for 3 h) during and following irradiation on a series of calibration phantoms results in only a small dose deviation of maximum 4%. Surprisingly, oxygen diffusing in a gel dosimeter up to 48 h post-irradiation was shown to have no effect. Volumetric effects were studied by comparing the dose distribution in a homogeneous phantom compared to the dose distribution in a phantom in which a small test tube was inserted. This study showed that the dose measured inside the test tube was closer to the ion chamber measurement in comparison to the reference phantom without test tube by almost 7%. It is demonstrated that physico-chemical effects are not the major causes for the dose discrepancies encountered in the reproducibility study discussed in the concurrent paper (Vandecasteele and De Deene 2013a Phys. Med. Biol. 58 19-42). However, it is concluded that these physico-chemical effects are important factors that should be addressed to further improve the dosimetric accuracy of 3D MRI polymer gel dosimetry. Both authors contributed equally to this study.

  20. On the validity of 3D polymer gel dosimetry: II. physico-chemical effects.

    PubMed

    Vandecasteele, Jan; De Deene, Yves

    2013-01-01

    This study quantifies some major physico-chemical factors that influence the validity of MRI (PAGAT) polymer gel dosimetry: temperature history (pre-, during and post-irradiation), oxygen exposure (post-irradiation) and volumetric effects (experiment with phantom in which a small test tube is inserted). Present results confirm the effects of thermal history prior to irradiation. By exposing a polymer gel sample to a linear temperature gradient of ?2.8 °C cm?¹ and following the dose deviation as a function of post-irradiation time new insights into temporal variations were added. A clear influence of the temperature treatment on the measured dose distribution is seen during the first hours post-irradiation (resulting in dose deviations up to 12%). This effect diminishes to 5% after 54 h post-irradiation. Imposing a temperature offset (maximum 6 °C for 3 h) during and following irradiation on a series of calibration phantoms results in only a small dose deviation of maximum 4%. Surprisingly, oxygen diffusing in a gel dosimeter up to 48 h post-irradiation was shown to have no effect. Volumetric effects were studied by comparing the dose distribution in a homogeneous phantom compared to the dose distribution in a phantom in which a small test tube was inserted. This study showed that the dose measured inside the test tube was closer to the ion chamber measurement in comparison to the reference phantom without test tube by almost 7%. It is demonstrated that physico-chemical effects are not the major causes for the dose discrepancies encountered in the reproducibility study discussed in the concurrent paper (Vandecasteele and De Deene 2013a Phys. Med. Biol. 58 19-42). However, it is concluded that these physico-chemical effects are important factors that should be addressed to further improve the dosimetric accuracy of 3D MRI polymer gel dosimetry. PMID:23221322

  1. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, Mohammed Omair

    2012-01-01

    Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

  2. Modeling, validation and system identification of a natural gas engine

    Microsoft Academic Search

    Anupam Gangopadhyay; Peter Meckl

    1997-01-01

    In this paper, a model of a central fuel injected natural gas engine with transmission is developed and linear system identification is carried out to identify key model parameters that could lead to automated identification of transmission dynamics. The paper has two major components. First, the natural gas engine is modeled with an extension of the mean value engine model

  3. Development and validation of a tokamak skin effect transformer model

    Microsoft Academic Search

    J. A. Romero; J.-M. Moret; S. Coda; F. Felici; I. Garrido

    2012-01-01

    A lumped parameter, state space model for a tokamak transformer including the slow flux penetration in the plasma (skin effect transformer model) is presented. The model does not require detailed or explicit information about plasma profiles or geometry. Instead, this information is lumped in system variables, parameters and inputs. The model has an exact mathematical structure built from energy and

  4. Enhancement and Validation of the IDES Orbital Debris Environment Model

    Microsoft Academic Search

    R. Walker; P. H. Stokes; J. E. Wilkinson; G. G. Swinerd

    1999-01-01

    Orbital debris environment models are essential in predicting the characteristics of the entire debris environment, especially for altitude and size regimes where measurement data is sparse. Most models are also used to assess mission collision risk. The IDES (Integrated Debris Evolution Suite) simulation model has recently been upgraded by including a new sodium–potassium liquid coolant droplet source model and a

  5. Dosimetric validation of first helical tomotherapy Hi-Art II machine in India

    PubMed Central

    Kinhikar, Rajesh A.; Jamema, Swamidas V.; Reenadevi; Pai, Rajeshri; Zubin, Master; Gupta, Tejpal; Dhote, Deepak S.; Deshpande, Deepak D.; Shrivastava, Shyam K.; Sarin, Rajiv

    2009-01-01

    A Helical Tomotherapy (HT) Hi-Art II machine, Hi ART (TomoTherapy, Inc., Madison, WI, USA) was installed at our center in July 2007, and was the first machine in India. Image-guided HT is a new modality for delivering intensity modulated radiotherapy (IMRT). Dosimetric tests done include (a) primary beam alignment (b) secondary beam alignment (c) water tank measurements (profiles and depth doses) (d) dose rate measurements (e) IMRT verification, and (f) Mega voltage Computed Tomography (MVCT) dose. Primary and secondary beam alignment revealed an acceptable linear accelerator (linac) alignment in both X and Y axes. In addition, it was observed that the beam was aligned in the same plane as gantry and the jaws were not twisted with respect to gantry. The rotational beam stability was acceptable. Multi-leaf collimators (MLC) were found to be stable and properly aligned with the radiation plane. The jaw alignment during gantry rotation was satisfactory. Transverse and longitudinal profiles were in good agreement with the “Gold” standard. During IMRT verification, the variation between the measured and calculated dose for a particular plan at the central and off-axis was found to be within 2% and 1mm in position, respectively. The dose delivered during the TomoImage scan was found to be 2.57 cGy. The Helical Tomotherapy system is mechanically stable and found to be acceptable for clinical treatment. It is recommended that the output of the machine should be measured on a daily basis to monitor the fluctuations in output. PMID:20126562

  6. BRE large compartment fire tests – characterising post-flashover fires for model validation 

    E-print Network

    Welch, Stephen; Jowsey, Allan; Deeny, Susan; Morgan, Richard; Torero, Jose L

    2007-01-01

    Reliable and comprehensive measurement data from large-scale fire tests is needed for validation of computer fire models, but is subject to various uncertainties, including radiation errors in temperature measurement. Here, ...

  7. The motivations-attributes-skills-knowledge competency cluster validation model an empirical study

    E-print Network

    Stevens, Jeffery Allen

    2004-09-30

    development and performance measurement tool as well as a communication tool and a blueprint for success for employees. The MIFV is a sequentially upward funneling competency cluster validation model. The MIFV will provide an opportunity for the study...

  8. Climatically Diverse Data Set for Flat-Plate PV Module Model Validations (Presentation)

    SciTech Connect

    Marion, B.

    2013-05-01

    Photovoltaic (PV) module I-V curves were measured at Florida, Colorado, and Oregon locations to provide data for the validation and development of models used for predicting the performance of PV modules.

  9. Biomarker Discovery and Validation for Proteomics and Genomics: Modeling And Systematic Analysis

    E-print Network

    Atashpazgargari, Esmaeil

    2014-08-27

    Discovery and validation of protein biomarkers with high specificity is the main challenge of current proteomics studies. Different mass spectrometry models are used as shotgun tools for discovery of biomarkers which is usually done on a small...

  10. Validation of a vibration and electric model of honeycomb panels equiped with

    E-print Network

    Paris-Sud XI, Université de

    viscoélastiques présents pour le Nomex sont par ailleurs mis en évidence. Du point de vue électrique, on montre validations of the proposed models are discussed in section 4. The Nomex honeycomb considered in the test

  11. Is regression through origin useful in external validation of QSAR models?

    PubMed

    Shayanfar, Ali; Shayanfar, Shadi

    2014-08-01

    The external validation of QSAR models is crucial to ensure their reliability for assessing new chemicals. The most widely used criteria for external validations, which has been applied in hundreds of more recent QSAR studies are the Golbraikh-Tropsha and Roy methods which these criteria are based on the regression through origin (RTO). In this study, the calculations of the deviation parameters such as absolute errors are used for ascertaining the difference between training and test sets to evaluate the prediction capability of the models. However, these results were not in a good agreement with the proposed criteria for external validation and there is an inconsistency in the definition and calculation of r(2) of RTO and therefore the constructed criteria based on RTO is not optimal. Instead, the calculation of model errors for training and test sets and compare them, provide a possible reliable method to external validation of QSAR models. PMID:24721181

  12. Validation of Models for Prediction of BRCA1 and BRCA2 Mutations

    Cancer.gov

    Validation of Models for Prediction of BRCA1 and BRCA2 Mutations Giovanni Parmigiani, The Sidney Kimmel Comprehensive Cancer Center and the Department of Biostatistics, Johns Hopkins University Tara Friebel, Center for Clinical Epidemiology and Biostatistics,

  13. Distinguishing 'new' from 'old' organic carbon in reclaimed coal mine sites using thermogravimetry: II. Field validation

    SciTech Connect

    Maharaj, S.; Barton, C.D.; Karathanasis, T.A.D.; Rowe, H.D.; Rimmer, S.M. [University of Kentucky, Lexington, KY (United States). Dept. of Forestry

    2007-04-15

    Thermogravimetry was used under laboratory conditions to differentiate 'new' and 'old' organic carbon (c) by using grass litter, coal, and limestone to represent the different C fractions. Thermogravimetric and derivative thermogravimetry curves showed pyrolysis peaks at distinctively different temperatures, with the peak for litter occurring at 270 to 395{sup o}C, for coal at 415 to 520 {sup o}C, and for limestone at 700 to 785{sup o}C. To validate this method in a field setting, we studied four reforested coal mine sites in Kentucky representing a chronosequence since reclamation: 0 and 2 years located at Bent Mountain and 3 and 8 years located at the Starfire mine. A nonmined mature (approximate to 80 years old) stand at Robinson Forest, Kentucky, was selected as a reference location. Results indicated a general peak increase in the 270 to 395{sup o}C region with increased time, signifying an increase in the 'new' organic matter (OM) fraction. For the Bent Mountain site, the OM fraction increased from 0.03 to 0.095% between years 0 and 2, whereas the Starfire site showed an increase from 0.095 to 1.47% between years 3 and 8. This equates to a C sequestration rate of 2.92 Mg ha{sup -1} yr{sup -1} for 'new' OM in the upper 10-cm layer during the 8 years of reclamation on eastern Kentucky reclaimed coal mine sites. Results suggest that stable isotopes and elemental data can be used as proxy tools for qualifying soil organic C (SOC) changes over time on the reclaimed coal mine sites but cannot be used to determine the exact SOC accumulation rate. However, results suggested that the thermogravimetric and derivative thermogravimetry methods can be used to quantify SOC accumulation and has the potential to be a more reliable, cost-effective, and rapid means to determine the new organic C fraction in mixed geological material, especially in areas dominated by coal and carbonate materials.

  14. Multiscale analysis and validation of the MODIS LAI product II. Sampling strategy

    E-print Network

    Myneni, Ranga B.

    ) product with emphasis on the sampling strategy for field data collection. Using a hierarchical scene model associated with satellite data- based products. In this paper, the second of a two-part series, we present, we divided 30-m resolution LAI and NDVI images from Maun (Botswana), Harvard Forest (USA

  15. A General Strategy for Physics-Based Model Validation Illustrated with Earthquake Phenomenology, Atmospheric Radiative Transfer, and Computational Fluid Dynamics

    Microsoft Academic Search

    Didier Sornette; Anthony B. Davis; James R. Kamm; Kayo Ide

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. In this article, we survey the model validation literature and propose to

  16. The 183-WSL Fast Rain Rate Retrieval Algorithm. Part II: Validation Using Ground Radar Measurements

    NASA Technical Reports Server (NTRS)

    Laviola, Sante; Levizzani, Vincenzo

    2014-01-01

    The Water vapour Strong Lines at 183 GHz (183-WSL) algorithm is a method for the retrieval of rain rates and precipitation type classification (convectivestratiform), that makes use of the water vapor absorption lines centered at 183.31 GHz of the Advanced Microwave Sounding Unit module B (AMSU-B) and of the Microwave Humidity Sounder (MHS) flying on NOAA-15-18 and NOAA-19Metop-A satellite series, respectively. The characteristics of this algorithm were described in Part I of this paper together with comparisons against analogous precipitation products. The focus of Part II is the analysis of the performance of the 183-WSL technique based on surface radar measurements. The ground truth dataset consists of 2.5 years of rainfall intensity fields from the NIMROD European radar network which covers North-Western Europe. The investigation of the 183-WSL retrieval performance is based on a twofold approach: 1) the dichotomous statistic is used to evaluate the capabilities of the method to identify rain and no-rain clouds; 2) the accuracy statistic is applied to quantify the errors in the estimation of rain rates.The results reveal that the 183-WSL technique shows good skills in the detection of rainno-rain areas and in the quantification of rain rate intensities. The categorical analysis shows annual values of the POD, FAR and HK indices varying in the range 0.80-0.82, 0.330.36 and 0.39-0.46, respectively. The RMSE value is 2.8 millimeters per hour for the whole period despite an overestimation in the retrieved rain rates. Of note is the distribution of the 183-WSL monthly mean rain rate with respect to radar: the seasonal fluctuations of the average rainfalls measured by radar are reproduced by the 183-WSL. However, the retrieval method appears to suffer for the winter seasonal conditions especially when the soil is partially frozen and the surface emissivity drastically changes. This fact is verified observing the discrepancy distribution diagrams where2the 183-WSL performs better during the warm months, while during the winter time the discrepancies with radar measurements tends to maximum values. A stable behavior of the 183-WSL algorithm is demonstrated over the whole study period with an overall overestimation for rain rates intensities lower than 1 millimeter per hour. This threshold is crucial especially in wintertime where the low precipitation regime is difficult to be classified.

  17. Model validation protocol for determining the performance of the terrain-responsive atmospheric code against the Rocky Flats Plant Winter Validation Study

    Microsoft Academic Search

    C. R. Hodgin; M. L. Smith

    1992-01-01

    The objective for this Model Validation Protocol is to establish a plan for quantifying the performance (accuracy and precision) of the Terrain-Responsive Atmospheric Code (TRAC) model. The performance will be determined by comparing model predictions against tracer characteristics observed in the free atmosphere. The Protocol will also be applied to other reference'' dispersion models. The performance of the TRAC model

  18. Model validation protocol for determining the performance of the terrain-responsive atmospheric code against the Rocky Flats Plant Winter Validation Study

    Microsoft Academic Search

    C. R. Hodgin; M. L. Smith

    1992-01-01

    The objective for this Model Validation Protocol is to establish a plan for quantifying the performance (accuracy and precision) of the Terrain-Responsive Atmospheric Code (TRAC) model. The performance will be determined by comparing model predictions against tracer characteristics observed in the free atmosphere. The Protocol will also be applied to other ``reference`` dispersion models. The performance of the TRAC model

  19. Application of CFD techniques toward the validation of nonlinear aerodynamic models

    NASA Technical Reports Server (NTRS)

    Schiff, L. B.; Katz, J.

    1985-01-01

    Applications of computational fluid dynamics (CFD) methods to determine the regimes of applicability of nonlinear models describing the unsteady aerodynamic responses to aircraft flight motions are described. The potential advantages of computational methods over experimental methods are discussed and the concepts underlying mathematical modeling are reviewed. The economic and conceptual advantages of the modeling procedure over coupled, simultaneous solutions of the gas dynamic equations and the vehicle's kinematic equations of motion are discussed. The modeling approach, when valid, eliminates the need for costly repetitive computation of flow field solutions. For the test cases considered, the aerodynamic modeling approach is shown to be valid.

  20. [C II] emission and star formation in late-type galaxies. II. A model

    NASA Astrophysics Data System (ADS)

    Pierini, D.; Leech, K. J.; Völk, H. J.

    2003-01-01

    We study the relationship between gas cooling via the [C II] (lambda = 158 ?m) line emission and dust cooling via the far-IR continuum emission on the global scale of a galaxy in normal (i.e. non-AGN dominated and non-starburst) late-type systems. It is known that the luminosity ratio of total gas and dust cooling, LC II/ LFIR, shows a non-linear behaviour with the equivalent width of the H alpha (lambda = 6563 Å) line emission, the ratio decreasing in galaxies of lower massive star-formation activity. This result holds despite the fact that known individual Galactic and extragalactic sources of the [C II] line emission show different [C II] line-to-far-IR continuum emission ratios. This non-linear behaviour is reproduced by a simple quantitative theoretical model of gas and dust heating from different stellar populations, assuming that the photoelectric effect on dust, induced by far-UV photons, is the dominant mechanism of gas heating in the general diffuse interstellar medium of the galaxies under investigation. According to the model, the global LC II/LFIR provides a direct measure of the fractional amount of non-ionizing UV light in the interstellar radiation field and not of the efficiency of the photoelectric heating. The theory also defines a method to constrain the stellar initial mass function from measurements of LC II and LFIR. A sample of 20 Virgo cluster galaxies observed in the [C II] line with the Long Wavelength Spectrometer on board the Infrared Space Observatory is used to illustrate the model. The limited statistics and the necessary assumptions behind the determination of the global [C II] luminosities from the spatially limited data do not allow us to establish definitive conclusions but data-sets available in the future will allow tests of both the reliability of the assumptions behind our model and the statistical significance of our results. Based on observations with the Infrared Space Observatory (ISO), an ESA project with instruments funded by ESA member states (especially the PI countries: France, Germany, The Netherlands and the UK) and with the participation of ISAS and NASA.

  1. Model-Based Approaches for Validating Business Critical Systems

    Microsoft Academic Search

    Juan Carlos Augusto; Y. Howard; Andrew M. Gravell; Carla Ferreira; Stefan Gruner; Michael Leuschel

    2003-01-01

    Developing a business critical system can involve considerable difficulties. This paper describes part of a new methodology that tackles this problem using co-evolution of models and prototypes to strengthen the relationship between modelling and testing. We illustrate how different modelling frameworks, Promela\\/SPIN and B\\/ProB\\/AtelierB, can be used to implement this idea. As a way to reinforce integration between modelling and

  2. GCR environmental models III: GCR model validation and propagated uncertainties in effective dose

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-04-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z > 2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  3. On-Board Prediction of Power Consumption in Automobile Active Suspension SYSTEMS—II: Validation and Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Ben Mrad, R.; Fassois, S. D.; Levitt, J. A.; Bachrach, B. I.

    1996-03-01

    The focus of this part of the paper is on validation and performance evaluation. The indirect (standard) and novel direct predictors or part I, which use time-recursive realisations and no leading indicators, are critically compared by using the non-linear active suspension system model. The results, constituting the first known comparison between indirect and direct schemes, show similar performance with a slight superiority of the former. Experimental validation is based on an especially developed active suspension vehicle. The power consumption non-stationarity is, in this case, shown to be of the homogeneous type, and completely "masking" the signal's second-order characteristics, which revealed only after the non-stationarity's effective removal. The analysis leads to two distinct types of indirect predictors: An explicit type, based on non-stationary integrated autoregressive moving average models, and an implicit type, based on stationary autoregressive moving average model. The explicit predictor is shown to be uniformly better than the implicit, although the difference is small for short prediction horizons. The experimental results indicate that accurate power consumption prediction is possible, with errors ranging from 2.22% for a prediction horizon of 0.156 s, to still less than 10% for horizons that are up to 0.470 s long, and about 25% for 1.563 s long horizons.

  4. Optical Observations of Meteors Generating Infrasound - II: Weak Shock Theory and Validation

    E-print Network

    Silber, Elizabeth A; Krzeminski, Zbigniew

    2014-01-01

    We have recorded a dataset of 24 centimeter-sized meteoroids detected simultaneously by video and infrasound to critically examine the ReVelle [1974] weak shock meteor infrasound model. We find that the effect of gravity wave perturbations to the wind field and updated absorption coefficients in the linear regime on the initial value of the blast radius (R0), which is the strongly non-linear zone of shock propagation near the body and corresponds to energy deposition per path length, is relatively small. Using optical photometry for ground-truth for energy deposition, we find that the ReVelle model accurately predicts blast radii from infrasound periods ({\\tau}), but systematically under-predicts R0 using pressure amplitude. If the weak shock to linear propagation distortion distance is adjusted as part of the modelling process we are able to self-consistently fit a single blast radius value for amplitude and period. In this case, the distortion distance is always much less (usually just a few percent) than t...

  5. Coupled Two-Dimensional Main-Chain Torsional Potential for Protein Dynamics II: Performance and Validation.

    PubMed

    Gao, Ya; Li, Yongxiu; Mou, Lirong; Hu, Wenxin; Zheng, Jun; Zhang, John Z H; Mei, Ye

    2015-03-19

    The accuracy of force fields is of utmost importance in molecular modeling of proteins. Despite successful applications of force fields for about the past 30 years, some inherent flaws lying in force fields, such as biased secondary propensities and fixed atomic charges, have been observed in different aspects of biomolecular research; hence, a correction to current force fields is desirable. Because of the simplified functional form and the limited number of parameters for main chain torsion (MCT) in traditional force fields, it is not easy to propose an exquisite force field that is well-balanced among various conformations. Recently, AMBER-compatible force fields with coupled MCT term have been proposed, which show some improvement over AMBER03 and AMBER99SB force fields. In this work, further calibration of the torsional parameters has been conducted by changing the solvation model in quantum mechanical calculation and minimizing the deviation from the nuclear magnetic resonance experiments for some benchmark model systems and a folded protein. The results show that the revised force fields give excellent agreement with experiments in J coupling, chemical shifts, and secondary structure populations. In addition, the polarization effect is found to be crucial for the systems with ordered secondary structures. PMID:25719206

  6. KINEROS2/AGWA: Model use, calibration, and validation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. Development and improvement of KINEROS continued from the 1960s on a vari...

  7. Estimation and Q-Matrix Validation for Diagnostic Classification Models

    ERIC Educational Resources Information Center

    Feng, Yuling

    2013-01-01

    Diagnostic classification models (DCMs) are structured latent class models widely discussed in the field of psychometrics. They model subjects' underlying attribute patterns and classify subjects into unobservable groups based on their mastery of attributes required to answer the items correctly. The effective implementation of DCMs depends…

  8. PEP-II vacuum system pressure profile modeling using EXCEL

    SciTech Connect

    Nordby, M.; Perkins, C.

    1994-06-01

    A generic, adaptable Microsoft EXCEL program to simulate molecular flow in beam line vacuum systems is introduced. Modeling using finite-element approximation of the governing differential equation is discussed, as well as error estimation and program capabilities. The ease of use and flexibility of the spreadsheet-based program is demonstrated. PEP-II vacuum system models are reviewed and compared with analytical models.

  9. CheS-Mapper 2.0 for visual validation of (Q)SAR models

    PubMed Central

    2014-01-01

    Background Sound statistical validation is important to evaluate and compare the overall performance of (Q)SAR models. However, classical validation does not support the user in better understanding the properties of the model or the underlying data. Even though, a number of visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allow the investigation of model validation results are still lacking. Results We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. The approach applies the 3D viewer CheS-Mapper, an open-source application for the exploration of small molecules in virtual 3D space. The present work describes the new functionalities in CheS-Mapper 2.0, that facilitate the analysis of (Q)SAR information and allows the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. The approach is generic: It is model-independent and can handle physico-chemical and structural input features as well as quantitative and qualitative endpoints. Conclusions Visual validation with CheS-Mapper enables analyzing (Q)SAR information in the data and indicates how this information is employed by the (Q)SAR model. It reveals, if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org. Graphical abstract Comparing actual and predicted activity values with CheS-Mapper.

  10. Nyala and Bushbuck II: A Harvesting Model.

    ERIC Educational Resources Information Center

    Fay, Temple H.; Greeff, Johanna C.

    1999-01-01

    Adds a cropping or harvesting term to the animal overpopulation model developed in Part I of this article. Investigates various harvesting strategies that might suggest a solution to the overpopulation problem without actually culling any animals. (ASK)

  11. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA?s ?virtual embryo? project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  12. Validation of an effervescent spray model with secondary atomization and its application to modeling of a large-scale furnace

    Microsoft Academic Search

    Jakub Broukal; Ji?í Hájek

    2011-01-01

    The present work consists of a validation attempt of an effervescent spray model with secondary atomization. The objective is the simulation of a 1 MW industrial-type liquid fuel burner equipped with effervescent spray nozzle. The adopted approach is based on a double experimental validation. Firstly, the evolution of radial drop size distributions of an isothermal spray is investigated. Secondly, the spray

  13. NARCCAP Model Validation for the Southeast United States

    NASA Astrophysics Data System (ADS)

    Kabela, E. D.; Carbone, G. J.

    2012-12-01

    Global climate models (GCMs) provide most projections of future climate change. But their coarse resolution limits their use in assessing regional climate change impacts on water resources, environmental quality, forest management, power plant operations, and many other fields. Such assessment requires translating global model output to more local scales. This research investigates dynamically downscaled regional climate model (RCM) output from the North American RegionalClimate Change Assessment Program (NARCCAP) in the Southeast United States. Analysis includes assessments of GCM and RCM performance and skill in the region during a historical reference period (1970-1999), with explanations of sources and magnitude of individual model bias. Three fundamental questions structure the research: 1) How skillful are dynamically downscaled models in simulating minimum and maximum temperature and mean precipitation in ahistorical reference period (1970-1999) for the Southeast United States? 2) What are the magnitude of biases for each NARCCAP member (and variable) and what is the potential source of the bias? 3) Does downscaling improve projections at local scales? In other words, is "value added" in downscaling? Analysis was performed on the states encompassing Alabama, Mississippi, and Tennessee (west sub-region), and Georgia, North Carolina, and South Carolina (east sub-region). Skill was determined using three methods: 1) Computing the overlap in probability density functions (PDF) between observations and models, 2) computing an index of agreement between models and observations, and 3) computing the root mean squared error (RMSE) between observations and models. Most models illustrated high skill for temperature. The outlier models included two RCMs run with the GFDL as their lateral boundary conditions; as these models suffered from a cold maximum temperature bias, attributed to erroneously high soil moisture. Precipitation skill using the PDF and index of agreement methodologies showed high skill for all models; however, the RMSE-based skill showed some models' skill suffered from over estimating the frequency of extreme precipitation events.

  14. A validated mathematical model of tumor growth including tumor-host interaction, cell-mediated immune

    E-print Network

    Rey Juan Carlos, Universidad

    A validated mathematical model of tumor growth including tumor-host interaction, cell of tumor cells to such cytotoxic substances (Lavi et al., 2012). Mathematical modelling of tumor growth/n, 28933 M´ostoles, Madrid, Spain Abstract We consider a dynamical model of cancer growth including three

  15. Social Validity of the Critical Incident Stress Management Model for School-Based Crisis Intervention

    ERIC Educational Resources Information Center

    Morrison, Julie Q.

    2007-01-01

    The Critical Incident Stress Management (CISM) model for crisis intervention was developed for use with emergency service personnel. Research regarding the use of the CISM model has been conducted among civilians and high-risk occupation groups with mixed results. The purpose of this study is to examine the social validity of the CISM model for…

  16. Tsunami Generation by Submarine Mass Failure. I: Modeling, Experimental Validation, and Sensitivity Analyses

    E-print Network

    Grilli, Stéphan T.

    Tsunami Generation by Submarine Mass Failure. I: Modeling, Experimental Validation, and Sensitivity with a two-dimensional 2D fully nonlinear potential flow FNPF model for tsunami generation by two idealized a simple wavemaker formalism, and prescribed as a boundary condition in the FNPF model. Tsunami amplitudes

  17. IMPLEMENTATION AND VALIDATION OF A BREAKER MODEL IN A FULLY NONLINEAR WAVE

    E-print Network

    Grilli, Stéphan T.

    to the normal particle velocity on the free surface. The instantaneous power dissipated in each breaking waveIMPLEMENTATION AND VALIDATION OF A BREAKER MODEL IN A FULLY NONLINEAR WAVE PROPAGATION MODEL in a two-dimensional fully nonlinear coastal wave propagation model. A maximum surface slope breaking

  18. Sensitivity Analysis, Calibration, and Validations for a Multisite and Multivariable SWAT Model

    Microsoft Academic Search

    Kati L. White; Indrajeet Chaubey

    2005-01-01

    The ability of a watershed model to mimic specified watershed processes is assessed through the calibration and validation process. The Soil and Water Assessment Tool (SWAT) watershed model was implemented in the Beaver Reservoir Watershed of Northwest Arkansas. The objectives were to: (1) provide detailed information on calibrating and applying a multisite and multivariable SWAT model; (2) conduct sensitivity analysis;

  19. Spatial calibration and temporal validation of flow for regional scale hydrologic modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Physically based regional scale hydrologic modeling is gaining importance for planning and management of water resources. Calibration and validation of such regional scale model is necessary before applying it for scenario assessment. However, in most regional scale hydrologic modeling, flow validat...

  20. Verification, Validation, and Confirmation of Numerical Models in the Earth Sciences

    Microsoft Academic Search

    Naomi Oreskes; Kristin Shrader-Frechette; Kenneth Belitz

    1994-01-01

    Verification and validation of numerical models of natural systems is impossible. This is because natural systems are never closed and because model results are always non-unique. Models can be confirmed by the demonstration of agreement between observation and prediction, but confirmation is inherently partial. Complete confirmation is logically precluded by the fallacy of affirming the consequent and by incomplete access

  1. Validation of a morphogenesis model of Drosophila early development by a multi-objective

    E-print Network

    Paris-Sud XI, Université de

    Validation of a morphogenesis model of Drosophila early development by a multi.Nicolau,Marc.Schoenauer}@inria.fr Abstract. We apply evolutionary computation to calibrate the parameters of a morphogenesis model of morphogenesis of Drosophila. This model incorporates the regulatory repression mechanism of Bicoid protein over

  2. Empirical validation of models to compute solar irradiance on inclined surfaces for building energy simulation

    Microsoft Academic Search

    P. G. Loutzenhiser; H. Manz; C. Felsmann; P. A. Strachan; T. Frank; G. M. Maxwell

    2007-01-01

    Accurately computing solar irradiance on external facades is a prerequisite for reliably predicting thermal behavior and cooling loads of buildings. Validation of radiation models and algorithms implemented in building energy simulation codes is an essential endeavor for evaluating solar gain models. Seven solar radiation models implemented in four building energy simulation codes were investigated: (1) isotropic sky, (2) Klucher, (3)

  3. Atmospheric Dispersion Model Validation in Low Wind Conditions

    SciTech Connect

    Sawyer, Patrick

    2007-11-01

    Atmospheric plume dispersion models are used for a variety of purposes including emergency planning and response to hazardous material releases, determining force protection actions in the event of a Weapons of Mass Destruction (WMD) attack and for locating sources of pollution. This study provides a review of previous studies that examine the accuracy of atmospheric plume dispersion models for chemical releases. It considers the principles used to derive air dispersion plume models and looks at three specific models currently in use: Aerial Location of Hazardous Atmospheres (ALOHA), Emergency Prediction Information Code (EPIcode) and Second Order Closure Integrated Puff (SCIPUFF). Results from this study indicate over-prediction bias by the EPIcode and SCIPUFF models and under-prediction bias by the ALOHA model. The experiment parameters were for near field dispersion (less than 100 meters) in low wind speed conditions (less than 2 meters per second).

  4. Shape memory polymer filled honeycomb model and experimental validation

    NASA Astrophysics Data System (ADS)

    Beblo, R. V.; Puttmann, J. P.; Joo, J. J.; Reich, G. W.

    2015-02-01

    An analytical model predicting the in-plane Young’s and shear moduli of a shape memory polymer filled honeycomb composite is presented. By modeling the composite as a series of rigidly attached beams, the mechanical advantage of the load distributed on each beam by the infill is accounted for. The model is compared to currently available analytical models as well as experimental data. The model correlates extremely well with experimental data for empty honeycomb and when the polymer is above its glass transition temperature. Below the glass transition temperature, rule of mixtures is shown to be more accurate as bending is no longer the dominant mode of deformation. The model is also derived for directions other than the typical x and y allowing interpolation of the stiffness of the composite in any direction.

  5. A stochastic MIMO radio channel model with experimental validation

    Microsoft Academic Search

    Jean Philippe Kermoal; Laurent Schumacher; Klaus Ingemann Pedersen; Preben Elgaard Mogensen; Frank Frederiksen

    2002-01-01

    Theoretical and experimental studies of multiple-input\\/multiple-output (MIMO) radio channels are presented. A simple stochastic MIMO model channel has been developed. This model uses the correlation matrices at the mobile station (MS) and base station (BS) so that results of the numerous single-input\\/multiple-output studies that have been published in the literature can be used as input parameters. The model is simplified

  6. Ion Thruster Modeling: Particle Simulations and Experimental Validations

    NASA Astrophysics Data System (ADS)

    Wang, Joseph; Polk, James; Brinza, David

    2003-05-01

    This paper presents results from ion thruster modeling studies performed in support of NASA's Deep Space 1 mission and NSTAR project. Fully 3-dimensional computer particle simulation models are presented for ion optics plasma flow and ion thruster plume. Ion optics simulation results are compared with measurements obtained from ground tests of the NSTAR ion thruster. Plume simulation results are compared with in-flight measurements from the Deep Space 1 spacecraft. Both models show excellent agreement with experimental data.

  7. Validation of a new Mesoscale Model for MARS

    Microsoft Academic Search

    K. De Sanctis; R. Ferretti; F. Forget; C. Fiorenza; G. Visconti

    2007-01-01

    The study of Mars planet is very important because of the several similarities with the Earth. For the understanding of the dynamical processes which drive the martian atmosphere, a new Martian Mesoscale Model (MARS-MM5) is presented. The new model is based on the Pennsylvania State University (PSU)\\/National Centre for Atmosphere Research (NCAR) Mesoscale Model Version 5 \\\\citep{duh,gre}. MARS-MM5 has been

  8. Modal Testing and FE-model Validation of Azimuthing Thruster

    Microsoft Academic Search

    Vesa Nieminen; Matti Tervonen

    \\u000a Vibratory behavior of an azimuthing thruster was studied with FE-models and the results were verified by full-scale experiments.\\u000a Studied thruster systems are used both for main propulsion and for steering of vessels. Modeling techniques were developed\\u000a to take into account the most significant factors and phenomena affecting on the vibration behavior of the structure in real\\u000a operation conditions. Modeling of

  9. Web-page on UrQMD Model Validation

    E-print Network

    A. Galoyan; J. Ritman; V. Uzhinsky

    2006-05-18

    A WEB-page containing materials of comparing experimental data and UrQMD model calculations has been designed. The page provides its user with a variety of tasks solved with the help of the model, accuracy and/or quality of experimental data description, and so on. The page can be useful for new experimental data analysis, or new experimental research planning. The UrQMD model is cited in more than 272 publications. Only 44 of them present original calculations. Their main results on the model are presented on the page.

  10. Radiation model predictions and validation using LDEF satellite data

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    Predictions and comparisons with the radiation dose measurements on Long Duration Exposure Facility (LDEF) by thermoluminescent dosimeters were made to evaluate the accuracy of models currently used in defining the ionizing radiation environment for low Earth orbit missions. The calculations include a detailed simulation of the radiation exposure (altitude and solar cycle variations, directional dependence) and shielding effects (three-dimensional LDEF geometry model) so that differences in the predicted and observed doses can be attributed to environment model uncertainties. The LDEF dose data are utilized to assess the accuracy of models describing the trapped proton flux, the trapped proton directionality, and the trapped electron flux.

  11. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product version is compared with the previous version. Although, the toolbox is mainly tested for the Baltic Sea yet, it can easily be adapted to different datasets and parameters, regardless of the geographic region. In this presentation the usability of the toolbox is demonstrated along with several results of the validation process.

  12. Nonlinear Model Validation using Multiple Experiments Wayne J. Dunstan

    E-print Network

    Bitmead, Bob

    , Popper argued that modelling knowledge advances instead by deductive falsification. Experiment and falsification testing in an attempt to invalidate a given model. The example used to illustrate the method. Popper [Pop59]. Popper described the set of methodological rules called Falsificationism. In- stead

  13. Development and Validation of Credit-Scoring Models

    Microsoft Academic Search

    Dennis Glennon; Nicholas M. Kiefer; C. Erik Larson; Hwan-sik Choi

    2007-01-01

    Accurate credit-granting decisions are crucial to the efficiency of the decentralized capital allocation mechanisms in modern market economies. Credit bureaus and many .nancial institutions have developed and used credit-scoring models to standardize and automate, to the extent possible, credit decisions. We build credit scoring models for bankcard markets using the Office of the Comptroller of the Currency, Risk Analysis Division

  14. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  15. Validation of a Nonlinear Automotive Seat Cushion Vibration Model

    Microsoft Academic Search

    WILLIAM N. PATTEN; JIAN PANG

    1998-01-01

    A low-order, lumped parameter model is proposed to describe the vertical vibration compliance of an automotive seat. The model includes nonlinear stiffness and damping effects that mimic the properties exhibited by open cell foams that are commonly used in the construction of an automotive seat cushion. A shaped sandbag was positioned on a seat cushion and vibrated to obtain test

  16. Modeling and validation of off-road vehicle ride dynamics

    NASA Astrophysics Data System (ADS)

    Pazooki, Alireza; Rakheja, Subhash; Cao, Dongpu

    2012-04-01

    Increasing concerns on human driver comfort/health and emerging demands on suspension systems for off-road vehicles call for an effective and efficient off-road vehicle ride dynamics model. This study devotes both analytical and experimental efforts in developing a comprehensive off-road vehicle ride dynamics model. A three-dimensional tire model is formulated to characterize tire-terrain interactions along all the three translational axes. The random roughness properties of the two parallel tracks of terrain profiles are further synthesized considering equivalent undeformable terrain and a coherence function between the two tracks. The terrain roughness model, derived from the field-measured responses of a conventional forestry skidder, was considered for the synthesis. The simulation results of the suspended and unsuspended vehicle models are derived in terms of acceleration PSD, and weighted and unweighted rms acceleration along the different axes at the driver seat location. Comparisons of the model responses with the measured data revealed that the proposed model can yield reasonably good predictions of the ride responses along the translational as well as rotational axes for both the conventional and suspended vehicles. The developed off-road vehicle ride dynamics model could serve as an effective and efficient tool for predicting vehicle ride vibrations, to seek designs of primary and secondary suspensions, and to evaluate the roles of various operating conditions.

  17. Kohlberg's Moral Development Model: Cohort Influences on Validity.

    ERIC Educational Resources Information Center

    Bechtel, Ashleah

    An overview of Kohlberg's theory of moral development is presented; three interviews regarding the theory are reported, and the author's own moral development is compared to the model; finally, a critique of the theory is addressed along with recommendations for future enhancement. Lawrence Kohlberg's model of moral development, also referred to…

  18. Validation Studies of Air Quality Models at Dulles Airport

    Microsoft Academic Search

    D. G. Smith; E. A. Taylor; S. M. Doucette; B. A. Egan

    1979-01-01

    Examination of measurement results for the Concorde SST and several other aircraft types has led to further review of basic assumptions made in many airport air quality models. Even though several fundamental details of the dispersion process are sometimes ignored, the modeling assumptions may be adequate for a particular scale of analysis that implies a basic limit of resolution for

  19. THE FERNALD DOSIMETRY RECONSTRUCTION PROJECT Environmental Pathways -Models and Validation

    E-print Network

    and Deposition Models . 19 Building Wake Effects and Plume Rise . 23 Resuspension of Particulates . . . . . 24 AND DOSE CALCULATIONS 38 Agricultural Parameters . . 38 Dose Conversion Factors . . 39 FMPC for Modeling the Transport of Airborne Releases F. The Straight-Line Gaussian Plume and Related Air Transport

  20. Summary of EASM Turbulence Models in CFL3D With Validation Test Cases

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Gatski, Thomas B.

    2003-01-01

    This paper summarizes the Explicit Algebraic Stress Model in k-omega form (EASM-ko) and in k-epsilon form (EASM-ke) in the Reynolds-averaged Navier-Stokes code CFL3D. These models have been actively used over the last several years in CFL3D, and have undergone some minor modifications during that time. Details of the equations and method for coding the latest versions of the models are given, and numerous validation cases are presented. This paper serves as a validation archive for these models.

  1. Chemical kinetics parameters and model validation for the gasification of PCEA nuclear graphite

    SciTech Connect

    El-Genk, Mohamed S [University of New Mexico, Albuquerque] [University of New Mexico, Albuquerque; Tournier, Jean-Michel [University of New Mexico, Albuquerque] [University of New Mexico, Albuquerque; Contescu, Cristian I [ORNL] [ORNL

    2014-01-01

    A series of gasification experiments, using two right cylinder specimens (~ 12.7 x 25.4 mm and 25.4 x 25.4 mm) of PCEA nuclear graphite in ambient airflow, measured the total gasification flux at weight losses up to 41.5% and temperatures (893-1015 K) characteristics of those for in-pores gasification Mode (a) and in-pores diffusion-limited Mode (b). The chemical kinetics parameters for the gasification of PCEA graphite are determined using a multi-parameters optimization algorithm from the measurements of the total gasification rate and transient weight loss in experiments. These parameters are: (i) the pre-exponential rate coefficients and the Gaussian distributions and values of specific activation energies for adsorption of oxygen and desorption of CO gas; (ii) the specific activation energy and pre-exponential rate coefficient for the breakup of stable un-dissociated C(O2) oxygen radicals to form stable (CO) complexes; (iii) the specific activation energy and pre-exponential coefficient for desorption of CO2 gas and; (iv) the initial surface area of reactive free sites per unit mass. This area is consistently 13.5% higher than that for nuclear graphite grades of NBG-25 and IG-110 and decreases inversely proportional with the square root of the initial mass of the graphite specimens in the experiments. Experimental measurements successfully validate the chemical-reactions kinetics model that calculates continuous Arrhenius curves of the total gasification flux and the production rates of CO and CO2 gases. The model results at different total weight losses agree well with measurements and expand beyond the temperatures in the experiments to the diffusion-limited mode of gasification. Also calculated are the production rates of CO and CO2 gases and their relative contributions to the total gasification rate in the experiments as functions of temperature, for total weight losses of 5% and 10%.

  2. Modeling of Alpine Atmospheric Dynamics II

    E-print Network

    Gohm, Alexander

    A typical MARS job for retrieving ECMWF analysis fields in GRIB format (submitted with command llsubmit conditions from a larger scale model (e.g. ECMWF or NCEP): Large-scale analysis fields are used for mesoscale://atmet.com/html/docs/data/ralph2.pdf) The ISAN (ISentropic ANalysis) package, a part of the RAMS code, uses the RALPH2 data

  3. An analytical approach to computing biomolecular electrostatic potential. II. Validation and applications

    PubMed Central

    Gordon, John C.; Fenley, Andrew T.; Onufriev, Alexey

    2008-01-01

    An ability to efficiently compute the electrostatic potential produced by molecular charge distributions under realistic solvation conditions is essential for a variety of applications. Here, the simple closed-form analytical approximation to the Poisson equation rigorously derived in Part I for idealized spherical geometry is tested on realistic shapes. The effects of mobile ions are included at the Debye–Hückel level. The accuracy of the resulting closed-form expressions for electrostatic potential is assessed through comparisons with numerical Poisson–Boltzmann (NPB) reference solutions on a test set of 580 representative biomolecular structures under typical conditions of aqueous solvation. For each structure, the deviation from the reference is computed for a large number of test points placed near the dielectric boundary (molecular surface). The accuracy of the approximation, averaged over all test points in each structure, is within 0.6 kcal?mol??e??kT per unit charge for all structures in the test set. For 91.5% of the individual test points, the deviation from the NPB potential is within 0.6 kcal?mol??e?. The deviations from the reference decrease with increasing distance from the dielectric boundary: The approximation is asymptotically exact far away from the source charges. Deviation of the overall shape of a structure from ideal spherical does not, by itself, appear to necessitate decreased accuracy of the approximation. The largest deviations from the NPB reference are found inside very deep and narrow indentations that occur on the dielectric boundaries of some structures. The dimensions of these pockets of locally highly negative curvature are comparable to the size of a water molecule; the applicability of a continuum dielectric models in these regions is discussed. The maximum deviations from the NPB are reduced substantially when the boundary is smoothed by using a larger probe radius (3 Å) to generate the molecular surface. A detailed accuracy analysis is presented for several proteins of various shapes, including lysozyme whose surface features a functionally relevant region of negative curvature. The proposed analytical model is computationally inexpensive; this strength of the approach is demonstrated by computing and analyzing the electrostatic potential generated by a full capsid of the tobacco ring spot virus at atomic resolution (500 000 atoms). An analysis of the electrostatic potential of the inner surface of the capsid reveals what might be a RNA binding pocket. These results are generated with the modest computational power of a desktop personal computer. PMID:19044803

  4. Validation of Atmospheric Refraction Modeling Improvements in Satellite Laser Ranging.

    NASA Astrophysics Data System (ADS)

    Hulley, G.; Pavlis, E. C.; Mendes, V. B.; Pavlis, D. E.

    2004-12-01

    Atmospheric refraction is an important accuracy-limiting factor in the use of satellite laser ranging (SLR) for high-accuracy science applications. In most of these applications, and particularly for the establishment and monitoring of the TRF, of great interest is the stability of its scale and its implied height system. The modeling of atmospheric refraction in the analysis of SLR data comprises the determination of the delay in the zenith direction and subsequent projection to a given elevation angle, using a mapping function. Standard data analyses practices use the 1973 Marini-Murray model for both zenith delay determination and mapping. This model was tailored for a particular wavelength and is not suitable for all the wavelengths used in modern SLR systems. Mendes et al., [2002] pointed out some limitations in that model, namely as regards the modeling of the elevation dependency of the zenith atmospheric delay (the mapping function component of the model). The mapping functions developed by Mendes et al. [2002] represent a significant improvement over the built-in mapping function of the Marini-Murray model and other known mapping functions. Of particular note is the ability of the new mapping functions to be used in combination with any zenith delay model, used to predict the atmospheric zenith delay. Mendes and Pavlis [2002] concluded also that current zenith delay models have errors at the millimeter level, which increase significantly at 0.355 micrometers, reflecting inadequacy in the dispersion formulae incorporated in these models. In a next step therefore, a more accurate zenith delay model was developed, applicable to the range of wavelengths used in modern SLR instrumentation (0.355 to 1.064 micrometers), [Mendes and Pavlis, 2004]. Using ray tracing through a large database of radiosonde and globally distributed satellite data, as well as the analysis of several years of SLR tracking data, we assess the new zenith delay models and mapping functions currently available; we discuss the effect of using different types of input data to drive those models and the sensitivity of models and functions to changes in the wavelength, and we give some recommendations towards a unification of practices and procedures in SLR data analysis.

  5. Spatial Statistical Procedures to Validate Input Data in Energy Models

    SciTech Connect

    Lawrence Livermore National Laboratory

    2006-01-27

    Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy-related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the above-mentioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.

  6. Spatial Statistical Procedures to Validate Input Data in Energy Models

    SciTech Connect

    Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.

    2006-01-01

    Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.

  7. Prevalence of depression and validation of the Beck Depression Inventory-II and the Children's Depression Inventory-Short amongst HIV-positive adolescents in Malawi

    PubMed Central

    Kim, Maria H; Mazenga, Alick C; Devandra, Akash; Ahmed, Saeed; Kazembe, Peter N; Yu, Xiaoying; Nguyen, Chi; Sharp, Carla

    2014-01-01

    Introduction There is a remarkable dearth of evidence on mental illness in adolescents living with HIV/AIDS, particularly in the African setting. Furthermore, there are few studies in sub-Saharan Africa validating the psychometric properties of diagnostic and screening tools for depression amongst adolescents. The primary aim of this cross-sectional study was to estimate the prevalence of depression amongst a sample of HIV-positive adolescents in Malawi. The secondary aim was to develop culturally adapted Chichewa versions of the Beck Depression Inventory-II (BDI-II) and Children's Depression Inventory-II-Short (CDI-II-S) and conduct a psychometric evaluation of these measures by evaluating their performance against a structured depression assessment using the Children's Rating Scale, Revised (CDRS-R). Study design Cross-sectional study. Methods We enrolled 562 adolescents, 12–18 years of age from two large public HIV clinics in central and southern Malawi. Participants completed two self-reports, the BDI-II and CDI-II-S, followed by administration of the CDRS-R by trained clinicians. Sensitivity, specificity and positive and negative predictive values for various BDI-II and CDI-II-S cut-off scores were calculated with receiver operating characteristics analysis. The area under the curve (AUC) was also calculated. Internal consistency was measured by standardized Cronbach's alpha coefficient, and correlation between self-reports and CDRS-R by Spearman's correlation. Results Prevalence of depression as measured by the CDRS-R was 18.9%. Suicidal ideation was expressed by 7.1% (40) using the BDI-II. The AUC for the BDI-II was 0.82 (95% CI 0.78–0.89) and for the CDI-II-S was 0.75 (95% CI 0.70–0.80). A score of ?13 in BDI-II achieved sensitivity of >80%, and a score of ?17 had a specificity of >80%. The Cronbach's alpha was 0.80 (BDI-II) and 0.66 (CDI-II-S). The correlation between the BDI-II and CDRS-R was 0.42 (p<0.001) and between the CDI-II-S and CDRS-R was 0.37 (p<0.001). Conclusions This study demonstrates that the BDI-II has sound psychometric properties in an outpatient setting among HIV-positive adolescents in Malawi. The high prevalence of depression amongst HIV-positive Malawian adolescents noted in this study underscores the need for the development of comprehensive services for HIV-positive adolescents. PMID:25085002

  8. How much certainty is enough? Validation of a nutrient retention model for prioritizing watershed conservation in North Carolina

    NASA Astrophysics Data System (ADS)

    Hamel, P.; Chaplin-Kramer, R.; Benner, R.

    2013-12-01

    Context Quantifying ecosystems services, nature's benefits to people, is an area of active research in water resource management. Increasingly, water utilities and basin management authorities are interested in optimizing watershed scale conservation strategies to mitigate the economic and environmental impacts of land-use and hydrological changes. While many models are available to represent hydrological processes in a spatially explicit way, large uncertainties remain associated with i) the biophysical outputs of these models (e.g., nutrient concentration at a given location), and ii) the service valuation method to support specific decisions (e.g., targeting conservation areas based on their contribution to retaining nutrient). Better understanding these uncertainties and their impact on the decision process is critical for establishing credibility of such models in a planning context. Methods To address this issue in an emerging payments for watershed services program in the Cape Fear watershed, North Carolina, USA, we tested and validated the use of a nutrient retention model (InVEST) for targeting conservation activities. Specifically, we modeled water yield and nutrient transport throughout the watershed and valued the retention service provided by forested areas. Observed flow and water quality data at multiple locations allowed calibration of the model at the watershed level as well as the subwatershed level. By comparing the results from each model parameterization, we were able to assess the uncertainties related to both the model structure and parameter estimation. Finally, we assessed the use of the model for climate scenario simulation by characterizing its ability to represent inter-annual variability. Results and discussion The spatial analyses showed that the two calibration approaches could yield distinct parameter sets, both for the water yield and the nutrient model. These results imply a difference in the absolute nutrient concentration predicted by the models in the validation period. However, they did not significantly impact the identification of priority areas for conservation activities, which is the level of confidence necessary to support a decision in this particular context. In addition, the temporal analyses suggested that the model could adequately capture inter-annual changes, which increases confidence for the use of the model in a context of climate change. Our approach shows the importance of assessing uncertainties in the context of decision-making, with errors in the biophysical component being less of a concern when comparing among different regions in a watershed or in scenario simulations. These results have major implications in the field of ecosystem services, where the importance of communicating uncertainties is an often unappreciated. While further work is needed to generalize the results of the Cape Fear study, the approach also has the potential to validate the use of the model in ungauged basins.

  9. Validated biomechanical model for efficiency and speed of rowing.

    PubMed

    Pelz, Peter F; Vergé, Angela

    2014-10-17

    The speed of a competitive rowing crew depends on the number of crew members, their body mass, sex and the type of rowing-sweep rowing or sculling. The time-averaged speed is proportional to the rower's body mass to the 1/36th power, to the number of crew members to the 1/9th power and to the physiological efficiency (accounted for by the rower's sex) to the 1/3rd power. The quality of the rowing shell and propulsion system is captured by one dimensionless parameter that takes the mechanical efficiency, the shape and drag coefficient of the shell and the Froude propulsion efficiency into account. We derive the biomechanical equation for the speed of rowing by two independent methods and further validate it by successfully predicting race times. We derive the theoretical upper limit of the Froude propulsion efficiency for low viscous flows. This upper limit is shown to be a function solely of the velocity ratio of blade to boat speed (i.e., it is completely independent of the blade shape), a result that may also be of interest for other repetitive propulsion systems. PMID:25189093

  10. J-Integral modeling and validation for GTS reservoirs.

    SciTech Connect

    Martinez-Canales, Monica L.; Nibur, Kevin A.; Lindblad, Alex J.; Brown, Arthur A.; Ohashi, Yuki; Zimmerman, Jonathan A.; Huestis, Edwin; Hong, Soonsung; Connelly, Kevin; Margolis, Stephen B.; Somerday, Brian P.; Antoun, Bonnie R.

    2009-01-01

    Non-destructive detection methods can reliably certify that gas transfer system (GTS) reservoirs do not have cracks larger than 5%-10% of the wall thickness. To determine the acceptability of a reservoir design, analysis must show that short cracks will not adversely affect the reservoir behavior. This is commonly done via calculation of the J-Integral, which represents the energetic driving force acting to propagate an existing crack in a continuous medium. J is then compared against a material's fracture toughness (J{sub c}) to determine whether crack propagation will occur. While the quantification of the J-Integral is well established for long cracks, its validity for short cracks is uncertain. This report presents the results from a Sandia National Laboratories project to evaluate a methodology for performing J-Integral evaluations in conjunction with its finite element analysis capabilities. Simulations were performed to verify the operation of a post-processing code (J3D) and to assess the accuracy of this code and our analysis tools against companion fracture experiments for 2- and 3-dimensional geometry specimens. Evaluation is done for specimens composed of 21-6-9 stainless steel, some of which were exposed to a hydrogen environment, for both long and short cracks.

  11. Validation of the CORMIX model using thermal plume data from four Maryland power plants. Final report

    SciTech Connect

    Schreiner, S.P.; Krebs, T.A.; Strebel, D.E.; Brindley, A.; McCall, C.G.

    1999-04-19

    The purpose of this investigation was to test (validate in computer modeling terminology) the mixing zone model CORMIX (CORnell MIXing Zone Expert System) using measured thermal plume data from the four Maryland power plants (Calvert Cliffs, Chalk Point, Dickerson, and Wagner). These facilities were chosen to represent a range of discharge environments used by power plants in the state, including a large freshwater river (Potomac), a narrow tidal estuary (Baltimore Harbor). The availability of extensive historical thermal plume data provided an excellent source for validating the model and demonstrating its utility and limitations in a variety of circumstances. CORMIX also idealizes the physical configuration of the receiving water. This study concludes that users of CORMIX expert system need to be aware of these limitations in applying the model in complex situations, especially where validation data are not available to check model results.

  12. Medium term hurricane catastrophe models: a validation experiment

    NASA Astrophysics Data System (ADS)

    Bonazzi, Alessandro; Turner, Jessica; Dobbin, Alison; Wilson, Paul; Mitas, Christos; Bellone, Enrica

    2013-04-01

    Climate variability is a major source of uncertainty for the insurance industry underwriting hurricane risk. Catastrophe models provide their users with a stochastic set of events that expands the scope of the historical catalogue by including synthetic events that are likely to happen in a defined time-frame. The use of these catastrophe models is widespread in the insurance industry but it is only in recent years that climate variability has been explicitly accounted for. In the insurance parlance "medium term catastrophe model" refers to products that provide an adjusted view of risk that is meant to represent hurricane activity on a 1 to 5 year horizon, as opposed to long term models that integrate across the climate variability of the longest available time series of observations. In this presentation we discuss how a simple reinsurance program can be used to assess the value of medium term catastrophe models. We elaborate on similar concepts as discussed in "Potential Economic Value of Seasonal Hurricane Forecasts" by Emanuel et al. (2012, WCAS) and provide an example based on 24 years of historical data of the Chicago Mercantile Hurricane Index (CHI), an insured loss proxy. Profit and loss volatility of a hypothetical primary insurer are used to score medium term models versus their long term counterpart. Results show that medium term catastrophe models could help a hypothetical primary insurer to improve their financial resiliency to varying climate conditions.

  13. Nearshore Tsunami Inundation Model Validation: Toward Sediment Transport Applications

    USGS Publications Warehouse

    Apotsos, Alex; Buckley, Mark; Gelfenbaum, Guy; Jaffe, Bruce; Vatvani, Deepak

    2011-01-01

    Model predictions from a numerical model, Delft3D, based on the nonlinear shallow water equations are compared with analytical results and laboratory observations from seven tsunami-like benchmark experiments, and with field observations from the 26 December 2004 Indian Ocean tsunami. The model accurately predicts the magnitude and timing of the measured water levels and flow velocities, as well as the magnitude of the maximum inundation distance and run-up, for both breaking and non-breaking waves. The shock-capturing numerical scheme employed describes well the total decrease in wave height due to breaking, but does not reproduce the observed shoaling near the break point. The maximum water levels observed onshore near Kuala Meurisi, Sumatra, following the 26 December 2004 tsunami are well predicted given the uncertainty in the model setup. The good agreement between the model predictions and the analytical results and observations demonstrates that the numerical solution and wetting and drying methods employed are appropriate for modeling tsunami inundation for breaking and non-breaking long waves. Extension of the model to include sediment transport may be appropriate for long, non-breaking tsunami waves. Using available sediment transport formulations, the sediment deposit thickness at Kuala Meurisi is predicted generally within a factor of 2.

  14. Small scale model for CFD validation in DAF application.

    PubMed

    Hague, J; Ta, C T; Biggs, M J; Sattary, J A

    2001-01-01

    A laboratory model is used to measure the generic flow patterns in dissolved air flotation (DAF). The Perspex model used in this study allows the use of laser Doppler velocimetry (LDV), a non-invasive, high-resolution (+/- 2 mm s-1) laser technique of flow velocity measurement. Measurement of flow velocity in the single-phase situation was first carried out. Air-saturated water was then supplied to the tank and measurements of bubble velocity in the two-phase system were made. Vertical flow re-circulation was observed in the flotation zone. In the bottom of the flotation zone (near the riser) secondary flow re-circulation was observed, but only in the two-phase system. Another phenomenon was the apparent movement of flow across the tank width, which may be due to lateral dispersion of the bubble cloud. Data from preliminary computational fluid dynamics (CFD) models were compared against this measured data in the case of the single-phase system. The CFD model incorporating a k-epsilon model of turbulence was found to give closer agreement with the measured data than the corresponding laminar flow model. The measured velocity data will be used to verify two-phase computational fluid dynamics (CFD) models of DAF. PMID:11394270

  15. Improving Agent Based Models and Validation through Data Fusion

    PubMed Central

    Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

  16. An Optical Afterglow Model for Bright Linear Type II Supernovae

    NASA Astrophysics Data System (ADS)

    Smith, D. D.; Young, T. R.; Johnson, T. A.

    2005-05-01

    Bright Linear Type II Supernovae exhibit a light curve that has yet to be fully explained. We have applied techniques currently being used to analyze Gamma-Ray Burst light curves to present a two-component model for the Bright Linear SN 1979C. One component is broken power-law emission like those seen in the optical afterglows of GRBs. In the currently accepted model of GRBs, this emission is explained by a jet of material along the rotation axis of the collapsed core. We believe the same general mechanism occurred in SN 1979C. The second component of the light curve is that of a more common Type II Plateau supernova, which makes itself known by a small bump in the light curve about 100 days after the explosion. We produced two fits using two separate models for the underlying supernova. The first used observational data from SN 1969L, a typical Type II Plateau. The second fit used a numerical simulation of a supernova resulting from a 300 solar radii star ejecting 12 solar masses of material during the explosion. Both techniques fit the data well. This model implies that Bright Linear Type II supernovae have a relativistic jet that produces the afterglow, but is not powerful enough to produce a long-duration gamma-ray burst. This is consistent with theoretical jet simulations and observational radio data.

  17. COAL PREPARATION PLANT COMPUTER MODEL: VOLUME II. PROGRAM DOCUMENTATION

    EPA Science Inventory

    The two-volume report describes a steady state modeling system that simulates the performance of coal preparation plants. Program documentation begins in Volume II with a discussion of basic documentation principles, followed by presentation of each routine and common block in te...

  18. Animal models of post-traumatic stress disorder: face validity

    PubMed Central

    Goswami, Sonal; Rodríguez-Sierra, Olga; Cascardi, Michele; Paré, Denis

    2013-01-01

    Post-traumatic stress disorder (PTSD) is a debilitating condition that develops in a proportion of individuals following a traumatic event. Despite recent advances, ethical limitations associated with human research impede progress in understanding PTSD. Fortunately, much effort has focused on developing animal models to help study the pathophysiology of PTSD. Here, we provide an overview of animal PTSD models where a variety of stressors (physical, psychosocial, or psychogenic) are used to examine the long-term effects of severe trauma. We emphasize models involving predator threat because they reproduce human individual differences in susceptibility to, and in the long-term consequences of, psychological trauma. PMID:23754973

  19. Method for appraising model validity of randomised controlled trials of homeopathic treatment: multi-rater concordance study

    PubMed Central

    2012-01-01

    Background A method for assessing the model validity of randomised controlled trials of homeopathy is needed. To date, only conventional standards for assessing intrinsic bias (internal validity) of trials have been invoked, with little recognition of the special characteristics of homeopathy. We aimed to identify relevant judgmental domains to use in assessing the model validity of homeopathic treatment (MVHT). We define MVHT as the extent to which a homeopathic intervention and the main measure of its outcome, as implemented in a randomised controlled trial (RCT), reflect 'state-of-the-art' homeopathic practice. Methods Using an iterative process, an international group of experts developed a set of six judgmental domains, with associated descriptive criteria. The domains address: (I) the rationale for the choice of the particular homeopathic intervention; (II) the homeopathic principles reflected in the intervention; (III) the extent of homeopathic practitioner input; (IV) the nature of the main outcome measure; (V) the capability of the main outcome measure to detect change; (VI) the length of follow-up to the endpoint of the study. Six papers reporting RCTs of homeopathy of varying design were randomly selected from the literature. A standard form was used to record each assessor's independent response per domain, using the optional verdicts 'Yes', 'Unclear', 'No'. Concordance among the eight verdicts per domain, across all six papers, was evaluated using the kappa (?) statistic. Results The six judgmental domains enabled MVHT to be assessed with 'fair' to 'almost perfect' concordance in each case. For the six RCTs examined, the method allowed MVHT to be classified overall as 'acceptable' in three, 'unclear' in two, and 'inadequate' in one. Conclusion Future systematic reviews of RCTs in homeopathy should adopt the MVHT method as part of a complete appraisal of trial validity. PMID:22510227

  20. Exploration of Pavement Oxidation Model Applications and Field Validation

    E-print Network

    Cui, Yuanchen

    2014-08-11

    oxidation have been investigated, such as oxidation kinetics, asphalt hardening in response to oxidation, pavement design, and environmental conditions. Based on understandings on those elements, pavement oxidation models have been developed to predict...

  1. Validation of energy harvest modeling for X14 system

    NASA Astrophysics Data System (ADS)

    Finot, Marc; MacDonald, Bob; Lance, Tamir

    2012-10-01

    Skyline Solar has developed a second generation medium concentration photovoltaic system with an optical concentration of around 14. The energy harvest model based on the first generation system has been updated and improved using field data. The model combines a bottom-up modeling approach based on performance of subcomponents such as mirrors and cells with a top-down approach based on measuring the system output under different environmental conditions. Improvement of the model includes the effect of non-uniformity of the light on the panel. The predicted energy ratio (ratio between the observed energy and expected energy) has been measured over a 10-month period and shows monthly variability below 2%, resulting in high confidence level for the mean of the expected energy harvest.

  2. Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models

    NASA Astrophysics Data System (ADS)

    Jacquot, K.; Chevrier, C.; Halin, G.

    2013-02-01

    Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.

  3. Experimental validation of different modeling approaches for solid particle receivers.

    SciTech Connect

    Khalsa, Siri Sahib S.; Amsbeck, Lars (German Aerospace Center (DLR), Spain and Stuttgart, Germany); Roger, Marc (German Aerospace Center (DLR), Spain and Stuttgart, Germany); Siegel, Nathan Phillip; Kolb, Gregory J.; Buck, Reiner (German Aerospace Center (DLR), Spain and Stuttgart, Germany); Ho, Clifford Kuofei

    2009-07-01

    Solid particle receivers have the potential to provide high-temperature heat for advanced power cycles, thermochemical processes, and thermal storage via direct particle absorption of concentrated solar energy. This paper presents two different models to evaluate the performance of these systems. One model is a detailed computational fluid dynamics model using FLUENT that includes irradiation from the concentrated solar flux, two-band re-radiation and emission within the cavity, discrete-phase particle transport and heat transfer, gas-phase convection, wall conduction, and radiative and convective heat losses. The second model is an easy-to-use and fast simulation code using Matlab that includes solar and thermal radiation exchange between the particle curtain, cavity walls, and aperture, but neglects convection. Both models were compared to unheated particle flow tests and to on-sun heating tests. Comparisons between measured and simulated particle velocities, opacity, particle volume fractions, particle temperatures, and thermal efficiencies were found to be in good agreement. Sensitivity studies were also performed with the models to identify parameters and modifications to improve the performance of the solid particle receiver.

  4. Validation of a two-phase multidimensional polymer electrolyte membrane fuel cell computational model using current distribution measurements

    E-print Network

    Validation of a two-phase multidimensional polymer electrolyte membrane fuel cell computational: Validation Polymer electrolyte membrane fuel cell Computational model Current distribution measurements Uncertainty quantification a b s t r a c t Validation of computational models for polymer electrolyte membrane

  5. Validation of mathematical models of complex endocrine-metabolic systems. A case study on a model of glucose regulation

    Microsoft Academic Search

    C. Cobelli; A. Mari

    1983-01-01

    The validation process is an essential component of the modelling ofin vivo endocrine and metabolic systems. In the paper a validation study of a comprehensive model of the glucose regulation system,\\u000a previously developed for intravenous testing, is performed on a new data set based on oral glucose tolerance studies. A novel\\u000a approach based on a ‘partition and input\\/output inversion’ technique

  6. Model light curves of linear Type II supernovae

    Microsoft Academic Search

    D. A. Swartz; J. C. Wheeler; R. P. Harkness

    1991-01-01

    Light curves computed from hydrodynamic models of supernova explosions are compared graphically to the average observed B and V band light curve of linear Type II supernovae. Models are based on the following explosion scenarios: carbon deflagration with a C+O core near the Chandrasekhar mass, electron capture induced core collapse of an O-Ne-Mg core of the Chandrasekhar mass, and collapse

  7. Model light curves of linear Type II supernovae

    Microsoft Academic Search

    Douglas A. Swartz; J. Craig Wheeler; Robert P. Harkness

    1991-01-01

    Light curves computed from hydrodynamic models of supernova are compared graphically with the average observed B and V-band light curves of linear Type II supernovae. Models are based on the following explosion scenarios: carbon deflagration within a C + O core near the Chandrasekhar mass, electron-capture-induced core collapse of an O-Ne-Mg core of the Chandrasekhar mass, and collapse of an

  8. Development and validation of a tokamak skin effect transformer model

    NASA Astrophysics Data System (ADS)

    Romero, J. A.; Moret, J.-M.; Coda, S.; Felici, F.; Garrido, I.

    2012-02-01

    A lumped parameter, state space model for a tokamak transformer including the slow flux penetration in the plasma (skin effect transformer model) is presented. The model does not require detailed or explicit information about plasma profiles or geometry. Instead, this information is lumped in system variables, parameters and inputs. The model has an exact mathematical structure built from energy and flux conservation theorems, predicting the evolution and non-linear interaction of plasma current and internal inductance as functions of the primary coil currents, plasma resistance, non-inductive current drive and the loop voltage at a specific location inside the plasma (equilibrium loop voltage). Loop voltage profile in the plasma is substituted by a three-point discretization, and ordinary differential equations are used to predict the equilibrium loop voltage as a function of the boundary and resistive loop voltages. This provides a model for equilibrium loop voltage evolution, which is reminiscent of the skin effect. The order and parameters of this differential equation are determined empirically using system identification techniques. Fast plasma current modulation experiments with random binary signals have been conducted in the TCV tokamak to generate the required data for the analysis. Plasma current was modulated under ohmic conditions between 200 and 300 kA with 30 ms rise time, several times faster than its time constant L/R ? 200 ms. A second-order linear differential equation for equilibrium loop voltage is sufficient to describe the plasma current and internal inductance modulation with 70% and 38% fit parameters, respectively. The model explains the most salient features of the plasma current transients, such as the inverse correlation between plasma current ramp rates and internal inductance changes, without requiring detailed or explicit information about resistivity profiles. This proves that a lumped parameter modelling approach can be used to predict the time evolution of bulk plasma properties such as plasma inductance or current with reasonable accuracy; at least under ohmic conditions without external heating and current drive sources.

  9. The Validity of Dynamical Models of the Solar Atmosphere

    NASA Astrophysics Data System (ADS)

    Kalkofen, Wolfgang

    2012-02-01

    Important results on the structure and dynamics of the nonmagnetic solar chromosphere are based on hydrodynamic models that oversimplify either the geometry of the atmosphere or the interaction of radiation and matter. Although the observed granulation pattern is well reproduced by the three-dimensional (3D) models, oversimplification of radiative relaxation leads to the prediction of temperature fluctuations that are too high (by a factor of 10 to 100) and result in a monotonic decrease with height in the chromosphere of the horizontally and temporally averaged temperature, and hence in the prediction of absorption lines at wavelengths where only emission lines are observed on the Sun. New values of solar abundances of oxygen and other metals are based on 3D hydrodynamic models with temporal and spatial fluctuations that are far greater than those observed. These new abundances destroy the previous agreement of observed modes with acoustic eigenmodes that had been predicted for the old abundances from a solar model for which the sound speed throughout most of the Sun was determined to an accuracy of a few parts in 104. One expects that, when radiative relaxation is properly accounted for, 3D models will reproduce the essential characteristics of the solar atmosphere, among them a positive temperature gradient in the outward direction and hence exclusively emission lines in the extreme ultraviolet at all times and positions in the nonmagnetic chromosphere. A minimum characteristic length of 0.1 arcsec is identified for the solar atmosphere, below which there is no significant structure in the actual Sun, only in wave models of the Sun. This criticism does not detract from the notable success of hydrodynamic modeling to explain the mechanism by which chromospheric H2V and K2V bright points are formed.

  10. Validity of the Julliere model of spin-dependent tunneling

    SciTech Connect

    MacLaren, J.M. [Department of Physics, Tulane University, New Orleans, Louisiana 70118 (United States)] [Department of Physics, Tulane University, New Orleans, Louisiana 70118 (United States); Zhang, X. [Computational Physics and Engineering Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6114 (United States)] [Computational Physics and Engineering Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6114 (United States); Butler, W.H. [Metals and Ceramics Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6114 (United States)] [Metals and Ceramics Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6114 (United States)

    1997-11-01

    We consider spin-dependent tunneling between two ferromagnets separated by a simple step barrier, and examine four models for the magnetoconductance ratio {Delta}G/G: A model due to Julliere which characterizes the magnetoconductance solely in terms of the tunneling spin polarization, a model due to Slonczewski which provides an approximate expression for the magnetoconductance of free electrons tunneling through a barrier, the exact expression for the magnetoconductance of free electrons tunneling through a barrier, and the numerical calculation of the magnetoconductance of band electrons in iron tunneling through a barrier. We find that the Julliere model does not accurately describe the magnetoconductance of free electrons tunneling through a barrier. Although Slonczewski{close_quote}s model provides a good approximation to the exact expression for free electrons in the limit of thick barriers, we find that the tunneling of band electrons shows features that are not described well by any free electron picture and which reflect the details of the band structure of iron at the Fermi energy. {copyright} {ital 1997} {ital The American Physical Society}

  11. Validation of road vehicle and traffic emission models - A review and meta-analysis

    NASA Astrophysics Data System (ADS)

    Smit, Robin; Ntziachristos, Leonidas; Boulter, Paul

    2010-08-01

    Road transport is often the main source of air pollution in urban areas, and there is an increasing need to estimate its contribution precisely so that pollution-reduction measures (e.g. emission standards, scrapage programs, traffic management, ITS) are designed and implemented appropriately. This paper presents a meta-analysis of 50 studies dealing with the validation of various types of traffic emission model, including 'average speed', 'traffic situation', 'traffic variable', 'cycle variable', and 'modal' models. The validation studies employ measurements in tunnels, ambient concentration measurements, remote sensing, laboratory tests, and mass-balance techniques. One major finding of the analysis is that several models are only partially validated or not validated at all. The mean prediction errors are generally within a factor of 1.3 of the observed values for CO 2, within a factor of 2 for HC and NO x, and within a factor of 3 for CO and PM, although differences as high as a factor of 5 have been reported. A positive mean prediction error for NO x (i.e. overestimation) was established for all model types and practically all validation techniques. In the case of HC, model predictions have been moving from underestimation to overestimation since the 1980s. The large prediction error for PM may be associated with different PM definitions between models and observations (e.g. size, measurement principle, exhaust/non-exhaust contribution). Statistical analyses show that the mean prediction error is generally not significantly different ( p < 0.05) when the data are categorised according to model type or validation technique. Thus, there is no conclusive evidence that demonstrates that more complex models systematically perform better in terms of prediction error than less complex models. In fact, less complex models appear to perform better for PM. Moreover, the choice of validation technique does not systematically affect the result, with the exception of a CO underprediction when the validation is based on ambient concentration measurements and inverse modelling. The analysis identified two vital elements currently lacking in traffic emissions modelling: 1) guidance on the allowable error margins for different applications/scales, and 2) estimates of prediction errors. It is recommended that current and future emission models incorporate the capability to quantify prediction errors, and that clear guidelines are developed internationally with respect to expected accuracy.

  12. Some Hamiltonian models of friction II

    SciTech Connect

    Egli, Daniel; Gang Zhou [Institute for Theoretical Physics, ETH Zurich, CH-8093 Zuerich (Switzerland)

    2012-10-15

    In the present paper we consider the motion of a very heavy tracer particle in a medium of a very dense, non-interacting Bose gas. We prove that, in a certain mean-field limit, the tracer particle will be decelerated and come to rest somewhere in the medium. Friction is caused by emission of Cerenkov radiation of gapless modes into the gas. Mathematically, a system of semilinear integro-differential equations, introduced in Froehlich et al. ['Some hamiltonian models of friction,' J. Math. Phys. 52(8), 083508 (2011)], describing a tracer particle in a dispersive medium is investigated, and decay properties of the solution are proven. This work is an extension of Froehlich et al. ['Friction in a model of hamiltonian dynamics,' Commun. Math. Phys. 315(2), 401-444 (2012)]; it is an extension because no weak coupling limit for the interaction between tracer particle and medium is assumed. The technical methods used are dispersive estimates and a contraction principle.

  13. Validation of simulation strategies for the flow in a model propeller turbine during a runaway event

    NASA Astrophysics Data System (ADS)

    Fortin, M.; Houde, S.; Deschênes, C.

    2014-12-01

    Recent researches indicate that the useful life of a turbine can be affected by transient events. This study aims to define and validate strategies for the simulation of the flow within a propeller turbine model in runaway condition. Using unsteady pressure measurements on two runner blades for validation, different strategies are compared and their results analysed in order to quantify their precision. This paper will focus on justifying the choice of the simulations strategies and on the analysis of preliminary results.

  14. Spectral modeling of two incline cylinders with validation in the time domain

    E-print Network

    Oswalt, Aaron Jacob

    1999-01-01

    SPECTRAL MODELING OF TWO INLINE CYLINDERS %ITH VALIDATION IN THE TIME DOMAIN A Thesis by AARON JACOB OSWALT Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree... of MASTER OF SCIENCE May 1999 Major Subject: Ocean Engineering SPECTRAL ANALYSIS OF TWO INLINE CYLINDERS WITH VALIDATION IN THE TIME DOMAIN A Thesis by AARON JACOB OSWALT Submitted to Texas A&M University in partial fulfillment of the requirements...

  15. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  16. Updated Delft Mass Transport model DMT-2: computation and validation

    NASA Astrophysics Data System (ADS)

    Hashemi Farahani, Hassan; Ditmar, Pavel; Inacio, Pedro; Klees, Roland; Guo, Jing; Guo, Xiang; Liu, Xianglin; Zhao, Qile; Didova, Olga; Ran, Jiangjun; Sun, Yu; Tangdamrongsub, Natthachet; Gunter, Brian; Riva, Ricardo; Steele-Dunne, Susan

    2014-05-01

    A number of research centers compute models of mass transport in the Earth's system using primarily K-Band Ranging (KBR) data from the Gravity Recovery And Climate Experiment (GRACE) satellite mission. These models typically consist of a time series of monthly solutions, each of which is defined in terms of a set of spherical harmonic coefficients up to degree 60-120. One of such models, the Delft Mass Transport, release 2 (DMT-2), is computed at the Delft University of Technology (The Netherlands) in collaboration with Wuhan University. An updated variant of this model has been produced recently. A unique feature of the computational scheme designed to compute DMT-2 is the preparation of an accurate stochastic description of data noise in the frequency domain using an Auto-Regressive Moving-Average (ARMA) model, which is derived for each particular month. The benefits of such an approach are a proper frequency-dependent data weighting in the data inversion and an accurate variance-covariance matrix of noise in the estimated spherical harmonic coefficients. Furthermore, the data prior to the inversion are subject to an advanced high-pass filtering, which makes use of a spatially-dependent weighting scheme, so that noise is primarily estimated on the basis of data collected over areas with minor mass transport signals (e.g., oceans). On the one hand, this procedure efficiently suppresses noise, which are caused by inaccuracies in satellite orbits and, on the other hand, preserves mass transport signals in the data. Finally, the unconstrained monthly solutions are filtered using a Wiener filter, which is based on estimates of the signal and noise variance-covariance matrices. In combination with a proper data weighting, this noticeably improves the spatial resolution of the monthly gravity models and the associated mass transport models.. For instance, the computed solutions allow long-term negative trends to be clearly seen in sufficiently small regions notorious for rapid mass transport losses, such us the Kangerdlugssuaq and Jakobshavn glaciers in the Greenland ice sheet, as well as the Aral Sea in the Central Asia. The updated variant of DMT-2 has been extensively tested and compared with alternative models. A number of regions/processes have been considered for that purpose. In particular, this model has been applied to estimate mass variations in Greenland and Antarctica (both total and for individual ice drainage systems), as well as to improve a hydrological model of the Rhine River basin. Furthermore, a time-series of degree-1 coefficients has been derived from the DMT-2 model using the method of Swenson et al. (2008). The obtained results are in a good agreement both with alternative GRACE-based models and with independent data, which confirms a high quality of the updated variant of DMT-2.

  17. Extension and validation of an unsteady wake model for rotors

    NASA Technical Reports Server (NTRS)

    Su, AY; Yoo, Kyung M.; Peters, David A.

    1992-01-01

    A new three-dimensional, finite-state induced-flow model is extended to treat nonlinearities associated with the mass flow induced through the rotor plane. This new theory is then applied to the correlation of a recent set of unsteady, hover laser Doppler velocimetry inflow measurements conducted in the Aeroelastic Rotor Test Chamber at Georgia Institute of Technology. Although the model is intended primarily as a representation of unsteady aerodynamics for aeroelasticity applications, the results show that it has an excellent capability in predicting the inflow distribution in hover except near the root and tip. In addition, the computed unsteady spanwise lift distribution of a rotor is compared with that from an unsteady vortex lattice method for pitch oscillations at various frequencies. The new model is shown to be capable of prediction of unsteady loads typical of aeroelastic response.

  18. Modelling and validation of magnetorheological brake responses using parametric approach

    NASA Astrophysics Data System (ADS)

    Z, Zainordin A.; A, Abdullah M.; K, Hudha

    2013-12-01

    Magnetorheological brake (MR Brake) is one x-by-wire systems which performs better than conventional brake systems. MR brake consists of a rotating disc that is immersed with Magnetorheological Fluid (MR Fluid) in an enclosure of an electromagnetic coil. The applied magnetic field will increase the yield strength of the MR fluid where this fluid was used to decrease the speed of the rotating shaft. The purpose of this paper is to develop a mathematical model to represent MR brake with a test rig. The MR brake model is developed based on actual torque characteristic which is coupled with motion of a test rig. Next, the experimental are performed using MR brake test rig and obtained three output responses known as angular velocity response, torque response and load displacement response. Furthermore, the MR brake was subjected to various current. Finally, the simulation results of MR brake model are then verified with experimental results.

  19. Shuttle Space Suit: Fabric/LCVG Model Validation. Chapter 8

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

    2003-01-01

    A detailed space suit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the space suit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of space suit shielding properties assumed the basic fabric layup (Thermal Micrometeoroid Garment, fabric restraints, and pressure envelope) and LCVG could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present space suit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and high-resolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the space suit s protection properties.

  20. On the validity of travel-time based nonlinear bioreactive transport models in steady-state flow.

    PubMed

    Sanz-Prat, Alicia; Lu, Chuanhe; Finkel, Michael; Cirpka, Olaf A

    2015-01-01

    Travel-time based models simplify the description of reactive transport by replacing the spatial coordinates with the groundwater travel time, posing a quasi one-dimensional (1-D) problem and potentially rendering the determination of multidimensional parameter fields unnecessary. While the approach is exact for strictly advective transport in steady-state flow if the reactive properties of the porous medium are uniform, its validity is unclear when local-scale mixing affects the reactive behavior. We compare a two-dimensional (2-D), spatially explicit, bioreactive, advective-dispersive transport model, considered as "virtual truth", with three 1-D travel-time based models which differ in the conceptualization of longitudinal dispersion: (i) neglecting dispersive mixing altogether, (ii) introducing a local-scale longitudinal dispersivity constant in time and space, and (iii) using an effective longitudinal dispersivity that increases linearly with distance. The reactive system considers biodegradation of dissolved organic carbon, which is introduced into a hydraulically heterogeneous domain together with oxygen and nitrate. Aerobic and denitrifying bacteria use the energy of the microbial transformations for growth. We analyze six scenarios differing in the variance of log-hydraulic conductivity and in the inflow boundary conditions (constant versus time-varying concentration). The concentrations of the 1-D models are mapped to the 2-D domain by means of the kinematic (for case i), and mean groundwater age (for cases ii & iii), respectively. The comparison between concentrations of the "virtual truth" and the 1-D approaches indicates extremely good agreement when using an effective, linearly increasing longitudinal dispersivity in the majority of the scenarios, while the other two 1-D approaches reproduce at least the concentration tendencies well. At late times, all 1-D models give valid approximations of two-dimensional transport. We conclude that the conceptualization of nonlinear bioreactive transport in complex multidimensional domains by quasi 1-D travel-time models is valid for steady-state flow fields if the reactants are introduced over a wide cross-section, flow is at quasi steady state, and dispersive mixing is adequately parametrized. PMID:25723340