Sample records for validation des modeles

  1. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  2. DES Y1 Results: Validating Cosmological Parameter Estimation Using Simulated Dark Energy Surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacCrann, N.; et al.

    We use mock galaxy survey simulations designed to resemble the Dark Energy Survey Year 1 (DES Y1) data to validate and inform cosmological parameter estimation. When similar analysis tools are applied to both simulations and real survey data, they provide powerful validation tests of the DES Y1 cosmological analyses presented in companion papers. We use two suites of galaxy simulations produced using different methods, which therefore provide independent tests of our cosmological parameter inference. The cosmological analysis we aim to validate is presented in DES Collaboration et al. (2017) and uses angular two-point correlation functions of galaxy number counts and weak lensing shear, as well as their cross-correlation, in multiple redshift bins. While our constraints depend on the specific set of simulated realisations available, for both suites of simulations we find that the input cosmology is consistent with the combined constraints from multiple simulated DES Y1 realizations in themore » $$\\Omega_m-\\sigma_8$$ plane. For one of the suites, we are able to show with high confidence that any biases in the inferred $$S_8=\\sigma_8(\\Omega_m/0.3)^{0.5}$$ and $$\\Omega_m$$ are smaller than the DES Y1 $$1-\\sigma$$ uncertainties. For the other suite, for which we have fewer realizations, we are unable to be this conclusive; we infer a roughly 70% probability that systematic biases in the recovered $$\\Omega_m$$ and $$S_8$$ are sub-dominant to the DES Y1 uncertainty. As cosmological analyses of this kind become increasingly more precise, validation of parameter inference using survey simulations will be essential to demonstrate robustness.« less

  3. Verification, Validation, and Accreditation (VV&A) of Federations (Verification, validation et accreditation (VV&A) des federations)

    DTIC Science & Technology

    2008-04-01

    le manque de modèle universel pour la Vérification, la Validation et l’Accréditation des fédérations à cause des perspectives et besoins nationaux...federation application will depend on a number of factors, including the quality of the requirements information and the resources allocated to the VV&A...required) allocating the required functionality to federates, and developing a detailed plan for federation development and implementation. Step 4

  4. Etude numerique et experimentale de la reponse vibro-acoustique des structures raidies a des excitations aeriennes et solidiennes

    NASA Astrophysics Data System (ADS)

    Mejdi, Abderrazak

    Les fuselages des avions sont generalement en aluminium ou en composite renforces par des raidisseurs longitudinaux (lisses) et transversaux (cadres). Les raidisseurs peuvent etre metalliques ou en composite. Durant leurs differentes phases de vol, les structures d'avions sont soumises a des excitations aeriennes (couche limite turbulente : TBL, champs diffus : DAF) sur la peau exterieure dont l'energie acoustique produite se transmet a l'interieur de la cabine. Les moteurs, montes sur la structure, produisent une excitation solidienne significative. Ce projet a pour objectifs de developper et de mettre en place des strategies de modelisations des fuselages d'avions soumises a des excitations aeriennes et solidiennes. Tous d'abord, une mise a jour des modeles existants de la TBL apparait dans le deuxieme chapitre afin de mieux les classer. Les proprietes de la reponse vibro-acoustique des structures planes finies et infinies sont analysees. Dans le troisieme chapitre, les hypotheses sur lesquelles sont bases les modeles existants concernant les structures metalliques orthogonalement raidies soumises a des excitations mecaniques, DAF et TBL sont reexamines en premier lieu. Ensuite, une modelisation fine et fiable de ces structures est developpee. Le modele est valide numeriquement a l'aide des methodes des elements finis (FEM) et de frontiere (BEM). Des tests de validations experimentales sont realises sur des panneaux d'avions fournis par des societes aeronautiques. Au quatrieme chapitre, une extension vers les structures composites renforcees par des raidisseurs aussi en composites et de formes complexes est etablie. Un modele analytique simple est egalement implemente et valide numeriquement. Au cinquieme chapitre, la modelisation des structures raidies periodiques en composites est beaucoup plus raffinee par la prise en compte des effets de couplage des deplacements planes et transversaux. L'effet de taille des structures finies periodiques est egalement pris en

  5. Validating Human Performance Models of the Future Orion Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Wong, Douglas T.; Walters, Brett; Fairey, Lisa

    2010-01-01

    NASA's Orion Crew Exploration Vehicle (CEV) will provide transportation for crew and cargo to and from destinations in support of the Constellation Architecture Design Reference Missions. Discrete Event Simulation (DES) is one of the design methods NASA employs for crew performance of the CEV. During the early development of the CEV, NASA and its prime Orion contractor Lockheed Martin (LM) strived to seek an effective low-cost method for developing and validating human performance DES models. This paper focuses on the method developed while creating a DES model for the CEV Rendezvous, Proximity Operations, and Docking (RPOD) task to the International Space Station. Our approach to validation was to attack the problem from several fronts. First, we began the development of the model early in the CEV design stage. Second, we adhered strictly to M&S development standards. Third, we involved the stakeholders, NASA astronauts, subject matter experts, and NASA's modeling and simulation development community throughout. Fourth, we applied standard and easy-to-conduct methods to ensure the model's accuracy. Lastly, we reviewed the data from an earlier human-in-the-loop RPOD simulation that had different objectives, which provided us an additional means to estimate the model's confidence level. The results revealed that a majority of the DES model was a reasonable representation of the current CEV design.

  6. Screening for depression in adolescent paediatric patients: validity of the new Depression Screener for Teenagers (DesTeen).

    PubMed

    Pietsch, Kathrin; Allgaier, Antje-Kathrin; Frühe, Barbara; Rohde, Sabine; Hosie, Stuart; Heinrich, Martina; Schulte-Körne, Gerd

    2011-09-01

    Depression in adolescents is often hard to detect. In many cases paediatricians are the first point of contact. In order to increase recognition rates, screening instruments may be a helpful support for health care professionals. However, there is a lack of valid and economical screening instruments for primary care patients. Thus, the aim of the study was the development of the new Depression Screener for Teenagers (DesTeen) and its validation in a paediatric sample. 326 patients between 13 and 16 years old completed the DesTeen and a diagnostic interview, serving as gold standard. Prevalence rate for any depressive disorder (minor depression, major depression and dysthymia) was 12.6%. Psychometric properties were calculated. For validity measures, the area under the receiver operating characteristic curves (AUC) for any depressive disorder and the diagnostic subgroups was computed. DesTeen showed a high reliability (Cronbach's α=.87) and a high validity (AUC=.91). For the diagnostic subgroups AUC values did not significantly differ from overall accuracy of any depressive disorder (major depression: AUC=.95, p=.179; dysthymia: AUC=.88, p=.605; minor depression: AUC=.87, p=.327). The optimal cut-off point for any depressive disorder according to the Youden-Index yielded a sensitivity of .90 and a specificity of .80. An abbreviated 5-item version of DesTeen showed no loss in validity (AUC=.90, p=.695). Overall, DesTeen can be regarded as a valid screening instrument for adolescent paediatric patients. For practical use, the 5-item version is even more promising. A replication of these results is essential. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Validation of mechanical models for reinforced concrete structures: Presentation of the French project ``Benchmark des Poutres de la Rance''

    NASA Astrophysics Data System (ADS)

    L'Hostis, V.; Brunet, C.; Poupard, O.; Petre-Lazar, I.

    2006-11-01

    Several ageing models are available for the prediction of the mechanical consequences of rebar corrosion. They are used for service life prediction of reinforced concrete structures. Concerning corrosion diagnosis of reinforced concrete, some Non Destructive Testing (NDT) tools have been developed, and have been in use for some years. However, these developments require validation on existing concrete structures. The French project “Benchmark des Poutres de la Rance” contributes to this aspect. It has two main objectives: (i) validation of mechanical models to estimate the influence of rebar corrosion on the load bearing capacity of a structure, (ii) qualification of the use of the NDT results to collect information on steel corrosion within reinforced-concrete structures. Ten French and European institutions from both academic research laboratories and industrial companies contributed during the years 2004 and 2005. This paper presents the project that was divided into several work packages: (i) the reinforced concrete beams were characterized from non-destructive testing tools, (ii) the mechanical behaviour of the beams was experimentally tested, (iii) complementary laboratory analysis were performed and (iv) finally numerical simulations results were compared to the experimental results obtained with the mechanical tests.

  8. When to use discrete event simulation (DES) for the economic evaluation of health technologies? A review and critique of the costs and benefits of DES.

    PubMed

    Karnon, Jonathan; Haji Ali Afzali, Hossein

    2014-06-01

    Modelling in economic evaluation is an unavoidable fact of life. Cohort-based state transition models are most common, though discrete event simulation (DES) is increasingly being used to implement more complex model structures. The benefits of DES relate to the greater flexibility around the implementation and population of complex models, which may provide more accurate or valid estimates of the incremental costs and benefits of alternative health technologies. The costs of DES relate to the time and expertise required to implement and review complex models, when perhaps a simpler model would suffice. The costs are not borne solely by the analyst, but also by reviewers. In particular, modelled economic evaluations are often submitted to support reimbursement decisions for new technologies, for which detailed model reviews are generally undertaken on behalf of the funding body. This paper reports the results from a review of published DES-based economic evaluations. Factors underlying the use of DES were defined, and the characteristics of applied models were considered, to inform options for assessing the potential benefits of DES in relation to each factor. Four broad factors underlying the use of DES were identified: baseline heterogeneity, continuous disease markers, time varying event rates, and the influence of prior events on subsequent event rates. If relevant, individual-level data are available, representation of the four factors is likely to improve model validity, and it is possible to assess the importance of their representation in individual cases. A thorough model performance evaluation is required to overcome the costs of DES from the users' perspective, but few of the reviewed DES models reported such a process. More generally, further direct, empirical comparisons of complex models with simpler models would better inform the benefits of DES to implement more complex models, and the circumstances in which such benefits are most likely.

  9. Modeling Clinical Outcomes in Prostate Cancer: Application and Validation of the Discrete Event Simulation Approach.

    PubMed

    Pan, Feng; Reifsnider, Odette; Zheng, Ying; Proskorovsky, Irina; Li, Tracy; He, Jianming; Sorensen, Sonja V

    2018-04-01

    Treatment landscape in prostate cancer has changed dramatically with the emergence of new medicines in the past few years. The traditional survival partition model (SPM) cannot accurately predict long-term clinical outcomes because it is limited by its ability to capture the key consequences associated with this changing treatment paradigm. The objective of this study was to introduce and validate a discrete-event simulation (DES) model for prostate cancer. A DES model was developed to simulate overall survival (OS) and other clinical outcomes based on patient characteristics, treatment received, and disease progression history. We tested and validated this model with clinical trial data from the abiraterone acetate phase III trial (COU-AA-302). The model was constructed with interim data (55% death) and validated with the final data (96% death). Predicted OS values were also compared with those from the SPM. The DES model's predicted time to chemotherapy and OS are highly consistent with the final observed data. The model accurately predicts the OS hazard ratio from the final data cut (predicted: 0.74; 95% confidence interval [CI] 0.64-0.85 and final actual: 0.74; 95% CI 0.6-0.88). The log-rank test to compare the observed and predicted OS curves indicated no statistically significant difference between observed and predicted curves. However, the predictions from the SPM based on interim data deviated significantly from the final data. Our study showed that a DES model with properly developed risk equations presents considerable improvements to the more traditional SPM in flexibility and predictive accuracy of long-term outcomes. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Developpement D'un Modele Climatique Regional: Fizr Simulation des Conditions de Janvier de la Cote Ouest Nord Americaine

    NASA Astrophysics Data System (ADS)

    Goyette, Stephane

    1995-11-01

    Le sujet de cette these concerne la modelisation numerique du climat regional. L'objectif principal de l'exercice est de developper un modele climatique regional ayant les capacites de simuler des phenomenes de meso-echelle spatiale. Notre domaine d'etude se situe sur la Cote Ouest nord americaine. Ce dernier a retenu notre attention a cause de la complexite du relief et de son controle sur le climat. Les raisons qui motivent cette etude sont multiples: d'une part, nous ne pouvons pas augmenter, en pratique, la faible resolution spatiale des modeles de la circulation generale de l'atmosphere (MCG) sans augmenter a outrance les couts d'integration et, d'autre part, la gestion de l'environnement exige de plus en plus de donnees climatiques regionales determinees avec une meilleure resolution spatiale. Jusqu'alors, les MCG constituaient les modeles les plus estimes pour leurs aptitudes a simuler le climat ainsi que les changements climatiques mondiaux. Toutefois, les phenomenes climatiques de fine echelle echappent encore aux MCG a cause de leur faible resolution spatiale. De plus, les repercussions socio-economiques des modifications possibles des climats sont etroitement liees a des phenomenes imperceptibles par les MCG actuels. Afin de circonvenir certains problemes inherents a la resolution, une approche pratique vise a prendre un domaine spatial limite d'un MCG et a y imbriquer un autre modele numerique possedant, lui, un maillage de haute resolution spatiale. Ce processus d'imbrication implique alors une nouvelle simulation numerique. Cette "retro-simulation" est guidee dans le domaine restreint a partir de pieces d'informations fournies par le MCG et forcee par des mecanismes pris en charge uniquement par le modele imbrique. Ainsi, afin de raffiner la precision spatiale des previsions climatiques de grande echelle, nous developpons ici un modele numerique appele FIZR, permettant d'obtenir de l'information climatique regionale valide a la fine echelle spatiale

  11. Identification des parametres du moteur de l'avion Cessna Citation X pour la phase de croisiere a partir des tests en vol et a base des reseaux de neurones =

    NASA Astrophysics Data System (ADS)

    Zaag, Mahdi

    La disponibilite des modeles precis des avions est parmi les elements cles permettant d'assurer leurs ameliorations. Ces modeles servent a ameliorer les commandes de vol et de concevoir de nouveaux systemes aerodynamiques pour la conception des ailes deformables des avions. Ce projet consiste a concevoir un systeme d'identification de certains parametres du modele du moteur de l'avion d'affaires americain Cessna Citation X pour la phase de croisiere a partir des essais en vol. Ces essais ont ete effectues sur le simulateur de vol concu et fabrique par CAE Inc. qui possede le niveau D de la dynamique de vol. En effet, le niveau D est le plus haut niveau de precision donne par l'autorite federale de reglementation FAA de l'aviation civile aux Etats-Unis. Une methodologie basee sur les reseaux de neurones optimises a l'aide d'un algorithme intitule le "grand deluge etendu" est utilisee dans la conception de ce systeme d'identification. Plusieurs tests de vol pour differentes altitudes et differents nombres de Mach ont ete realises afin de s'en servir comme bases de donnees pour l'apprentissage des reseaux de neurones. La validation de ce modele a ete realisee a l'aide des donnees du simulateur. Malgre la nonlinearite et la complexite du systeme, les parametres du moteur ont ete tres bien predits pour une enveloppe de vol determinee. Ce modele estime pourrait etre utilise pour des analyses de fonctionnement du moteur et pourrait assurer le controle de l'avion pendant cette phase de croisiere. L'identification des parametres du moteur pourrait etre realisee aussi pour les autres phases de montee et de descente afin d'obtenir son modele complet pour toute l'enveloppe du vol de l'avion Cessna Citation X (montee, croisiere, descente). Cette methode employee dans ce travail pourrait aussi etre efficace pour realiser un modele pour l'identification des coefficients aerodynamiques du meme avion a partir toujours des essais en vol. None None None

  12. Using XML to encode TMA DES metadata.

    PubMed

    Lyttleton, Oliver; Wright, Alexander; Treanor, Darren; Lewis, Paul

    2011-01-01

    The Tissue Microarray Data Exchange Specification (TMA DES) is an XML specification for encoding TMA experiment data. While TMA DES data is encoded in XML, the files that describe its syntax, structure, and semantics are not. The DTD format is used to describe the syntax and structure of TMA DES, and the ISO 11179 format is used to define the semantics of TMA DES. However, XML Schema can be used in place of DTDs, and another XML encoded format, RDF, can be used in place of ISO 11179. Encoding all TMA DES data and metadata in XML would simplify the development and usage of programs which validate and parse TMA DES data. XML Schema has advantages over DTDs such as support for data types, and a more powerful means of specifying constraints on data values. An advantage of RDF encoded in XML over ISO 11179 is that XML defines rules for encoding data, whereas ISO 11179 does not. We created an XML Schema version of the TMA DES DTD. We wrote a program that converted ISO 11179 definitions to RDF encoded in XML, and used it to convert the TMA DES ISO 11179 definitions to RDF. We validated a sample TMA DES XML file that was supplied with the publication that originally specified TMA DES using our XML Schema. We successfully validated the RDF produced by our ISO 11179 converter with the W3C RDF validation service. All TMA DES data could be encoded using XML, which simplifies its processing. XML Schema allows datatypes and valid value ranges to be specified for CDEs, which enables a wider range of error checking to be performed using XML Schemas than could be performed using DTDs.

  13. Using XML to encode TMA DES metadata

    PubMed Central

    Lyttleton, Oliver; Wright, Alexander; Treanor, Darren; Lewis, Paul

    2011-01-01

    Background: The Tissue Microarray Data Exchange Specification (TMA DES) is an XML specification for encoding TMA experiment data. While TMA DES data is encoded in XML, the files that describe its syntax, structure, and semantics are not. The DTD format is used to describe the syntax and structure of TMA DES, and the ISO 11179 format is used to define the semantics of TMA DES. However, XML Schema can be used in place of DTDs, and another XML encoded format, RDF, can be used in place of ISO 11179. Encoding all TMA DES data and metadata in XML would simplify the development and usage of programs which validate and parse TMA DES data. XML Schema has advantages over DTDs such as support for data types, and a more powerful means of specifying constraints on data values. An advantage of RDF encoded in XML over ISO 11179 is that XML defines rules for encoding data, whereas ISO 11179 does not. Materials and Methods: We created an XML Schema version of the TMA DES DTD. We wrote a program that converted ISO 11179 definitions to RDF encoded in XML, and used it to convert the TMA DES ISO 11179 definitions to RDF. Results: We validated a sample TMA DES XML file that was supplied with the publication that originally specified TMA DES using our XML Schema. We successfully validated the RDF produced by our ISO 11179 converter with the W3C RDF validation service. Conclusions: All TMA DES data could be encoded using XML, which simplifies its processing. XML Schema allows datatypes and valid value ranges to be specified for CDEs, which enables a wider range of error checking to be performed using XML Schemas than could be performed using DTDs. PMID:21969921

  14. Conception, fabrication et validation d'une presse a injection basse pression pour le procede des poudres metalliques =

    NASA Astrophysics Data System (ADS)

    Lamarre, Simon G.

    Le moulage par injection a basse pression de poudre metallique est une technique de mise en forme de pieces de formes complexes. La poudre metallique est melangee avec des polymeres basse viscosite (ex. : cire) pour former un melange homogene a une temperature superieure a la temperature de fusion des polymeres. Pour faciliter l'injection dans la cavite du moule, la composition des melanges est ajustee pour diminuer la viscosite. D'une part, les melanges peu visqueux possedent une bonne moulabilite. D'autre part, le phenomene de la segregation se manifeste rapidement avec les melanges peu visqueux. Les machines commerciales sont munies d'un canal d'injection et d'une valve qui relient le reservoir de melange et la cavite du moule. Le melange reste stationnaire dans ces composantes entre deux sequences d'injection, ce qui le rend propice a la segregation. Plusieurs brevets tentent de resoudre ce probleme en utilisant des pompes et des canaux de recirculation. Ces composantes sont difficiles a nettoyer en raison de leur complexite. Une machine a injection basse pression a ete concue et fabriquee pour l'etude de l'aptitude au moulage des melanges de tres faible viscosite (ex. : 0.1 Pa˙s), qui tient compte du phenomene de segregation et des contraintes de nettoyage. Un piston d'injection puise le volume desire d'un reservoir. Ensuite, un mouvement lateral cisaille le melange a l'intersection entre le reservoir et le cylindre et bouche l'orifice de sortie du reservoir. Le cylindre est degage et peut recevoir le moule. A la suite de l'injection, le piston retourne a la position du reservoir et entre dans son orifice de sortie. Le melange residuel est retourne dans le reservoir, melange et desaere a nouveau. L'appareil a ete valide par des essais d'injectabilite avec un melange de poudre d'acier inoxydable et de liants a basse viscosite. Des essais d'injection ont montre que le melange contenant l'acide stearique a parcouru la plus grande distance dans le moule de forme

  15. Validation d’indicateurs de la prise en charge des atteintes des fonctions cognitives dans les unités d’évaluation gériatrique

    PubMed Central

    Payot, Isabelle; Latour, Judith; Massoud, Fadi; Kergoat, Marie-Jeanne

    2007-01-01

    RÉSUMÉ OBJECTIF Analyser et adapter des indicateurs de qualité pour l’évaluation et la prise en charge des personnes avec atteinte des fonctions cognitives, dont la prévalence est très élevée dans les unités d’évaluation gériatrique au Québec. DEVIS Une méthode de type Delphi-modifiée. CONTEXTE Province de Québec. PARTICIPANTS Sept praticiens de milieux hospitaliers affiliés à 3 universités du Québec choisis pour leur compétence reconnue en démence et soins gériatriques. MÉTHODE Parmi les indicateurs développés en 2001 par la méthode RAND, 22 items sélectionnés pour leur pertinence au cours du processus d’évaluation et de prise en charge d’une atteinte des fonctions cognitives ont été adaptés aux conditions de pratique du milieu hospitalier québécois. Les indicateurs, accompagnés d’évidences de la littérature, ont été soumis, par la poste, à un panel d’experts. Chaque expert a coté, sur une échelle de 1 à 9, son degré d’accord à des affirmations concernant la validité, la qualité et la nécessité d’être inscrit dans le dossier médical. Pour qu’un indicateur soit retenu, il devait faire consensus selon les valeurs médianes, être situé dans le tertile supérieur et recevoir l’agrément des experts. Les indicateurs incertains étaient modifiés en fonction des commentaires des experts, puis soumis au même panel pour un second tour. RÉSULTATS Des 22 indicateurs soumis au premier tour, 21 ont été validés. Ils prenaient en compte le dépistage, l’investigation, l’évaluation, le traitement et le suivi. L’indicateur considéré comme incertain a été modifié puis accepté lors du second tour. CONCLUSION Cette étude a identifié 22 indicateurs pertinents pour l’évaluation et la prise en charge de l’atteinte des fonctions cognitives dans une unité d’évaluation gériatrique. Ils serviront de base à l’appréciation de la problématique de la démence, dans une étude ayant cours

  16. DES Prediction of Cavitation Erosion and Its Validation for a Ship Scale Propeller

    NASA Astrophysics Data System (ADS)

    Ponkratov, Dmitriy, Dr

    2015-12-01

    Lloyd's Register Technical Investigation Department (LR TID) have developed numerical functions for the prediction of cavitation erosion aggressiveness within Computational Fluid Dynamics (CFD) simulations. These functions were previously validated for a model scale hydrofoil and ship scale rudder [1]. For the current study the functions were applied to a cargo ship's full scale propeller, on which the severe cavitation erosion was reported. The performed Detach Eddy Simulation (DES) required a fine computational mesh (approximately 22 million cells), together with a very small time step (2.0E-4 s). As the cavitation for this type of vessel is primarily caused by a highly non-uniform wake, the hull was also included in the simulation. The applied method under predicted the cavitation extent and did not fully resolve the tip vortex; however, the areas of cavitation collapse were captured successfully. Consequently, the developed functions showed a very good prediction of erosion areas, as confirmed by comparison with underwater propeller inspection results.

  17. Modelisation des emissions de particules microniques et nanometriques en usinage

    NASA Astrophysics Data System (ADS)

    Khettabi, Riad

    La mise en forme des pieces par usinage emet des particules, de tailles microscopiques et nanometriques, qui peuvent etre dangereuses pour la sante. Le but de ce travail est d'etudier les emissions de ces particules pour fins de prevention et reduction a la source. L'approche retenue est experimentale et theorique, aux deux echelles microscopique et macroscopique. Le travail commence par des essais permettant de determiner les influences du materiau, de l'outil et des parametres d'usinage sur les emissions de particules. E nsuite un nouveau parametre caracterisant les emissions, nomme Dust unit , est developpe et un modele predictif est propose. Ce modele est base sur une nouvelle theorie hybride qui integre les approches energetiques, tribologiques et deformation plastique, et inclut la geometrie de l'outil, les proprietes du materiau, les conditions de coupe et la segmentation des copeaux. Il ete valide au tournage sur quatre materiaux: A16061-T6, AISI1018, AISI4140 et fonte grise.

  18. Integration des sciences et de la langue: Creation et experimentation d'un modele pedagogique pour ameliorer l'apprentissage des sciences en milieu francophone minoritaire

    NASA Astrophysics Data System (ADS)

    Cormier, Marianne

    Les faibles resultats en sciences des eleves du milieu francophone minoritaire, lors d'epreuves au plan national et international, ont interpelle la recherche de solutions. Cette these avait pour but de creer et d'experimenter un modele pedagogique pour l'enseignement des sciences en milieu linguistique minoritaire. En raison de la presence de divers degres de francite chez la clientele scolaire de ce milieu, plusieurs elements langagiers (l'ecriture, la discussion et la lecture) ont ete integres a l'apprentissage scientifique. Nous avions recommande de commencer le processus d'apprentissage avec des elements langagiers plutot informels (redaction dans un journal, discussions en dyades...) pour progresser vers des activites langagieres plus formelles (redaction de rapports ou d'explications scientifiques). En ce qui a trait a l'apprentissage scientifique, le modele preconisait une demarche d'evolution conceptuelle d'inspiration socio-constructiviste tout en s'appuyant fortement sur l'apprentissage experientiel. Lors de l'experimentation du modele, nous voulions savoir si celui-ci provoquait une evolution conceptuelle chez les eleves, et si, simultanement, le vocabulaire scientifique de ces derniers s'enrichissait. Par ailleurs, nous cherchions a comprendre comment les eleves vivaient leurs apprentissages dans le cadre de ce modele pedagogique. Une classe de cinquieme annee de l'ecole de Grande-Digue, dans le Sud-est du Nouveau-Brunswick, a participe a la mise a l'essai du modele en etudiant les marais sales locaux. Lors d'entrevues initiales, nous avons remarque que les connaissances des eleves au sujet des marais sales etaient limitees. En effet, s'ils etaient conscients que les marais etaient des lieux naturels, ils ne pouvaient pas necessairement les decrire avec precision. Nous avons egalement constate que les eleves utilisaient surtout des mots communs (plantes, oiseaux, insectes) pour decrire le marais. Les resultats obtenus indiquent que les eleves ont

  19. Des proprietes de l'etat normal du modele de Hubbard bidimensionnel

    NASA Astrophysics Data System (ADS)

    Lemay, Francois

    Depuis leur decouverte, les etudes experimentales ont demontre que les supra-conducteurs a haute temperature ont une phase normale tres etrange. Les proprietes de ces materiaux ne sont pas bien decrites par la theorie du liquide de Fermi. Le modele de Hubbard bidimensionnel, bien qu'il ne soit pas encore resolu, est toujours considere comme un candidat pour expliquer la physique de ces composes. Dans cet ouvrage, nous mettons en evidence plusieurs proprietes electroniques du modele qui sont incompatibles avec l'existence de quasi-particules. Nous montrons notamment que la susceptibilite des electrons libres sur reseau contient des singularites logarithmiques qui influencent de facon determinante les proprietes de la self-energie a basse frequence. Ces singularites sont responsables de la destruction des quasi-particules. En l'absence de fluctuations antiferromagnetiques, elles sont aussi responsables de l'existence d'un petit pseudogap dans le poids spectral au niveau de Fermi. Les proprietes du modele sont egalement etudiees pour une surface de Fermi similaire a celle des supraconducteurs a haute temperature. Un parallele est etabli entre certaines caracteristiques du modele et celles de ces materiaux.

  20. Measuring Diversity and Inclusion in Academic Medicine: The Diversity Engagement Survey (DES)

    PubMed Central

    Person, Sharina D.; Jordan, C. Greer; Allison, Jeroan J.; Fink Ogawa, Lisa M.; Castillo-Page, Laura; Conrad, Sarah; Nivet, Marc A.; Plummer, Deborah L.

    2018-01-01

    Purpose To produce a physician and scientific workforce capable of delivering high quality, culturally competent health care and research, academic medical centers must assess their capacity for diversity and inclusion and respond to identified opportunities. Thus, the Diversity Engagement Survey (DES) is presented as a diagnostic and benchmarking tool. Method The 22-item DES connects workforce engagement theory with inclusion and diversity constructs. Face and content validity were established based on decades of previous work to promote institutional diversity. The survey was pilot tested at a single academic medical center and subsequently administered at 13 additional academic medical centers. Cronbach alphas assessed internal consistency and Confirmatory Factor Analysis (CFA) established construct validity. Criterion validity was assessed by observed separation in scores for groups traditionally recognized to have less workforce engagement. Results The sample consisted of 13,694 individuals at 14 medical schools from across the U.S. who responded to the survey administered between 2011– 2012. The Cronbach alphas for inclusion and engagement factors (range: 0.68 to 0.85), CFA fit indices, and item correlations with latent constructs, indicated an acceptable model fit and that questions measured the intended concepts. DES scores clearly distinguished higher and lower performing institutions. The DES detected important disparities for black, women, and those who did not have heterosexual orientation. Conclusions This study demonstrated that the DES is a reliable and valid instrument for internal assessment and evaluation or external benchmarking of institutional progress in building inclusion and engagement. PMID:26466376

  1. Development and validation of clinical prediction models for mortality, functional outcome and cognitive impairment after stroke: a study protocol

    PubMed Central

    Fahey, Marion; Rudd, Anthony; Béjot, Yannick; Wolfe, Charles; Douiri, Abdel

    2017-01-01

    Introduction Stroke is a leading cause of adult disability and death worldwide. The neurological impairments associated with stroke prevent patients from performing basic daily activities and have enormous impact on families and caregivers. Practical and accurate tools to assist in predicting outcome after stroke at patient level can provide significant aid for patient management. Furthermore, prediction models of this kind can be useful for clinical research, health economics, policymaking and clinical decision support. Methods 2869 patients with first-ever stroke from South London Stroke Register (SLSR) (1995–2004) will be included in the development cohort. We will use information captured after baseline to construct multilevel models and a Cox proportional hazard model to predict cognitive impairment, functional outcome and mortality up to 5 years after stroke. Repeated random subsampling validation (Monte Carlo cross-validation) will be evaluated in model development. Data from participants recruited to the stroke register (2005–2014) will be used for temporal validation of the models. Data from participants recruited to the Dijon Stroke Register (1985–2015) will be used for external validation. Discrimination, calibration and clinical utility of the models will be presented. Ethics Patients, or for patients who cannot consent their relatives, gave written informed consent to participate in stroke-related studies within the SLSR. The SLSR design was approved by the ethics committees of Guy’s and St Thomas’ NHS Foundation Trust, Kings College Hospital, Queens Square and Westminster Hospitals (London). The Dijon Stroke Registry was approved by the Comité National des Registres and the InVS and has authorisation of the Commission Nationale de l’Informatique et des Libertés. PMID:28821511

  2. Development and validation of clinical prediction models for mortality, functional outcome and cognitive impairment after stroke: a study protocol.

    PubMed

    Fahey, Marion; Rudd, Anthony; Béjot, Yannick; Wolfe, Charles; Douiri, Abdel

    2017-08-18

    Stroke is a leading cause of adult disability and death worldwide. The neurological impairments associated with stroke prevent patients from performing basic daily activities and have enormous impact on families and caregivers. Practical and accurate tools to assist in predicting outcome after stroke at patient level can provide significant aid for patient management. Furthermore, prediction models of this kind can be useful for clinical research, health economics, policymaking and clinical decision support. 2869 patients with first-ever stroke from South London Stroke Register (SLSR) (1995-2004) will be included in the development cohort. We will use information captured after baseline to construct multilevel models and a Cox proportional hazard model to predict cognitive impairment, functional outcome and mortality up to 5 years after stroke. Repeated random subsampling validation (Monte Carlo cross-validation) will be evaluated in model development. Data from participants recruited to the stroke register (2005-2014) will be used for temporal validation of the models. Data from participants recruited to the Dijon Stroke Register (1985-2015) will be used for external validation. Discrimination, calibration and clinical utility of the models will be presented. Patients, or for patients who cannot consent their relatives, gave written informed consent to participate in stroke-related studies within the SLSR. The SLSR design was approved by the ethics committees of Guy's and St Thomas' NHS Foundation Trust, Kings College Hospital, Queens Square and Westminster Hospitals (London). The Dijon Stroke Registry was approved by the Comité National des Registres and the InVS and has authorisation of the Commission Nationale de l'Informatique et des Libertés. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Un accumulateur echangeur de chaleur hybride pour la gestion simultanee des energies solaire et electrique

    NASA Astrophysics Data System (ADS)

    Ait Hammou, Zouhair

    Cette etude porte sur la conception d'un accumulateur echangeur de chaleur hybride (AECH) pour la gestion simultanee des energies solaire et electrique. Un modele mathematique reposant sur les equations de conservation de la quantite d'energie est expose. Il est developpe pour tester differents materiaux de stockage, entre autres, les materiaux a changement de phase (solide/liquide) et les materiaux de stockage sensible. Un code de calcul est mis en eeuvre sur ordinateur, puis valide a l'aide des resultats analytiques et numeriques de la litterature. En parallele, un prototype experimental a echelle reduite est concu au laboratoire afin de valider le code de calcul. Des simulations sont effectuees pour etudier les effets des parametres de conception et des materiaux de stockage sur le comportement thermique de l'AECH et sur la consommation d'energie electrique. Les resultats des simulations sur quatre mois d'hiver montrent que la paraffine n-octadecane et l'acide caprique sont deux candidats souhaitables pour le stockage d'energie destine au chauffage des habitats. L'utilisation de ces deux materiaux dans l'AECH permet de reduire la consommation d'energie electrique de 32% et d'aplanir le probleme de pointe electrique puisque 90% de l'energie electrique est consommee durant les heures creuses. En plus, en adoptant un tarif preferentiel, le calcul des couts lies a la consommation d'energie electrique montre que le consommateur adoptant ce systeme beneficie d'une reduction de 50% de la facture d'electricite.

  4. Groundwater Model Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process ofmore » stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of

  5. Model Validation Status Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.L. Hardin

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified,more » and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural

  6. Elaboration de nouvelles approches micromecaniques pour l'optimisation des performances mecaniques des materiaux heterogenes

    NASA Astrophysics Data System (ADS)

    Aboutajeddine, Ahmed

    Les modeles micromecaniques de transition d'echelles qui permettent de determiner les proprietes effectives des materiaux heterogenes a partir de la microstructure sont consideres dans ce travail. L'objectif est la prise en compte de la presence d'une interphase entre la matrice et le renforcement dans les modeles micromecaniques classiques, de meme que la reconsideration des approximations de base de ces modeles, afin de traiter les materiaux multiphasiques. Un nouveau modele micromecanique est alors propose pour tenir compte de la presence d'une interphase elastique mince lors de la determination des proprietes effectives. Ce modele a ete construit grace a l'apport de l'equation integrale, des operateurs interfaciaux de Hill et de la methode de Mori-Tanaka. Les expressions obtenues pour les modules globaux et les champs dans l'enrobage sont de nature analytique. L'approximation de base de ce modele est amelioree par la suite dans un nouveau modele qui s'interesse aux inclusions enrobees avec un enrobage mince ou epais. La resolution utilisee s'appuie sur une double homogeneisation realisee au niveau de l'inclusion enrobee et du materiau. Cette nouvelle demarche, permettra d'apprehender completement les implications des approximations de la modelisation. Les resultats obtenus sont exploites par la suite dans la solution de l'assemblage de Hashin. Ainsi, plusieurs modeles micromecaniques classiques d'origines differentes se voient unifier et rattacher, dans ce travail, a la representation geometrique de Hashin. En plus de pouvoir apprecier completement la pertinence de l'approximation de chaque modele dans cette vision unique, l'extension correcte de ces modeles aux materiaux multiphasiques est rendue possible. Plusieurs modeles analytiques et explicites sont alors proposee suivant des solutions de differents ordres de l'assemblage de Hashin. L'un des modeles explicite apparait comme une correction directe du modele de Mori-Tanaka, dans les cas ou celui ci echoue a

  7. Application of Fiber Optic Instrumentation (Validation des systemes d’instrumentation a fibres optiques)

    DTIC Science & Technology

    2012-07-01

    SCI-228) Executive Summary This AGARDograph presents an introduction to fiber optic systems and is intended to provide a basic understanding of...22 / SCI-228) Synthèse Cette AGARDograph présente une introduction aux systèmes à fibres optiques et a pour objet d’expliquer l’utilisation de ces...l’instrumentation des essais en vol dans l’ensemble des pays membres de l’OTAN, entraînant la disparition progressive des jauges extensométriques et des

  8. Littoral Infrared Ship Self Defence Technology Studies (Autodefense cotiere infrarouge des navires etudes technologiques)

    DTIC Science & Technology

    2014-05-01

    de simulation du Simulateur de Contre- mesures de la Menace Navale afin de pouvoir inclure des leurres et des autodirecteurs de missiles ; 4) Une...sur le littoral ; 2) La détection des petites cibles de surface sur le littoral ; 3) L’amélioration et la validation de la modélisation et du code...amélioration et une validation supplémentaire de la modélisation et du

  9. Validation and optimization of SST k-ω turbulence model for pollutant dispersion within a building array

    NASA Astrophysics Data System (ADS)

    Yu, Hesheng; Thé, Jesse

    2016-11-01

    The prediction of the dispersion of air pollutants in urban areas is of great importance to public health, homeland security, and environmental protection. Computational Fluid Dynamics (CFD) emerges as an effective tool for pollutant dispersion modelling. This paper reports and quantitatively validates the shear stress transport (SST) k-ω turbulence closure model and its transitional variant for pollutant dispersion under complex urban environment for the first time. Sensitivity analysis is performed to establish recommendation for the proper use of turbulence models in urban settings. The current SST k-ω simulation is validated rigorously by extensive experimental data using hit rate for velocity components, and the "factor of two" of observations (FAC2) and fractional bias (FB) for concentration field. The simulation results show that current SST k-ω model can predict flow field nicely with an overall hit rate of 0.870, and concentration dispersion with FAC2 = 0.721 and FB = 0.045. The flow simulation of the current SST k-ω model is slightly inferior to that of a detached eddy simulation (DES), but better than that of standard k-ε model. However, the current study is the best among these three model approaches, when validated against measurements of pollutant dispersion in the atmosphere. This work aims to provide recommendation for proper use of CFD to predict pollutant dispersion in urban environment.

  10. Relations de Dispersion et Diffusion des Glueballs et des Mesons dans la Theorie de Jauge U(1)(2+1) Compacte

    NASA Astrophysics Data System (ADS)

    Ahmed, Chaara El Mouez

    Nous avons etudie les relations de dispersion et la diffusion des glueballs et des mesons dans le modele U(1)_{2+1} compact. Ce modele a ete souvent utilise comme un simple modele de la chromodynamique quantique (QCD), parce qu'il possede le confinement ainsi que les etats de glueballs. Par contre, sa structure mathematique est beaucoup plus simple que la QCD. Notre methode consiste a diagonaliser l'Hamiltonien de ce modele dans une base appropriee de graphes et sur reseau impulsion, afin de generer les relations de dispersion des glueballs et des mesons. Pour la diffusion, nous avons utilise la methode dependante du temps pour calculer la matrice S et la section efficace de diffusion des glueballs et des mesons. Les divers resultats obtenus semblent etre en accord avec les travaux anterieurs de Hakim, Alessandrini et al., Irving et al., qui eux, utilisent plutot la theorie des perturbations en couplage fort, et travaillent sur un reseau espace-temps.

  11. Markov modeling and discrete event simulation in health care: a systematic comparison.

    PubMed

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  12. Impact de la preparation des anodes crues et des conditions de cuisson sur la fissuration dans des anodes denses

    NASA Astrophysics Data System (ADS)

    Amrani, Salah

    fabriquees industriellement. Cette technique a consiste a determiner le profil des differentes proprietes physiques. En effet, la methode basee sur la mesure de la distribution de la resistivite electrique sur la totalite de l'echantillon est la technique qui a ete utilisee pour localiser la fissuration et les macro-pores. La microscopie optique et l'analyse d'image ont, quant a elles, permis de caracteriser les zones fissurees tout en determinant la structure des echantillons analyses a l'echelle microscopique. D'autres tests ont ete menes, et ils ont consiste a etudier des echantillons cylindriques d'anodes de 50 mm de diametre et de 130 mm de longueur. Ces derniers ont ete cuits dans un four a UQAC a differents taux de chauffage dans le but de pouvoir determiner l'influence des parametres de cuisson sur la formation de la fissuration dans ce genre de carottes. La caracterisation des echantillons d'anodes cuites a ete faite a l'aide de la microscopie electronique a balayage et de l'ultrason. La derniere partie des travaux realises a l'UQAC contient une etude sur la caracterisation des anodes fabriquees au laboratoire sous differentes conditions d'operation. L'evolution de la qualite de ces anodes a ete faite par l'utilisation de plusieurs techniques. L'evolution de la temperature de refroidissement des anodes crues de laboratoire a ete mesuree; et un modele mathematique a ete developpe et valide avec les donnees experimentales. Cela a pour objectif d'estimer la vitesse de refroidissement ainsi que le stress thermique. Toutes les anodes fabriquees ont ete caracterisees avant la cuisson par la determination de certaines proprietes physiques (resistivite electrique, densite apparente, densite optique et pourcentage de defauts). La tomographie et la distribution de la resistivite electrique, qui sont des techniques non destructives, ont ete employees pour evaluer les defauts internes des anodes. Pendant la cuisson des anodes de laboratoire, l'evolution de la resistivite

  13. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  14. DES Science Portal: Computing Photometric Redshifts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gschwend, Julia

    An important challenge facing photometric surveys for cosmological purposes, such as the Dark Energy Survey (DES), is the need to produce reliable photometric redshifts (photo-z). The choice of adequate algorithms and configurations and the maintenance of an up-to-date spectroscopic database to build training sets, for example, are challenging tasks when dealing with large amounts of data that are regularly updated and constantly growing. In this paper, we present the first of a series of tools developed by DES, provided as part of the DES Science Portal, an integrated web-based data portal developed to facilitate the scientific analysis of the data,more » while ensuring the reproducibility of the analysis. We present the DES Science Portal photometric redshift tools, starting from the creation of a spectroscopic sample to training the neural network photo-z codes, to the final estimation of photo-zs for a large photometric catalog. We illustrate this operation by calculating well calibrated photo-zs for a galaxy sample extracted from the DES first year (Y1A1) data. The series of processes mentioned above is run entirely within the Portal environment, which automatically produces validation metrics, and maintains the provenance between the different steps. This system allows us to fine tune the many steps involved in the process of calculating photo-zs, making sure that we do not lose the information on the configurations and inputs of the previous processes. By matching the DES Y1A1 photometry to a spectroscopic sample, we define different training sets that we use to feed the photo-z algorithms already installed at the Portal. Finally, we validate the results under several conditions, including the case of a sample limited to i<22.5 with the color properties close to the full DES Y1A1 photometric data. This way we compare the performance of multiple methods and training configurations. The infrastructure presented here is an effcient way to test several methods

  15. Determination des Parametres Atmospheriques des Etoiles Naines Blanches de Type DB

    NASA Astrophysics Data System (ADS)

    Beauchamp, Alain

    1995-01-01

    Les etoiles naines blanches dont les spectres visibles sont domines par des raies fortes d'helium neutre sont subdivisees en trois classes, DB (raies d'helium neutre seulement), DBA (raies d'helium neutre et d'hydrogene) et DBZ (raies d'helium neutre et d'elements lourds). Nous analysons trois echantillons de spectres observes de ces types de naines blanches. Les echantillons consistent, respectivement, de 48 spectres dans le domaine du visible (3700-5100 A). 24 dans l'ultraviolet (1200-3100 A) et quatre dans la partie rouge du visible (5100-6900) A). Parmi les objets de l'echantillon visible, nous identifions quatre nouvelles DBA, ainsi que deux nouvelles DBZ, auparavant classees DB. L'analyse nous permet de determiner spectroscopiquement les parametres atmospheriques, soit la temperature effective, la gravite de surface, ainsi que l'abondance relative de l'hydrogene, N(H)/N(He), dans le cas des DBA. Pour les objets plus chauds que ~15,000 K, la gravite de surface determinee est fiable, et nous obtenons les masses stellaires avec une relation masse -rayon theorique. Les exigences propres a l'analyse de ces objets ont requis d'importantes ameliorations dans la modelisation de leurs atmospheres et distributions de flux de radiation emis par ces derniers. Nous avons inclus dans les modeles d'atmospheres, pour la premiere fois a notre connaissance, les effets dus a la molecule He_sp{2 }{+}, ainsi que l'equation d'etat de Hummer et Mihalas (1988), qui tient compte des perturbations entre particules dans le calcul des populations des differents niveaux atomiques. Nous traitons la convection dans le cadre de la theorie de la longueur de melange. Trois grilles de modeles d'atmospheres a l'ETL (equilibre thermodynamique local) ont ete produites, pour un ensemble de temperatures effectives, gravites de surface et abondances d'hydrogene couvrant les proprietes des etoiles de nos echantillons; elles sont caracterisees par differentes parametrisations appelees, respectivement

  16. Models of the strongly lensed quasar DES J0408−5354

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnello, A.; et al.

    We present gravitational lens models of the multiply imaged quasar DES J0408-5354, recently discovered in the Dark Energy Survey (DES) footprint, with the aim of interpreting its remarkable quad-like configuration. We first model the DES single-epochmore » $grizY$ images as a superposition of a lens galaxy and four point-like objects, obtaining spectral energy distributions (SEDs) and relative positions for the objects. Three of the point sources (A,B,D) have SEDs compatible with the discovery quasar spectra, while the faintest point-like image (G2/C) shows significant reddening and a `grey' dimming of $$\\approx0.8$$mag. In order to understand the lens configuration, we fit different models to the relative positions of A,B,D. Models with just a single deflector predict a fourth image at the location of G2/C but considerably brighter and bluer. The addition of a small satellite galaxy ($$R_{\\rm E}\\approx0.2$$") in the lens plane near the position of G2/C suppresses the flux of the fourth image and can explain both the reddening and grey dimming. All models predict a main deflector with Einstein radius between $1.7"$ and $2.0",$ velocity dispersion $267-280$km/s and enclosed mass $$\\approx 6\\times10^{11}M_{\\odot},$$ even though higher resolution imaging data are needed to break residual degeneracies in model parameters. The longest time-delay (B-A) is estimated as $$\\approx 85$$ (resp. $$\\approx125$$) days by models with (resp. without) a perturber near G2/C. The configuration and predicted time-delays of J0408-5354 make it an excellent target for follow-up aimed at understanding the source quasar host galaxy and substructure in the lens, and measuring cosmological parameters. We also discuss some lessons learnt from J0408-5354 on lensed quasar finding strategies, due to its chromaticity and morphology.« less

  17. Validation materielle d'une architecture generique de reseaux avioniques basee sur une gestion modulaire de la redondance

    NASA Astrophysics Data System (ADS)

    Tremblay, Jose-Philippe

    Les systemes avioniques ne cessent d'evoluer depuis l'apparition des technologies numeriques au tournant des annees 60. Apres le passage par plusieurs paradigmes de developpement, ces systemes suivent maintenant l'approche " Integrated Modular Avionics " (IMA) depuis le debut des annees 2000. Contrairement aux methodes anterieures, cette approche est basee sur une conception modulaire, un partage de ressources generiques entre plusieurs systemes et l'utilisation plus poussee de bus multiplexes. La plupart des concepts utilises par l'architecture IMA, bien que deja connus dans le domaine de l'informatique distribuee, constituent un changement marque par rapport aux modeles anterieurs dans le monde avionique. Ceux-ci viennent s'ajouter aux contraintes importantes de l'avionique classique telles que le determinisme, le temps reel, la certification et les cibles elevees de fiabilite. L'adoption de l'approche IMA a declenche une revision de plusieurs aspects de la conception, de la certification et de l'implementation d'un systeme IMA afin d'en tirer profit. Cette revision, ralentie par les contraintes avioniques, est toujours en cours, et offre encore l'opportunite de developpement de nouveaux outils, methodes et modeles a tous les niveaux du processus d'implementation d?un systeme IMA. Dans un contexte de proposition et de validation d'une nouvelle architecture IMA pour un reseau generique de capteurs a bord d?un avion, nous avons identifie quelques aspects des differentes approches traditionnelles pour la realisation de ce type d?architecture pouvant etre ameliores. Afin de remedier a certaines des differentes lacunes identifiees, nous avons propose une approche de validation basee sur une plateforme materielle reconfigurable ainsi qu'une nouvelle approche de gestion de la redondance pour l'atteinte des cibles de fiabilite. Contrairement aux outils statiques plus limites satisfaisant les besoins pour la conception d'une architecture federee, notre approche de

  18. Evaluation and cross-validation of Environmental Models

    NASA Astrophysics Data System (ADS)

    Lemaire, Joseph

    Before scientific models (statistical or empirical models based on experimental measurements; physical or mathematical models) can be proposed and selected as ISO Environmental Standards, a Commission of professional experts appointed by an established International Union or Association (e.g. IAGA for Geomagnetism and Aeronomy, . . . ) should have been able to study, document, evaluate and validate the best alternative models available at a given epoch. Examples will be given, indicating that different values for the Earth radius have been employed in different data processing laboratories, institutes or agencies, to process, analyse or retrieve series of experimental observations. Furthermore, invariant magnetic coordinates like B and L, commonly used in the study of Earth's radiation belts fluxes and for their mapping, differ from one space mission data center to the other, from team to team, and from country to country. Worse, users of empirical models generally fail to use the original magnetic model which had been employed to compile B and L , and thus to build these environmental models. These are just some flagrant examples of inconsistencies and misuses identified so far; there are probably more of them to be uncovered by careful, independent examination and benchmarking. A meter prototype, the standard unit length that has been determined on 20 May 1875, during the Diplomatic Conference of the Meter, and deposited at the BIPM (Bureau International des Poids et Mesures). In the same token, to coordinate and safeguard progress in the field of Space Weather, similar initiatives need to be undertaken, to prevent wild, uncontrolled dissemination of pseudo Environmental Models and Standards. Indeed, unless validation tests have been performed, there is guaranty, a priori, that all models on the market place have been built consistently with the same units system, and that they are based on identical definitions for the coordinates systems, etc... Therefore

  19. Generic Methodology for Verification and Validation (GM-VV) to Support Acceptance of Models, Simulations and Data (Methodologie generale de verification et de validation (GM-VV) visant a soutenir l acceptation des modeles, simulations et donnees)

    DTIC Science & Technology

    2015-01-01

    RTO ou AGARD doivent comporter la dénomination « STO », « RTO » ou « AGARD » selon le cas, suivi du numéro de série. Des informations analogues...rapports de la STO au fur et à mesure de leur publication, vous pouvez consulter notre site Web (http://www.sto.nato.int/) et vous abonner à ce service...le cas, suivie du numéro de série (par exemple AGARD-AG-315). Des informations analogues, telles que le titre et la date de publication sont

  20. Toward Supersonic Retropropulsion CFD Validation

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Schauerhamer, D. Guy; Trumble, Kerry; Sozer, Emre; Barnhardt, Michael; Carlson, Jan-Renee; Edquist, Karl

    2011-01-01

    This paper begins the process of verifying and validating computational fluid dynamics (CFD) codes for supersonic retropropulsive flows. Four CFD codes (DPLR, FUN3D, OVERFLOW, and US3D) are used to perform various numerical and physical modeling studies toward the goal of comparing predictions with a wind tunnel experiment specifically designed to support CFD validation. Numerical studies run the gamut in rigor from code-to-code comparisons to observed order-of-accuracy tests. Results indicate that this complex flowfield, involving time-dependent shocks and vortex shedding, design order of accuracy is not clearly evident. Also explored is the extent of physical modeling necessary to predict the salient flowfield features found in high-speed Schlieren images and surface pressure measurements taken during the validation experiment. Physical modeling studies include geometric items such as wind tunnel wall and sting mount interference, as well as turbulence modeling that ranges from a RANS (Reynolds-Averaged Navier-Stokes) 2-equation model to DES (Detached Eddy Simulation) models. These studies indicate that tunnel wall interference is minimal for the cases investigated; model mounting hardware effects are confined to the aft end of the model; and sparse grid resolution and turbulence modeling can damp or entirely dissipate the unsteadiness of this self-excited flow.

  1. A Validation Study of the Adolescent Dissociative Experiences Scale

    ERIC Educational Resources Information Center

    Keck Seeley, Susan. M.; Perosa, Sandra, L.; Perosa, Linda, M.

    2004-01-01

    Objective: The purpose of this study was to further the validation process of the Adolescent Dissociative Experiences Scale (A-DES). In this study, a 6-item Likert response format with descriptors was used when responding to the A-DES rather than the 11-item response format used in the original A-DES. Method: The internal reliability and construct…

  2. Detection de la fin de la compaction des anodes par le son

    NASA Astrophysics Data System (ADS)

    Sanogo, Bazoumana

    L'objectif de ce projet etait de developper un outil de controle en temps reel du temps de compaction en se servant du son genere par le vibrocompacteur pendant le formage des anodes crues. Ainsi, une application a ete developpee pour l'analyse des sons enregistres. Des essais ont ete realises avec differents microphones pour une meilleure qualite des mesures et un a ete choisi pour la suite du projet. De meme, differents tests ont ete realises sur des anodes de laboratoire ainsi que des anodes a l'echelle industrielle afin de mettre en place une methode pour la detection du temps optimal necessaire au formage des anodes. Les travaux au laboratoire de carbone a l'Universite du Quebec a Chicoutimi (UQAC) ont consiste a l'enregistrement de son des anodes fabriquees sur place avec differentes configurations; et a la caracterisation de certaines anodes de l'usine. Les anodes fabriquees au laboratoire sont reparties en deux groupes. Le premier regroupe les anodes pour la validation de notre methode. Ce sont des anodes produites avec des temps de compaction differents. Le laboratoire de carbone a l'UQAC est unique et il est possible de produire des anodes avec les memes proprietes que celles des anodes industrielles. Par consequent, la validation initialement prevue a l'usine a ete effectuee avec les anodes de laboratoire. Le deuxieme groupe a servi a etudier les effets des matieres premieres sur le temps de compaction. Le type de coke et le type de brai ont constitue les differentes variations dans ce deuxieme groupe. Quant aux tests et mesures a l'usine, ils ont ete realises en trois campagnes de mesure. La premiere campagne en juin 2014 a servi a standardiser et a trouver le meilleur positionnement des appareils pour les mesures, a regler le logiciel et a faire les premieres mesures. Une deuxieme campagne en mai 2015 a fait l'objet d'enregistrement de son en classant les anodes selon differents temps de compaction. La troisieme et derniere campagne en decembre 2015 a

  3. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  4. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  5. Unsteady Three-Dimensional Simulation of a Shear Coaxial GO2/GH2 Rocket Injector with RANS and Hybrid-RAN-LES/DES Using Flamelet Models

    NASA Technical Reports Server (NTRS)

    Westra, Doug G.; West, Jeffrey S.; Richardson, Brian R.

    2015-01-01

    Historically, the analysis and design of liquid rocket engines (LREs) has relied on full-scale testing and one-dimensional empirical tools. The testing is extremely expensive and the one-dimensional tools are not designed to capture the highly complex, and multi-dimensional features that are inherent to LREs. Recent advances in computational fluid dynamics (CFD) tools have made it possible to predict liquid rocket engine performance, stability, to assess the effect of complex flow features, and to evaluate injector-driven thermal environments, to mitigate the cost of testing. Extensive efforts to verify and validate these CFD tools have been conducted, to provide confidence for using them during the design cycle. Previous validation efforts have documented comparisons of predicted heat flux thermal environments with test data for a single element gaseous oxygen (GO2) and gaseous hydrogen (GH2) injector. The most notable validation effort was a comprehensive validation effort conducted by Tucker et al. [1], in which a number of different groups modeled a GO2/GH2 single element configuration by Pal et al [2]. The tools used for this validation comparison employed a range of algorithms, from both steady and unsteady Reynolds Averaged Navier-Stokes (U/RANS) calculations, large-eddy simulations (LES), detached eddy simulations (DES), and various combinations. A more recent effort by Thakur et al. [3] focused on using a state-of-the-art CFD simulation tool, Loci/STREAM, on a two-dimensional grid. Loci/STREAM was chosen because it has a unique, very efficient flamelet parameterization of combustion reactions that are too computationally expensive to simulate with conventional finite-rate chemistry calculations. The current effort focuses on further advancement of validation efforts, again using the Loci/STREAM tool with the flamelet parameterization, but this time with a three-dimensional grid. Comparisons to the Pal et al. heat flux data will be made for both RANS and

  6. Developpement d'un modele analytique pour l'analyse en elasticite lineaire de champs de deformation et contrainte au sein d'un polycristal. comparaison avec la methode des elements finis =

    NASA Astrophysics Data System (ADS)

    Bretin, Remy

    L'endommagement par fatigue des materiaux est un probleme courant dans de nombreux domaines, dont celui de l'aeronautique. Afin de prevenir la rupture par fatigue des materiaux il est necessaire de determiner leur duree de vie en fatigue. Malheureusement, dues aux nombreuses heterogeneites presentes, la duree de vie en fatigue peut fortement varier entre deux pieces identiques faites dans le meme materiau ayant subi les memes traitements. Il est donc necessaire de considerer ces heterogeneites dans nos modeles afin d'avoir une meilleure estimation de la duree de vie des materiaux. Comme premiere etape vers une meilleure consideration des heterogeneites dans nos modeles, une etude en elasticite lineaire de l'influence des orientations cristallographiques sur les champs de deformations et de contraintes dans un polycristal a ete realisee a l'aide de la methode des elements finis. Des correlations ont pu etre etablies a partir des resultats obtenus, et un modele analytique en elasticite lineaire prenant en compte les distributions d'orientations cristallographiques et les effets de voisinage a pu etre developpe. Ce modele repose sur les bases des modeles d'homogeneisation classique, comme le schema auto-coherent, et reprend aussi les principes de voisinage des automates cellulaires. En prenant pour reference les resultats des analyses elements finis, le modele analytique ici developpe a montre avoir une precision deux fois plus grande que le modele auto-coherent, quel que soit le materiau etudie.

  7. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  8. Etude des phenomenes dynamiques ultrarapides et des caracteristiques impulsionnelles d'emission terahertz du supraconducteur YBCO

    NASA Astrophysics Data System (ADS)

    Savard, Stephane

    Les premieres etudes d'antennes a base de supraconducteurs a haute temperature critique emettant une impulsion electromagnetique dont le contenu en frequence se situe dans le domaine terahertz remontent a 1996. Une antenne supraconductrice est formee d'un micro-pont d'une couche mince supraconductrice sur lequel un courant continu est applique. Un faisceau laser dans le visible est focalise sur le micro-pont et place le supraconducteur dans un etat hors-equilibre ou des paires sont brisees. Grace a la relaxation des quasiparticules en surplus et eventuellement de la reformation des paires supraconductrices, nous pouvons etudier la nature de la supraconductivite. L'analyse de la cinetique temporelle du champ electromagnetique emis par une telle antenne terahertz supraconductrice s'est averee utile pour decrire qualitativement les caracteristiques de celle-ci en fonction des parametres d'operation tels que le courant applique, la temperature et la puissance d'excitation. La comprehension de l'etat hors-equilibre est la cle pour comprendre le fonctionnement des antennes terahertz supraconductrices a haute temperature critique. Dans le but de comprendre ultimement cet etat hors-equilibre, nous avions besoin d'une methode et d'un modele pour extraire de facon plus systematique les proprietes intrinseques du materiau qui compose l'antenne terahertz a partir des caracteristiques d'emission de celle-ci. Nous avons developpe une procedure pour calibrer le spectrometre dans le domaine temporel en utilisant des antennes terahertz de GaAs bombarde aux protons H+ comme emetteur et detecteur. Une fois le montage calibre, nous y avons insere une antenne emettrice dipolaire de YBa 2Cu3O7-delta . Un modele avec des fonctions exponentielles de montee et de descente du signal est utilise pour lisser le spectre du champ electromagnetique de l'antenne de YBa 2Cu3O7-delta, ce qui nous permet d'extraire les proprietes intrinseques de ce dernier. Pour confirmer la validite du modele

  9. Déprescription des agonistes des récepteurs des benzodiazépines

    PubMed Central

    Pottie, Kevin; Thompson, Wade; Davies, Simon; Grenier, Jean; Sadowski, Cheryl A.; Welch, Vivian; Holbrook, Anne; Boyd, Cynthia; Swenson, Robert; Ma, Andy; Farrell, Barbara

    2018-01-01

    Résumé Objectif Formuler des lignes directrices fondées sur les données probantes visant à aider les cliniciens à décider du moment et de la façon sécuritaire de réduire la dose des agonistes des récepteurs des benzodiazépines (BZRA) pour mettre fin au traitement; se concentrer sur le niveau le plus élevé des données disponibles et obtenir les commentaires des professionnels de première ligne durant le processus de rédaction, de révision et d’adoption des lignes directrices. Méthodes L’équipe comptait 8 cliniciens (1 médecin de famille, 2 psychiatres, 1 psychologue clinique, 1 pharmacologue clinique, 2 pharmaciennes cliniques et 1 gériatre) et une spécialiste de la méthodologie; les membres ont divulgué tout conflit d’intérêts. Nous avons eu recours à un processus systématique, y compris l’approche GRADE (Grading of Recommendations Assessment, Development and Evaluation) pour formuler les lignes directrices. Les données ont été générées par une revue systématique d’études portant sur la déprescription des BZRA contre l’insomnie, de même que par une revue des revues sur les torts liés à la poursuite du traitement par BZRA et des synthèses narratives sur les préférences des patients et les répercussions sur les ressources. Ces données et le score GRADE de qualité des données ont servi à formuler les recommandations. L’équipe a peaufiné le texte sur le contenu et les recommandations des lignes directrices par consensus et a synthétisé les considérations cliniques afin de répondre aux questions des cliniciens de première ligne. Une version préliminaire des lignes directrices a été révisée par les cliniciens et les intervenants. Recommandations Nous recommandons d’offrir la déprescription (réduction lente de la dose) des BZRA à tous les patients âgés (≥ 65 ans) sous un BZRA, sans égard à la durée de l’usage, et suggérons d’offrir la déprescription (réduction lente de la dose)

  10. Validation de la version française du Questionnaire de Bournemouth

    PubMed Central

    Martel, Johanne; Dugas, Claude; Lafond, D.; Descarreaux, M.

    2009-01-01

    Les auto questionnaires font partie intégrante de l’évaluation des patients ayant des douleurs cervicales. Le Questionnaire de Bournemouth intègre la réalité biopsychosociale dans l’évaluation des douleurs cervicales et sa version anglais (QBc-a) est validée et présente des propriétés psychométriques de modérées à excellentes. L’objectif de cette étude est de traduire et valider une version française de ce questionnaire. La traduction et l’adaptation a été complétée en utilisant la méthode de traduction contre-traduction qui a permis d’obtenir un consensus entre les deux versions. L’étude de validation impliquait 68 sujets (âge moyen 41 ans) qui participaient à un essai clinique randomisé concernant l’efficacité des thérapies manuelles pour les douleurs cervicales. Le protocole expérimental permettait d’obtenir des données pour évaluer la validité conceptuelle, la validité conceptuelle longitudinale, la fidélité test-retest et la sensibilité au changement. Les données de validité conceptuelle (r = 0,67 et 0,61 et 0,42 respectivement pour la validité conceptuelle pré, post traitement et longitudinale), de fidélité test-retest (r = 0,97) et de sensibilité au changement (taille de l’effet = 0,56 et réponse moyenne normalisée = 0,61) sont adéquates pour suggérer une utilisation de cet auto questionnaire pour la gestion des patients ayant des douleurs cervicales.

  11. Dynamique de nanobulles et nanoplasmas generes autour de nanoparticules plasmoniques irradiees par des impulsions ultracourtes

    NASA Astrophysics Data System (ADS)

    Dagallier, Adrien

    L'emergence des lasers a impulsion ultrabreves et des nanotechnologies a revolutionne notre perception et notre maniere d'interagir avec l'infiniment petit. Les gigantesques intensites generees par ces impulsions plus courtes que les temps de relaxation ou de diffusion du milieu irradie induisent de nombreux phenomenes non-lineaires, du doublement de frequence a l'ablation, dans des volumes de dimension caracteristique de l'ordre de la longueur d'onde du laser. En biologie et en medecine, ces phenomenes sont utilises a des fins d'imagerie multiphotonique ou pour detruire des tissus vivants. L'introduction de nanoparticules plasmoniques, qui concentrent le champ electromagnetique incident dans des regions de dimensions nanometriques, jusqu'a une fraction de la longueur d'onde, amplifie les phenomenes non-lineaires tout en offrant un controle beaucoup plus precis de la deposition d'energie, ouvrant la voie a la detection de molecules individuelles en solution et a la nanochirurgie. La nanochirurgie repose principalement sur la formation d'une bulle de vapeur a proximite d'une membrane cellulaire. Cette bulle de vapeur perce la membrane de maniere irreversible,entrainant la cellule a sa mort, ou la perturbe temporairement, ce qui permet d'envisager de faire penetrer dans la cellule des medicaments ou des brins d'ADN pour de la therapie genique. C'est principalement la taille de la bulle qui va decider de l'issue de l'irradiation laser. Il est donc necessaire de controler finement les parametres du laser et la geometrie de la nanoparticule afin d'atteindre l'objectif fixe. Le moyen le plus direct a l'heure actuelle de valider un ensemble de conditions experimentales est de realiser l'experience en laboratoire,ce qui est long et couteux. Les modeles de dynamique de bulle existants ne prennent pas en compte les parametres de l'irradiation et ajustent souvent leurs conditions initiales a partir de leurs mesures experimentales, ce qui limite la portee du modele au cas pour

  12. Ground-water models: Validate or invalidate

    USGS Publications Warehouse

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  13. Empirical agreement in model validation.

    PubMed

    Jebeile, Julie; Barberousse, Anouk

    2016-04-01

    Empirical agreement is often used as an important criterion when assessing the validity of scientific models. However, it is by no means a sufficient criterion as a model can be so adjusted as to fit available data even though it is based on hypotheses whose plausibility is known to be questionable. Our aim in this paper is to investigate into the uses of empirical agreement within the process of model validation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Prise en compte des ``courants de London'' dans la modélisation des supraconducteurs

    NASA Astrophysics Data System (ADS)

    Bossavit, Alain

    1997-10-01

    A model is given, in variational form, in which volumic “Bean currents”, ruled by Bean's law, and surface “London currents” coexist. This macroscopic model generalizes Bean's one, by appending to the critical density j_c a second parameter, with the dimension of a length, similar to London's depth λ. The one-dimensional version of the model is investigated, in order to link this parameter with the standard observable H-M characteristics On propose un modèle, sous forme variationnelle, associant des “courants de Bean” volumiques, décrits par la loi de Bean, et des “courants de London”, surfaciques. Ce modèle macroscopique généralise celui de Bean, caractérisé par le courant critique j_c, et fait intervenir un second paramètre, homogène à une longueur, analogue au λ de London. La version unidimensionnelle du modèle est étudiée en détail de manière à relier ce paramètre à l'observation des caractéristiques H-M usuelles.

  15. Validation of the Italian version of the dissociative experience scale for adolescents and young adults.

    PubMed

    De Pasquale, Concetta; Sciacca, Federica; Hichy, Zira

    2016-01-01

    The Dissociative Experience Scale for adolescent (A-DES), a 30-item, multidimensional, self-administered questionnaire, was validated using a large sample of American young people sample. We reported the linguistic validation process and the metric validity of the Italian version of A-DES in the Italy. A set of questionnaires was provided to a total of 633 participants from March 2015 to April 2016. The participants consisted of 282 boys and 351 girls, and their average age was between 18 and 24 years old. The translation process consisted of two consecutive steps: forward-backward translation and acceptability testing. The psychometric testing was applied to Italian students who were recruited from the Italian Public Schools and Universities in Sicily. Informed consent was obtained from all participants at the research. All individuals completed the A-DES. Reliability and validity were tested. The translated version was validated on a total of 633 Italian students. The reliability of A-DES total is .926. It is composed by 4 subscales: Dissociative amnesia, Absorption and imaginative involvement, Depersonalization and derealization, and Passive influence. The reliability of each subscale is: .756 for dissociative amnesia, .659 for absorption and imaginative involvement, .850 for depersonalization and derealization, and .743 for passive influence. The Italian version of the A-DES constitutes a useful instrument to measure dissociative experience in adolescents and young adults in Italy.

  16. Hybrid Architectural Framework for C4ISR and Discrete-Event Simulation (DES) to Support Sensor-Driven Model Synthesis in Real-World Scenarios

    DTIC Science & Technology

    2013-09-01

    which utilizes FTA and then loads it into a DES engine to generate simulation results. .......44 Figure 21. This simulation architecture is...While Discrete Event Simulation ( DES ) can provide accurate time estimation and fast simulation speed, models utilizing it often suffer...C4ISR progress in MDW is developed in this research to demonstrate the feasibility of AEMF- DES and explore its potential. The simulation (MDSIM

  17. Vecteurs Singuliers des Theories des Champs Conformes Minimales

    NASA Astrophysics Data System (ADS)

    Benoit, Louis

    En 1984 Belavin, Polyakov et Zamolodchikov revolutionnent la theorie des champs en explicitant une nouvelle gamme de theories, les theories quantiques des champs bidimensionnelles invariantes sous les transformations conformes. L'algebre des transformations conformes de l'espace-temps presente une caracteristique remarquable: en deux dimensions elle possede un nombre infini de generateurs. Cette propriete impose de telles conditions aux fonctions de correlations qu'il est possible de les evaluer sans aucune approximation. Les champs des theories conformes appartiennent a des representations de plus haut poids de l'algebre de Virasoro, une extension centrale de l'algebre conforme du plan. Ces representations sont etiquetees par h, le poids conforme de leur vecteur de plus haut poids, et par la charge centrale c, le facteur de l'extension centrale, commune a toutes les representations d'une meme theorie. Les theories conformes minimales sont constituees d'un nombre fini de representations. Parmi celles-ci se trouvent des theories unitaires dont les representation forment la serie discrete de l'algebre de Virasoro; leur poids h a la forme h_{p,q}(m)=[ (p(m+1) -qm)^2-1] (4m(m+1)), ou p,q et m sont des entiers positifs et p+q<= m+1. L'entier m parametrise la charge centrale: c(m)=1 -{6over m(m+1)} avec n>= 2. Ces representations possedent un sous-espace invariant engendre par deux sous-representations avec h_1=h_{p,q} + pq et h_2=h_{p,q} + (m-p)(m+1-q) dont chacun des vecteurs de plus haut poids portent le nom de vecteur singulier et sont notes respectivement |Psi _{p,q}> et |Psi_{m-p,m+1-q}>. . Les theories super-conformes sont une version super-symetrique des theories conformes. Leurs champs appartiennent a des representation de plus haut poids de l'algebre de Neveu-Schwarz, une des deux extensions super -symetriques de l'algebre de Virasoro. Les theories super -conformes minimales possedent la meme structure que les theories conformes minimales. Les representations

  18. Etude du processus de changement vecu par des familles ayant decide d'adopter volontairement des comportements d'attenuation des changements climatiques

    NASA Astrophysics Data System (ADS)

    Leger, Michel T.

    recension des ecrits sur le changement de comportement en environnement. Nous explorons egalement la famille comme systeme fonctionnel de sorte a mieux comprendre ce contexte d'action environnementale qui est, a notre connaissance, peu etudie. Dans le deuxieme article, nous presentons nos resultats de recherche concernant les facteurs d'influence observes ainsi que les competences manifestees au cours du processus d'adoption de nouveaux comportements environnementaux dans trois familles. Enfin, le troisieme article presente les resultats du cas d'une quatrieme famille ou les membres vivent depuis longtemps des modes de vie ecologique. Dans le cadre d'une demarche d'analyse par theorisation ancree, l'etude de ce cas modele nous a permis d'approfondir les categories conceptuelles identifiees dans le deuxieme article de sorte a produire une modelisation de l'integration de comportements environnementaux dans le contexte de la famille. Les conclusions degagees grace a la recension des ecrits nous ont permis d'identifier les elements qui pourraient influencer l'adoption de comportements environnementaux dans des familles. La recension a aussi permis une meilleure comprehension des divers facteurs qui peuvent affecter l'adoption de comportements environnementaux et, enfin, elle a permis de mieux cerner le phenomene de changement de comportement dans le contexte de la famille consideree comme un systeme. En appliquant un processus d'analyse inductif, a partir de nos donnees qualitatives, les resultats de notre etude multi-cas nous ont indique que deux construits conceptuels semblent influencer l'adoption de comportements environnementaux en famille : 1) les valeurs biospheriques communes au sein de la famille et 2) les competences collectivement mises a profit collectivement durant l'essai de nouveaux comportements environnementaux. Notre modelisation du processus de changement dans des familles indique aussi qu'une dynamique familiale collaborative et la presence d'un groupe de

  19. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    USGS Publications Warehouse

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  20. Les effets des interfaces sur les proprietes magnetiques et de transport des multicouches nickel/iron et cobalt/silver

    NASA Astrophysics Data System (ADS)

    Veres, Teodor

    Cette these est consacree a l'etude de l'evolution structurale des proprietes magnetiques et de transport des multicouches Ni/Fe et nanostructures a base de Co et de l'Ag. Dans une premiere partie, essentiellement bibliographique, nous introduisons quelques concepts de base relies aux proprietes magnetiques et de transport des multicouches metalliques. Ensuite, nous presentons une breve description des methodes d'analyse des resultats. La deuxieme partie est consacree a l'etude des proprietes magnetiques et de transport des multicouches ferromagnetiques/ferromagnetiques Ni/Fe. Nous montrerons qu'une interpretation coherente de ces proprietes necessite la prise en consideration des effets des interfaces. Nous nous attacherons a mettre en evidence, a evaluer et a etudier les effets de ces interfaces ainsi que leur evolution, et ce, suite a des traitements thermiques tel que le depot a temperature elevee et l'irradiation ionique. Les analyses correlees de la structure et de la magnetoresistance nous permettront d'emettre des conclusions sur l'influence des couches tampons entre l'interface et le substrat ainsi qu'entre les couches elles-memes sur le comportement magnetique des couches F/F. La troisieme partie est consacree aux systemes a Magneto-Resistance Geante (MRG) a base de Co et Ag. Nous allons etudier l'evolution de la microstructure suite a l'irradiation avec des ions Si+ ayant une energie de 1 MeV, ainsi que les effets de ces changements sur le comportement magnetique. Cette partie debutera par l'analyse des proprietes d'une multicouche hybride, intermediaire entre les multicouches et les materiaux granulaires. Nous analyserons a l'aide des mesures de diffraction, de relaxation superparamagnetique et de magnetoresistance, les evolutions structurales produites par l'irradiation ionique. Nous etablirons des modeles qui nous aideront a interpreter les resultats pour une serie des multicouches qui couvrent un large eventail de differents comportements magnetiques

  1. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  2. Les perceptions des femmes tunisiennes selon le modèle des croyances liées à la santé et leurs pratiques relativement à l'ostéoporose

    PubMed Central

    Belgacem, Amina; Nouira, Amel; Soussi, Sonia

    2016-01-01

    Introduction L'étude a pour objectif de décrire les croyances des femmes et leurs pratiques liées à la santé et à l'ostéoporose, afin d'élaborer des interventions efficaces et ciblées pour la prévention de cette maladie dans le contexte tunisien. Méthodes Une étude descriptive transversale a été effectuée auprès de 100 femmes tunisiennes, âgées de 45 ans et plus, qui consultent au centre de santé de base d'une zone périurbaine de la région de Sousse (Tunisie). La collecte de l'information a été réalisée à l'aide de « l'échelle des croyances relatives à la santé sur l'ostéoporose» développée par Kim et ses collègues traduit en arabe et validé en Tunisie et le questionnaire de «Calcul des apports calciques quotidiens» développé par Fardellone Patrice. L'interprétation des résultants s'est basée sur le «Health Belief Model ». Résultats La perception des participantes pourrait être considérée comme au dessus de la moyenne pour la vulnérabilité de l'ostéoporose (58%), la gravité de la maladie, les avantages de la pratique de l'activité physique, les avantages de l'apport en calcium et la motivation à la santé; par contre, elle pourrait être considérée comme modérée concernant les obstacles à la prévention. Cependant, les pratiques exposant au risque de la maladie sont relativement fréquentes et ceci essentiellement en rapport avec des facteurs socio-économiques et culturels. Conclusion Les programmes de promotion doivent viser la création d'un environnement physique et social favorable à l'adoption des comportements à moindre risque et viser l'éducation ciblée de la population. PMID:27217868

  3. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Caracterisation des proprietes acoustiques des materiaux poreux a cellules ouvertes et a matrice rigide ou souple

    NASA Astrophysics Data System (ADS)

    Salissou, Yacoubou

    L'objectif global vise par les travaux de cette these est d'ameliorer la caracterisation des proprietes macroscopiques des materiaux poreux a structure rigide ou souple par des approches inverses et indirectes basees sur des mesures acoustiques faites en tube d'impedance. La precision des approches inverses et indirectes utilisees aujourd'hui est principalement limitee par la qualite des mesures acoustiques obtenues en tube d'impedance. En consequence, cette these se penche sur quatre problemes qui aideront a l'atteinte de l'objectif global precite. Le premier probleme porte sur une caracterisation precise de la porosite ouverte des materiaux poreux. Cette propriete en est une de passage permettant de lier la mesure des proprietes dynamiques acoustiques d'un materiau poreux aux proprietes effectives de sa phase fluide decrite par les modeles semi-phenomenologiques. Le deuxieme probleme traite de l'hypothese de symetrie des materiaux poreux selon leur epaisseur ou un index et un critere sont proposes pour quantifier l'asymetrie d'un materiau. Cette hypothese est souvent source d'imprecision des methodes de caracterisation inverses et indirectes en tube d'impedance. Le critere d'asymetrie propose permet ainsi de s'assurer de l'applicabilite et de la precision de ces methodes pour un materiau donne. Le troisieme probleme vise a mieux comprendre le probleme de transmission sonore en tube d'impedance en presentant pour la premiere fois un developpement exact du probleme par decomposition d'ondes. Ce developpement permet d'etablir clairement les limites des nombreuses methodes existantes basees sur des tubes de transmission a 2, 3 ou 4 microphones. La meilleure comprehension de ce probleme de transmission est importante puisque c'est par ce type de mesures que des methodes permettent d'extraire successivement la matrice de transfert d'un materiau poreux et ses proprietes dynamiques intrinseques comme son impedance caracteristique et son nombre d'onde complexe. Enfin, le

  5. Turbine Engine Mathematical Model Validation

    DTIC Science & Technology

    1976-12-01

    AEDC-TR-76-90 ~Ec i ? Z985 TURBINE ENGINE MATHEMATICAL MODEL VALIDATION ENGINE TEST FACILITY ARNOLD ENGINEERING DEVELOPMENT CENTER AIR FORCE...i f n e c e s e a ~ ~ d i den t i f y by b l ock number) YJI01-GE-100 engine turbine engines mathematical models computations mathematical...report presents and discusses the results of an investigation to develop a rationale and technique for the validation of turbine engine steady-state

  6. Minimisation des inductances propres des condensateurs à film métallisé

    NASA Astrophysics Data System (ADS)

    Joubert, Ch.; Rojat, G.; Béroual, A.

    1995-07-01

    In this article, we examine the different factors responsible for the equivalent series inductance in metallized capacitors and we propose structures for capacitors that reduce this inductance. After recalling the structure of metallized capacitors we compare, by experimental measurements, the inductance due to the winding and that one added by the connections. The latter can become preponderant. In order to explain the experimental evolution of the winding impedance vs. frequency, we describe an analytical model which gives the current density in the winding and its impedance. This model enables us to determine the self resonant frequency for different types of capacitors. From where, we can infer the influence of the height of capacitors and their internal and external radius upon performances, It appears that to reduce the equivalent series inductance, it is better to use flat windings and annular windings. Dans cet article nous examinons les différents facteurs responsables de l'inductance équivalente série dans les condensateurs à film métallisé et proposons des géométries de condensateurs qui réduisent cette inductance. Après avoir rappelé la structure des condensateurs à film métallisé, nous comparons, par des mesures expérimentales, l'inductance due au bobinage et l'inductance ajoutée par les connexions. Cette dernière peut devenir prépondérante. Afin d'expliquer l'évolution de l'impédance du bobinage en fonction de la fréquence, nous décrivons un modèle analytique qui donne la densité du courant dans le bobinage et l'impédance de ce dernier. En outre, ce modèle permet de déterminer la fréquence de résonance série de divers types de condensateurs ce qui permet de déduire l'influence de la hauteur des condensateurs et de leurs rayons interne et externe sur les performances. Il apparaît ainsi que, pour diminuer l'inductance équivalente série, il vaut mieux employer des bobinages plats et des bobinages annulaires.

  7. Le role du phytoplancton de petite taille (<20 mum) dans les variations des proprietes optiques des eaux du Saint-Laurent

    NASA Astrophysics Data System (ADS)

    Mas, Sebastien

    proprietes biooptiques, particulierement l'absorption, etaient attribuables a la contribution du phytoplancton <20 mum. Ceci confirme l'importance de la structure de taille des communautes phytoplanctoniques dans les modeles bio-optiques appliques au Saint-Laurent. L'ensemble des resultats a permis de mettre en evidence l'importance des mecanismes de photoacclimatation et de synchronisation du cycle cellulaire du phytoplancton sur les variations journalieres des IOPs, ainsi que de l'etat physiologique relie au stade de croissance sur les variations temporelles a long terme des IOPs. De plus, le phytoplancton <20 mum contribue de maniere importante aux IOPs et a leur variabilite dans l'Estuaire et le Golfe du St-Laurent, et ce particulierement pour l'absorption. Cette etude de doctorat souligne donc l'importance du phytoplancton <20 mum sur la variabilite des IOPs des oceans.

  8. Global precipitation measurements for validating climate models

    NASA Astrophysics Data System (ADS)

    Tapiador, F. J.; Navarro, A.; Levizzani, V.; García-Ortega, E.; Huffman, G. J.; Kidd, C.; Kucera, P. A.; Kummerow, C. D.; Masunaga, H.; Petersen, W. A.; Roca, R.; Sánchez, J.-L.; Tao, W.-K.; Turk, F. J.

    2017-11-01

    The advent of global precipitation data sets with increasing temporal span has made it possible to use them for validating climate models. In order to fulfill the requirement of global coverage, existing products integrate satellite-derived retrievals from many sensors with direct ground observations (gauges, disdrometers, radars), which are used as reference for the satellites. While the resulting product can be deemed as the best-available source of quality validation data, awareness of the limitations of such data sets is important to avoid extracting wrong or unsubstantiated conclusions when assessing climate model abilities. This paper provides guidance on the use of precipitation data sets for climate research, including model validation and verification for improving physical parameterizations. The strengths and limitations of the data sets for climate modeling applications are presented, and a protocol for quality assurance of both observational databases and models is discussed. The paper helps elaborating the recent IPCC AR5 acknowledgment of large observational uncertainties in precipitation observations for climate model validation.

  9. Beware of external validation! - A Comparative Study of Several Validation Techniques used in QSAR Modelling.

    PubMed

    Majumdar, Subhabrata; Basak, Subhash C

    2018-04-26

    Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. L’évaluation systématique des instruments pour mesurer la douleur chez les personnes âgées ayant des capacités réduites à communiquer*

    PubMed Central

    Aubin, Michèle; Giguère, Anik; Hadjistavropoulos, Thomas; Verreault, René

    2007-01-01

    La douleur chronique est souvent sous-détectée et insuffisamment traitée dans les milieux de soins de longue durée. Les outils d’autorapport (ou autoévaluation) de la douleur, comme l’échelle visuelle analogique, n’ont été validés que partiellement chez les populations âgées, en raison de la prévalence élevée de déficits visuels, auditifs, moteurs et cognitifs que l’on y trouve. Des outils d’observation des patients ont été développés pour pallier ces difficultés d’utilisation des échelles d’autorapport de la douleur. Le présent projet vise l’identification de ces échelles et leur évaluation sur la base de la validité de contenu (12 questions), de la validité de construit (12 questions), de la fiabilité (13 questions) et de l’utilité clinique (10 questions). Parmi les 24 instruments recensés, plusieurs apparaissent prometteurs pour évaluer la douleur chez les personnes âgées atteintes de démence sévère. Des efforts additionnels de validation sont cependant requis avant leur intégration à la pratique régulière en soins de longue durée. PMID:17717611

  11. Modeling rheumatoid arthritis using different techniques - a review of model construction and results.

    PubMed

    Scholz, Stefan; Mittendorf, Thomas

    2014-12-01

    to report more outcome parameters. Given a sufficient data supply, DES is the modeling technique of choice when modeling cost-effectiveness in RA. Otherwise transparency on the data inputs is crucial for valid results and to inform decision makers about possible biases. With regard to ICERs, Markov models might provide similar estimates as more advanced modeling techniques.

  12. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  13. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  14. Validation de la version française de l’échelle multidimensionnelle des conduites de négligence parentale

    PubMed Central

    Bérubé, Annie; Chamberland, Claire

    2017-01-01

    Objectif: La mesure de la négligence parentale pose de nombreux défis et il existe encore peu d’outil capable de la documenter auprès des parents. L’étude vise à documenter les propriétés psychométriques de la traduction française de la version brève de l’Échelle multidimensionnelle des conduites de négligence parentale dans la population générale. Méthode: Cette étude utilise les données d’une enquête téléphonique réalisée auprès d’un échantillon représentatif de 3584 mères et 1202 pères d’enfants âgés entre 6 mois-4 ans, 5-9 ans et 10-15 ans. Des liens sont établis entre la négligence et plusieurs autres facteurs connus pour leurs liens avec la problématique, dont les conduites parentales à caractère violent, le stress lié au tempérament perçu difficile de l’enfant et à la conciliation travail-famille, la consommation d’alcool et de drogues, les symptômes de dépression, la pauvreté, et le soutien social. Résultats: Les analyses factorielles exploratoires montrent la présence de diverses dimensions de la négligence concernant les besoins affectifs/cognitifs, physiques (soins de base) et de supervision des enfants. Bien que les dimensions soient étroitement associées aux facteurs de vulnérabilité psychosociale des enfants et des familles, les coefficients de cohérence interne sont faibles, variant entre 0,20 et 0,64. Conclusion: Des recommandations sont émises pour améliorer la mesure dans la population générale, et notamment la dimension de supervision parentale qui pose des défis particuliers. PMID:28359164

  15. Validating EHR clinical models using ontology patterns.

    PubMed

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Validation of urban freeway models.

    DOT National Transportation Integrated Search

    2015-01-01

    This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...

  17. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  18. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  19. A physiology-based model describing heterogeneity in glucose metabolism: the core of the Eindhoven Diabetes Education Simulator (E-DES).

    PubMed

    Maas, Anne H; Rozendaal, Yvonne J W; van Pul, Carola; Hilbers, Peter A J; Cottaar, Ward J; Haak, Harm R; van Riel, Natal A W

    2015-03-01

    Current diabetes education methods are costly, time-consuming, and do not actively engage the patient. Here, we describe the development and verification of the physiological model for healthy subjects that forms the basis of the Eindhoven Diabetes Education Simulator (E-DES). E-DES shall provide diabetes patients with an individualized virtual practice environment incorporating the main factors that influence glycemic control: food, exercise, and medication. The physiological model consists of 4 compartments for which the inflow and outflow of glucose and insulin are calculated using 6 nonlinear coupled differential equations and 14 parameters. These parameters are estimated on 12 sets of oral glucose tolerance test (OGTT) data (226 healthy subjects) obtained from literature. The resulting parameter set is verified on 8 separate literature OGTT data sets (229 subjects). The model is considered verified if 95% of the glucose data points lie within an acceptance range of ±20% of the corresponding model value. All glucose data points of the verification data sets lie within the predefined acceptance range. Physiological processes represented in the model include insulin resistance and β-cell function. Adjusting the corresponding parameters allows to describe heterogeneity in the data and shows the capabilities of this model for individualization. We have verified the physiological model of the E-DES for healthy subjects. Heterogeneity of the data has successfully been modeled by adjusting the 4 parameters describing insulin resistance and β-cell function. Our model will form the basis of a simulator providing individualized education on glucose control. © 2014 Diabetes Technology Society.

  20. Modeling nitrate-nitrogen load reduction strategies for the des moines river, iowa using SWAT

    USGS Publications Warehouse

    Schilling, K.E.; Wolter, C.F.

    2009-01-01

    The Des Moines River that drains a watershed of 16,175 km2 in portions of Iowa and Minnesota is impaired for nitrate-nitrogen (nitrate) due to concentrations that exceed regulatory limits for public water supplies. The Soil Water Assessment Tool (SWAT) model was used to model streamflow and nitrate loads and evaluate a suite of basin-wide changes and targeting configurations to potentially reduce nitrate loads in the river. The SWAT model comprised 173 subbasins and 2,516 hydrologic response units and included point and nonpoint nitrogen sources. The model was calibrated for an 11-year period and three basin-wide and four targeting strategies were evaluated. Results indicated that nonpoint sources accounted for 95% of the total nitrate export. Reduction in fertilizer applications from 170 to 50 kg/ha achieved the 38% reduction in nitrate loads, exceeding the 34% reduction required. In terms of targeting, the most efficient load reductions occurred when fertilizer applications were reduced in subbasins nearest the watershed outlet. The greatest load reduction for the area of land treated was associated with reducing loads from 55 subbasins with the highest nitrate loads, achieving a 14% reduction in nitrate loads achieved by reducing applications on 30% of the land area. SWAT model results provide much needed guidance on how to begin implementing load reduction strategies most efficiently in the Des Moines River watershed. ?? 2009 Springer Science+Business Media, LLC.

  1. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  2. Fiabilité des structures mécaniques adaptatives: effet de la panne des actionneurs ou des capteurs sur la stabilité

    NASA Astrophysics Data System (ADS)

    Fall, H.; Charon, W.; Kouta, R.

    2002-12-01

    Ces dernières décennies, des activités significatives dans le monde étaient dirigées autour du contrôle actif. Le but de ces recherches était essentiellement d'améliorer les performances, la fiabilité et la sécurité des systèmes. Notamment dans le cas des structures soumises à des vibrations aléatoires. D'importants travaux ont été consacré à l'utilisation des “matériaux intelligents” comme capteurs et actionneurs. Cette article propose l'analyse de la fiabilité des systèmes mécaniques en étudiant les pannes des actionneurs ou des capteurs. L'effet de ces pannes sur la stabilité et la performance du système y est démontré. Les méthodologies de conception y sont rappelées. Des exemples numériques sont fournis à travers le contrôle d'un panneau sous chargement dynamique pour illustrer la méthode proposée.

  3. Comprendre la maltraitance des aînés en pratique familiale

    PubMed Central

    Yaffe, Mark J.; Tazkarji, Bachir

    2012-01-01

    Résumé Objectif Présenter ce qui constitue la maltraitance des aînés, ce dont les médecins de famille devraient être au courant, les signes et les symptômes laissant présager de mauvais traitements chez des adultes plus âgés, comment l’outil Elder Abuse Suspicion Index peut aider à détecter la maltraitance et les options qui existent pour réagir en cas de soupçons de maltraitance. Sources des données On a fait une recension dans MEDLINE, PsycINFO et Social Work Abstracts pour trouver des publications en français ou en anglais, de 1970 à 2011, à l’aide des expressions elder abuse, elder neglect, elder mistreatment, seniors, older adults, violence, identification, detection tools et signs and symptoms. Les publications pertinentes ont fait l’objet d’un examen. Message principal La maltraitance des aînés est une cause importante de morbidité et de mortalité chez les adultes plus âgés. Si les médecins de famille sont bien placés pour détecter des mauvais traitements infligés aux aînés, leurs taux réels de signalement de cas de maltraitance sont plus faibles que dans d’autres professions. Cette situation pourrait s’améliorer s’ils comprenaient mieux les genres d’agissements qui constituent de la maltraitance des aînés, ainsi que les signes et les symptômes observés au bureau qui pourraient pointer vers des cas de mauvais traitements. La détection de tels cas pourrait être facilitée par le recours à un court outil validé, comme l’Elder Abuse Suspicion Index. Conclusion Les médecins de famille peuvent jouer un rôle plus important dans la détection d’une éventuelle maltraitance des aînés. Une fois qu’on soupçonne de mauvais traitements, il existe dans la plupart des communautés des services sociaux ou des forces de l’ordre accessibles pour effectuer des évaluations plus approfondies et intervenir.

  4. Modeling and Simulation at NASA

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2009-01-01

    This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.

  5. Etude de pratiques d'enseignement relatives a la modelisation en sciences et technologies avec des enseignants du secondaire

    NASA Astrophysics Data System (ADS)

    Aurousseau, Emmanuelle

    Les modeles sont des outils amplement utilises en sciences et technologies (S&T) afin de representer et d’expliquer un phenomene difficilement accessible, voire abstrait. La demarche de modelisation est presentee de maniere explicite dans le programme de formation de l’ecole quebecoise (PFEQ), notamment au 2eme cycle du secondaire (Quebec. Ministere de l'Education du Loisir et du Sport, 2007a). Elle fait ainsi partie des sept demarches auxquelles eleves et enseignants sont censes recourir. Cependant, de nombreuses recherches mettent en avant la difficulte des enseignants a structurer leurs pratiques d’enseignement autour des modeles et de la demarche de modelisation qui sont pourtant reconnus comme indispensables. En effet, les modeles favorisent la conciliation des champs concrets et abstraits entre lesquels le scientifique, meme en herbe, effectue des allers-retours afin de concilier le champ experimental de reference qu’il manipule et observe au champ theorique relie qu’il construit. L’objectif de cette recherche est donc de comprendre comment les modeles et la demarche de modelisation contribuent a faciliter l’articulation du concret et de l’abstrait dans l’enseignement des sciences et des technologies (S&T) au 2eme cycle du secondaire. Pour repondre a cette question, nous avons travaille avec les enseignants dans une perspective collaborative lors de groupes focalises et d’observation en classe. Ces dispositifs ont permis d’examiner les pratiques d’enseignement que quatre enseignants mettent en oeuvre en utilisant des modeles et des demarches de modelisation. L’analyse des pratiques d’enseignement et des ajustements que les enseignants envisagent dans leur pratique nous permet de degager des connaissances a la fois pour la recherche et pour la pratique des enseignants, au regard de l’utilisation des modeles et de la demarche de modelisation en S&T au secondaire.

  6. Modelling human skull growth: a validated computational model

    PubMed Central

    Marghoub, Arsalan; Johnson, David; Khonsari, Roman H.; Fagan, Michael J.; Moazen, Mehran

    2017-01-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions (n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. PMID:28566514

  7. Modelling human skull growth: a validated computational model.

    PubMed

    Libby, Joseph; Marghoub, Arsalan; Johnson, David; Khonsari, Roman H; Fagan, Michael J; Moazen, Mehran

    2017-05-01

    During the first year of life, the brain grows rapidly and the neurocranium increases to about 65% of its adult size. Our understanding of the relationship between the biomechanical forces, especially from the growing brain, the craniofacial soft tissue structures and the individual bone plates of the skull vault is still limited. This basic knowledge could help in the future planning of craniofacial surgical operations. The aim of this study was to develop a validated computational model of skull growth, based on the finite-element (FE) method, to help understand the biomechanics of skull growth. To do this, a two-step validation study was carried out. First, an in vitro physical three-dimensional printed model and an in silico FE model were created from the same micro-CT scan of an infant skull and loaded with forces from the growing brain from zero to two months of age. The results from the in vitro model validated the FE model before it was further developed to expand from 0 to 12 months of age. This second FE model was compared directly with in vivo clinical CT scans of infants without craniofacial conditions ( n = 56). The various models were compared in terms of predicted skull width, length and circumference, while the overall shape was quantified using three-dimensional distance plots. Statistical analysis yielded no significant differences between the male skull models. All size measurements from the FE model versus the in vitro physical model were within 5%, with one exception showing a 7.6% difference. The FE model and in vivo data also correlated well, with the largest percentage difference in size being 8.3%. Overall, the FE model results matched well with both the in vitro and in vivo data. With further development and model refinement, this modelling method could be used to assist in preoperative planning of craniofacial surgery procedures and could help to reduce reoperation rates. © 2017 The Author(s).

  8. Estimation et validation des derivees de stabilite et controle du modele dynamique non-lineaire d'un drone a voilure fixe

    NASA Astrophysics Data System (ADS)

    Courchesne, Samuel

    Knowledge of the dynamic characteristics of a fixed-wing UAV is necessary to design flight control laws and to conceive a high quality flight simulator. The basic features of a flight mechanic model include the properties of mass, inertia and major aerodynamic terms. They respond to a complex process involving various numerical analysis techniques and experimental procedures. This thesis focuses on the analysis of estimation techniques applied to estimate problems of stability and control derivatives from flight test data provided by an experimental UAV. To achieve this objective, a modern identification methodology (Quad-M) is used to coordinate the processing tasks from multidisciplinary fields, such as parameter estimation modeling, instrumentation, the definition of flight maneuvers and validation. The system under study is a non-linear model with six degrees of freedom with a linear aerodynamic model. The time domain techniques are used for identification of the drone. The first technique, the equation error method is used to determine the structure of the aerodynamic model. Thereafter, the output error method and filter error method are used to estimate the aerodynamic coefficients values. The Matlab scripts for estimating the parameters obtained from the American Institute of Aeronautics and Astronautics (AIAA) are used and modified as necessary to achieve the desired results. A commendable effort in this part of research is devoted to the design of experiments. This includes an awareness of the system data acquisition onboard and the definition of flight maneuvers. The flight tests were conducted under stable flight conditions and with low atmospheric disturbance. Nevertheless, the identification results showed that the filter error method is most effective for estimating the parameters of the drone due to the presence of process noise and measurement. The aerodynamic coefficients are validated using a numerical analysis of the vortex method. In addition, a

  9. External validation of preexisting first trimester preeclampsia prediction models.

    PubMed

    Allen, Rebecca E; Zamora, Javier; Arroyo-Manzano, David; Velauthar, Luxmilar; Allotey, John; Thangaratinam, Shakila; Aquilina, Joseph

    2017-10-01

    To validate the increasing number of prognostic models being developed for preeclampsia using our own prospective study. A systematic review of literature that assessed biomarkers, uterine artery Doppler and maternal characteristics in the first trimester for the prediction of preeclampsia was performed and models selected based on predefined criteria. Validation was performed by applying the regression coefficients that were published in the different derivation studies to our cohort. We assessed the models discrimination ability and calibration. Twenty models were identified for validation. The discrimination ability observed in derivation studies (Area Under the Curves) ranged from 0.70 to 0.96 when these models were validated against the validation cohort, these AUC varied importantly, ranging from 0.504 to 0.833. Comparing Area Under the Curves obtained in the derivation study to those in the validation cohort we found statistically significant differences in several studies. There currently isn't a definitive prediction model with adequate ability to discriminate for preeclampsia, which performs as well when applied to a different population and can differentiate well between the highest and lowest risk groups within the tested population. The pre-existing large number of models limits the value of further model development and future research should be focussed on further attempts to validate existing models and assessing whether implementation of these improves patient care. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  10. Developpement des betons semi autoplacants a rheologie adaptee pour des infrastructures

    NASA Astrophysics Data System (ADS)

    Sotomayor Cruz, Cristian Daniel

    Au cours des dernières décennies, les infrastructures canadiennes et québécoises comportent plusieurs structures en béton armé présentant des problèmes de durabilité dus aux conditions climatiques sévères, à la mauvaise conception des structures, à la qualité des matériaux, aux types des bétons choisis, aux systèmes de construction ou à l'existence d'événements incontrôlables. En ce qui concerne le choix du béton pour la construction des infrastructures, une vaste gamme de béton divisée en deux principaux types peut être utilisée: le béton conventionnel vibré (BCV) et le béton autoplaçant (BAP). Dans le cas d'un BCV, la consolidation inadéquate par vibration a été un problème récurrent, occasionnant des dommages structuraux. Ceci a conduit à une réduction de la durabilité et à une augmentation du coût d'entretien et de réparation des infrastructures. Rien que l'utilisation d'un BAP a des avantages tels que l'élimination de la vibration, la réduction des coûts de main d'oeuvre et l'amélioration de la qualité des structures, néanmoins, le coût initial d'un BAP par rapport à un BCV ne permet pas encore de généraliser son utilisation dans l'industrie de la construction. Ce mémoire présente la conception d'une nouvelle gamme de béton semi-autoplaçant pour la construction des infrastructures (BSAP-I) exigeant une vibration minimale. Il s'agit de trouver un équilibre optimal entre la rhéologie et le coût initial du nouveau béton pour conférer une bonne performance structurale et économique aux structures. Le programme expérimental établi a premièrement permis d'évaluer la faisabilité d'utilisation des BSAP-I pour la mise en place des piliers d'une infrastructure de pont à Sherbrooke. En plus, l'utilisation d'un plan d'expériences a permis l'évaluation de trois paramètres de formulation sur les propriétés des mélanges de BSAP-I à l'état frais et durci. Finalement, l'évaluation de la performance des

  11. Calibration and validation of rockfall models

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  12. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  13. Paternité des articles et intérêts concurrents : une analyse des recommandations aux auteurs des journaux traitant de pratique pharmaceutique

    PubMed Central

    Courbon, Ève; Tanguay, Cynthia; Lebel, Denis; Bussières, Jean-François

    2014-01-01

    RÉSUMÉ Contexte : La présence d’auteurs honorifiques et fantômes ainsi que les intérêts concurrents représentent des difficultés bien documentées, liées à la publication d’articles scientifiques. Il existe des lignes directrices encadrant la rédaction et la publication de manuscrits scientifiques, notamment celles de l’International Committee of Medical Journal Editors (ICMJE). Objectifs : L’objectif principal de cette étude descriptive et transversale visait à recenser les instructions portant sur la paternité des articles et les intérêts concurrents provenant des recommandations aux auteurs des journaux traitant de pratique pharmaceutique. L’objectif secondaire visait à déterminer des mesures correctrices pour une paternité des articles plus transparente. Méthode : La recherche a débuté par l’identification des journaux traitant de pratique pharmaceutique. La consultation des instructions aux auteurs des journaux a permis ensuite de recenser les recommandations destinées à éviter les problèmes de paternité des articles et d’intérêts concurrents. Finalement, les membres de l’équipe de recherche se sont consultés afin de définir des mesures correctrices possibles à l’intention des chercheurs. Résultats : Des 232 journaux traitant de pharmacie, 33 ont été définis comme traitant de pratique pharmaceutique. Un total de 24 (73 %) journaux mentionnaient suivre la politique de l’ICMJE, 14 (42 %) demandaient aux auteurs de remplir un formulaire de déclaration d’intérêts concurrents au moment de la soumission de l’article, 17 (52 %) présentaient une définition de la qualité d’auteur et 5 (15 %) demandaient de détailler les contributions de chaque auteur. Une grille de 40 critères a été élaborée pour définir l’attribution du statut d’auteur. Conclusion : Moins de la moitié des journaux demandait aux auteurs de transmettre un formulaire de déclaration des intérêts concurrents au moment de la

  14. Lignes directrices sur l’aiguillage des cas soupçonnés de cancer du poumon par un médecin de famille ou autre professionnel des soins primaires

    PubMed Central

    Del Giudice, M. Elisabeth; Young, Sheila-Mae; Vella, Emily T.; Ash, Marla; Bansal, Praveen; Robinson, Andrew; Skrastins, Roland; Ung, Yee; Zeldin, Robert; Levitt, Cheryl

    2014-01-01

    Résumé Objectif Les présentes lignes directrices visent à aider les médecins de famille et autres généralistes à reconnaître les manifestations cliniques devant éveiller les soupçons quant à la présence d’un cancer du poumon chez les patients. Composition du comité Les membres du comité ont été choisis parmi les leaders régionaux en soins primaires du Réseau provincial des soins primaires et de la lutte contre le cancer d’Action Cancer Ontario et parmi les membres du Groupe sur le siège de la maladie, Cancer du poumon d’Action Cancer Ontario. Méthodes Les présentes lignes directrices sont le fruit d’une revue systématique des données probantes, d’une synthèse des données et d’un examen externe formel effectué par des intervenants canadiens qui ont validé la pertinence des recommandations. Rapport Ces lignes directrices fondées sur des données probantes ont été formulées pour améliorer la prise en charge en contexte canadien des patients qui présentent des manifestations cliniques du cancer du poumon. Conclusion Le dépistage et l’aiguillage précoces des patients atteints de cancer du poumon pourraient en fin de compte aider à réduire les morbidités et mortalités liées au cancer. Ces lignes directrices pourraient aussi s’avérer utiles dans la mise sur pied de programmes de diagnostic du cancer du poumon et pour aider les décideurs à veiller à ce que les ressources appropriées soient en place.

  15. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  16. Applicability Analysis of Validation Evidence for Biomedical Computational Models

    DOE PAGES

    Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...

    2017-09-07

    Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less

  17. Cross-validation to select Bayesian hierarchical models in phylogenetics.

    PubMed

    Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C

    2016-05-26

    Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.

  18. Caracterisation des melanges developpes pour le moulage basse pression des poudres metalliques (LPIM) =

    NASA Astrophysics Data System (ADS)

    Fareh, Fouad

    Le moulage par injection basse pression des poudres metalliques est une technique de fabrication qui permet de fabriquer des pieces possedant la complexite des pieces coulees mais avec les proprietes mecaniques des pieces corroyees. Cependant, l'optimisation des etapes de deliantage et de frittage a ete jusqu'a maintenant effectuee a l'aide de melange pour lesquels la moulabilite optimale n'a pas encore ete demontree. Ainsi, la comprehension des proprietes rheologiques et de la segregation des melanges est tres limitee et cela presente le point faible du processus de LPIM. L'objectif de ce projet de recherche etait de caracteriser l'influence des liants sur le comportement rheologique des melanges en mesurant la viscosite et la segregation des melanges faible viscosite utilises dans le procede LPIM. Afin d'atteindre cet objectif, des essais rheologiques et thermogravimetriques ont ete conduits sur 12 melanges. Ces melanges ont ete prepares a base de poudre d'Inconel 718 de forme spherique (chargement solide constant a 60%) et de cires, d'agents surfactants ou epaississants. Les essais rheologiques ont ete utilises entre autre pour calculer l'indice d'injectabilite ?STV des melanges, tandis que les essais thermogravimetriques ont permis d'evaluer precisement la segregation des poudres dans les melanges. Il a ete demontre que les trois (3) melanges contenant de la cire de paraffine et de l'acide stearique presentent des indices alpha STV plus eleves qui sont avantageux pour le moulage par injection des poudres metalliques (MIM), mais segregent beaucoup trop pour que la piece fabriquee produise de bonnes caracteristiques mecaniques. A l'oppose, le melange contenant de la cire de paraffine et de l'ethylene-vinyle acetate ainsi que le melange contenant seulement de la cire de carnauba segregent peu voire pas du tout, mais possedent de tres faibles indices alphaSTV : ils sont donc difficilement injectables. Le meilleur compromis semble donc etre les melanges contenant de

  19. La participation des enfants et des adolescents à la boxe

    PubMed Central

    Purcell, Laura K; LeBlanc, Claire MA

    2012-01-01

    RÉSUMÉ Des milliers de garçons et de filles de moins de 19 ans font de la boxe en Amérique du Nord. Même si la boxe comporte des avantages pour ceux qui y participent, y compris l’exercice, l’autodiscipline et la confiance en soi, le sport lui-même favorise et récompense des coups délibérés à la tête et au visage. Les personnes qui font de la boxe risquent de subir des blessures à la tête, au visage et au cou, y compris des traumatismes neurologiques chroniques et même fatals. Les commotions cérébrales sont l’une des principales blessures causées par la boxe. En raison du risque de blessures crâniennes et faciales, la Société canadienne de pédiatrie et l’American Academy of Pediatrics s’opposent vigoureusement à la boxe comme activité sportive pour les enfants et les adolescents. Ces organismes recommandent que les médecins s’élèvent contre la boxe auprès des jeunes et les encouragent à participer à d’autres activités dans lesquelles les coups intentionnels à la tête ne constituent pas un élément essentiel du sport.

  20. Membrane fluidization by alcohols inhibits DesK-DesR signalling in Bacillus subtilis.

    PubMed

    Vaňousová, Kateřina; Beranová, Jana; Fišer, Radovan; Jemioła-Rzemińska, Malgorzata; Matyska Lišková, Petra; Cybulski, Larisa; Strzałka, Kazimierz; Konopásek, Ivo

    2018-03-01

    After cold shock, the Bacillus subtilis desaturase Des introduces double bonds into the fatty acids of existing membrane phospholipids. The synthesis of Des is regulated exclusively by the two-component system DesK/DesR; DesK serves as a sensor of the state of the membrane and triggers Des synthesis after a decrease in membrane fluidity. The aim of our work is to investigate the biophysical changes in the membrane that are able to affect the DesK signalling state. Using linear alcohols (ethanol, propanol, butanol, hexanol, octanol) and benzyl alcohol, we were able to suppress Des synthesis after a temperature downshift. The changes in the biophysical properties of the membrane caused by alcohol addition were followed using membrane fluorescent probes and differential scanning calorimetry. We found that the membrane fluidization induced by alcohols was reflected in an increased hydration at the lipid-water interface. This is associated with a decrease in DesK activity. The addition of alcohol mimics a temperature increase, which can be measured isothermically by fluorescence anisotropy. The effect of alcohols on the membrane periphery is in line with the concept of the mechanism by which two hydrophilic motifs located at opposite ends of the transmembrane region of DesK, which work as a molecular caliper, sense temperature-dependent variations in membrane properties. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Model Validation Against The Modelers’ Data Archive

    DTIC Science & Technology

    2014-08-01

    completion of the planned Jack Rabbit 2 field trials. The relevant task for the effort addressed here is Task 4 of the current Interagency Agreement, as...readily simulates the Prairie Grass sulfur dioxide plumes. Also, Jack Rabbit II field trials are set to be completed during FY16. Once these data are...available, they will also be used to validate the combined models. This validation may prove to be more useful, as the Jack Rabbit II will release

  2. Etude de la performance des radars hautes-frequences CODAR et WERA pour la mesure des courants marins en presence partielle de glace de mer

    NASA Astrophysics Data System (ADS)

    Kamli, Emna

    Les radars hautes-frequences (RHF) mesurent les courants marins de surface avec une portee pouvant atteindre 200 kilometres et une resolution de l'ordre du kilometre. Cette etude a pour but de caracteriser la performance des RHF, en terme de couverture spatiale, pour la mesure des courants de surface en presence partielle de glace de mer. Pour ce faire, les mesures des courants de deux radars de type CODAR sur la rive sud de l'estuaire maritime du Saint-Laurent, et d'un radar de type WERA sur la rive nord, prises pendant l'hiver 2013, ont ete utilisees. Dans un premier temps, l'aire moyenne journaliere de la zone ou les courants sont mesures par chaque radar a ete comparee a l'energie des vagues de Bragg calculee a partir des donnees brutes d'acceleration fournies par une bouee mouillee dans la zone couverte par les radars. La couverture des CODARs est dependante de la densite d'energie de Bragg, alors que la couverture du WERA y est pratiquement insensible. Un modele de fetch appele GENER a ete force par la vitesse du vent predite par le modele GEM d'Environnement Canada pour estimer la hauteur significative ainsi que la periode modale des vagues. A partir de ces parametres, la densite d'energie des vagues de Bragg a ete evaluee pendant l'hiver a l'aide du spectre theorique de Bretschneider. Ces resultats permettent d'etablir la couverture normale de chaque radar en absence de glace de mer. La concentration de glace de mer, predite par le systeme canadien operationnel de prevision glace-ocean, a ete moyennee sur les differents fetchs du vent selon la direction moyenne journaliere des vagues predites par GENER. Dans un deuxieme temps, la relation entre le ratio des couvertures journalieres obtenues pendant l'hiver 2013 et des couvertures normales de chaque radar d'une part, et la concentration moyenne journaliere de glace de mer d'autre part, a ete etablie. Le ratio des couvertures decroit avec l'augmentation de la concentration de glace de mer pour les deux types

  3. Modelisation de la Propagation des Ondes Sonores dans un Environnement Naturel Complexe

    NASA Astrophysics Data System (ADS)

    L'Esperance, Andre

    Ce travail est consacre a la propagation sonore a l'exterieur dans un environnement naturel complexe, i.e. en presence de conditions reelles de vent, de gradient de temperature et de turbulence atmospherique. Plus specifiquement ce travail comporte deux objectifs. D'une part, il vise a developper un modele heuristique de propagation sonore (MHP) permettant de prendre en consideration l'ensemble des phenomenes meteorologiques et acoustiques influencant la propagation du son a l'exterieur. D'autre part, il vise a identifier dans quelles circonstances et avec quelle importance les conditions meteorologiques interviennent sur la propagation sonore. Ce travail est divise en cinq parties. Apres une breve introduction identifiant les motivations de cette etude (chapitre 1), le chapitre 2 fait un rappel des travaux deja realises dans le domaine de la propagation du son a l'exterieur. Ce chapitre presente egalement les bases de l'acoustique geometrique a partir desquelles ont ete developpees la partie acoustique du modele heuristique de propagation. En outre, on y decrit comment les phenomenes de refraction et de turbulence atmospherique peuvent etre consideres dans la theorie des rayons. Le chapitre 3 presente le modele heuristique de propagation (MHP) developpe au cours de cet ouvrage. La premiere section de ce chapitre decrit le modele acoustique de propagation, modele qui fait l'hypothese d'un gradient de celerite lineaire et qui est base sur une solution hybride d'acoustique geometrique et de theorie des residus. La deuxieme section du chapitre 3 traite plus specifiquement de la modelisation des aspects meteorologiques et de la determination des profils de celerite et des index de fluctuation associes aux conditions meteorologiques. La section 3 de ce chapitre decrit comment les profils de celerite resultants sont linearises pour les calculs dans le modele acoustique, et finalement la section 4 donne les tendances generales obtenues par le modele. Le chapitre 4 decrit

  4. Sur la modélisation des supraconducteurs : le ``modèle de l'état critique'' de Bean, en trois dimensions

    NASA Astrophysics Data System (ADS)

    Bossavit, A.

    1993-03-01

    Macroscopic modelling of superconductors demands a substitution of some nonlinear behavior law for Ohm's law. For this, a version of Bean's “critical state” model, derived from the setting of a convex functional of the current density field, valid in dimension 3 without any previous assumption about the direction of currents, is proposed. It is shown how two standard three-dimensional finite element methods (“h-formulation” and “e-formulation”), once fitted with this model, can deal with situations were superconductors are present. La modélisation macroscopique des supraconducteurs suppose le remplacement de la loi d'Ohm par une loi de comportement non linéaire adéquate. On présente à cet effet une version du “modèle de Bean”, ou modèle de l'état critique, basée sur la construction d'une certaine fonctionnelle convexe du champ des densités de courant, qui est valable en dimension 3 sans hypothèses préalables sur la direction des courants. On montre comment adapter deux méthodes standards de calcul de courants de Foucault par élérnents finis en trois dimensions (“en h” et “en e”) à la présence de supraconducteurs, en incorporant ce modèle.

  5. Des ballons pour demain

    NASA Astrophysics Data System (ADS)

    Régipa, R.

    A partir d'une théorie sur la détermination des formes et des contraintes globales d'un ballon de révolution, ou s'en rapprochant, une nouvelle famille de ballons a été définie. Les ballons actuels, dits de ``forme naturelle'', sont calculés en général pour une tension circonférencielle nulle. Ainsi, pour une mission donnée, la tension longitudinale et la forme de l'enveloppe sont strictement imposées. Les ballons de la nouvelle génération sont globalement cylindriques et leurs pôles sont réunis par un câble axial, chargé de transmettre une partie des efforts depuis le crochet (pôle inférieur), directement au pôle supérieur. De plus, la zone latérale cylindrique est soumise à un faible champ de tensions circonférencielles. Ainsi, deux paramètres permettent de faire évoluer la distribution des tensions et la forme de l'enveloppe: - la tension du câble de liaison entre pôles (ou la longueur de ce câble) - la tension circonférencielle moyenne désirée (ou le rayon du ballon). On peut donc calculer et réaliser: - soit des ballons de forme adaptée, comme les ballons à fond plat pour le bon fonctionnement des montgolfières infrarouge (projet MIR); - soit des ballons optimisés pour une bonne répartition des contraintes et une meilleure utilisation des matériaux d'enveloppe, pour l'ensemble des programmes stratosphériques. Il s'ensuit une économie sensible des coûts de fabrication, une fiabilité accrue du fonctionnement de ces ballons et une rendement opérationnel bien supérieur, permettant entre autres, d'envisager des vols à très haute altitude en matériaux très légers.

  6. Diethylstilbestrol (DES) and Cancer

    MedlinePlus

    ... Genetics Services Directory Cancer Prevention Overview Research Diethylstilbestrol (DES) and Cancer On This Page What is DES? ... outlined in the table below. Fertility Problems in DES Daughters ( 7 ) Fertility Complication Hazard Ratio Percent Cumulative ...

  7. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    PubMed

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  8. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  9. Maladie des vibrations

    PubMed Central

    Shen, Shixin (Cindy); House, Ronald A.

    2017-01-01

    Résumé Objectif Permettre aux médecins de famille de comprendre l’épidémiologie, la pathogenèse, les symptômes, le diagnostic et la prise en charge de la maladie des vibrations, une maladie professionnelle importante et courante au Canada. Sources d’information Une recherche a été effectuée sur MEDLINE afin de relever les recherches et comptes rendus portant sur la maladie des vibrations. Une recherche a été effectuée sur Google dans le but d’obtenir la littérature grise qui convient au contexte canadien. D’autres références ont été tirées des articles relevés. Message principal La maladie des vibrations est une maladie professionnelle répandue touchant les travailleurs de diverses industries qui utilisent des outils vibrants. La maladie est cependant sous-diagnostiquée au Canada. Elle compte 3 éléments : vasculaire, sous la forme d’un phénomène de Raynaud secondaire; neurosensoriel; et musculosquelettique. Aux stades les plus avancés, la maladie des vibrations entraîne une invalidité importante et une piètre qualité de vie. Son diagnostic exige une anamnèse minutieuse, en particulier des antécédents professionnels, un examen physique, des analyses de laboratoire afin d’éliminer les autres diagnostics, et la recommandation en médecine du travail aux fins d’investigations plus poussées. La prise en charge consiste à réduire l’exposition aux vibrations, éviter les températures froides, abandonner le tabac et administrer des médicaments. Conclusion Pour assurer un diagnostic rapide de la maladie des vibrations et améliorer le pronostic et la qualité de vie, les médecins de famille devraient connaître cette maladie professionnelle courante, et pouvoir obtenir les détails pertinents durant l’anamnèse, recommander les patients aux cliniques de médecine du travail et débuter les demandes d’indemnisation de manière appropriée. PMID:28292812

  10. Etat des lieux des soins de premier recours des malades mentaux à Antananarivo : étude rétrospective

    PubMed Central

    Bakohariliva, Hasina Andrianarivony; Rafehivola, Imisanavalona Hanitrinihaja; Raobelle, Evah Norotiana; Raharivelo, Adeline; Rajaonarison, Bertille Hortense

    2018-01-01

    Résumé Religion et guérisseurs traditionnels occupent encore une place prépondérante dans la prise en charge des maladies mentales à Madagascar. Ainsi, nous nous sommes fixés comme objectif d'établir un état des lieux sur les soins de premier recours des malades mentaux. Nous avons mené une étude rétrospective descriptive s'étalant sur une période de 16 mois allant de janvier 2014 en avril 2015 au sein du service de psychiatrie du CHU de Befelatanana à Antananarivo. La prévalence des psychoses était de 25%. Le genre féminin (53%), l'ethnie merina (77%), les étudiants (45%), le niveau d'étude secondaire (40%), les célibataires (72%), la religion protestante (45%), ainsi que le niveau socio-économique moyen (57,5%) étaient prédominants. Dans les paramètres cliniques, le mode de début brutal (52%), le premier recours à la religion (40%), la présence d'antécédents des cas similaire (90%), étaient majoritaires. La schizophrénie était la pathologie la plus rencontrée dans la moitié des cas. Le délai d'amélioration en cas de traitement religieux et traditionnels était dans la moitié des cas de plus de 10 jours d'hospitalisation. Les patients ayant reçu une prise en charge psychiatrique en premier recours, étaient améliorés dans 75 % cas en moins de 10jours. Le retard du recours aux soins psychiatriques est une réalité à Madagascar qui aggrave le pronostic des psychoses. PMID:29632623

  11. Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

    2013-03-01

    SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

  12. Model Validation | Center for Cancer Research

    Cancer.gov

    Research Investigation and Animal Model Validation This activity is also under development and thus far has included increasing pathology resources, delivering pathology services, as well as using imaging and surgical methods to develop and refine animal models in collaboration with other CCR investigators.

  13. Quantitative model validation of manipulative robot systems

    NASA Astrophysics Data System (ADS)

    Kartowisastro, Iman Herwidiana

    This thesis is concerned with applying the distortion quantitative validation technique to a robot manipulative system with revolute joints. Using the distortion technique to validate a model quantitatively, the model parameter uncertainties are taken into account in assessing the faithfulness of the model and this approach is relatively more objective than the commonly visual comparison method. The industrial robot is represented by the TQ MA2000 robot arm. Details of the mathematical derivation of the distortion technique are given which explains the required distortion of the constant parameters within the model and the assessment of model adequacy. Due to the complexity of a robot model, only the first three degrees of freedom are considered where all links are assumed rigid. The modelling involves the Newton-Euler approach to obtain the dynamics model, and the Denavit-Hartenberg convention is used throughout the work. The conventional feedback control system is used in developing the model. The system behavior to parameter changes is investigated as some parameters are redundant. This work is important so that the most important parameters to be distorted can be selected and this leads to a new term called the fundamental parameters. The transfer function approach has been chosen to validate an industrial robot quantitatively against the measured data due to its practicality. Initially, the assessment of the model fidelity criterion indicated that the model was not capable of explaining the transient record in term of the model parameter uncertainties. Further investigations led to significant improvements of the model and better understanding of the model properties. After several improvements in the model, the fidelity criterion obtained was almost satisfied. Although the fidelity criterion is slightly less than unity, it has been shown that the distortion technique can be applied in a robot manipulative system. Using the validated model, the importance of

  14. Methode d'identification parametrique pour la surveillance in situ des joints a recouvrement par propagation d'ondes vibratoires

    NASA Astrophysics Data System (ADS)

    Francoeur, Dany

    Cette these de doctorat s'inscrit dans le cadre de projets CRIAQ (Consortium de recherche et d'innovation en aerospatiale du Quebec) orientes vers le developpement d'approches embarquees pour la detection de defauts dans des structures aeronautiques. L'originalite de cette these repose sur le developpement et la validation d'une nouvelle methode de detection, quantification et localisation d'une entaille dans une structure de joint a recouvrement par la propagation d'ondes vibratoires. La premiere partie expose l'etat des connaissances sur l'identification d'un defaut dans le contexte du Structural Health Monitoring (SHM), ainsi que la modelisation de joint a recouvrements. Le chapitre 3 developpe le modele de propagation d'onde d'un joint a recouvrement endommage par une entaille pour une onde de flexion dans la plage des moyennes frequences (10-50 kHz). A cette fin, un modele de transmission de ligne (TLM) est realise pour representer un joint unidimensionnel (1D). Ce modele 1D est ensuite adapte a un joint bi-dimensionnel (2D) en faisant l'hypothese d'un front d'onde plan incident et perpendiculaire au joint. Une methode d'identification parametrique est ensuite developpee pour permettre a la fois la calibration du modele du joint a recouvrement sain, la detection puis la caracterisation de l'entaille situee sur le joint. Cette methode est couplee a un algorithme qui permet une recherche exhaustive de tout l'espace parametrique. Cette technique permet d'extraire une zone d'incertitude reliee aux parametres du modele optimal. Une etude de sensibilite est egalement realisee sur l'identification. Plusieurs resultats de mesure sur des joints a recouvrements 1D et 2D sont realisees permettant ainsi l'etude de la repetabilite des resultats et la variabilite de differents cas d'endommagement. Les resultats de cette etude demontrent d'abord que la methode de detection proposee est tres efficace et permet de suivre la progression d'endommagement. De tres bons resultats

  15. Economic analysis of model validation for a challenge problem

    DOE PAGES

    Paez, Paul J.; Paez, Thomas L.; Hasselman, Timothy K.

    2016-02-19

    It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. Asmore » a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.« less

  16. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  17. Litigation involving DES.

    PubMed

    Rheingold, P D

    1976-01-01

    Focus is on the diethylstilbestrol (DES) litigation which has resulted from the 1971 discovery that this synthetic estrogen can cause cancer in the daughters of women who used the drug during pregnancy in an effort to prevent threatened abortion. Possibly 100 suits are pending at this time in which DES daughters claim injuries. In most of these vaginal or cervical cancer has appeared -- with or without a hysterectomy having been performed. Several women died from cancer. The fact that the use of DES occurred many years ago is the legal hurdle most troublesome to lawyers. The average women coming to a lawyer's office today has a mother who used some form of DES, perhaps in 1955. Few drugstores have records today of the prescriptions which they filled 20 years ago. It has been estimated that over the 1950-1970 period more than 200 different companies manufactured or "tabletized" under their own name DES plus a variety of similar synthetic estrogens promoted for the prevention of threatened abortion. A further hurdle caused by the passage of time is that even the records of the physicians are frequently lost. A final problem created by the age of the cases is statute of limitations. If the actual manufacturer of the DES cannot be identified, this is generally the end of the lawyer's interest in the case. The chance of the plaintiff winning may be increased if the action against all the manufacturers is a class action. Most of the pending DES suits are against the manufacturer and not against the doctor. Thus far no DES case has been tried to completion. Several have been settled by the manufacturers on the eve of the trial, generally for less than the full sum that a cancer victim would expect to receive.

  18. Médecine des voyages

    PubMed Central

    Aw, Brian; Boraston, Suni; Botten, David; Cherniwchan, Darin; Fazal, Hyder; Kelton, Timothy; Libman, Michael; Saldanha, Colin; Scappatura, Philip; Stowe, Brian

    2014-01-01

    Résumé Objectif Définir la pratique de la médecine des voyages, présenter les éléments fondamentaux d’une consultation complète préalable aux voyages à des voyageurs internationaux et aider à identifier les patients qu’il vaudrait mieux envoyer en consultation auprès de professionnels de la médecine des voyages. Sources des données Les lignes directrices et les recommandations sur la médecine des voyages et les maladies liées aux voyages publiées par les autorités sanitaires nationales et internationales ont fait l’objet d’un examen. Une recension des ouvrages connexes dans MEDLINE et EMBASE a aussi été effectuée. Message principal La médecine des voyages est une spécialité très dynamique qui se concentre sur les soins préventifs avant un voyage. Une évaluation exhaustive du risque pour chaque voyageur est essentielle pour mesurer avec exactitude les risques particuliers au voyageur, à son itinéraire et à sa destination et pour offrir des conseils sur les interventions les plus appropriées en gestion du risque afin de promouvoir la santé et prévenir les problèmes médicaux indésirables durant le voyage. Des vaccins peuvent aussi être nécessaires et doivent être personnalisés en fonction des antécédents d’immunisation du voyageur, de son itinéraire et du temps qu’il reste avant son départ. Conclusion La santé et la sécurité d’un voyageur dépendent du degré d’expertise du médecin qui offre le counseling préalable à son voyage et les vaccins, au besoin. On recommande à ceux qui donnent des conseils aux voyageurs d’être conscients de l’ampleur de cette responsabilité et de demander si possible une consultation auprès de professionnels de la médecine des voyages pour tous les voyageurs à risque élevé.

  19. Development and psychometric evaluation of the Decisional Engagement Scale (DES-10): A patient-reported psychosocial survey for quality cancer care.

    PubMed

    Hoerger, Michael; Chapman, Benjamin P; Mohile, Supriya G; Duberstein, Paul R

    2016-09-01

    In light of recent health care reforms, we have provided an illustrative example of new opportunities available for psychologists to develop patient-reported measures related to health care quality. Patient engagement in health care decision making has been increasingly acknowledged as a vital component of quality cancer care. We developed the 10-item Decisional Engagement Scale (DES-10), a patient-reported measure of engagement in decision making in cancer care that assesses patients' awareness of their diagnosis, sense of empowerment and involvement, and level of information seeking and planning. The National Institutes of Health's ResearchMatch recruitment tool was used to facilitate Internet-mediated data collection from 376 patients with cancer. DES-10 scores demonstrated good internal consistency reliability (α = .80), and the hypothesized unidimensional factor structure fit the data well. The reliability and factor structure were supported across subgroups based on demographic, socioeconomic, and health characteristics. Higher DES-10 scores were associated with better health-related quality of life (r = .31). In concurrent validity analyses controlling for age, socioeconomic status, and health-related quality of life, higher DES-10 scores were associated with higher scores on quality-of-care indices, including greater awareness of one's treatments, greater preferences for shared decision making, and clearer preferences about end-of-life care. A mini-measure, the DES-3, also performed well psychometrically. In conclusion, DES-10 and DES-3 scores showed evidence of reliability and validity, and these brief patient-reported measures can be used by researchers, clinicians, nonprofits, hospitals, insurers, and policymakers interested in evaluating and improving the quality of cancer care. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. La Station Laser ultra mobile: de l'Obtention d'une Exactitude centimétrique des Mesures à des Applications en Océanographie et Géodésie Spatiales

    NASA Astrophysics Data System (ADS)

    Nicolas, Joëlle

    2000-12-01

    La Station Laser Ultra Mobile est la plus petite station de télémétrie laser au monde, ne pesant que 300 kg, dédiée à la poursuite de satellites équipés de rétroréflecteurs laser. Elle utilise un petit télescope de 13 cm de diamètre placé sur une monture issue d'un théodolite de précision et motorisé, un laser très compact et une photodiode à avalanche permettant la détection au niveau du simple photo-électron. Les premières expériences (Corse, fin 1996) ont révélé de nombreuses instabilités dans la qualité des mesures. Ce travail concerne l'étude et la mise en place de nombreuses modifications techniques afin d'atteindre une exactitude centimétrique des mesures et de pouvoir participer à la campagne de validation des orbites et d'étalonnage de l'altimètre du satellite océanographique JASON-1 (2001). La précision instrumentale souhaitée a été vérifiée avec succès en laboratoire. Outre cet aspect instrumental et métrologique, une analyse a été développée afin de pouvoir estimer l'exactitude et la stabilité des observations de la station mobile après intégration des modifications. A partir d'une expérience de co-localisation entre les deux stations laser fixe du plateau de Calern, l'analyse est basée sur l'ajustement, par station, de coordonnées et d'un biais instrumental moyen à partir d'une orbite de référence des satellites LAGEOS. Des variations saisonnières ont été mises en évidence dans les séries temporelles des différentes composantes. La comparaison locale des déformations de la croûte terrestre se traduisant par des variations d'altitude issues des données laser a montré une cohérence remarquable avec les mesures du gravimètre absolu transportable FG5. Des signaux de même amplitude ont aussi été observés par GPS. Ces variations sont également mises en évidence à l'échelle mondiale et leur interprétation géophysique est faite (combinaison des effets de marées terrestres et polaire

  1. Real-time remote scientific model validation

    NASA Technical Reports Server (NTRS)

    Frainier, Richard; Groleau, Nicolas

    1994-01-01

    This paper describes flight results from the use of a CLIPS-based validation facility to compare analyzed data from a space life sciences (SLS) experiment to an investigator's preflight model. The comparison, performed in real-time, either confirms or refutes the model and its predictions. This result then becomes the basis for continuing or modifying the investigator's experiment protocol. Typically, neither the astronaut crew in Spacelab nor the ground-based investigator team are able to react to their experiment data in real time. This facility, part of a larger science advisor system called Principal Investigator in a Box, was flown on the space shuttle in October, 1993. The software system aided the conduct of a human vestibular physiology experiment and was able to outperform humans in the tasks of data integrity assurance, data analysis, and scientific model validation. Of twelve preflight hypotheses associated with investigator's model, seven were confirmed and five were rejected or compromised.

  2. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  3. Methodes de caracterisation des proprietes thermomecaniques d'un acier martensitique =

    NASA Astrophysics Data System (ADS)

    Ausseil, Lucas

    Le but de l'etude est de developper des methodes permettant de mesurer les proprietes thermomecaniques d'un acier martensitique lors de chauffe rapide. Ces donnees permettent d'alimenter les modeles d'elements finis existant avec des donnees experimentales. Pour cela, l'acier 4340 est utilise. Cet acier est notamment utilise dans les roues d'engrenage, il a des proprietes mecaniques tres interessantes. Il est possible de modifier ses proprietes grâce a des traitements thermiques. Le simulateur thermomecanique Gleeble 3800 est utilise. Il permet de tester theoriquement toutes les conditions presentes dans les procedes de fabrication. Avec les tests de dilatation realises dans ce projet, les temperatures exactes de changement de phases austenitiques et martensitiques sont obtenues. Des tests de traction ont aussi permis de deduire la limite d'elasticite du materiau dans le domaine austenitique allant de 850 °C a 1100 °C. L'effet des deformations sur la temperature de debut de transformation est montre qualitativement. Une simulation numerique est aussi realisee pour comprendre les phenomenes intervenant pendant les essais.

  4. Validation of Model Forecasts of the Ambient Solar Wind

    NASA Technical Reports Server (NTRS)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  5. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    PubMed

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  6. Assessing Discriminative Performance at External Validation of Clinical Prediction Models

    PubMed Central

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.

    2016-01-01

    Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect

  7. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    PubMed

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W

    2016-01-01

    External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  8. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  9. Validation of Magnetospheric Magnetohydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  10. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  11. Validity of empirical models of exposure in asphalt paving

    PubMed Central

    Burstyn, I; Boffetta, P; Burr, G; Cenni, A; Knecht, U; Sciarra, G; Kromhout, H

    2002-01-01

    Aims: To investigate the validity of empirical models of exposure to bitumen fume and benzo(a)pyrene, developed for a historical cohort study of asphalt paving in Western Europe. Methods: Validity was evaluated using data from the USA, Italy, and Germany not used to develop the original models. Correlation between observed and predicted exposures was examined. Bias and precision were estimated. Results: Models were imprecise. Furthermore, predicted bitumen fume exposures tended to be lower (-70%) than concentrations found during paving in the USA. This apparent bias might be attributed to differences between Western European and USA paving practices. Evaluation of the validity of the benzo(a)pyrene exposure model revealed a similar to expected effect of re-paving and a larger than expected effect of tar use. Overall, benzo(a)pyrene models underestimated exposures by 51%. Conclusions: Possible bias as a result of underestimation of the impact of coal tar on benzo(a)pyrene exposure levels must be explored in sensitivity analysis of the exposure–response relation. Validation of the models, albeit limited, increased our confidence in their applicability to exposure assessment in the historical cohort study of cancer risk among asphalt workers. PMID:12205236

  12. Etude de l'amelioration de la qualite des anodes par la modification des proprietes du brai

    NASA Astrophysics Data System (ADS)

    Bureau, Julie

    La qualite des anodes produites se doit d'etre bonne afin d'obtenir de l'aluminium primaire tout en reduisant le cout de production du metal, la consommation d'energie et les emissions environnementales. Or, l'obtention des proprietes finales de l'anode necessite une liaison satisfaisante entre le coke et le brai. Toutefois, la matiere premiere actuelle n'assure pas forcement la compatibilite entre le coke et le brai. Une des solutions les plus prometteuses, pour ameliorer la cohesion entre ces deux materiaux, est la modification des proprietes du brai. L'objectif de ce travail consiste a modifier les proprietes du brai par l'ajout d'additifs chimiques afin d'ameliorer la mouillabilite du coke par le brai modifie pour produire des anodes de meilleure qualite. La composition chimique du brai est modifiee en utilisant des tensioactifs ou agents de modification de surface choisis dans le but d'enrichir les groupements fonctionnels susceptibles d'ameliorer la mouillabilite. L'aspect economique, l'empreinte environnementale et l'impact sur la production sont consideres dans la selection des additifs chimiques. Afin de realiser ce travail, la methodologie consiste a d'abord caracteriser les brais non modifies, les additifs chimiques et les cokes par la spectroscopie infrarouge a transformee de Fourier (FTIR) afin d'identifier les groupements chimiques presents. Puis, les brais sont modifies en ajoutant un additif chimique afin de possiblement modifier ses proprietes. Differentes quantites d'additif sont ajoutees afin d'examiner l'effet de la variation de la concentration sur les proprietes du brai modifie. La methode FTIR permet d'evaluer la composition chimique des brais modifies afin de constater si l'augmentation de la concentration d'additif enrichit les groupements fonctionnels favorisant l'adhesion coke/brai. Ensuite, la mouillabilite du coke par le brai est observee par la methode goutte- sessile. Une amelioration de la mouillabilite par la modification a l'aide d

  13. An Auxiliary Gas Supply to Improve Safety During Aborted Dives with the Canadian Underwater Mine Countermeasures Apparatus (CUMA) (Un Systeme Auxiliaire D’approvisionnement en gaz Augmente la Securite des Plongeurs Utilisant L’appareil Canadien de Deminage Sous-marin (ACDSM) lors des Remontees D’urgence)

    DTIC Science & Technology

    2010-11-01

    Des expériences de validation ont été menées de juin 2002 à novembre 2003, au cours de quatre séries de plongées. Les données consignées par...Eaton; A.J. Ward; D.J. Woodward; DRDC Toronto TR 2010-081; R & D pour la défense Canada – Toronto; Novembre 2010. Introduction ou contexte: L’appareil...semaines, qui ont eu lieu de juin 2002 à novembre 2003. Un contrôle Doppler des participants aux fins de décompression et l’analyse continue des gaz

  14. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  15. A Formal Approach to Empirical Dynamic Model Optimization and Validation

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G; Morelli, Eugene A.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    A framework was developed for the optimization and validation of empirical dynamic models subject to an arbitrary set of validation criteria. The validation requirements imposed upon the model, which may involve several sets of input-output data and arbitrary specifications in time and frequency domains, are used to determine if model predictions are within admissible error limits. The parameters of the empirical model are estimated by finding the parameter realization for which the smallest of the margins of requirement compliance is as large as possible. The uncertainty in the value of this estimate is characterized by studying the set of model parameters yielding predictions that comply with all the requirements. Strategies are presented for bounding this set, studying its dependence on admissible prediction error set by the analyst, and evaluating the sensitivity of the model predictions to parameter variations. This information is instrumental in characterizing uncertainty models used for evaluating the dynamic model at operating conditions differing from those used for its identification and validation. A practical example based on the short period dynamics of the F-16 is used for illustration.

  16. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  17. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  18. Validation of the Poisson Stochastic Radiative Transfer Model

    NASA Technical Reports Server (NTRS)

    Zhuravleva, Tatiana; Marshak, Alexander

    2004-01-01

    A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

  19. AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.

    PubMed

    Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L

    2016-04-01

    A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.

  20. A detached eddy simulation model for the study of lateral separation zones along a large canyon-bound river

    NASA Astrophysics Data System (ADS)

    Alvarez, Laura V.; Schmeeckle, Mark W.; Grams, Paul E.

    2017-01-01

    Lateral flow separation occurs in rivers where banks exhibit strong curvature. In canyon-bound rivers, lateral recirculation zones are the principal storage of fine-sediment deposits. A parallelized, three-dimensional, turbulence-resolving model was developed to study the flow structures along lateral separation zones located in two pools along the Colorado River in Marble Canyon. The model employs the detached eddy simulation (DES) technique, which resolves turbulence structures larger than the grid spacing in the interior of the flow. The DES-3D model is validated using Acoustic Doppler Current Profiler flow measurements taken during the 2008 controlled flood release from Glen Canyon Dam. A point-to-point validation using a number of skill metrics, often employed in hydrological research, is proposed here for fluvial modeling. The validation results show predictive capabilities of the DES model. The model reproduces the pattern and magnitude of the velocity in the lateral recirculation zone, including the size and position of the primary and secondary eddy cells, and return current. The lateral recirculation zone is open, having continuous import of fluid upstream of the point of reattachment and export by the recirculation return current downstream of the point of separation. Differences in magnitude and direction of near-bed and near-surface velocity vectors are found, resulting in an inward vertical spiral. Interaction between the recirculation return current and the main flow is dynamic, with large temporal changes in flow direction and magnitude. Turbulence structures with a predominately vertical axis of vorticity are observed in the shear layer becoming three-dimensional without preferred orientation downstream.

  1. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2017-01-01

    Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237

  2. Quantification des besoins en intrants antipaludiques: contribution à l'actualisation des hypothèses pour la quantification des intrants de prise en charge des cas de paludisme grave en République Démocratique du Congo

    PubMed Central

    Likwela, Joris Losimba; Otokoye, John Otshudiema

    2015-01-01

    Les formes graves de paludisme à Plasmodium falciparum sont une cause majeure de décès des enfants de moins de 5 ans en Afrique Sub-saharienne. Un traitement rapide dépend de la disponibilité de médicaments appropriés au niveau des points de prestation de service. La fréquence des ruptures de stock des commodités antipaludiques, en particuliers celles utilisées pour le paludisme grave, avait nécessité une mise à jour des hypothèses de quantification. Les données issues de la collecte de routine du PNLP de 2007 à 2012 ont été comparées à celles rapportés par d'autres pays africains et utilisées pour orienter les discussions au cours d'un atelier organisé par le PNLP et ses partenaires techniques et financiers afin de dégager un consensus national. La proportion des cas de paludisme rapportés comme grave en RDC est resté autour d'une médiane de 7% avec un domaine de variation de 6 à 9%. Hormis la proportion rapportée au Kenya (2%), les pays africains ont rapporté une proportion de cas grave variant entre 5 et 7%. Il apparaît que la proportion de 1% précédemment utilisée pour la quantification en RDC a été sous-estimée dans le contexte de la gestion des cas graves sur terrain. Un consensus s'est dégagé autour de la proportion de 5% étant entendu que des efforts de renforcement des capacités seraient déployés afin d'améliorer le diagnostic au niveau des points de prestation des services. PMID:26213595

  3. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  4. Validation of urban freeway models. [supporting datasets

    DOT National Transportation Integrated Search

    2015-01-01

    The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...

  5. Developing rural palliative care: validating a conceptual model.

    PubMed

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  6. Méthode d'estimation des tassements des sols fins sous les remblais d'infrastructures ferroviaires pour lignes à grande vitesse

    NASA Astrophysics Data System (ADS)

    Said Alami, Soukaina; Reiffsteck, Philippe; Cuira, Fahd

    2018-02-01

    Le besoin de maîtriser les déformations des sols sous les remblais destinés à recevoir les structures de lignes à grande vitesse, en fait un enjeu important pour ces projets. Toutefois, de nombreuses difficultés ont été soulevées liées principalement au grand nombre d'incertitudes qui entourent le phénomène. En effet, la caractérisation géologique et géotechnique dépend de sondages et d'essais où le risque de remaniement est important et dont l'interprétation est, souvent, délicate. Cet article présente une procédure de calcul qui permettrait en pratique d'évaluer le tassement sous remblai avec une bonne précision et qui a été validée sur un certain nombre d'ouvrages. Pour cela, des corrections sont introduites aux méthodes usuelles de calcul permettant d'approcher les valeurs mesurées sur chantier.

  7. Multiple Versus Single Set Validation of Multivariate Models to Avoid Mistakes.

    PubMed

    Harrington, Peter de Boves

    2018-01-02

    Validation of multivariate models is of current importance for a wide range of chemical applications. Although important, it is neglected. The common practice is to use a single external validation set for evaluation. This approach is deficient and may mislead investigators with results that are specific to the single validation set of data. In addition, no statistics are available regarding the precision of a derived figure of merit (FOM). A statistical approach using bootstrapped Latin partitions is advocated. This validation method makes an efficient use of the data because each object is used once for validation. It was reviewed a decade earlier but primarily for the optimization of chemometric models this review presents the reasons it should be used for generalized statistical validation. Average FOMs with confidence intervals are reported and powerful, matched-sample statistics may be applied for comparing models and methods. Examples demonstrate the problems with single validation sets.

  8. Outward Bound Outcome Model Validation and Multilevel Modeling

    ERIC Educational Resources Information Center

    Luo, Yuan-Chun

    2011-01-01

    This study was intended to measure construct validity for the Outward Bound Outcomes Instrument (OBOI) and to predict outcome achievement from individual characteristics and course attributes using multilevel modeling. A sample of 2,340 participants was collected by Outward Bound USA between May and September 2009 using the OBOI. Two phases of…

  9. Validation of Community Models: Identifying Events in Space Weather Model Timelines

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    I develop and document a set of procedures which test the quality of predictions of solar wind speed and polarity of the interplanetary magnetic field (IMF) made by coupled models of the ambient solar corona and heliosphere. The Wang-Sheeley-Arge (WSA) model is used to illustrate the application of these validation procedures. I present an algorithm which detects transitions of the solar wind from slow to high speed. I also present an algorithm which processes the measured polarity of the outward directed component of the IMF. This removes high-frequency variations to expose the longer-scale changes that reflect IMF sector changes. I apply these algorithms to WSA model predictions made using a small set of photospheric synoptic magnetograms obtained by the Global Oscillation Network Group as input to the model. The results of this preliminary validation of the WSA model (version 1.6) are summarized.

  10. Préoccupations de carrière chez les médecins de travail des groupements de Médecine de travail en Tunisie

    PubMed Central

    Merchaoui, Irtyah; Chouchène, Asma; Bouanène, Ines; Chaari, Néila; Zrafi, Wassim; Henchi, Adnène; Akrout, Mohamed; Amri, Charfeddine

    2017-01-01

    Introduction L'insatisfaction de carrière des médecins de travail (MT) peut influencer leur performance et la qualité des prestations fournies. L'objectif de notre étude est d'évaluer la satisfaction au travail des MT du terrain de l'ensemble des Groupements de médecine de travail (GMT) de la Tunisie et préciser les facteurs déterminants. Méthodes Il s'agit d'une étude nationale transversale portant sur les MT de 22 GMT, basée sur le questionnaire, validé SAPHORA JOB. Résultats 58% des MT des GMT étaient insatisfaits de leur carrière. La satisfaction de carrière était statistiquement influencée par le nombre d'entreprises en charge (p=0,016), l'organisation du travail (p=0,010),le ressenti du métier (p=0,011),le salaire (p‹10-3) et l'information sur la réglementation en vigueur (p=0,047). Conclusion L'homogénéisation des grilles salariales et des échelons de carrière des MT des GMT basée sur une révision des textes législatifs est indiquée. L'amélioration de l'organisation et des conditions de travail peut permettre un épanouissement au travail et une amélioration des prestations. PMID:28819472

  11. Validating Computational Human Behavior Models: Consistency and Accuracy Issues

    DTIC Science & Technology

    2004-06-01

    includes a discussion of SME demographics, content, and organization of the datasets . This research generalizes data from two pilot studies and two base...meet requirements for validating the varied and complex behavioral models. Through a series of empirical studies , this research identifies subject...meet requirements for validating the varied and complex behavioral models. Through a series of empirical studies , this research identifies subject

  12. Adolescent Personality: A Five-Factor Model Construct Validation

    ERIC Educational Resources Information Center

    Baker, Spencer T.; Victor, James B.; Chambers, Anthony L.; Halverson, Jr., Charles F.

    2004-01-01

    The purpose of this study was to investigate convergent and discriminant validity of the five-factor model of adolescent personality in a school setting using three different raters (methods): self-ratings, peer ratings, and teacher ratings. The authors investigated validity through a multitrait-multimethod matrix and a confirmatory factor…

  13. The Dutch Linguistic Intraoperative Protocol: a valid linguistic approach to awake brain surgery.

    PubMed

    De Witte, E; Satoer, D; Robert, E; Colle, H; Verheyen, S; Visch-Brink, E; Mariën, P

    2015-01-01

    Intraoperative direct electrical stimulation (DES) is increasingly used in patients operated on for tumours in eloquent areas. Although a positive impact of DES on postoperative linguistic outcome is generally advocated, information about the neurolinguistic methods applied in awake surgery is scarce. We developed for the first time a standardised Dutch linguistic test battery (measuring phonology, semantics, syntax) to reliably identify the critical language zones in detail. A normative study was carried out in a control group of 250 native Dutch-speaking healthy adults. In addition, the clinical application of the Dutch Linguistic Intraoperative Protocol (DuLIP) was demonstrated by means of anatomo-functional models and five case studies. A set of DuLIP tests was selected for each patient depending on the tumour location and degree of linguistic impairment. DuLIP is a valid test battery for pre-, intraoperative and postoperative language testing and facilitates intraoperative mapping of eloquent language regions that are variably located. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. A Model for Estimating the Reliability and Validity of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Edmonston, Leon P.; Randall, Robert S.

    A decision model designed to determine the reliability and validity of criterion referenced measures (CRMs) is presented. General procedures which pertain to the model are discussed as to: Measures of relationship, Reliability, Validity (content, criterion-oriented, and construct validation), and Item Analysis. The decision model is presented in…

  15. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    NASA Astrophysics Data System (ADS)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  16. Modelling of enterobacterial loads to the Baie des Veys (Normandy, France).

    PubMed

    Lafforgue, Michel; Gerard, Laure; Vieillard, Celine; Breton, Marguerite

    2018-06-01

    The Baie des Veys (Normandy, France) has abundant stocks of shellfish (oyster and cockle farms). Water quality in the bay is affected by pollutant inputs from a 3500 km 2 watershed and notably occasional episodes of contamination by faecal coliforms. In order to characterise enterobacterial loads and develop a plan of action to improve the quality of seawater and shellfish in the bay, a two-stage modelling procedure was adopted. This focused on Escherichia coli and included a catchment model describing the E. coli releases, and the transport and die-off of this bacteria up to the coast. The output from this model then served as input for a marine model used to determine the concentration of E. coli in seawater. A total 60 scenarios were tested, including different wind, tidal, rainfall and temperature conditions and accidental pollution events, for both current situations and future scenarios. The modelling results highlighted the impact of rainfall on E. coli loadings to the sea, as well as the effects of sluice gates and tidal cycles, which dictated the use of an hourly timescale for the modelling process. The coupled models also made it possible to identify the origin of these enterobacteria as found in shellfish harvesting areas, both in terms of the contributing watercourses and the sources of contamination of those watercourses. The tool can accordingly be used to optimise remedial action. Copyright © 2018 Elsevier GmbH. All rights reserved.

  17. SPR Hydrostatic Column Model Verification and Validation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less

  18. Longitudinal Models of Reliability and Validity: A Latent Curve Approach.

    ERIC Educational Resources Information Center

    Tisak, John; Tisak, Marie S.

    1996-01-01

    Dynamic generalizations of reliability and validity that will incorporate longitudinal or developmental models, using latent curve analysis, are discussed. A latent curve model formulated to depict change is incorporated into the classical definitions of reliability and validity. The approach is illustrated with sociological and psychological…

  19. Validation of Competences and Professionalisation of Teachers and Trainers = Validation des Acquis et Professionnalisation des Enseignants et Formateurs. CEDEFOP Dossier Series.

    ERIC Educational Resources Information Center

    de Blignieres-Legeraud, Anne; Bjornavold, Jens; Charraud, Anne-Marie; Gerard, Francoise; Diamanti, Stamatina; Freundlinger, Alfred; Bjerknes, Ellen; Covita, Horacio

    A workshop aimed to clarify under what conditions the validation of knowledge gained through experience can be considered a professionalizing factor for European Union teachers and trainers by creating a better link between experience and training and between vocational training and qualifications. Seven papers were presented in addition to an…

  20. Development and validation of a mass casualty conceptual model.

    PubMed

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  1. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  2. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  3. Instrumental Response Model and Detrending for the Dark Energy Camera

    DOE PAGES

    Bernstein, G. M.; Abbott, T. M. C.; Desai, S.; ...

    2017-09-14

    We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less

  4. Instrumental Response Model and Detrending for the Dark Energy Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, G. M.; Abbott, T. M. C.; Desai, S.

    We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less

  5. The desA and desB genes from Clostridium scindens ATCC 35704 encode steroid-17,20-desmolase.

    PubMed

    Devendran, Saravanan; Mythen, Sean M; Ridlon, Jason M

    2018-06-01

    Clostridium scindens is a gut microbe capable of removing the side-chain of cortisol, forming 11β-hydro-xyandrostenedione. A cortisol-inducible operon ( desABCD ) was previously identified in C. scindens ATCC 35704 by RNA-Seq. The desC gene was shown to encode a cortisol 20α-hydroxysteroid dehydrogenase (20α-HSDH). The desD encodes a protein annotated as a member of the major facilitator family, predicted to function as a cortisol transporter. The desA and desB genes are annotated as N-terminal and C-terminal transketolases, respectively. We hypothesized that the DesAB forms a complex and has steroid-17,20-desmolase activity. We cloned the desA and desB genes from C. scindens ATCC 35704 in pETDuet for overexpression in Escherichia coli The purified recombinant DesAB was determined to be a 142 ± 5.4 kDa heterotetramer. We developed an enzyme-linked continuous spectrophotometric assay to quantify steroid-17,20-desmolase. This was achieved by coupling DesAB-dependent formation of 11β-hydroxyandrostenedione with the NADPH-dependent reduction of the steroid 17-keto group by a recombinant 17β-HSDH from the filamentous fungus, Cochliobolus lunatus The pH optimum for the coupled assay was 7.0 and kinetic constants using cortisol as substrate were K m of 4.96 ± 0.57 µM and k cat of 0.87 ± 0.076 min -1 Substrate-specificity studies revealed that rDesAB recognized substrates regardless of 11β-hydroxylation, but had an absolute requirement for 17,21-dihydroxy 20-ketosteroids. Copyright © 2018 Devendran et al.

  6. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  7. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  8. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  9. Modeling and validating the cost and clinical pathway of colorectal cancer.

    PubMed

    Joranger, Paal; Nesbakken, Arild; Hoff, Geir; Sorbye, Halfdan; Oshaug, Arne; Aas, Eline

    2015-02-01

    Cancer is a major cause of morbidity and mortality, and colorectal cancer (CRC) is the third most common cancer in the world. The estimated costs of CRC treatment vary considerably, and if CRC costs in a model are based on empirically estimated total costs of stage I, II, III, or IV treatments, then they lack some flexibility to capture future changes in CRC treatment. The purpose was 1) to describe how to model CRC costs and survival and 2) to validate the model in a transparent and reproducible way. We applied a semi-Markov model with 70 health states and tracked age and time since specific health states (using tunnels and 3-dimensional data matrix). The model parameters are based on an observational study at Oslo University Hospital (2049 CRC patients), the National Patient Register, literature, and expert opinion. The target population was patients diagnosed with CRC. The model followed the patients diagnosed with CRC from the age of 70 until death or 100 years. The study focused on the perspective of health care payers. The model was validated for face validity, internal and external validity, and cross-validity. The validation showed a satisfactory match with other models and empirical estimates for both cost and survival time, without any preceding calibration of the model. The model can be used to 1) address a range of CRC-related themes (general model) like survival and evaluation of the cost of treatment and prevention measures; 2) make predictions from intermediate to final outcomes; 3) estimate changes in resource use and costs due to changing guidelines; and 4) adjust for future changes in treatment and trends over time. The model is adaptable to other populations. © The Author(s) 2014.

  10. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    2012-01-01

    Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

  11. Modeling and Validation of a Three-Stage Solidification Model for Sprays

    NASA Astrophysics Data System (ADS)

    Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

    2010-09-01

    A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

  12. Validation of recent geopotential models in Tierra Del Fuego

    NASA Astrophysics Data System (ADS)

    Gomez, Maria Eugenia; Perdomo, Raul; Del Cogliano, Daniel

    2017-10-01

    This work presents a validation study of global geopotential models (GGM) in the region of Fagnano Lake, located in the southern Andes. This is an excellent area for this type of validation because it is surrounded by the Andes Mountains, and there is no terrestrial gravity or GNSS/levelling data. However, there are mean lake level (MLL) observations, and its surface is assumed to be almost equipotential. Furthermore, in this article, we propose improved geoid solutions through the Residual Terrain Modelling (RTM) approach. Using a global geopotential model, the results achieved allow us to conclude that it is possible to use this technique to extend an existing geoid model to those regions that lack any information (neither gravimetric nor GNSS/levelling observations). As GGMs have evolved, our results have improved progressively. While the validation of EGM2008 with MLL data shows a standard deviation of 35 cm, GOCO05C shows a deviation of 13 cm, similar to the results obtained on land.

  13. Effets des electrons secondaires sur l'ADN

    NASA Astrophysics Data System (ADS)

    Boudaiffa, Badia

    Les interactions des electrons de basse energie (EBE) representent un element important en sciences des radiations, particulierement, les sequences se produisant immediatement apres l'interaction de la radiation ionisante avec le milieu biologique. Il est bien connu que lorsque ces radiations deposent leur energie dans la cellule, elles produisent un grand nombre d'electrons secondaires (4 x 104/MeV), qui sont crees le long de la trace avec des energies cinetiques initiales bien inferieures a 20 eV. Cependant, il n'y a jamais eu de mesures directes demontrant l'interaction de ces electrons de tres basse energie avec l'ADN, du principalement aux difficultes experimentales imposees par la complexite du milieu biologique. Dans notre laboratoire, les dernieres annees ont ete consacrees a l'etude des phenomenes fondamentaux induits par impact des EBE sur differentes molecules simples (e.g., N2, CO, O2, H2O, NO, C2H 4, C6H6, C2H12) et quelques molecules complexes dans leur phase solide. D'autres travaux effectues recemment sur des bases de l'ADN et des oligonucleotides ont montre que les EBE produisent des bris moleculaires sur les biomolecules. Ces travaux nous ont permis d'elaborer des techniques pour mettre en evidence et comprendre les interactions fondamentales des EBE avec des molecules d'interet biologique, afin d'atteindre notre objectif majeur d'etudier l'effet direct de ces particules sur la molecule d'ADN. Les techniques de sciences des surfaces developpees et utilisees dans les etudes precitees peuvent etre etendues et combinees avec des methodes classiques de biologie pour etudier les dommages de l'ADN induits par l'impact des EBE. Nos experiences ont montre l'efficacite des electrons de 3--20 eV a induire des coupures simple et double brins dans l'ADN. Pour des energies inferieures a 15 eV, ces coupures sont induites par la localisation temporaire d'un electron sur une unite moleculaire de l'ADN, ce qui engendre la formation d'un ion negatif transitoire

  14. Mesure Objective De L'attenuation et De L'effet D'occlusion Des Protecteurs Auditifs a Partir Des Potentiels Evoques Stationnaires et Multiples =

    NASA Astrophysics Data System (ADS)

    Valentin, Olivier

    Selon l'Organisation mondiale de la sante, le nombre de travailleurs exposes quotidiennement a des niveaux de bruit prejudiciables a leur audition est passe de 120 millions en 1995 a 250 millions en 2004. Meme si la reduction du bruit a la source devrait etre toujours privilegiee, la solution largement utilisee pour lutter contre le bruit au travail reste la protection auditive individuelle. Malheureusement, le port des protecteurs auditifs n'est pas toujours respecte par les travailleurs car il est difficile de fournir un protecteur auditif dont le niveau d'attenuation effective est approprie a l'environnement de travail d'un individu. D'autre part, l'occlusion du canal auditif induit une modification de la perception de la parole, ce qui cree un inconfort incitant les travailleurs a retirer leurs protecteurs. Ces deux problemes existent parce que les methodes actuelles de mesure de l'effet d'occlusion et de l'attenuation sont limitees. Les mesures objectives basees sur des mesures microphoniques intra-auriculaires ne tiennent pas compte de la transmission directe du son a la cochlee par conduction osseuse. Les mesures subjectives au seuil de l'audition sont biaisees a cause de l'effet de masquage aux basses frequences induit par le bruit physiologique. L'objectif principal de ce travail de these de doctorat est d'ameliorer la mesure de l'attenuation et de l'effet d'occlusion des protecteurs auditifs intra-auriculaires. L'approche generale consiste a : (i) verifier s'il est possible de mesurer l'attenuation des protecteurs auditifs grâce au recueil des potentiels evoques stationnaires et multiples (PEASM) avec et sans protecteur auditif (protocole 1), (ii) adapter cette methodologie pour mesurer l'effet d'occlusion induit par le port de protecteur auditifs intra-auriculaires (protocole 2), et (iii) valider chaque protocole par l'intermediaire de mesures realisees sur sujets humains. Les resultats du protocole 1 demontrent que les PEASM peuvent etre utilises pour

  15. Fabrication par injection flexible de pieces coniques pour des applications aerospatiales

    NASA Astrophysics Data System (ADS)

    Shebib Loiselle, Vincent

    Les materiaux composites sont presents dans les tuyeres de moteurs spatiaux depuis les annees soixante. Aujourd'hui, l'avenement des tissus tridimensionnels apporte une solution innovatrice au probleme de delamination qui limitait les proprietes mecaniques de ces composites. L'utilisation de ces tissus necessite toutefois la conception de procedes de fabrication mieux adaptes. Une nouvelle methode de fabrication de pieces composites pour des applications aerospatiales a ete etudiee tout au long de ce travail. Celle-ci applique les principes de l'injection flexible (procede Polyflex) a la fabrication de pieces coniques de fortes epaisseurs. La piece de validation a fabriquer represente un modele reduit de piece de tuyere de moteur spatial. Elle est composee d'un renfort tridimensionnel en fibres de carbone et d'une resine phenolique. La reussite du projet est definie par plusieurs criteres sur la compaction et la formation de plis du renfort et sur la formation de porosites de la piece fabriquee. Un grand nombre d'etapes ont ete necessaires avant la fabrication de deux pieces de validation. Premierement, pour repondre au critere sur la compaction du renfort, la conception d'un outil de caracterisation a ete entreprise. L'etude de la compaction a ete effectuee afin d'obtenir les informations necessaires a la comprehension de la deformation d'un renfort 3D axisymetrique. Ensuite, le principe d'injection de la piece a ete defini pour ce nouveau procede. Pour en valider les concepts proposes, la permeabilite du renfort fibreux ainsi que la viscosite de la resine ont du etre caracterisees. A l'aide de ces donnees, une serie de simulations de l'ecoulement pendant l'injection de la piece ont ete realisees et une approximation du temps de remplissage calculee. Apres cette etape, la conception du moule de tuyere a ete entamee et appuyee par une simulation mecanique de la resistance aux conditions de fabrication. Egalement, plusieurs outillages necessaires pour la fabrication

  16. Utilisation d'images aeroportees a tres haute resolution spatiale pour l'estimation de la vigueur des peuplements forestiers du nord-ouest du Nouveau-Brunswick

    NASA Astrophysics Data System (ADS)

    Louis, Ognel Pierre

    Le but de cette etude est de developper un outil permettant d'estimer le niveau de risque de perte de vigueur des peuplements forestiers de la region de Gounamitz au nord-ouest du Nouveau-Brunswick via des donnees d'inventaires forestiers et des donnees de teledetection. Pour ce faire, un marteloscope de 100m x 100m et 20 parcelles d'echantillonnages ont ete delimites. A l'interieur de ces derniers, le niveau de risque de perte de vigueur des arbres ayant un DHP superieur ou egal a 9 cm a ete determine. Afin de caracteriser le risque de perte de vigueur des arbres, leurs positions spatiales ont ete repertoriees a partir d'un GPS en tenant compte des defauts au niveau des tiges. Pour mener a bien ce travail, les indices de vegetation et de textures et les bandes spectrales de l'image aeroportee ont ete extraits et consideres comme variables independantes. Le niveau de risque de perte de vigueur obtenu par espece d'arbre a travers les inventaires forestiers a ete considere comme variable dependante. En vue d'obtenir la superficie des peuplements forestiers de la region d'etude, une classification dirigee des images a partir de l'algorithme maximum de vraisemblance a ete effectuee. Le niveau de risque de perte de vigueur par type d'arbre a ensuite ete estime a l'aide des reseaux de neurones en utilisant un reseau dit perceptron multicouches. Il s'agit d'un modele de reseau de neurones compose de : 11 neurones sur la couche d'entree, correspondant aux variables independantes, 35 neurones sur la couche cachee et 4 neurones sur la couche de sortie. La prediction a partir des reseaux de neurones produit une matrice de confusion qui permet d'obtenir des mesures quantitatives d'estimation, notamment un pourcentage de classification globale de 91,7% pour la prediction du risque de perte de vigueur du peuplement de resineux et de 89,7% pour celui du peuplement de feuillus. L'evaluation de la performance des reseaux de neurones fournit une valeur de MSE globale de 0,04, et une

  17. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  18. Fluctuations Magnetiques des Gaz D'electrons Bidimensionnels: Application AU Compose Supraconducteur LANTHANE(2-X) Strontium(x) Cuivre OXYGENE(4)

    NASA Astrophysics Data System (ADS)

    Benard, Pierre

    Nous presentons une etude des fluctuations magnetiques de la phase normale de l'oxyde de cuivre supraconducteur La_{2-x}Sr _{x}CuO_4 . Le compose est modelise par le Hamiltonien de Hubbard bidimensionnel avec un terme de saut vers les deuxiemes voisins (modele tt'U). Le modele est etudie en utilisant l'approximation de la GRPA (Generalized Random Phase Approximation) et en incluant les effets de la renormalisation de l'interaction de Hubbard par les diagrammes de Brueckner-Kanamori. Dans l'approche presentee dans ce travail, les maximums du facteur de structure magnetique observes par les experiences de diffusion de neutrons sont associes aux anomalies 2k _{F} de reseau du facteur de structure des gaz d'electrons bidimensionnels sans interaction. Ces anomalies proviennent de la diffusion entre particules situees a des points de la surface de Fermi ou les vitesses de Fermi sont tangentes, et conduisent a des divergences dont la nature depend de la geometrie de la surface de Fermi au voisinage de ces points. Ces resultats sont ensuite appliques au modele tt'U, dont le modele de Hubbard usuel tU est un cas particulier. Dans la majorite des cas, les interactions ne determinent pas la position des maximums du facteur de structure. Le role de l'interaction est d'augmenter l'intensite des structures du facteur de structure magnetique associees a l'instabilite magnetique du systeme. Ces structures sont souvent deja presentes dans la partie imaginaire de la susceptibilite sans interaction. Le rapport d'intensite entre les maximums absolus et les autres structures du facteur de structure magnetique permet de determiner le rapport U_ {rn}/U_{c} qui mesure la proximite d'une instabilite magnetique. Le diagramme de phase est ensuite etudie afin de delimiter la plage de validite de l'approximation. Apres avoir discute des modes collectifs et de l'effet d'une partie imaginaire non-nulle de la self-energie, l'origine de l'echelle d'energie des fluctuations magnetiques est examinee

  19. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    PubMed

    Gomes, Anna; van der Wijk, Lars; Proost, Johannes H; Sinha, Bhanu; Touw, Daan J

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  20. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    PubMed Central

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  1. Predicting the ungauged basin: model validation and realism assessment

    NASA Astrophysics Data System (ADS)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  2. Validation and calibration of structural models that combine information from multiple sources.

    PubMed

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  3. Aircraft Disinsection: A Guide for Military and Civilian Air Carriers (Desinsectisation des aeronefs: Un guide a l’intention des responsables des transports aeriens civils et militaires)

    DTIC Science & Technology

    1996-04-01

    regulations. - Ceratitis capitata (Mediterranean fruit fly). Quarantine regulations generally reduce - Rhagoletis pomonella (Apple maggot). the chances of a...d’insectes nuisibles dans un pays, en 6vitant des risques pour la sant6 des 6quipages, pour la s~curit6 de 1’a~ronef et pour l’industrie. Ce rapport examine 1...diffrrents officiels responsables de la r~glementation des insectes nuisibles introduits, de l’inscription des pesticides et de leur utilisation en

  4. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  5. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  6. CheS-Mapper 2.0 for visual validation of (Q)SAR models

    PubMed Central

    2014-01-01

    Background Sound statistical validation is important to evaluate and compare the overall performance of (Q)SAR models. However, classical validation does not support the user in better understanding the properties of the model or the underlying data. Even though, a number of visualization tools for analyzing (Q)SAR information in small molecule datasets exist, integrated visualization methods that allow the investigation of model validation results are still lacking. Results We propose visual validation, as an approach for the graphical inspection of (Q)SAR model validation results. The approach applies the 3D viewer CheS-Mapper, an open-source application for the exploration of small molecules in virtual 3D space. The present work describes the new functionalities in CheS-Mapper 2.0, that facilitate the analysis of (Q)SAR information and allows the visual validation of (Q)SAR models. The tool enables the comparison of model predictions to the actual activity in feature space. The approach is generic: It is model-independent and can handle physico-chemical and structural input features as well as quantitative and qualitative endpoints. Conclusions Visual validation with CheS-Mapper enables analyzing (Q)SAR information in the data and indicates how this information is employed by the (Q)SAR model. It reveals, if the endpoint is modeled too specific or too generic and highlights common properties of misclassified compounds. Moreover, the researcher can use CheS-Mapper to inspect how the (Q)SAR model predicts activity cliffs. The CheS-Mapper software is freely available at http://ches-mapper.org. Graphical abstract Comparing actual and predicted activity values with CheS-Mapper.

  7. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  8. Modeling and Validation of Microwave Ablations with Internal Vaporization

    PubMed Central

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

    2014-01-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  9. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  10. Developpement d'une methode calorimetrique de mesure des pertes ac pour des rubans supraconducteurs a haute temperature critique

    NASA Astrophysics Data System (ADS)

    Dolez, Patricia

    Le travail de recherche effectue dans le cadre de ce projet de doctorat a permis la mise au point d'une methode de mesure des pertes ac destinee a l'etude des supraconducteurs a haute temperature critique. Pour le choix des principes de cette methode, nous nous sommes inspires de travaux anterieurs realises sur les supraconducteurs conventionnels, afin de proposer une alternative a la technique electrique, presentant lors du debut de cette these des problemes lies a la variation du resultat des mesures selon la position des contacts de tension sur la surface de l'echantillon, et de pouvoir mesurer les pertes ac dans des conditions simulant la realite des futures applications industrielles des rubans supraconducteurs: en particulier, cette methode utilise la technique calorimetrique, associee a une calibration simultanee et in situ. La validite de la methode a ete verifiee de maniere theorique et experimentale: d'une part, des mesures ont ete realisees sur des echantillons de Bi-2223 recouverts d'argent ou d'alliage d'argent-or et comparees avec les predictions theoriques donnees par Norris, nous indiquant la nature majoritairement hysteretique des pertes ac dans nos echantillons; d'autre part, une mesure electrique a ete realisee in situ dont les resultats correspondent parfaitement a ceux donnes par notre methode calorimetrique. Par ailleurs, nous avons compare la dependance en courant et en frequence des pertes ac d'un echantillon avant et apres qu'il ait ete endommage. Ces mesures semblent indiquer une relation entre la valeur du coefficient de la loi de puissance modelisant la dependance des pertes avec le courant, et les inhomogeneites longitudinales du courant critique induites par l'endommagement. De plus, la variation en frequence montre qu'au niveau des grosses fractures transverses creees par l'endommagement dans le coeur supraconducteur, le courant se partage localement de maniere a peu pres equivalente entre les quelques grains de matiere

  11. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling.

    PubMed

    Sauer, Bryan G; Singh, Kanwar P; Wagner, Barry L; Vanden Hoek, Matthew S; Twilley, Katherine; Cohn, Steven M; Shami, Vanessa M; Wang, Andrew Y

    2016-11-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience.

  12. PACIC Instrument: disentangling dimensions using published validation models.

    PubMed

    Iglesias, K; Burnand, B; Peytremann-Bridevaux, I

    2014-06-01

    To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. Validation study using data from cross-sectional survey. A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  13. Cross-validation of an employee safety climate model in Malaysia.

    PubMed

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  14. Cartographie des disques

    NASA Astrophysics Data System (ADS)

    Hameury, Jean-Marie

    2001-01-01

    Two techniques are frequently used to produce images of the accretion disc in an eclipsing binary: eclipse mapping and Doppler tomography. From the light curve, one can deduce the radial distribution of the effective temperature, assuming axial symmetry. On the other hand, from the variation of the line profile one can reconstruct an image in the velocity space, which can be converted into a real image if one knows the kinematics of the system. Deux techniques sont couramment utilisées pour obtenir des images des disques dans les systèmes binaires à éclipses. En utilisant la courbe de lumière, on peut remonter à la distribution radiale de la brillance de surface, en supposant que celle-ci a une symètrie axiale. D'autre part, les profils de raies renseignent sur la distribution de vitesse des régions émissives leur variation temporelle permet de réaliser une image dans l'espace des vitesses, que l'on peut ensuite transformer en carte dans l'espace (x,y) si on connaît la cinématique du système.

  15. Troubles des conduites alimentaires et tempérament cyclothymique: étude transversale à propos de 107 étudiants Tunisiens

    PubMed Central

    Jaweher, Masmoudi; Sonda, Trabelsi; Uta, Ouali; Inès, Feki; Rim, Sallemi; Imene, Baati; Abdelaziz, Jaoua

    2014-01-01

    Introduction Les objectifs de notre étude ont été d'estimer la prévalence des troubles des conduites alimentaires (TCA) chez les jeunes tunisiens et étudier la relation entre le tempérament cyclothymique et les TCA. Méthodes Nous avons ainsi mené une étude transversale descriptive et analytique. Elle a concerné 107 étudiants de l'Institut de Presse et des Sciences de l'Information de la Manouba, Tunisie. Pour l’évaluation des TCA, nous avons procédé par la passation de l'auto questionnaire EAT 40, dans sa version validée en Tunisie. C'est l'outil le plus utilisé pour le dépistage des TCA dans le monde. Pour l’évaluation du tempérament cyclothymique, nous avons utilisé le TEMPS A dans sa version arabe validée. Une fiche épidémiologique associée a permis de recueillir quelques facteurs sociodémographiques et hygiéno-diététiques. Résultats La prévalence des troubles de conduites alimentaires a été de 24,3%. Le pourcentage des étudiants ayant un score de tempérament cyclothymique ≥14 a été de 37,4%. Une association a été trouvée entre les troubles de conduites alimentaires et le tempérament affectif cyclothymique que ce soit selon l'approche dimensionnelle (p=0,005) ou selon celle catégorielle (p=0,046). Le tempérament cyclothymique multiplie par deux le risque de développer un TCA chez les étudiants de sexe féminin (p=0,04). Conclusion Es TCA sont fréquents chez nos étudiants particulièrement de sexe féminin. De plus, la présence d'un tempérament cyclothymique associé permettrait de suspecter doublement une appartenance au spectre bipolaire et devrait conduire à une attention particulière de la part du clinicien pour définir au mieux les stratégies thérapeutiques. PMID:25404977

  16. Preclinical mouse model to monitor live Muc5b-producing conjunctival goblet cell density under pharmacological treatments.

    PubMed

    Portal, Céline; Gouyer, Valérie; Gottrand, Frédéric; Desseyn, Jean-Luc

    2017-01-01

    Modification of mucous cell density and gel-forming mucin production are established hallmarks of mucosal diseases. Our aim was to develop and validate a mouse model to study live goblet cell density in pathological situations and under pharmacological treatments. We created a reporter mouse for the gel-forming mucin gene Muc5b. Muc5b-positive goblet cells were studied in the eye conjunctiva by immunohistochemistry and probe-based confocal laser endomicroscopy (pCLE) in living mice. Dry eye syndrome (DES) model was induced by topical application of benzalkonium chloride (BAK) and recombinant interleukine (rIL) 13 was administered to reverse the goblet cell loss in the DES model. Almost 50% of the total of conjunctival goblet cells are Muc5b+ in unchallenged mice. The decrease density of Muc5b+ conjunctival goblet cell population in the DES model reflects the whole conjunctival goblet cell loss. Ten days of BAK in one eye followed by 4 days without any treatment induced a -18.3% decrease in conjunctival goblet cell density. A four days of rIL13 application in the DES model restored the normal goblet cell density. Muc5b is a biological marker of DES mouse models. We bring the proof of concept that our model is unique and allows a better understanding of the mechanisms that regulate gel-forming mucin production/secretion and mucous cell differentiation in the conjunctiva of living mice and can be used to test treatment compounds in mucosal disease models.

  17. Using airborne laser scanning profiles to validate marine geoid models

    NASA Astrophysics Data System (ADS)

    Julge, Kalev; Gruno, Anti; Ellmann, Artu; Liibusk, Aive; Oja, Tõnis

    2014-05-01

    Airborne laser scanning (ALS) is a remote sensing method which utilizes LiDAR (Light Detection And Ranging) technology. The datasets collected are important sources for large range of scientific and engineering applications. Mostly the ALS is used to measure terrain surfaces for compilation of Digital Elevation Models but it can also be used in other applications. This contribution focuses on usage of ALS system for measuring sea surface heights and validating gravimetric geoid models over marine areas. This is based on the ALS ability to register echoes of LiDAR pulse from the water surface. A case study was carried out to analyse the possibilities for validating marine geoid models by using ALS profiles. A test area at the southern shores of the Gulf of Finland was selected for regional geoid validation. ALS measurements were carried out by the Estonian Land Board in spring 2013 at different altitudes and using different scan rates. The one wavelength Leica ALS50-II laser scanner on board of a small aircraft was used to determine the sea level (with respect to the GRS80 reference ellipsoid), which follows roughly the equipotential surface of the Earth's gravity field. For the validation a high-resolution (1'x2') regional gravimetric GRAV-GEOID2011 model was used. This geoid model covers the entire area of Estonia and surrounding waters of the Baltic Sea. The fit between the geoid model and GNSS/levelling data within the Estonian dry land revealed RMS of residuals ±1… ±2 cm. Note that such fitting validation cannot proceed over marine areas. Therefore, an ALS observation-based methodology was developed to evaluate the GRAV-GEOID2011 quality over marine areas. The accuracy of acquired ALS dataset were analyzed, also an optimal width of nadir-corridor containing good quality ALS data was determined. Impact of ALS scan angle range and flight altitude to obtainable vertical accuracy were investigated as well. The quality of point cloud is analysed by cross

  18. Analyse de L'ancrage des Vortex Intergrains pour le Yttrium BARYUM(2) CUIVRE(3) OXYGENE(7) Polycristallin

    NASA Astrophysics Data System (ADS)

    Fournier, Patrick

    Le Modele de l'Etat Critique Generalise (MECG) est utilise pour decrire les proprietes magnetiques et de transport du YBa_2Cu_3O _7 polycristallin. Ce modele empirique permet de relier la densite de courant critique a la densite de lignes de flux penetrant dans la region intergrain. Deux techniques de mesures sont utilisees pour caracteriser nos materiaux. La premiere consiste a mesurer le champ au centre d'un cylindre creux en fonction du champ magnetique applique pour des temperatures comprises entre 20 et 85K. En variant l'epaisseur de la paroi du cylindre creux, il est possible de suivre l'evolution des cycles d'hysteresis et de determiner des champs caracteristiques qui varient en fonction de cette dimension. En utilisant un lissage des resultats experimentaux, nous determinons J _{co}, H_ {o} et n, les parametres du MECG. La forme des cylindres, avec une longueur comparable au diametre externe, entrai ne la presence d'un champ demagnetisant qui peut etre inclus dans le modele theorique. Ceci nous permet d'evaluer la fraction du volume ecrante, f _{g}, ainsi que le facteur demagnetisant N. Nous trouvons que J_{ co}, H_{o} et f_{g} dependent de la temperature, tandis que n et N (pour une epaisseur de paroi fixe) n'en dependent pas. La deuxieme technique consiste a mesurer le courant critique de lames minces en fonction du champ applique pour differentes temperatures. Nous utilisons un montage que nous avons developpe permettant d'effectuer ces mesures en contact direct avec le liquide refrigerant, i.e. dans l'azote liquide. Nous varions la temperature du liquide en variant la pression du gaz au-dessus du bain d'azote. Cette methode nous permet de balayer des temperatures entre 65K et la temperature critique du materiau ({~ }92K). Nous effectuons le lissage des courbes de courant critique en fonction du champ applique encore a l'aide du MECG, pour a nouveau obtenir ses parametres. Pour trois echantillons avec des traitements thermiques differents, les parametres

  19. Discrete Event Simulation for Decision Modeling in Health Care: Lessons from Abdominal Aortic Aneurysm Screening

    PubMed Central

    Jones, Edmund; Masconi, Katya L.; Sweeting, Michael J.; Thompson, Simon G.; Powell, Janet T.

    2018-01-01

    Markov models are often used to evaluate the cost-effectiveness of new healthcare interventions but they are sometimes not flexible enough to allow accurate modeling or investigation of alternative scenarios and policies. A Markov model previously demonstrated that a one-off invitation to screening for abdominal aortic aneurysm (AAA) for men aged 65 y in the UK and subsequent follow-up of identified AAAs was likely to be highly cost-effective at thresholds commonly adopted in the UK (£20,000 to £30,000 per quality adjusted life-year). However, new evidence has emerged and the decision problem has evolved to include exploration of the circumstances under which AAA screening may be cost-effective, which the Markov model is not easily able to address. A new model to handle this more complex decision problem was needed, and the case of AAA screening thus provides an illustration of the relative merits of Markov models and discrete event simulation (DES) models. An individual-level DES model was built using the R programming language to reflect possible events and pathways of individuals invited to screening v. those not invited. The model was validated against key events and cost-effectiveness, as observed in a large, randomized trial. Different screening protocol scenarios were investigated to demonstrate the flexibility of the DES. The case of AAA screening highlights the benefits of DES, particularly in the context of screening studies.

  20. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  1. Is the Acute NMDA Receptor Hypofunction a Valid Model of Schizophrenia?

    PubMed Central

    Adell, Albert; Jiménez-Sánchez, Laura; López-Gil, Xavier; Romón, Tamara

    2012-01-01

    Several genetic, neurodevelopmental, and pharmacological animal models of schizophrenia have been established. This short review examines the validity of one of the most used pharmacological model of the illness, ie, the acute administration of N-methyl-D-aspartate (NMDA) receptor antagonists in rodents. In some cases, data on chronic or prenatal NMDA receptor antagonist exposure have been introduced for comparison. The face validity of acute NMDA receptor blockade is granted inasmuch as hyperlocomotion and stereotypies induced by phencyclidine, ketamine, and MK-801 are regarded as a surrogate for the positive symptoms of schizophrenia. In addition, the loss of parvalbumin-containing cells (which is one of the most compelling finding in postmortem schizophrenia brain) following NMDA receptor blockade adds construct validity to this model. However, the lack of changes in glutamic acid decarboxylase (GAD67) is at variance with human studies. It is possible that changes in GAD67 are more reflective of the neurodevelopmental condition of schizophrenia. Finally, the model also has predictive validity, in that its behavioral and transmitter activation in rodents are responsive to antipsychotic treatment. Overall, although not devoid of drawbacks, the acute administration of NMDA receptor antagonists can be considered as a good model of schizophrenia bearing a satisfactory degree of validity. PMID:21965469

  2. Effets perturbateurs endocriniens des pesticides organochlores.

    PubMed

    Charlier, C; Plomteux, G

    2002-01-01

    Xenoestrogens such organochlorine pesticides are known to induce changes in reproductive development, function or behaviour in wildlife. Because these compounds are able to modify the estrogens metabolism, or to compete with estradiol for binding to the estrogen receptor, it may be possible that these products affect the risk of developing impaired fertility, precocious puberty or some kinds of cancer in man. Le plus ancien récit de lutte contre la pollution remonte à une légende indienne racontant que la divinité Sing-bonga était incommodée par les émanations des fours dans lesquels les Asuras fondaient leurs métaux (1). Evidemment depuis, la problématique n-a cessé de s-accroître et la contamination de la Terre par de nombreux polluants est devenue aujourd-hui un problème majeur de notre Société. La protection de notre environnement est une question capitale qui doit être respectée malgré la pression économique actuelle et qui ne cessera de croître au cours des prochaines années même si l-identification objective et indiscutable de ce qui est essentiel - donc devant être prioritairement garanti sur la planète - est difficile à cerner (2). « Un oiseau en mauvais état ne pond pas de bons oeufs » disait un proverbe grec. Mais ce n-est qu-à partir de la seconde moitié du XXème siècle que les toxicologues ont commencé à identifier les effets qu-avaient entraînés à l-échelle mondiale les pollutions émises aux XIXème siècle sur la faune sauvage et sur le cheptel (3). L-histoire contemporaine des pesticides industriels commence vers 1874 (synthèse des organochlorés) et se poursuit tout au long de ces 2 siècles en passant par la synthèse des organophosphorés (1950), des carbamates (1970) et des pyréthroïdes (1975) (4). Le dichlorodiphényltrichloroéthane (DDT) a été synthétisé pour la première fois par un étudiant en cours de préparation de sa thèse de doctorat : Othmer Zeidler. La production, reprise par les

  3. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  4. MRI-based modeling for radiocarpal joint mechanics: validation criteria and results for four specimen-specific models.

    PubMed

    Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet

    2011-10-01

    The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations

  5. Some guidance on preparing validation plans for the DART Full System Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less

  6. Validation of an Evaluation Model for Learning Management Systems

    ERIC Educational Resources Information Center

    Kim, S. W.; Lee, M. G.

    2008-01-01

    This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

  7. Relation entre les caractéristiques des table-bancs et les mesures anthropométriques des écoliers au Benin

    PubMed Central

    Falola, Stève Marjelin; Gouthon, Polycarpe; Falola, Jean-Marie; Fiogbe, Michel Armand; Nigan, Issiako Bio

    2014-01-01

    Introduction Le mobilier scolaire et la posture assise en classe sont souvent impliqués dans l'apparition des douleurs rachidiennes, influant de fait sur la qualité des tâches réalisées par les apprenants. Aucune étude n'a encore vérifié le degré d'adéquation entre les caractéristiques du mobilier et celles des écoliers au Bénin. L'objectif de cette étude transversale est donc de déterminer la relation entre les dimensions des table-bancs utilisées en classe et les mesures anthropométriques des écoliers au Bénin. Methods Elle a été réalisée avec un échantillon probabiliste de 678 écoliers, âgés de 4 à 17 ans. Les mesures anthropométriques des écoliers et les mensurations relatives aux longueurs, largeurs et hauteurs des table-bancs ont été mesurées, puis intégrées aux équations proposées dans la littérature. Les pourcentages des valeurs situées hors des limitesacceptables, dérivées de l'application des équations ont été calculés. Results La largeur et la hauteur des table-bancs utilisées par les écoliers étaient plus élevées (p < 0,05) que les valeurs de référence recommandées par les structures officielles de contrôle et de production des mobiliers scolaires au Bénin. Quel que soit le sexe, il y avait une inadéquation entre la largeur du banc et la longueur fesse-poplité, puis entre la hauteur de la table et la distance coude-bancdes écoliers. Conclusion Les résultats suggèrent de prendre en compte l’évolution des mesures anthropométriques des écoliers dans la confection des table-bancs, afin de promouvoir de bonnes postures assises en classe et de réduire le risque de troubles du rachis. PMID:25317232

  8. Profil épidémiologique des tumeurs malignes primitives des glandes salivaires : à propos de 154 cas

    PubMed Central

    Setti, Khadija; Mouanis, Mohamed; Moumni, Abdelmounim; Maher, Mostafa; Harmouch, Amal

    2014-01-01

    Introduction Les tumeurs des glandes salivaires sont des tumeurs rares représentant 3à 5% des tumeurs de la tête et du cou. La classification de l'OMS 2005 distingue les tumeurs épithéliales, les tumeurs mésenchymateuses, les tumeurs hématologiques et les tumeurs secondaires. Méthodes Notre travail consiste en une étude rétrospective réalisée sur une période de 10 ans allant de janvier 2002 à janvier 2012. Les critères d'inclusion étaient: l'âge, le sexe, le siège de la tumeur et le type histologique. Résultats L'incidence annuelle des tumeurs malignes primitives des glandes salivaires dans notre série était de 15 cas par an. Cent cinquante quatre cas de tumeurs malignes primitives des glandes salivaires ont été colligés sans prédominance de sexe (78 femmes (50,6%) et 76 hommes (49,4%)). La moyenne d'âge était de 60 ans avec des extrêmes de 4 et 83 ans et un pic de fréquence entre 51et 70 ans. Deux tiers des cas (65%) avaient une localisation au niveau des glandes principales avec 66 cas au niveau de la parotide (43%) et 34 cas au niveau de la glande sous maxillaire (22%). Cinquante quatre patients avaient une tumeur maligne des glandes salivaires accessoires (35%) dont 61% au niveau du palais. Aucun cas de tumeur maligne de la glande sublinguale n'a été recensé dans notre étude. Le type histologique prédominant dans notre série était le carcinome adénoïde kystique et retrouvé chez 43 patients (27,9%), suivi de l'adénocarcinome sans autre indication chez 37 patients (24%) puis du carcinome mucoépidermoïde chez 16 patients (10,4%) et de l'adénocarcinome polymorphe de bas grade également chez 16 patients (10. 4%). Conclusion Les tumeurs malignes des glandes salivaires représentent un ensemble hétérogène de maladies de caractérisation complexe et de fréquence variable. PMID:25120861

  9. Molprobity's ultimate rotamer-library distributions for model validation.

    PubMed

    Hintze, Bradley J; Lewis, Steven M; Richardson, Jane S; Richardson, David C

    2016-09-01

    Here we describe the updated MolProbity rotamer-library distributions derived from an order-of-magnitude larger and more stringently quality-filtered dataset of about 8000 (vs. 500) protein chains, and we explain the resulting changes and improvements to model validation as seen by users. To include only side-chains with satisfactory justification for their given conformation, we added residue-specific filters for electron-density value and model-to-density fit. The combined new protocol retains a million residues of data, while cleaning up false-positive noise in the multi- χ datapoint distributions. It enables unambiguous characterization of conformational clusters nearly 1000-fold less frequent than the most common ones. We describe examples of local interactions that favor these rare conformations, including the role of authentic covalent bond-angle deviations in enabling presumably strained side-chain conformations. Further, along with favored and outlier, an allowed category (0.3-2.0% occurrence in reference data) has been added, analogous to Ramachandran validation categories. The new rotamer distributions are used for current rotamer validation in MolProbity and PHENIX, and for rotamer choice in PHENIX model-building and refinement. The multi-dimensional χ distributions and Top8000 reference dataset are freely available on GitHub. These rotamers are termed "ultimate" because data sampling and quality are now fully adequate for this task, and also because we believe the future of conformational validation should integrate side-chain with backbone criteria. Proteins 2016; 84:1177-1189. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  11. Institutional Effectiveness: A Model for Planning, Assessment & Validation.

    ERIC Educational Resources Information Center

    Truckee Meadows Community Coll., Sparks, NV.

    The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

  12. Les soins aux enfants et aux adolescents des familles des militaires canadiens : les considérations particulières

    PubMed Central

    Rowan-Legg, Anne

    2017-01-01

    Résumé Les familles des militaires font face à de nombreux facteurs de stress, tels que les réinstallations fréquentes, les longues pério des de séparation familiale, l’isolement géographique du réseau de soutien de la famille élargie et le déploiement en zones très dangereuses. Les enfants et les adolescents des familles des militaires vivent les mêmes trajectoires développementales et motivationnelles que leurs homologues civils, mais ils sont également aux prises avec des pressions et des facteurs de stress liés à leur développement qui sont inhabituels et qui leur sont imposés par les exigences de la vie militaire. Les effets de la vie militaire sur les familles et les enfants commencent à être admis et mieux caractérisés. Il est essentiel de comprendre les préoccupations propres aux enfants et aux adolescents des familles des militaires et de mobiliser les ressources nécessaires pour les soutenir afin de répondre à leurs besoins en matière de santé.

  13. Lignes directrices canadiennes sur l’utilisation sécuritaire et efficace des opioïdes pour la douleur chronique non cancéreuse

    PubMed Central

    Kahan, Meldon; Wilson, Lynn; Mailis-Gagnon, Angela; Srivastava, Anita

    2011-01-01

    Résumé Objectif Présenter aux médecins de famille un résumé clinique pratique sur la prescription d’opioïdes à des populations particulières en se fondant sur les recommandations faites dans les lignes directrices canadiennes sur l’utilisation sécuritaire et efficace des opioïdes pour la douleur chronique non cancéreuse. Qualité des données Pour produire les lignes directrices, les chercheurs ont effectué une synthèse critique de la littérature médicale en insistant plus précisément sur les études de l’efficacité et de la sécurité des opioïdes dans des populations particulières. Message principal Les médecins de famille peuvent atténuer les risques de surdose, de sédation, d’usage abusif et de dépendance grâce à des stratégies adaptées à l’âge et à l’état de santé des patients. Dans le cas de patients à risque de dépendance, on devrait réserver les opioïdes aux douleurs nociceptives ou neuropathiques bien définies qui n’ont pas répondu aux traitements de première intention. Il faut procéder lentement au titrage des opioïdes, avec des dispensations fréquentes et une étroite surveillance pour dépister tout signe d’usage abusif. Une dépendance aux opioïdes suspectée est prise en charge au moyen d’une thérapie structurée aux opioïdes, d’un traitement à la méthadone ou à la buprénorphine ou encore d’un traitement fondé sur l’abstinence. Les patients souffrant de troubles de l’humeur ou d’anxiété ont tendance à avoir une réponse analgésique atténuée aux opioïdes, sont à risque plus élevé d’usage abusif et prennent souvent des sédatifs qui interagissent défavorablement avec les opioïdes. Il faut prendre des précautions semblables à celles utilisées avec d’autres patients à risque élevé. Il faut faire un sevrage progressif si la douleur du patient demeure sévère même avec un essai adéquat de thérapie aux opioïdes. Chez les personnes âgées, la s

  14. L'astronomie des Anciens

    NASA Astrophysics Data System (ADS)

    Nazé, Yaël

    2009-04-01

    Quelle que soit la civilisation à laquelle il appartient, l'être humain cherche dans le ciel des réponses aux questions qu'il se pose sur son origine, son avenir et sa finalité. Le premier mérite de ce livre est de nous rappeler que l'astronomie a commencé ainsi à travers les mythes célestes imaginés par les Anciens pour expliquer l'ordre du monde et la place qu'ils y occupaient. Mais les savoirs astronomiques passés étaient loin d'être négligeables et certainement pas limités aux seuls travaux des Grecs : c'est ce que l'auteur montre à travers une passionnante enquête, de Stonehenge à Gizeh en passant par Pékin et Mexico, fondée sur l'étude des monuments anciens et des sources écrites encore accessibles. Les tablettes mésopotamiennes, les annales chinoises, les chroniques médiévales, etc. sont en outre d'une singulière utilité pour les astronomes modernes : comment sinon remonter aux variations de la durée du jour au cours des siècles, ou percer la nature de l'explosion qui a frappé tant d'observateurs en 1054 ? Ce livre offre un voyage magnifiquement illustré à travers les âges, entre astronomie et archéologie.

  15. Independent technical review and analysis of hydraulic modeling and hydrology under low-flow conditions of the Des Plaines River near Riverside, Illinois

    USGS Publications Warehouse

    Over, Thomas M.; Straub, Timothy D.; Hortness, Jon E.; Murphy, Elizabeth A.

    2012-01-01

    The U.S. Geological Survey (USGS) has operated a streamgage and published daily flows for the Des Plaines River at Riverside since Oct. 1, 1943. A HEC-RAS model has been developed to estimate the effect of the removal of Hofmann Dam near the gage on low-flow elevations in the reach approximately 3 miles upstream from the dam. The Village of Riverside, the Illinois Department of Natural Resources-Office of Water Resources (IDNR-OWR), and the U. S. Army Corps of Engineers-Chicago District (USACE-Chicago) are interested in verifying the performance of the HEC-RAS model for specific low-flow conditions, and obtaining an estimate of selected daily flow quantiles and other low-flow statistics for a selected period of record that best represents current hydrologic conditions. Because the USGS publishes streamflow records for the Des Plaines River system and provides unbiased analyses of flows and stream hydraulic characteristics, the USGS served as an Independent Technical Reviewer (ITR) for this study.

  16. Validating a Technology Enhanced Student-Centered Learning Model

    ERIC Educational Resources Information Center

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  17. Implementation Cryptography Data Encryption Standard (DES) and Triple Data Encryption Standard (3DES) Method in Communication System Based Near Field Communication (NFC)

    NASA Astrophysics Data System (ADS)

    Ratnadewi; Pramono Adhie, Roy; Hutama, Yonatan; Saleh Ahmar, A.; Setiawan, M. I.

    2018-01-01

    Cryptography is a method used to create secure communication by manipulating sent messages during the communication occurred so only intended party that can know the content of that messages. Some of the most commonly used cryptography methods to protect sent messages, especially in the form of text, are DES and 3DES cryptography method. This research will explain the DES and 3DES cryptography method and its use for stored data security in smart cards that working in the NFC-based communication system. Several things that will be explained in this research is the ways of working of DES and 3DES cryptography method in doing the protection process of a data and software engineering through the creation of application using C++ programming language to realize and test the performance of DES and 3DES cryptography method in encrypted data writing process to smart cards and decrypted data reading process from smart cards. The execution time of the entering and the reading process data using a smart card DES cryptography method is faster than using 3DES cryptography.

  18. Finite Element Model Development and Validation for Aircraft Fuselage Structures

    NASA Technical Reports Server (NTRS)

    Buehrle, Ralph D.; Fleming, Gary A.; Pappa, Richard S.; Grosveld, Ferdinand W.

    2000-01-01

    The ability to extend the valid frequency range for finite element based structural dynamic predictions using detailed models of the structural components and attachment interfaces is examined for several stiffened aircraft fuselage structures. This extended dynamic prediction capability is needed for the integration of mid-frequency noise control technology. Beam, plate and solid element models of the stiffener components are evaluated. Attachment models between the stiffener and panel skin range from a line along the rivets of the physical structure to a constraint over the entire contact surface. The finite element models are validated using experimental modal analysis results. The increased frequency range results in a corresponding increase in the number of modes, modal density and spatial resolution requirements. In this study, conventional modal tests using accelerometers are complemented with Scanning Laser Doppler Velocimetry and Electro-Optic Holography measurements to further resolve the spatial response characteristics. Whenever possible, component and subassembly modal tests are used to validate the finite element models at lower levels of assembly. Normal mode predictions for different finite element representations of components and assemblies are compared with experimental results to assess the most accurate techniques for modeling aircraft fuselage type structures.

  19. Verona Coding Definitions of Emotional Sequences (VR-CoDES): Conceptual framework and future directions.

    PubMed

    Piccolo, Lidia Del; Finset, Arnstein; Mellblom, Anneli V; Figueiredo-Braga, Margarida; Korsvold, Live; Zhou, Yuefang; Zimmermann, Christa; Humphris, Gerald

    2017-12-01

    To discuss the theoretical and empirical framework of VR-CoDES and potential future direction in research based on the coding system. The paper is based on selective review of papers relevant to the construction and application of VR-CoDES. VR-CoDES system is rooted in patient-centered and biopsychosocial model of healthcare consultations and on a functional approach to emotion theory. According to the VR-CoDES, emotional interaction is studied in terms of sequences consisting of an eliciting event, an emotional expression by the patient and the immediate response by the clinician. The rationale for the emphasis on sequences, on detailed classification of cues and concerns, and on the choices of explicit vs. non-explicit responses and providing vs. reducing room for further disclosure, as basic categories of the clinician responses, is described. Results from research on VR-CoDES may help raise awareness of emotional sequences. Future directions in applying VR-CoDES in research may include studies on predicting patient and clinician behavior within the consultation, qualitative analyses of longer sequences including several VR-CoDES triads, and studies of effects of emotional communication on health outcomes. VR-CoDES may be applied to develop interventions to promote good handling of patients' emotions in healthcare encounters. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Finite Element Model and Validation of Nasal Tip Deformation

    PubMed Central

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian JF

    2016-01-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39mm ± 1.04 mm and deviated up to 2mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow. PMID:27633018

  1. Finite Element Model and Validation of Nasal Tip Deformation.

    PubMed

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  2. Efficiency of endoscopy units can be improved with use of discrete event simulation modeling

    PubMed Central

    Sauer, Bryan G.; Singh, Kanwar P.; Wagner, Barry L.; Vanden Hoek, Matthew S.; Twilley, Katherine; Cohn, Steven M.; Shami, Vanessa M.; Wang, Andrew Y.

    2016-01-01

    Background and study aims: The projected increased demand for health services obligates healthcare organizations to operate efficiently. Discrete event simulation (DES) is a modeling method that allows for optimization of systems through virtual testing of different configurations before implementation. The objective of this study was to identify strategies to improve the daily efficiencies of an endoscopy center with the use of DES. Methods: We built a DES model of a five procedure room endoscopy unit at a tertiary-care university medical center. After validating the baseline model, we tested alternate configurations to run the endoscopy suite and evaluated outcomes associated with each change. The main outcome measures included adequate number of preparation and recovery rooms, blocked inflow, delay times, blocked outflows, and patient cycle time. Results: Based on a sensitivity analysis, the adequate number of preparation rooms is eight and recovery rooms is nine for a five procedure room unit (total 3.4 preparation and recovery rooms per procedure room). Simple changes to procedure scheduling and patient arrival times led to a modest improvement in efficiency. Increasing the preparation/recovery rooms based on the sensitivity analysis led to significant improvements in efficiency. Conclusions: By applying tools such as DES, we can model changes in an environment with complex interactions and find ways to improve the medical care we provide. DES is applicable to any endoscopy unit and would be particularly valuable to those who are trying to improve on the efficiency of care and patient experience. PMID:27853739

  3. Validation and Trustworthiness of Multiscale Models of Cardiac Electrophysiology

    PubMed Central

    Pathmanathan, Pras; Gray, Richard A.

    2018-01-01

    Computational models of cardiac electrophysiology have a long history in basic science applications and device design and evaluation, but have significant potential for clinical applications in all areas of cardiovascular medicine, including functional imaging and mapping, drug safety evaluation, disease diagnosis, patient selection, and therapy optimisation or personalisation. For all stakeholders to be confident in model-based clinical decisions, cardiac electrophysiological (CEP) models must be demonstrated to be trustworthy and reliable. Credibility, that is, the belief in the predictive capability, of a computational model is primarily established by performing validation, in which model predictions are compared to experimental or clinical data. However, there are numerous challenges to performing validation for highly complex multi-scale physiological models such as CEP models. As a result, credibility of CEP model predictions is usually founded upon a wide range of distinct factors, including various types of validation results, underlying theory, evidence supporting model assumptions, evidence from model calibration, all at a variety of scales from ion channel to cell to organ. Consequently, it is often unclear, or a matter for debate, the extent to which a CEP model can be trusted for a given application. The aim of this article is to clarify potential rationale for the trustworthiness of CEP models by reviewing evidence that has been (or could be) presented to support their credibility. We specifically address the complexity and multi-scale nature of CEP models which makes traditional model evaluation difficult. In addition, we make explicit some of the credibility justification that we believe is implicitly embedded in the CEP modeling literature. Overall, we provide a fresh perspective to CEP model credibility, and build a depiction and categorisation of the wide-ranging body of credibility evidence for CEP models. This paper also represents a step

  4. Testing the validity of the International Atomic Energy Agency (IAEA) safety culture model.

    PubMed

    López de Castro, Borja; Gracia, Francisco J; Peiró, José M; Pietrantoni, Luca; Hernández, Ana

    2013-11-01

    This paper takes the first steps to empirically validate the widely used model of safety culture of the International Atomic Energy Agency (IAEA), composed of five dimensions, further specified by 37 attributes. To do so, three independent and complementary studies are presented. First, 290 students serve to collect evidence about the face validity of the model. Second, 48 experts in organizational behavior judge its content validity. And third, 468 workers in a Spanish nuclear power plant help to reveal how closely the theoretical five-dimensional model can be replicated. Our findings suggest that several attributes of the model may not be related to their corresponding dimensions. According to our results, a one-dimensional structure fits the data better than the five dimensions proposed by the IAEA. Moreover, the IAEA model, as it stands, seems to have rather moderate content validity and low face validity. Practical implications for researchers and practitioners are included. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. A detached eddy simulation model for the study of lateral separation zones along a large canyon-bound river

    USGS Publications Warehouse

    Alvarez, Laura V.; Schmeeckle, Mark W.; Grams, Paul E.

    2017-01-01

    Lateral flow separation occurs in rivers where banks exhibit strong curvature. In canyon-boundrivers, lateral recirculation zones are the principal storage of fine-sediment deposits. A parallelized,three-dimensional, turbulence-resolving model was developed to study the flow structures along lateralseparation zones located in two pools along the Colorado River in Marble Canyon. The model employs thedetached eddy simulation (DES) technique, which resolves turbulence structures larger than the grid spacingin the interior of the flow. The DES-3D model is validated using Acoustic Doppler Current Profiler flowmeasurements taken during the 2008 controlled flood release from Glen Canyon Dam. A point-to-pointvalidation using a number of skill metrics, often employed in hydrological research, is proposed here forfluvial modeling. The validation results show predictive capabilities of the DES model. The model reproducesthe pattern and magnitude of the velocity in the lateral recirculation zone, including the size and position ofthe primary and secondary eddy cells, and return current. The lateral recirculation zone is open, havingcontinuous import of fluid upstream of the point of reattachment and export by the recirculation returncurrent downstream of the point of separation. Differences in magnitude and direction of near-bed andnear-surface velocity vectors are found, resulting in an inward vertical spiral. Interaction between therecirculation return current and the main flow is dynamic, with large temporal changes in flow direction andmagnitude. Turbulence structures with a predominately vertical axis of vorticity are observed in the shearlayer becoming three-dimensional without preferred orientation downstream.

  6. Une nouvelle voie pour la conception des implants intervertébraux

    NASA Astrophysics Data System (ADS)

    Gradel, T.; Tabourot, L.; Arrieux, R.; Balland, P.

    2002-12-01

    L'objectif de notre travail est la conception d'une nouvelle génération d'implants intersomatiques qui s'adapte parfaitement à la géométrie des plateaux vertébraux en se déformant. Pour cela, nous avons utilisé une nouvelle démarche qui consiste à simuler entièrement le procédé de fabrication en l'occurrence l'emboutissage, Cette simulation, en concervant l'historique des sollicitations exercées sur le matériau lors de sa mise en œuvre permet de valider très précisément sa résistance mécanique en fin de cycle. Au cours de cette étude, nous avons mené en parallèle deux analyses dites “ coopératives ” : l'une fondée sur un modèle phénoménologique de type HILL et l'autre sur un modèle multi-échelles prenant en compte des phénomènes plus physiques afin d'acquérir une bonne connaissance du comportement du matériau lors de la déformation. C'est pour sa bonne résistance, sa biocompatibilité et ses propriétés radiologiques que nous avons choisi le T40 (titane pur) comme matériau.

  7. Using the split Hopkinson pressure bar to validate material models.

    PubMed

    Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian

    2014-08-28

    This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. Construct validity of the ovine model in endoscopic sinus surgery training.

    PubMed

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P < .001). Experience of the intermediate group was variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  9. Community-wide validation of geospace model local K-index predictions to support model transition to operations

    NASA Astrophysics Data System (ADS)

    Glocer, A.; Rastätter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; Weigel, R. S.; McCollough, J.; Wing, S.

    2016-07-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPC's effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  10. Community-Wide Validation of Geospace Model Local K-Index Predictions to Support Model Transition to Operations

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Rastaetter, L.; Kuznetsova, M.; Pulkkinen, A.; Singer, H. J.; Balch, C.; Weimer, D.; Welling, D.; Wiltberger, M.; Raeder, J.; hide

    2016-01-01

    We present the latest result of a community-wide space weather model validation effort coordinated among the Community Coordinated Modeling Center (CCMC), NOAA Space Weather Prediction Center (SWPC), model developers, and the broader science community. Validation of geospace models is a critical activity for both building confidence in the science results produced by the models and in assessing the suitability of the models for transition to operations. Indeed, a primary motivation of this work is supporting NOAA/SWPCs effort to select a model or models to be transitioned into operations. Our validation efforts focus on the ability of the models to reproduce a regional index of geomagnetic disturbance, the local K-index. Our analysis includes six events representing a range of geomagnetic activity conditions and six geomagnetic observatories representing midlatitude and high-latitude locations. Contingency tables, skill scores, and distribution metrics are used for the quantitative analysis of model performance. We consider model performance on an event-by-event basis, aggregated over events, at specific station locations, and separated into high-latitude and midlatitude domains. A summary of results is presented in this report, and an online tool for detailed analysis is available at the CCMC.

  11. Making Validated Educational Models Central in Preschool Standards.

    ERIC Educational Resources Information Center

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  12. Web Based Semi-automatic Scientific Validation of Models of the Corona and Inner Heliosphere

    NASA Astrophysics Data System (ADS)

    MacNeice, P. J.; Chulaki, A.; Taktakishvili, A.; Kuznetsova, M. M.

    2013-12-01

    Validation is a critical step in preparing models of the corona and inner heliosphere for future roles supporting either or both the scientific research community and the operational space weather forecasting community. Validation of forecasting quality tends to focus on a short list of key features in the model solutions, with an unchanging order of priority. Scientific validation exposes a much larger range of physical processes and features, and as the models evolve to better represent features of interest, the research community tends to shift its focus to other areas which are less well understood and modeled. Given the more comprehensive and dynamic nature of scientific validation, and the limited resources available to the community to pursue this, it is imperative that the community establish a semi-automated process which engages the model developers directly into an ongoing and evolving validation process. In this presentation we describe the ongoing design and develpment of a web based facility to enable this type of validation of models of the corona and inner heliosphere, on the growing list of model results being generated, and on strategies we have been developing to account for model results that incorporate adaptively refined numerical grids.

  13. Strategies facilitant les tests en pre-certification pour la robustesse a l'egard des radiations =

    NASA Astrophysics Data System (ADS)

    Souari, Anis

    . permettant d’evaluer d’une facon realiste la sensibilite des circuits integres face. aux effets des radiations afin d’eviter d’envoyer des circuits non robustes a la phase tres. couteuse de la certification. Les circuits cibles par nos travaux sont les circuits integres. programmables par l’usager (FPGA) a base de memoire SRAM et le type de pannes ciblees, causees par les radiations, est les SEU (single event upset) consistant a un basculement de. l’etat logique d’un element de memoire a son complementaire. En effet, les FPGA a base de. memoire SRAM sont de plus en plus demandes par la communaute de l’aerospatial grace a. leurs caracteristiques de prototypage rapide et de reconfiguration sur site mais ils sont. vulnerables face aux radiations ou les SEU sont les pannes les plus frequentes dans les. elements de memoire de type SRAM. Nous proposons une nouvelle approche d’injection de. pannes par emulation permettant de mimer les effets des radiations sur la memoire de. configuration des FPGA et de generer des resultats les plus fideles possibles des resultats des. tests de certification. Cette approche est basee sur la consideration de la difference de. sensibilite des elements de memoire de configuration lorsqu’ils sont a l’etat '1' et a l’etat '0', observee sous des tests acceleres sous faisceaux de protons au renomme laboratoire. TRIUMF, dans la procedure de generation des sequences de test dans le but de mimer la. distribution des pannes dans la memoire de configuration. Les resultats des experimentations. de validation montrent que la strategie proposee est efficace et genere des resultats realistes. Ces resultats revelent que ne pas considerer la difference de sensibilite peut mener a une. sous-estimation de la sensibilite des circuits face aux radiations. Dans la meme optique d’optimisation de la procedure d’injection des pannes par emulation, a. savoir le test de pre-certification, nous proposons

  14. Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)

    NASA Astrophysics Data System (ADS)

    Basri, M.; Mawengkang, H.; Zamzami, E. M.

    2018-03-01

    Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.

  15. Evaluation d’une grille de supervision des laboratoires des leishmanioses cutanées au Maroc

    PubMed Central

    El Mansouri, Bouchra; Amarir, Fatima; Hajli, Yamina; Fellah, Hajiba; Sebti, Faiza; Delouane, Bouchra; Sadak, Abderrahim; Adlaoui, El Bachir; Rhajaoui, Mohammed

    2017-01-01

    Introduction Afin d’évaluer une grille de contrôle standardisée de laboratoire de diagnostic des leishmanioses, comme nouveau outil de supervision. Méthodes Un essai pilote a été pratiqué sur sept laboratoires provinciaux, appartenant à quatre provinces au Maroc, en suivant l’évolution de leurs performances tous les deux ans, entre l’année 2006 et 2014. Cette étude détaille la situation des laboratoires provinciaux avant et après la mise en œuvre de la grille de supervision. Au total vingt et une grille sont analysées. Résultats En 2006, les résultats ont montré clairement une insuffisance des performances des laboratoires: besoin en formation (41.6%), personnel pratiquant le prélèvement cutané (25%), pénurie en matériels et réactifs (65%), gestions documentaire et local non conformes (85%). Différentes actions correctives ont été menées par le Laboratoire National de Référence des Leishmanioses (LNRL) durant la période d’étude. En 2014, le LNRL a enregistré une nette amélioration des performances des laboratoires. Les besoins en matière de formation, qualité du prélèvement, dotation en matériels et réactifs ont été comblés et une coordination efficace s’est établie entre le LNRL et les laboratoires provinciaux. Conclusion Ceci montre l'efficacité de la grille comme outil de supervision de grande qualité, et comme pierre angulaire de tout progrès qui doit être obtenu dans les programmes de lutte contre les leishmanioses. PMID:29187922

  16. Identification et prise en charge des femmes ayant des antécédents familiaux de cancer du sein

    PubMed Central

    Heisey, Ruth; Carroll, June C.

    2016-01-01

    Résumé Objectif Résumer les meilleures données portant sur les stratégies d’identification et de prise en charge des femmes qui présentent des antécédents familiaux de cancer du sein. Sources d’information Une recherche a été effectuée sur PubMed à l’aide des mots-clés anglais suivants : breast cancer, guidelines, risk, family history, management et magnetic resonance imaging screening, entre 2000 et 2016. La plupart des données sont de niveau II. Message principal Une bonne anamnèse familiale est essentielle lors de l’évaluation du risque de cancer du sein afin d’identifier les femmes qui sont candidates à une recommandation en counseling génétique pour un éventuel test génétique. On peut sauver des vies en offrant aux femmes porteuses d’une mutation au gène BRCA des interventions chirurgicales de réduction des risques (mastectomie bilatérale prophylactique, salpingo-ovariectomie bilatérale). Il faut encourager toutes les femmes qui présentent des antécédents familiaux de cancer du sein à demeurer actives et à limiter leur consommation d’alcool à moins de 1 verre par jour; certaines femmes sont admissibles à la chimioprévention. Il faut offrir aux femmes dont le risque à vie de cancer du sein est de 20 à 25 % ou plus un dépistage poussé par imagerie par résonance magnétique en plus d’une mammographie. Conclusion Une vie saine et la chimioprévention (chez les candidates) pourraient réduire l’incidence du cancer du sein; le dépistage poussé pourrait entraîner une détection plus précoce. Le fait d’aiguiller des femmes porteuses d’une mutation au BRCA vers la chirurgie de réduction des risques sauve des vies. PMID:27737991

  17. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  18. Note des Éditeurs scientifiques

    NASA Astrophysics Data System (ADS)

    Averbuch, P.

    Cette série d'articles est une revue de résultats expérimentaux sur différents "fluides" moléculaires, dans lesquels la cohésion est due à des forces de Van der Waals et à des liaisons hydrogène, l'eau étant un de ces fluides. Ces résultats sont présentés de façon à justifier expérimentalement un modèle original, non extensif, des propriétés de ces fluides, et l'ensemble se présente sous la forme de trois articles décrivant le modèle, suivis chacun par un article le comparant aux résultats expérimentaux publiés par de nombreux auteurs. Le caractère non extensif des propriétés physiques des fluides est choquant, contraire à beaucoup d'idées établies, il semble n'avoir en sa faveur qu'un argument, la comparaison avec un nombre de résultats expérimentaux assez grand pour que l'effet du hasard soit difficilement soupçonnable. En particulier, les écarts entre des résultats de mesures faits par des auteurs différents dans des conditions différentes sont expliqués, le sérieux et la compétence des différents expérimentateurs ne sont plus mis en doute : mais l'interprétation de ces résultats avec un modèle extensif non adapté est seule mise en cause. Les modèles extensifs étant utilisés systématiquement, au delà des expériences de physiciens, dans les calculs d'ingénieurs, et dans la modélisation d'appareils qui fonctionnent et de phénomènes naturels observés par tout le monde, il fallait expliquer pourquoi on pouvait renoncer à l'extensivité. Les raisons du succès pratique des modèles extensifs sont données, d'abord dans le cas des nématiques, puis dans celui des liquides ordinaires, et c'est ce qui rend l'ensemble cohérent, tant avec les mesures physiques fines qu'avec les observations quotidiennes. Il n'en reste pas moins que si l'interprétation donnée dans cette série d'articles est généralisable, une justification théorique du modèle utilisé devient nécessaire. Pour ce qui est des propriétés d

  19. Applications of Composite Materials in Helicopter Construction (Les Applications des Materiaux Composite dans la Construction des Helicopteres),

    DTIC Science & Technology

    1983-11-21

    TRANSLATION TITLE: APPLICATIONS OF COMPOSITE MATERIALS IN HELICOPTER CONSTRUCTION LES APPLICATIONS DES MATERTAUX COMWSITE DANS LA CONSTRUCTION DES...International Symposium on Design and Use of Kevlar in Aircraft, Geneva, 12 October 1982 [Beziac, Gilbert;* Les applications des mat6riaux composite...the pilot’s orders to the engine and the rotors. --Rear rotor Conventional or "faired propeller" type with its overall pitch control --Vibration

  20. Forward ultrasonic model validation using wavefield imaging methods

    NASA Astrophysics Data System (ADS)

    Blackshire, James L.

    2018-04-01

    The validation of forward ultrasonic wave propagation models in a complex titanium polycrystalline material system is accomplished using wavefield imaging methods. An innovative measurement approach is described that permits the visualization and quantitative evaluation of bulk elastic wave propagation and scattering behaviors in the titanium material for a typical focused immersion ultrasound measurement process. Results are provided for the determination and direct comparison of the ultrasonic beam's focal properties, mode-converted shear wave position and angle, and scattering and reflection from millimeter-sized microtexture regions (MTRs) within the titanium material. The approach and results are important with respect to understanding the root-cause backscatter signal responses generated in aerospace engine materials, where model-assisted methods are being used to understand the probabilistic nature of the backscatter signal content. Wavefield imaging methods are shown to be an effective means for corroborating and validating important forward model predictions in a direct manner using time- and spatially-resolved displacement field amplitude measurements.

  1. Kontinuierliche Wanddickenbestimmung und Visualisierung des linken Herzventrikels

    NASA Astrophysics Data System (ADS)

    Dornheim, Lars; Hahn, Peter; Oeltze, Steffen; Preim, Bernhard; Tönnies, Klaus D.

    Zur Bestimmung von Defekten in der Herztätigkeit kann die Veränderung der Wanddicke des linken Ventrikels in zeitlichen MRTAufnahmesequenzen gemessen werden. Derzeit werden für diese Bestimmung im allgemeinen nur die aufwändig manuell erstellte Segmentierungen der Endsystole und Enddiastole benutzt. Wir stellen ein bis auf die Startpunktinitialisierung automatisches Verfahren zur Bestimmung der Wanddicke des linken Ventrikels und ihrer Veränderung vor, das auf einer vollständigen Segmentierung der Herzwand in allen Zeitschritten durch ein dynamisches dreidimensionales Formmodell (Stabiles Feder-Masse-Modell) basiert. Dieses Modell nutzt bei der Segmentierung neben der Grauwertinformation eines Zeitschrittes auch die Segmentierungen der anderen Zeitschritte und ist so aufgebaut, dass die Wanddicken direkt gemessen und visualisiert werden können. Auf diese Weise werden die lokalen Wanddickenextrema über den gesamten Aufnahmezeitraum detektiert, auch wenn sie nicht in die Endsystole bzw. -diastole fallen. Das Verfahren wurde auf sechs 4D-Kardio-MRT-Datensätzen evaluiert und stellte sich als sehr robust bzgl. der einzig nötigen Interaktion heraus.

  2. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  3. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  4. SIX1 oncoprotein is necessary for abnormal uterine basal cell development in mice exposed neonatally to DES

    EPA Science Inventory

    In a classical model of latent hormonal carcinogenesis, exposing female mice on neonatal days 1-5 to the synthetic estrogen diethylstilbestrol (DES; 1 mg/kg/day) results in high incidence of uterine carcinoma. However, the biological mechanisms driving DES-induced carcinogenesis ...

  5. Calibration of Predictor Models Using Multiple Validation Experiments

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  6. Validation of the measure automobile emissions model : a statistical analysis

    DOT National Transportation Integrated Search

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  7. Étude des perturbations conduites et rayonnées dans une cellule de commutation

    NASA Astrophysics Data System (ADS)

    Costa, F.; Forest, F.; Puzo, A.; Rojat, G.

    1993-12-01

    The principles used in static conversion and the rise of the performances of the new switching devices contribue to increase the level of electromagnetic noises emitted by electronic converters. We have studied the way how these perturbations are created and coupled through their environment in conducted and radiated mode by a switching cell. This one can work in hard switching, zero current or voltage switching modes. We first outline the general problems of electromagnetic pollution and their metrology in converters. Then we describe the experimental environment. We analyse the mechanisms of generation of parasitic signals in a switching cell related to the electrical constraints and its switching mode. The simulated results, issued of the analytical models obtained, are confronted with the experimental ones. Then we show a method to calculate analytically the E and H near fields. It has been confirmed by experimental results. At last, we present, in a synthetic manner, the main results obtained, relative to the switching mode and the electrical constraints, using a new characterizing method. Theses results will allow the designer to incorporate the electromagnetic considerations in the conception of a converter. Les principes de commutation employés en conversion statique, l'évolution des performances statiques et dynamiques des composants, contribuent à faire des dispositifs de conversion statique de puissants générateurs de perturbations conduites et rayonnées. Nous nous sommes attachés à étudier les mécanismes de génération et de couplage des perturbations, tant en mode conduit que rayonné dans des structures à une seule cellule de commutation et fonctionnant selon les trois principaux modes de commutation : commutation forcée, à zéro de courant (ZCS), et à zéro de tension (ZVS). Après la mise en évidence de la problématique de pollution électromagnétique dans les structures et leur métrologie, nous décrivons l'environnement exp

  8. Etude spectroscopique des collisions moleculaires (hydrogene-azote et hydrogene-oxygene) a des energies de quelques MeV

    NASA Astrophysics Data System (ADS)

    Plante, Jacinthe

    1998-09-01

    Les resultats presentes ici proviennent d'une etude systematique portant sur les collisions a vitesse constante, entre les projectiles d'hydrogene (H+, H2+ et H3+ a 1 MeV/nucleon) et deux cibles gazeuses (N2 et O2), soumises a differentes pressions. Les collisions sont analysees a l'aide des spectres d'emission (de 400 A a 6650 A) et des graphiques intensite/pression. Les spectres ont revele la presence des raies d'azote atomique, d'azote moleculaire, d'oxygene atomique, d'oxygene moleculaire et d'hydrogene atomique. Les raies d'hydrogene sont observees seulement avec les projectiles H2+ et H3+. Donc les processus responsables de la formation de ces raies sont des mecanismes de fragmentation des projectiles. Pour conclure, il existe une difference notable entre les projectiles et les differentes pressions. Les raies d'azote et d'oxygene augmentent selon la pression et les raies d'hydrogene atomique presentent une relation non lineaire avec la pression.

  9. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  10. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  11. DES Science Portal: II- Creating Science-Ready Catalogs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fausti Neto, Angelo; et al.

    We present a novel approach for creating science-ready catalogs through a software infrastructure developed for the Dark Energy Survey (DES). We integrate the data products released by the DES Data Management and additional products created by the DES collaboration in an environment known as DES Science Portal. Each step involved in the creation of a science-ready catalog is recorded in a relational database and can be recovered at any time. We describe how the DES Science Portal automates the creation and characterization of lightweight catalogs for DES Year 1 Annual Release, and show its flexibility in creating multiple catalogs withmore » different inputs and configurations. Finally, we discuss the advantages of this infrastructure for large surveys such as DES and the Large Synoptic Survey Telescope. The capability of creating science-ready catalogs efficiently and with full control of the inputs and configurations used is an important asset for supporting science analysis using data from large astronomical surveys.« less

  12. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  14. The Space Weather Modeling Framework (SWMF): Models and Validation

    NASA Astrophysics Data System (ADS)

    Gombosi, Tamas; Toth, Gabor; Sokolov, Igor; de Zeeuw, Darren; van der Holst, Bart; Ridley, Aaron; Manchester, Ward, IV

    In the last decade our group at the Center for Space Environment Modeling (CSEM) has developed the Space Weather Modeling Framework (SWMF) that efficiently couples together different models describing the interacting regions of the space environment. Many of these domain models (such as the global solar corona, the inner heliosphere or the global magneto-sphere) are based on MHD and are represented by our multiphysics code, BATS-R-US. SWMF is a powerful tool for coupling regional models describing the space environment from the solar photosphere to the bottom of the ionosphere. Presently, SWMF contains over a dozen components: the solar corona (SC), eruptive event generator (EE), inner heliosphere (IE), outer heliosphere (OH), solar energetic particles (SE), global magnetosphere (GM), inner magnetosphere (IM), radiation belts (RB), plasmasphere (PS), ionospheric electrodynamics (IE), polar wind (PW), upper atmosphere (UA) and lower atmosphere (LA). This talk will present an overview of SWMF, new results obtained with improved physics as well as some validation studies.

  15. Using the verona coding definitions of emotional sequences (VR-CoDES) and health provider responses (VR-CoDES-P) in the dental context.

    PubMed

    Wright, Alice; Humphris, Gerry; Wanyonyi, Kristina L; Freeman, Ruth

    2012-10-01

    To show if cues, concerns and provider responses (defined in VR-CoDES and VR-CoDES-P manuals) are present, can be reliably coded and require additional advice for adoption in a dental context. Thirteen patients in a dental practice setting were videoed with either their dentist or hygienist and dental nurse present in routine treatment sessions. All utterances were coded using the Verona systems: VR-CoDES and the VR-CoDES-P. Rates of cue, concerns and provider responses described and reliability tested. The VR-CoDES and VR-CoDES-P were successfully applied in the dental context. The intra-rater ICCs for the detection of cues and concerns and provider response were acceptable and above 0.75. A similar satisfactory result was found for the inter-rater reliability. The VR-CoDES and the VR-CoDES-P are applicable in the dental setting with minor supporting guidelines and show evidence of reliable coding. The VR-CoDES and the VR-CoDES-P may be helpful tools for analysing patient cues and concerns and the dental professionals' responses in the dental context. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. CFD Modeling Needs and What Makes a Good Supersonic Combustion Validation Experiment

    NASA Technical Reports Server (NTRS)

    Gaffney, Richard L., Jr.; Cutler, Andrew D.

    2005-01-01

    If a CFD code/model developer is asked what experimental data he wants to validate his code or numerical model, his answer will be: "Everything, everywhere, at all times." Since this is not possible, practical, or even reasonable, the developer must understand what can be measured within the limits imposed by the test article, the test location, the test environment and the available diagnostic equipment. At the same time, it is important for the expermentalist/diagnostician to understand what the CFD developer needs (as opposed to wants) in order to conduct a useful CFD validation experiment. If these needs are not known, it is possible to neglect easily measured quantities at locations needed by the developer, rendering the data set useless for validation purposes. It is also important for the experimentalist/diagnostician to understand what the developer is trying to validate so that the experiment can be designed to isolate (as much as possible) the effects of a particular physical phenomena that is associated with the model to be validated. The probability of a successful validation experiment can be greatly increased if the two groups work together, each understanding the needs and limitations of the other.

  17. Reticulation des fibres lignocellulosiques

    NASA Astrophysics Data System (ADS)

    Landrevy, Christel

    Pour faire face à la crise économique la conception de papier à valeur ajoutée est développée par les industries papetières. Le but de se projet est l'amélioration des techniques actuelles de réticulation des fibres lignocellulosiques de la pâte à papier visant à produire un papier plus résistant. En effet, lors des réactions de réticulation traditionnelles, de nombreuses liaisons intra-fibres se forment ce qui affecte négativement l'amélioration anticipée des propriétés physiques du papier ou du matériau produit. Pour éviter la formation de ces liaisons intra-fibres, un greffage sur les fibres de groupements ne pouvant pas réagir entre eux est nécessaire. La réticulation des fibres par une réaction de « click chemistry » appelée cycloaddition de Huisgen entre un azide et un alcyne vrai, catalysée par du cuivre (CuAAC) a été l'une des solutions trouvée pour remédier à ce problème. De plus, une adaptation de cette réaction en milieux aqueux pourrait favoriser son utilisation en milieu industriel. L'étude que nous désirons entreprendre lors de ce projet vise à optimiser la réaction de CuAAC et les réactions intermédiaires (propargylation, tosylation et azidation) sur la pâte kraft, en milieu aqueux. Pour cela, les réactions ont été adaptées en milieu aqueux sur la cellulose microcristalline afin de vérifier sa faisabilité, puis transférée à la pâte kraft et l'influence de différents paramètres comme le temps de réaction ou la quantité de réactifs utilisée a été étudiée. Dans un second temps, une étude des différentes propriétés conférées au papier par les réactions a été réalisée à partir d'une série de tests papetiers optiques et physiques. Mots Clés Click chemistry, Huisgen, CuAAC, propargylation, tosylation, azidation, cellulose, pâte kraft, milieu aqueux, papier.

  18. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    NASA Astrophysics Data System (ADS)

    Davis, C.; Rozo, E.; Roodman, A.; Alarcon, A.; Cawthon, R.; Gatti, M.; Lin, H.; Miquel, R.; Rykoff, E. S.; Troxel, M. A.; Vielzeuf, P.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Carnero Rosell, A.; Carrasco Kind, M.; Carretero, J.; Castander, F. J.; Crocce, M.; Cunha, C. E.; D'Andrea, C. B.; da Costa, L. N.; Desai, S.; Diehl, H. T.; Doel, P.; Drlica-Wagner, A.; Fausti Neto, A.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gaztanaga, E.; Gerdes, D. W.; Giannantonio, T.; Gruen, D.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Jain, B.; James, D. J.; Jeltema, T.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Martini, P.; Melchior, P.; Ogando, R. L. C.; Plazas, A. A.; Romer, A. K.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla-Noarbe, I.; Smith, M.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Vikram, V.; Walker, A. R.; Wechsler, R. H.

    2018-06-01

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogues with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty of Δz ˜ ±0.01. We forecast that our proposal can, in principle, control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Our results provide strong motivation to launch a programme to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.

  19. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE PAGES

    Davis, C.; Rozo, E.; Roodman, A.; ...

    2018-03-26

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  20. Cross-correlation redshift calibration without spectroscopic calibration samples in DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, C.; Rozo, E.; Roodman, A.

    Galaxy cross-correlations with high-fidelity redshift samples hold the potential to precisely calibrate systematic photometric redshift uncertainties arising from the unavailability of complete and representative training and validation samples of galaxies. However, application of this technique in the Dark Energy Survey (DES) is hampered by the relatively low number density, small area, and modest redshift overlap between photometric and spectroscopic samples. We propose instead using photometric catalogs with reliable photometric redshifts for photo-z calibration via cross-correlations. We verify the viability of our proposal using redMaPPer clusters from the Sloan Digital Sky Survey (SDSS) to successfully recover the redshift distribution of SDSS spectroscopic galaxies. We demonstrate how to combine photo-z with cross-correlation data to calibrate photometric redshift biases while marginalizing over possible clustering bias evolution in either the calibration or unknown photometric samples. We apply our method to DES Science Verification (DES SV) data in order to constrain the photometric redshift distribution of a galaxy sample selected for weak lensing studies, constraining the mean of the tomographic redshift distributions to a statistical uncertainty ofmore » $$\\Delta z \\sim \\pm 0.01$$. We forecast that our proposal can in principle control photometric redshift uncertainties in DES weak lensing experiments at a level near the intrinsic statistical noise of the experiment over the range of redshifts where redMaPPer clusters are available. Here, our results provide strong motivation to launch a program to fully characterize the systematic errors from bias evolution and photo-z shapes in our calibration procedure.« less

  1. DES13S2cmm: The first superluminous supernova from the Dark Energy Survey

    DOE PAGES

    Papadopoulos, A.; Plazas, A. A.; D"Andrea, C. B.; ...

    2015-03-23

    We present DES13S2cmm, the first spectroscopically-confirmed superluminous supernova (SLSN) from the Dark Energy Survey (DES). We briefly discuss the data and search algorithm used to find this event in the first year of DES operations, and outline the spectroscopic data obtained from the European Southern Observatory (ESO) Very Large Telescope to confirm its redshift (z = 0.663 ± 0.001 based on the host-galaxy emission lines) and likely spectral type (type I). Using this redshift, we find M peak U = –21.05 +0.10 –0.09 for the peak, rest-frame U-band absolute magnitude, and find DES13S2cmm to be located in a faint, low-metallicitymore » (sub-solar), low stellar-mass host galaxy (log(M/M⊙) = 9.3 ± 0.3), consistent with what is seen for other SLSNe-I. We compare the bolometric light curve of DES13S2cmm to fourteen similarly well-observed SLSNe-I in the literature and find it possesses one of the slowest declining tails (beyond +30 days rest frame past peak), and is the faintest at peak. Moreover, we find the bolometric light curves of all SLSNe-I studied herein possess a dispersion of only 0.2–0.3 magnitudes between +25 and +30 days after peak (rest frame) depending on redshift range studied; this could be important for ‘standardising’ such supernovae, as is done with the more common type Ia. We fit the bolometric light curve of DES13S2cmm with two competing models for SLSNe-I – the radioactive decay of ⁵⁶Ni, and a magnetar – and find that while the magnetar is formally a better fit, neither model provides a compelling match to the data. Although we are unable to conclusively differentiate between these two physical models for this particular SLSN-I, further DES observations of more SLSNe-I should break this degeneracy, especially if the light curves of SLSNe-I can be observed beyond 100 days in the rest frame of the supernova.« less

  2. A Validated Open-Source Multisolver Fourth-Generation Composite Femur Model.

    PubMed

    MacLeod, Alisdair R; Rose, Hannah; Gill, Harinderjit S

    2016-12-01

    Synthetic biomechanical test specimens are frequently used for preclinical evaluation of implant performance, often in combination with numerical modeling, such as finite-element (FE) analysis. Commercial and freely available FE packages are widely used with three FE packages in particular gaining popularity: abaqus (Dassault Systèmes, Johnston, RI), ansys (ANSYS, Inc., Canonsburg, PA), and febio (University of Utah, Salt Lake City, UT). To the best of our knowledge, no study has yet made a comparison of these three commonly used solvers. Additionally, despite the femur being the most extensively studied bone in the body, no freely available validated model exists. The primary aim of the study was primarily to conduct a comparison of mesh convergence and strain prediction between the three solvers (abaqus, ansys, and febio) and to provide validated open-source models of a fourth-generation composite femur for use with all the three FE packages. Second, we evaluated the geometric variability around the femoral neck region of the composite femurs. Experimental testing was conducted using fourth-generation Sawbones® composite femurs instrumented with strain gauges at four locations. A generic FE model and four specimen-specific FE models were created from CT scans. The study found that the three solvers produced excellent agreement, with strain predictions being within an average of 3.0% for all the solvers (r2 > 0.99) and 1.4% for the two commercial codes. The average of the root mean squared error against the experimental results was 134.5% (r2 = 0.29) for the generic model and 13.8% (r2 = 0.96) for the specimen-specific models. It was found that composite femurs had variations in cortical thickness around the neck of the femur of up to 48.4%. For the first time, an experimentally validated, finite-element model of the femur is presented for use in three solvers. This model is freely available online along with all the supporting validation data.

  3. Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.

    PubMed

    Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

    2016-01-01

    A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. © The Author(s) 2014.

  4. KINEROS2-AGWA: Model Use, Calibration, and Validation

    NASA Technical Reports Server (NTRS)

    Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

    2013-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  5. KINEROS2/AGWA: Model use, calibration and validation

    USGS Publications Warehouse

    Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.

    2012-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  6. Validation of elk resource selection models with spatially independent data

    Treesearch

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  7. A validated approach for modeling collapse of steel structures

    NASA Astrophysics Data System (ADS)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  8. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  9. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  10. A Historical Forcing Ice Sheet Model Validation Framework for Greenland

    NASA Astrophysics Data System (ADS)

    Price, S. F.; Hoffman, M. J.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Kalashnikova, I.; Neumann, T.; Nowicki, S.; Perego, M.; Salinger, A.

    2014-12-01

    We propose an ice sheet model testing and validation framework for Greenland for the years 2000 to the present. Following Perego et al. (2014), we start with a realistic ice sheet initial condition that is in quasi-equilibrium with climate forcing from the late 1990's. This initial condition is integrated forward in time while simultaneously applying (1) surface mass balance forcing (van Angelen et al., 2013) and (2) outlet glacier flux anomalies, defined using a new dataset of Greenland outlet glacier flux for the past decade (Enderlin et al., 2014). Modeled rates of mass and elevation change are compared directly to remote sensing observations obtained from GRACE and ICESat. Here, we present a detailed description of the proposed validation framework including the ice sheet model and model forcing approach, the model-to-observation comparison process, and initial results comparing model output and observations for the time period 2000-2013.

  11. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  12. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  13. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems

    DTIC Science & Technology

    2015-12-01

    distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and

  14. Etudes optiques de nouveaux materiaux laser: Des orthosilicates dopes a l'ytterbium: Le yttrium (lutetium,scandium) pentoxide de silicium

    NASA Astrophysics Data System (ADS)

    Denoyer, Aurelie

    . Des mesures optiques sous champ magnetique ont egalement ete effectuees dans le but de caracteriser le comportement de ces excitations lorsqu'elles sont soumises a l'effet Zeeman. La resonance paramagnetique electronique a permis de completer cette etude de l'eclatement Zeeman suivant toutes les orientations du cristal. Enfin la fluorescence par excitation selective et la fluorescence induite par Raman FT, completent la description des niveaux d'energie et revelent l'existence d'emission cooperative de deux ions Yb3+ et de transferts d'energie. Les resultats de cette these apportent une contribution originale dans le domaine des nouveaux materiaux lasers par l'etude et la comprehension des interactions fines et des proprietes microscopiques d'un materiau en particulier. Ils debouchent a la fois sur des applications possibles dans le domaine de l'optique et des lasers, et sur la comprehension d'aspects fondamentaux. Cette these a prouve l'interet de ces matrices pour leur utilisation comme lasers solides: un fort eclatement du champ cristallin favorable a l'elaboration de laser quasi-3 niveaux, et de larges bandes d'absorption (dues a un fort couplage electron-phonon et a des raies satellites causees par une interaction d'echange entre deux ions Yb3+) qui permettent la generation d'impulsions laser ultra-courtes, l'accordabilite du laser, etc. De plus la miniaturisation des lasers est possible pour l'optique integree grace a des couches minces synthetisees par epitaxie en phase liquide dont nous avons demontre la tres bonne qualite structurale et l'ajustement possible de certains parametres. Nous avons reconstruit le tenseur g du niveau fondamental (qui donne des informations precieuses sur les fonctions d'onde), ceci dans le but d'aider les theoriciens a concevoir un modele de champ cristallin valide. Plusieurs mecanismes de transferts d'energie ont ete mis en evidence: un mecanisme de relaxation d'un site vers l'autre, un mecanisme d'emission cooperative, et un

  15. Validation of the Continuum of Care Conceptual Model for Athletic Therapy

    PubMed Central

    Lafave, Mark R.; Butterwick, Dale; Eubank, Breda

    2015-01-01

    Utilization of conceptual models in field-based emergency care currently borrows from existing standards of medical and paramedical professions. The purpose of this study was to develop and validate a comprehensive conceptual model that could account for injuries ranging from nonurgent to catastrophic events including events that do not follow traditional medical or prehospital care protocols. The conceptual model should represent the continuum of care from the time of initial injury spanning to an athlete's return to participation in their sport. Finally, the conceptual model should accommodate both novices and experts in the AT profession. This paper chronicles the content validation steps of the Continuum of Care Conceptual Model for Athletic Therapy (CCCM-AT). The stages of model development were domain and item generation, content expert validation using a three-stage modified Ebel procedure, and pilot testing. Only the final stage of the modified Ebel procedure reached a priori 80% consensus on three domains of interest: (1) heading descriptors; (2) the order of the model; (3) the conceptual model as a whole. Future research is required to test the use of the CCCM-AT in order to understand its efficacy in teaching and practice within the AT discipline. PMID:26464897

  16. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  17. Cross-Paradigm Simulation Modeling: Challenges and Successes

    DTIC Science & Technology

    2011-12-01

    is also highlighted. 2.1 Discrete-Event Simulation Discrete-event simulation ( DES ) is a modeling method for stochastic, dynamic models where...which almost anything can be coded; models can be incredibly detailed. Most commercial DES software has a graphical interface which allows the user to...results. Although the above definition is the commonly accepted definition of DES , there are two different worldviews that dominate DES modeling today: a

  18. I-15 San Diego, California, model validation and calibration report.

    DOT National Transportation Integrated Search

    2010-02-01

    The Integrated Corridor Management (ICM) initiative requires the calibration and validation of simulation models used in the Analysis, Modeling, and Simulation of Pioneer Site proposed integrated corridors. This report summarizes the results and proc...

  19. Mesures experimentales de l'impact des revetements hydrophobeset superhydrophobes sur la trainee et la portance d'un profil aerodynamique propre et glace

    NASA Astrophysics Data System (ADS)

    Villeneuve, Eric

    Ce projet, realise a la demande du Laboratoire International des Materiaux Antigivre, a pour but de mesurer et definir experimentalement l'impact de revetements hydrophobes sur les coefficients de trainee et de portance d'un profil NACA 0012. Pour ce faire, la balance aerodynamique du LIMA devait tout d'abord etre amelioree afin d'offrir une sensibilite suffisante pour realiser le projet. Plusieurs ameliorations ont ete faites, comme le changement des cellules de charge, la diminution du nombre de cellules de charge, le changement du cadre de la balance, etc. Une fois ces ameliorations terminees, la reproductibilite, l'exactitude et la sensibilite ont ete valides afin de s'assurer de la fiabilite des resultats offerts par la balance. Pour les angles d'attaque etudies avec les revetements, soient -6° et 0°, la balance a une reproductibilite de +/-2,06% a 360 000 de nombre de Reynolds. Pour valider la sensibilite, des essais a -6° et 0° d'angle d'attaque et des nombres de Reynolds de 360 000 et 500 000 ont ete faits avec des papiers sables. Les resultats de ces essais ont permis de, tracer des courbes de tendances du coefficient de trainee du NACA 0012 en fonction de la rugosite de surface et d'etablir la valeur de la sensibilite de la balance a +/-8 mu m. Cinq revetements populaires ont ete choisis pour l'experimentation, soient le Wearlon, le Staclean, le Hirec, le Phasebreak ainsi que le Nusil. Les revetements sont soumis aux memes conditions experimentales que les papiers sables, et une rugosite equivalente est trouvee par extrapolation des resultats. Cependant, les rugosites equivalentes de surfaces different entre -6° et 0°. Les essais avec le Staclean et le Hirec donnent des coefficients de trainee equivalent a ceux avec l'aluminium, alors que le Wearlon, le Nusil et le Phasebreak donnent une augmentation du coefficient de trainee de 13%, 17% et 25% respectivement par rapport a l'aluminium. Pour les coefficients de portance, la balance ne detecte pas l

  20. Animal models of binge drinking, current challenges to improve face validity.

    PubMed

    Jeanblanc, Jérôme; Rolland, Benjamin; Gierski, Fabien; Martinetti, Margaret P; Naassila, Mickael

    2018-05-05

    Binge drinking (BD), i.e., consuming a large amount of alcohol in a short period of time, is an increasing public health issue. Though no clear definition has been adopted worldwide the speed of drinking seems to be a keystone of this behavior. Developing relevant animal models of BD is a priority for gaining a better characterization of the neurobiological and psychobiological mechanisms underlying this dangerous and harmful behavior. Until recently, preclinical research on BD has been conducted mostly using forced administration of alcohol, but more recent studies used scheduled access to alcohol, to model more voluntary excessive intakes, and to achieve signs of intoxications that mimic the human behavior. The main challenges for future research are discussed regarding the need of good face validity, construct validity and predictive validity of animal models of BD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  2. Validation Assessment of a Glass-to-Metal Seal Finite-Element Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamison, Ryan Dale; Buchheit, Thomas E.; Emery, John M

    Sealing glasses are ubiquitous in high pressure and temperature engineering applications, such as hermetic feed-through electrical connectors. A common connector technology are glass-to-metal seals where a metal shell compresses a sealing glass to create a hermetic seal. Though finite-element analysis has been used to understand and design glass-to-metal seals for many years, there has been little validation of these models. An indentation technique was employed to measure the residual stress on the surface of a simple glass-to-metal seal. Recently developed rate- dependent material models of both Schott 8061 and 304L VAR stainless steel have been applied to a finite-element modelmore » of the simple glass-to-metal seal. Model predictions of residual stress based on the evolution of material models are shown. These model predictions are compared to measured data. Validity of the finite- element predictions is discussed. It will be shown that the finite-element model of the glass-to-metal seal accurately predicts the mean residual stress in the glass near the glass-to-metal interface and is valid for this quantity of interest.« less

  3. Apport des neutrons à l'analyse structurale des composés partiellement désordonnés

    NASA Astrophysics Data System (ADS)

    Cousson, A.

    2003-02-01

    La cristallographie est un outil extrêmement puissant qui pourrait être utilisé par de nombreux scientifiques dont les sujets de recherche sont en fait très éloignés. L'évolution des techniques ces dernières années a relégué par exemple la cristallographie des rayons X des petites molécules à un rôle mineur, un rôle de service. Certains ont même le sentiment semble-t-il que toutes les connaissances sont contenues dans de multiples logiciels capables par eux-mêmes de conduire une analyse structurale à un résultat correct unique. Il est souhaitable que chacun soit capable de réaliser l'étude structurale du composé qui l'intéresse et bien entendu nécessaire de comprendre ce que l'on fait, la qualité des résultats et leur analyse en dépend. L'objet de cette présentation est de montrer l'apport spécifique de la diffraction de neutrons sur monocristaux à l'étude du désordre, en particulier des atomes d'hydrogène, et ses conséquences sur la compréhension des propriétés physiques, à partir de développements et d'exemples récents.

  4. External validation of EPIWIN biodegradation models.

    PubMed

    Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

    2005-01-01

    The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6.

  5. Apport des moyens endoscopiques dans la dilatation des sténoses caustiques de l’œsophage

    PubMed Central

    Seydou, Togo; Abdoulaye, Ouattara Moussa; xing, Li; Zi, Sanogo Zimogo; sekou, Koumaré; Wen, Yang Shang; Ibrahim, Sankare; Sekou, Toure Cheik Ahmed; Boubacar, Maiga Ibrahim; Saye, Jacque; Jerome, Dakouo Dodino; Dantoumé, Toure Ousmane; Sadio, Yena

    2016-01-01

    Introduction Toutes les sténoses symptomatiques de l’œsophage peuvent être dilatées par voie endoscopique. Nous évaluons l'apport des moyens endoscopiques dans la prise en charge de la dilatation œsophagienne pour sténose caustique de l’œsophage (SCO) au Mali. Méthodes IL s'agissait d'une étude descriptive et prospective réalisée dans le service de chirurgie thoracique à l'hôpital du Mali. Au total 46 dossiers cliniques de patients on été enregistrés et subdivisés en 4 groupes en fonction de la topographie des lésions cicatricielles. Le nombre de cas d'assistance endoscopique réalisé a été déterminé afin de comprendre l'apport des moyens endoscopiques dans le succès de la dilatation des SCO. Pour les 2 différentes méthodes de dilatation utilisées, le résultat du traitement et le coût ont comparés. Résultats La FOGD a été utilisée dans 19 cas (41.30%) de dilatation avec la bougie de Savary Guillard et dans 47.82% des cas dans la dilatation de Lerut. La vidéo-laryngoscopie a été utilisé 58.69% des cas de dilatation à la bougie de Lerut. Le passage de guide métallique et / ou de fil-guide a été réalisée dans 39.13% avec la vidéo laryngoscopie et dans 58.68% avec la FOGD. Dans la comparaison des deux méthodes, il existe une différence significative dans la survenue des complications (p=0.04075), l'anesthésie générale (p=0.02287), l'accessibilité à la méthode (p=0.04805) et la mortalité (p=0.00402). Conclusion La SCO est une pathologie grave et sous évaluée au Mali. Les moyens endoscopiques contribuent considérablement au succès de la dilatation œsophagienne pour sténose caustique dans les différentes méthodes utilisées. PMID:27200129

  6. Ventilation tube insertion simulation: a literature review and validity assessment of five training models.

    PubMed

    Mahalingam, S; Awad, Z; Tolley, N S; Khemani, S

    2016-08-01

    The objective of this study was to identify and investigate the face and content validity of ventilation tube insertion (VTI) training models described in the literature. A review of literature was carried out to identify articles describing VTI simulators. Feasible models were replicated and assessed by a group of experts. Postgraduate simulation centre. Experts were defined as surgeons who had performed at least 100 VTI on patients. Seventeen experts were participated ensuring sufficient statistical power for analysis. A standardised 18-item Likert-scale questionnaire was used. This addressed face validity (realism), global and task-specific content (suitability of the model for teaching) and curriculum recommendation. The search revealed eleven models, of which only five had associated validity data. Five models were found to be feasible to replicate. None of the tested models achieved face or global content validity. Only one model achieved task-specific validity, and hence, there was no agreement on curriculum recommendation. The quality of simulation models is moderate and there is room for improvement. There is a need for new models to be developed or existing ones to be refined in order to construct a more realistic training platform for VTI simulation. © 2015 John Wiley & Sons Ltd.

  7. A Validity Agenda for Growth Models: One Size Doesn't Fit All!

    ERIC Educational Resources Information Center

    Patelis, Thanos

    2012-01-01

    This is a keynote presentation given at AERA on developing a validity agenda for growth models in a large scale (e.g., state) setting. The emphasis of this presentation was to indicate that growth models and the validity agenda designed to provide evidence in supporting the claims to be made need to be personalized to meet the local or…

  8. Fissuration en relaxation des aciers inoxydables austénitiques au voisinage des soudures

    NASA Astrophysics Data System (ADS)

    Auzoux, Q.; Allais, L.; Gourgues, A. F.; Pineau, A.

    2003-03-01

    Des fissures intergranulaires peuvent se développer au voisinage des soudures des aciers inoxydables austénitiques lorsqu'ils sont réchauffés dans le domaine de température compris entre 500^{circ}C et 700^{circ}C. A ces températures, les contraintes résiduelles post-soudage se relaxent par déformation viscoplastique. Il peut arriver que ces zones proches de la soudure soient tellement fragiles, qu'elles ne puissent accommoder cette faible déformation. Afin de préciser quelles peuvent être les modifications microstructurales qui conduisent à une telle fragilisation, on a examiné les microstructures de ces zones et révélé ainsi un écrouissage résiduel, responsable d'une forte élévation de la dureté. On a pu reproduire par hypertrempe puis laminage entre 400^{circ}C et 600^{circ}C une microstructure similaire. Des essais mécaniques (traction, fluage, relaxation, sur éprouvettes lisses et pré-fissurées) ont été réalisés à 550^{circ}C et à 600^{circ}C sur ces zones affectées simulées et sur un état de référence hypertrempé. Ils ont montré que l'écrouissage diminuait la ductilité dans le domaine de rupture intergranulaire, sans modifier qualitativement le mécanisme d'endommagement. Pendant la pré-déformation les incompatibilités de déformation entre grains conduiraient à l'existence de contraintes locales élevées qui favoriseraient la germination des cavités intergranulaires.

  9. Objective validation of central sensitization in the rat UVB and heat rekindling model

    PubMed Central

    Weerasinghe, NS; Lumb, BM; Apps, R; Koutsikou, S; Murrell, JC

    2014-01-01

    Background The UVB and heat rekindling (UVB/HR) model shows potential as a translatable inflammatory pain model. However, the occurrence of central sensitization in this model, a fundamental mechanism underlying chronic pain, has been debated. Face, construct and predictive validity are key requisites of animal models; electromyogram (EMG) recordings were utilized to objectively demonstrate validity of the rat UVB/HR model. Methods The UVB/HR model was induced on the heel of the hind paw under anaesthesia. Mechanical withdrawal thresholds (MWTs) were obtained from biceps femoris EMG responses to a gradually increasing pinch at the mid hind paw region under alfaxalone anaesthesia, 96 h after UVB irradiation. MWT was compared between UVB/HR and SHAM-treated rats (anaesthetic only). Underlying central mechanisms in the model were pharmacologically validated by MWT measurement following intrathecal N-methyl-d-aspartate (NMDA) receptor antagonist, MK-801, or saline. Results Secondary hyperalgesia was confirmed by a significantly lower pre-drug MWT {mean [±standard error of the mean (SEM)]} in UVB/HR [56.3 (±2.1) g/mm2, n = 15] compared with SHAM-treated rats [69.3 (±2.9) g/mm2, n = 8], confirming face validity of the model. Predictive validity was demonstrated by the attenuation of secondary hyperalgesia by MK-801, where mean (±SEM) MWT was significantly higher [77.2 (±5.9) g/mm2 n = 7] in comparison with pre-drug [57.8 (±3.5) g/mm2 n = 7] and saline [57.0 (±3.2) g/mm2 n = 8] at peak drug effect. The occurrence of central sensitization confirmed construct validity of the UVB/HR model. Conclusions This study used objective outcome measures of secondary hyperalgesia to validate the rat UVB/HR model as a translational model of inflammatory pain. What's already known about this topic? Most current animal chronic pain models lack translatability to human subjects. Primary hyperalgesia is an established feature of the UVB/heat rekindling

  10. Comparaison des effets des irradiations γ, X et UV dans les fibres optiques

    NASA Astrophysics Data System (ADS)

    Girard, S.; Ouerdane, Y.; Baggio, J.; Boukenter, A.; Meunier, J.-P.; Leray, J.-L.

    2005-06-01

    Les fibres optiques présentent de nombreux avantages incitant à les intégrer dans des applications devant résister aux environnements radiatifs associés aux domaines civil, spatial ou militaire. Cependant, leur exposition à un rayonnement entraîne la création de défauts ponctuels dans la silice amorphe pure ou dopée qui constitue les différentes parties de la fibre optique. Ces défauts causent, en particulier, une augmentation transitoire de l'atténuation linéique des fibres optiques responsable de la dégradation voire de la perte du signal propagé dans celles-ci. Dans cet article, nous comparons les effets de deux types d'irradiation: une impulsion X et une dose γ cumulée. Les effets de ces irradiations sont ensuite comparés avec ceux induits par une insolation ultraviolette (244 nm) sur les propriétés d'absorption des fibres optiques. Nous montrons qu'il existe des similitudes entre ces différentes excitations et qu'il est possible, sous certaines conditions, d'utiliser celles-ci afin d'évaluer la capacité de certaines fibres optiques à fonctionner dans un environnement nucléaire donné.

  11. Validation by simulation of a clinical trial model using the standardized mean and variance criteria.

    PubMed

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2006-12-01

    To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.

  12. Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

  13. Vers des boites quantiques a base de graphene

    NASA Astrophysics Data System (ADS)

    Branchaud, Simon

    Le graphene est un materiau a base de carbone qui est etudie largement depuis 2004. De tres nombreux articles ont ete publies tant sur les proprietes electroniques, qu'optiques ou mecaniques de ce materiel. Cet ouvrage porte sur l'etude des fluctuations de conductance dans le graphene, et sur la fabrication et la caracterisation de nanostructures gravees dans des feuilles de ce cristal 2D. Des mesures de magnetoresistance a basse temperature ont ete faites pres du point de neutralite de charge (PNC) ainsi qu'a haute densite electronique. On trouve deux origines aux fluctuations de conductance pres du PNC, soit des oscillations mesoscopiques provenant de l'interference quantique, et des fluctuations dites Hall quantique apparaissant a plus haut champ (>0.5T), semblant suivre les facteurs de remplissage associes aux monocouches de graphene. Ces dernieres fluctuations sont attribuees a la charge d'etats localises, et revelent un precurseur a l'effet Hall quantique, qui lui, ne se manifeste pas avant 2T. On arrive a extraire les parametres caracterisant l'echantillon a partir de ces donnees. A la fin de cet ouvrage, on effectue des mesures de transport dans des constrictions et ilots de graphene, ou des boites quantiques sont formees. A partir de ces mesures, on extrait les parametres importants de ces boites quantiques, comme leur taille et leur energie de charge.

  14. Development and validation of a cost-utility model for Type 1 diabetes mellitus.

    PubMed

    Wolowacz, S; Pearson, I; Shannon, P; Chubb, B; Gundgaard, J; Davies, M; Briggs, A

    2015-08-01

    To develop a health economic model to evaluate the cost-effectiveness of new interventions for Type 1 diabetes mellitus by their effects on long-term complications (measured through mean HbA1c ) while capturing the impact of treatment on hypoglycaemic events. Through a systematic review, we identified complications associated with Type 1 diabetes mellitus and data describing the long-term incidence of these complications. An individual patient simulation model was developed and included the following complications: cardiovascular disease, peripheral neuropathy, microalbuminuria, end-stage renal disease, proliferative retinopathy, ketoacidosis, cataract, hypoglycemia and adverse birth outcomes. Risk equations were developed from published cumulative incidence data and hazard ratios for the effect of HbA1c , age and duration of diabetes. We validated the model by comparing model predictions with observed outcomes from studies used to build the model (internal validation) and from other published data (external validation). We performed illustrative analyses for typical patient cohorts and a hypothetical intervention. Model predictions were within 2% of expected values in the internal validation and within 8% of observed values in the external validation (percentages represent absolute differences in the cumulative incidence). The model utilized high-quality, recent data specific to people with Type 1 diabetes mellitus. In the model validation, results deviated less than 8% from expected values. © 2014 Research Triangle Institute d/b/a RTI Health Solutions. Diabetic Medicine © 2014 Diabetes UK.

  15. Towards Automatic Validation and Healing of Citygml Models for Geometric and Semantic Consistency

    NASA Astrophysics Data System (ADS)

    Alam, N.; Wagner, D.; Wewetzer, M.; von Falkenhausen, J.; Coors, V.; Pries, M.

    2013-09-01

    A steadily growing number of application fields for large 3D city models have emerged in recent years. Like in many other domains, data quality is recognized as a key factor for successful business. Quality management is mandatory in the production chain nowadays. Automated domain-specific tools are widely used for validation of business-critical data but still common standards defining correct geometric modeling are not precise enough to define a sound base for data validation of 3D city models. Although the workflow for 3D city models is well-established from data acquisition to processing, analysis and visualization, quality management is not yet a standard during this workflow. Processing data sets with unclear specification leads to erroneous results and application defects. We show that this problem persists even if data are standard compliant. Validation results of real-world city models are presented to demonstrate the potential of the approach. A tool to repair the errors detected during the validation process is under development; first results are presented and discussed. The goal is to heal defects of the models automatically and export a corrected CityGML model.

  16. Prise en charge des troubles de consommation d’opioïdes en première ligne

    PubMed Central

    Srivastava, Anita; Kahan, Meldon; Nader, Maya

    2017-01-01

    Résumé Objectif Conseiller les médecins quant aux options thérapeutiques à recommander à des populations précises de patients : approche axée sur l’abstinence, traitement d’entretien par la buprénorphine-naloxone ou traitement d’entretien par la méthadone. Sources d’information Une recherche sur PubMed a été effectuée, et on a relevé dans les publications les données sur l’efficacité, l’innocuité et le profil d’effets indésirables de l’approche axée sur l’abstinence, du traitement par la buprénorphine-naloxone et du traitement par la méthadone. Les études d’observation et interventionnelles ont été incluses. Message principal La méthadone et la buprénorphine-naloxone sont substantiellement plus efficaces que l’approche axée sur l’abstinence. La méthadone présente un taux de rétention plus élevé que la buprénorphine-naloxone, alors que la buprénorphine-naloxone présente un risque plus faible de surdose. Les médecins devraient recommander le traitement par la méthadone ou la buprénorphine-naloxone plutôt que l’approche axée sur l’abstinence, et ce, à tous les groupes de patients (données de niveau I). La méthadone est préférable à la buprénorphine-naloxone chez les patients qui présentent un risque élevé d’abandon, comme les usagers d’opioïdes par injection (données de niveau I). Les jeunes et les femmes enceintes qui font usage d’opioïdes par injection devraient aussi recevoir la méthadone d’abord (données de niveau III). Si la buprénorphine-naloxone est prescrite en premier, il faut faire passer rapidement le patient à la méthadone si les symptômes de sevrage, les fortes envies ou la consommation d’opioïdes persistent malgré une dose optimale de buprénorphine-naloxone (données de niveau II). La buprénorphine-naloxone est recommandée chez les usagers d’opioïdes sur ordonnance par voie orale socialement stables, surtout s’ils ont un emploi ou si leurs

  17. Modern modeling techniques had limited external validity in predicting mortality from traumatic brain injury.

    PubMed

    van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W

    2016-10-01

    Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    NASA Astrophysics Data System (ADS)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  19. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  20. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  1. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    PubMed

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  2. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  3. Determination dans les Installations au Sol des Parametres Aerodynamiques de Stabilite des Aeronefs (Determination in Ground Facilities of Aerodynamic Stability Parameters of Aircraft),

    DTIC Science & Technology

    1979-06-01

    7 INTRODUCTION ...................................................................................... 12 CHAPITRE I : ELEMENTS DE DYNAMIQUE...daone Ie texts. 12 INTRODUCTION La ddtermination des ddrivdes adrodynamiques de stabilitd, en soufflerie et au tunnel de tir, aur des maquettes...ddtection des forces eat effectuds de fegon classiqus depuis plus de vingt ana par des barreaux d’acier dquipds de jauges de contrainte, gdndralament en

  4. MODELS FOR SUBMARINE OUTFALL - VALIDATION AND PREDICTION UNCERTAINTIES

    EPA Science Inventory

    This address reports on some efforts to verify and validate dilution models, including those found in Visual Plumes. This is done in the context of problem experience: a range of problems, including different pollutants such as bacteria; scales, including near-field and far-field...

  5. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Sewer solids separation by sedimentation--the problem of modeling, validation and transferability.

    PubMed

    Kutzner, R; Brombach, H; Geiger, W F

    2007-01-01

    Sedimentation of sewer solids in tanks, ponds and similar devices is the most relevant process for the treatment of stormwater and combined sewer overflows in urban collecting systems. In the past a lot of research work was done to develop deterministic models for the description of this separation process. But these modern models are not commonly accepted in Germany until today. Water Authorities are sceptical with regard to model validation and transferability. Within this paper it is checked whether this scepticism is reasonable. A framework-proposal for the validation of mathematical models with zero or one dimensional spatial resolution for particle separation processes for stormwater and combined sewer overflow treatment is presented. This proposal was applied to publications of repute on sewer solids separation by sedimentation. The result was that none of the investigated models described in literature passed the validation entirely. There is an urgent need for future research in sewer solids sedimentation and remobilization!

  7. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    PubMed

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. U.S. 75 Dallas, Texas, Model Validation and Calibration Report

    DOT National Transportation Integrated Search

    2010-02-01

    This report presents the model validation and calibration results of the Integrated Corridor Management (ICM) analysis, modeling, and simulation (AMS) for the U.S. 75 Corridor in Dallas, Texas. The purpose of the project was to estimate the benefits ...

  9. Lignes directrices canadiennes sur l’utilisation sécuritaire et efficace des opioïdes pour la douleur chronique non cancéreuse

    PubMed Central

    Kahan, Meldon; Mailis-Gagnon, Angela; Wilson, Lynn; Srivastava, Anita

    2011-01-01

    Résumé Objectif Présenter aux médecins de famille un résumé clinique pratique des lignes directrices canadiennes sur l’utilisation sécuritaire et efficace des opioïdes pour la douleur chronique non cancéreuse, produites par le National Opioid Use Guideline Group. Qualité des données Pour produire les lignes directrices, les chercheurs ont effectué une synthèse critique de la littérature médicale sur l’efficacité et l’innocuité des opioïdes pour la douleur chronique non cancéreuse et ont rédigé une série de recommandations. Un panel de 49 cliniciens experts de toutes les régions du Canada ont passé en revue l’ébauche et ont atteint un consensus sur 24 recommandations. Message principal Il est recommandé de faire un dépistage du risque de dépendance avant de prescrire des opioïdes. On recommande des opioïdes faibles (codéine et tramadol) pour une douleur de légère à modérée qui n’a pas répondu aux traitements de première intention. On peut essayer l’oxycodone, l’hydromorphone et la morphine chez les patients qui n’ont pas eu de soulagement avec des opioïdes plus faibles. Une faible dose initiale et une lente hausse du titrage sont recommandées, ainsi qu’une étroite surveillance du patient qu’on a d’abord renseigné. Les médecins doivent surveiller l’apparition de complications comme l’apnée du sommeil. La dose optimale est celle qui améliore le fonctionnement ou atténue les cotes d’évaluation de la douleur d’au moins 30 %. Pour la grande majorité des patients, la dose optimale sera bien en deçà de l’équivalent de 200 mg de morphine par jour. On recommande le sevrage progressif pour les patients qui n’ont pas répondu à un essai d’opioïdes adéquat. Conclusion Les opioïdes jouent un rôle important dans la prise en charge de la douleur chronique non cancéreuse, mais il faut en prescrire avec prudence pour limiter les dommages potentiels. Les nouvelles lignes directrices

  10. Development and Validation of a Predictive Model for Functional Outcome After Stroke Rehabilitation: The Maugeri Model.

    PubMed

    Scrutinio, Domenico; Lanzillo, Bernardo; Guida, Pietro; Mastropasqua, Filippo; Monitillo, Vincenzo; Pusineri, Monica; Formica, Roberto; Russo, Giovanna; Guarnaschelli, Caterina; Ferretti, Chiara; Calabrese, Gianluigi

    2017-12-01

    Prediction of outcome after stroke rehabilitation may help clinicians in decision-making and planning rehabilitation care. We developed and validated a predictive tool to estimate the probability of achieving improvement in physical functioning (model 1) and a level of independence requiring no more than supervision (model 2) after stroke rehabilitation. The models were derived from 717 patients admitted for stroke rehabilitation. We used multivariable logistic regression analysis to build each model. Then, each model was prospectively validated in 875 patients. Model 1 included age, time from stroke occurrence to rehabilitation admission, admission motor and cognitive Functional Independence Measure scores, and neglect. Model 2 included age, male gender, time since stroke onset, and admission motor and cognitive Functional Independence Measure score. Both models demonstrated excellent discrimination. In the derivation cohort, the area under the curve was 0.883 (95% confidence intervals, 0.858-0.910) for model 1 and 0.913 (95% confidence intervals, 0.884-0.942) for model 2. The Hosmer-Lemeshow χ 2 was 4.12 ( P =0.249) and 1.20 ( P =0.754), respectively. In the validation cohort, the area under the curve was 0.866 (95% confidence intervals, 0.840-0.892) for model 1 and 0.850 (95% confidence intervals, 0.815-0.885) for model 2. The Hosmer-Lemeshow χ 2 was 8.86 ( P =0.115) and 34.50 ( P =0.001), respectively. Both improvement in physical functioning (hazard ratios, 0.43; 0.25-0.71; P =0.001) and a level of independence requiring no more than supervision (hazard ratios, 0.32; 0.14-0.68; P =0.004) were independently associated with improved 4-year survival. A calculator is freely available for download at https://goo.gl/fEAp81. This study provides researchers and clinicians with an easy-to-use, accurate, and validated predictive tool for potential application in rehabilitation research and stroke management. © 2017 American Heart Association, Inc.

  11. Independent external validation of predictive models for urinary dysfunction following external beam radiotherapy of the prostate: Issues in model development and reporting.

    PubMed

    Yahya, Noorazrul; Ebert, Martin A; Bulsara, Max; Kennedy, Angel; Joseph, David J; Denham, James W

    2016-08-01

    Most predictive models are not sufficiently validated for prospective use. We performed independent external validation of published predictive models for urinary dysfunctions following radiotherapy of the prostate. Multivariable models developed to predict atomised and generalised urinary symptoms, both acute and late, were considered for validation using a dataset representing 754 participants from the TROG 03.04-RADAR trial. Endpoints and features were harmonised to match the predictive models. The overall performance, calibration and discrimination were assessed. 14 models from four publications were validated. The discrimination of the predictive models in an independent external validation cohort, measured using the area under the receiver operating characteristic (ROC) curve, ranged from 0.473 to 0.695, generally lower than in internal validation. 4 models had ROC >0.6. Shrinkage was required for all predictive models' coefficients ranging from -0.309 (prediction probability was inverse to observed proportion) to 0.823. Predictive models which include baseline symptoms as a feature produced the highest discrimination. Two models produced a predicted probability of 0 and 1 for all patients. Predictive models vary in performance and transferability illustrating the need for improvements in model development and reporting. Several models showed reasonable potential but efforts should be increased to improve performance. Baseline symptoms should always be considered as potential features for predictive models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Conceptual Modeling (CM) for Military Modeling and Simulation (M&S) (Modelisation conceptuelle (MC) pour la modelisation et la simulation (M&S) militaires)

    DTIC Science & Technology

    2012-07-01

    du monde de la modélisation et de la simulation et lui fournir des directives de mise en œuvre ; et fournir des ...définition ; rapports avec les normes ; spécification de procédure de gestion de la MC ; spécification d’artefact de MC. Considérations importantes...utilisant la présente directive comme référence. • Les VV&A (vérification, validation et acceptation) des MC doivent faire partie intégrante du

  13. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  14. Modeling the effects of argument length and validity on inductive and deductive reasoning.

    PubMed

    Rotello, Caren M; Heit, Evan

    2009-09-01

    In an effort to assess models of inductive reasoning and deductive reasoning, the authors, in 3 experiments, examined the effects of argument length and logical validity on evaluation of arguments. In Experiments 1a and 1b, participants were given either induction or deduction instructions for a common set of stimuli. Two distinct effects were observed: Induction judgments were more affected by argument length, and deduction judgments were more affected by validity. In Experiment 2, fluency was manipulated by displaying the materials in a low-contrast font, leading to increased sensitivity to logical validity. Several variants of 1-process and 2-process models of reasoning were assessed against the results. A 1-process model that assumed the same scale of argument strength underlies induction and deduction was not successful. A 2-process model that assumed separate, continuous informational dimensions of apparent deductive validity and associative strength gave the more successful account. (c) 2009 APA, all rights reserved.

  15. Reliability and validity of a Japanese version of the Cambridge depersonalization scale as a screening instrument for depersonalization disorder.

    PubMed

    Sugiura, Miyuki; Hirosawa, Masataka; Tanaka, Sumio; Nishi, Yasunobu; Yamada, Yasuyuki; Mizuno, Motoki

    2009-06-01

    The Cambridge Depersonalization Scale (CDS) is an instrument that has obtained reliability and validity in some countries for use in detecting depersonalization disorder under clinical conditions, but not yet in Japan under non-psychiatric conditions. The purposes of this study were to develop a Japanese version of the CDS (J-CDS) and to examine its reliability and validity as an instrument for screening depersonalization disorder under non-clinical conditions. The CDS was translated from English into Japanese and then back-translated into English by a native English-speaking American. After making the J-CDS, we examined its reliability and validity. Questionnaires that were composed of J-CDS, the Dissociative Experience Scale (DES), the Zung self-rating scale and the Maudsley Obsessional-Compulsive Inventory were administrated to 59 participants (12 patients with depersonalization disorder, 11 individuals who had recovered from depersonalization and 36 healthy controls). Cronbach's alpha and split-half reliability were 0.94 and 0.93, respectively. The J-CDS score in the depersonalization group was significantly higher than in the healthy control group. The J-CDS score was significantly correlated with scores of total DES, and DES-depersonalization. The best compromise between the true positive and false negative rate was at a cut-off point of 60, yielding a sensitivity of 1.00 and a specificity of 0.96. In this study, J-CDS showed good reliability and validity. The best cut-off point, when we use this for distinguishing individuals with depersonalization disorder from individuals without psychiatric disorders, is 60 points.

  16. Validating soil phosphorus routines in the SWAT model

    USDA-ARS?s Scientific Manuscript database

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  17. Criteria of validity for animal models of psychiatric disorders: focus on anxiety disorders and depression

    PubMed Central

    2011-01-01

    Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments

  18. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    NASA Astrophysics Data System (ADS)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  19. Etude theorique des fluctuations structurales dans les composes organiques a dimensionnalite reduite

    NASA Astrophysics Data System (ADS)

    Dumoulin, Benoit

    'instabilite spin-Peierls apparait ensuite vers 60K. Notre etude theorique montre qu'un modele d'electrons en interaction de type "g-ologie" avec possibilite de processus umklapp permet de bien rendre compte des proprietes physiques de ce systeme. Finalement, la troisieme partie de cette these porte sur l'etude des premiers composes organiques quasi-unidimensionnels a avoir ete synthetises: les composes de la famille du TTF-TCNQ. Notre etude theorique des instabilites structurales que presentent ces composes n'est evidemment pas la premiere. L'originalite de cette derniere est qu'elle tient compte des fortes interactions entre les electrons, presentent dans ces composes. Pour tenir compte de telles interactions, nous avons choisi la formulation "liquide de Luttinger" qui nous permet de mieux traiter ce regimne dit de couplage fort.

  20. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and

  1. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    NASA Astrophysics Data System (ADS)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  2. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  3. MolProbity’s Ultimate Rotamer-Library Distributions for Model Validation

    PubMed Central

    Hintze, Bradley J.; Lewis, Steven M.; Richardson, Jane S.; Richardson, David C.

    2016-01-01

    Here we describe the updated MolProbity rotamer-library distributions derived from an order-of-magnitude larger and more stringently quality-filtered dataset of about 8000 (vs. 500) protein chains, and we explain the resulting changes and improvements to model validation as seen by users. To include only sidechains with satisfactory justification for their given conformation, we added residue-specific filters for electron-density value and model-to-density fit. The combined new protocol retains a million residues of data, while cleaning up false-positive noise in the multi-χ datapoint distributions. It enables unambiguous characterization of conformational clusters nearly 1000-fold less frequent than the most common ones. We describe examples of local interactions that favor these rare conformations, including the role of authentic covalent bond-angle deviations in enabling presumably strained sidechain conformations. Further, along with favored and outlier, an allowed category (0.3% to 2.0% occurrence in reference data) has been added, analogous to Ramachandran validation categories. The new rotamer distributions are used for current rotamer validation in Mol-Probity and PHENIX, and for rotamer choice in PHENIX model-building and refinement. The multi-dimensional χ distributions and Top8000 reference dataset are freely available on GitHub. These rotamers are termed “ultimate” because data sampling and quality are now fully adequate for this task, and also because we believe the future of conformational validation should integrate sidechain with backbone criteria. PMID:27018641

  4. Data-Driven Residential Load Modeling and Validation in GridLAB-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gotseff, Peter; Lundstrom, Blake

    Accurately characterizing the impacts of high penetrations of distributed energy resources (DER) on the electric distribution system has driven modeling methods from traditional static snap shots, often representing a critical point in time (e.g., summer peak load), to quasi-static time series (QSTS) simulations capturing all the effects of variable DER, associated controls and hence, impacts on the distribution system over a given time period. Unfortunately, the high time resolution DER source and load data required for model inputs is often scarce or non-existent. This paper presents work performed within the GridLAB-D model environment to synthesize, calibrate, and validate 1-second residentialmore » load models based on measured transformer loads and physics-based models suitable for QSTS electric distribution system modeling. The modeling and validation approach taken was to create a typical GridLAB-D model home that, when replicated to represent multiple diverse houses on a single transformer, creates a statistically similar load to a measured load for a given weather input. The model homes are constructed to represent the range of actual homes on an instrumented transformer: square footage, thermal integrity, heating and cooling system definition as well as realistic occupancy schedules. House model calibration and validation was performed using the distribution transformer load data and corresponding weather. The modeled loads were found to be similar to the measured loads for four evaluation metrics: 1) daily average energy, 2) daily average and standard deviation of power, 3) power spectral density, and 4) load shape.« less

  5. Turbulence Modeling Validation, Testing, and Development

    NASA Technical Reports Server (NTRS)

    Bardina, J. E.; Huang, P. G.; Coakley, T. J.

    1997-01-01

    The primary objective of this work is to provide accurate numerical solutions for selected flow fields and to compare and evaluate the performance of selected turbulence models with experimental results. Four popular turbulence models have been tested and validated against experimental data often turbulent flows. The models are: (1) the two-equation k-epsilon model of Wilcox, (2) the two-equation k-epsilon model of Launder and Sharma, (3) the two-equation k-omega/k-epsilon SST model of Menter, and (4) the one-equation model of Spalart and Allmaras. The flows investigated are five free shear flows consisting of a mixing layer, a round jet, a plane jet, a plane wake, and a compressible mixing layer; and five boundary layer flows consisting of an incompressible flat plate, a Mach 5 adiabatic flat plate, a separated boundary layer, an axisymmetric shock-wave/boundary layer interaction, and an RAE 2822 transonic airfoil. The experimental data for these flows are well established and have been extensively used in model developments. The results are shown in the following four sections: Part A describes the equations of motion and boundary conditions; Part B describes the model equations, constants, parameters, boundary conditions, and numerical implementation; and Parts C and D describe the experimental data and the performance of the models in the free-shear flows and the boundary layer flows, respectively.

  6. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  7. Criterion for evaluating the predictive ability of nonlinear regression models without cross-validation.

    PubMed

    Kaneko, Hiromasa; Funatsu, Kimito

    2013-09-23

    We propose predictive performance criteria for nonlinear regression models without cross-validation. The proposed criteria are the determination coefficient and the root-mean-square error for the midpoints between k-nearest-neighbor data points. These criteria can be used to evaluate predictive ability after the regression models are updated, whereas cross-validation cannot be performed in such a situation. The proposed method is effective and helpful in handling big data when cross-validation cannot be applied. By analyzing data from numerical simulations and quantitative structural relationships, we confirm that the proposed criteria enable the predictive ability of the nonlinear regression models to be appropriately quantified.

  8. Validation of Modelled Ice Dynamics of the Greenland Ice Sheet using Historical Forcing

    NASA Astrophysics Data System (ADS)

    Hoffman, M. J.; Price, S. F.; Howat, I. M.; Bonin, J. A.; Chambers, D. P.; Tezaur, I.; Kennedy, J. H.; Lenaerts, J.; Lipscomb, W. H.; Neumann, T.; Nowicki, S.; Perego, M.; Saba, J. L.; Salinger, A.; Guerber, J. R.

    2015-12-01

    Although ice sheet models are used for sea level rise projections, the degree to which these models have been validated by observations is fairly limited, due in part to the limited duration of the satellite observation era and the long adjustment time scales of ice sheets. Here we describe a validation framework for the Greenland Ice Sheet applied to the Community Ice Sheet Model by forcing the model annually with flux anomalies at the major outlet glaciers (Enderlin et al., 2014, observed from Landsat/ASTER/Operation IceBridge) and surface mass balance (van Angelen et al., 2013, calculated from RACMO2) for the period 1991-2012. The ice sheet model output is compared to ice surface elevation observations from ICESat and ice sheet mass change observations from GRACE. Early results show promise for assessing the performance of different model configurations. Additionally, we explore the effect of ice sheet model resolution on validation skill.

  9. Validation of the Colorado Retinopathy of Prematurity Screening Model.

    PubMed

    McCourt, Emily A; Ying, Gui-Shuang; Lynch, Anne M; Palestine, Alan G; Wagner, Brandie D; Wymore, Erica; Tomlinson, Lauren A; Binenbaum, Gil

    2018-04-01

    The Colorado Retinopathy of Prematurity (CO-ROP) model uses birth weight, gestational age, and weight gain at the first month of life (WG-28) to predict risk of severe retinopathy of prematurity (ROP). In previous validation studies, the model performed very well, predicting virtually all cases of severe ROP and potentially reducing the number of infants who need ROP examinations, warranting validation in a larger, more diverse population. To validate the performance of the CO-ROP model in a large multicenter cohort. This study is a secondary analysis of data from the Postnatal Growth and Retinopathy of Prematurity (G-ROP) Study, a retrospective multicenter cohort study conducted in 29 hospitals in the United States and Canada between January 2006 and June 2012 of 6351 premature infants who received ROP examinations. Sensitivity and specificity for severe (early treatment of ROP [ETROP] type 1 or 2) ROP, and reduction in infants receiving examinations. The CO-ROP model was applied to the infants in the G-ROP data set with all 3 data points (infants would have received examinations if they met all 3 criteria: birth weight, <1501 g; gestational age, <30 weeks; and WG-28, <650 g). Infants missing WG-28 information were included in a secondary analysis in which WG-28 was considered fewer than 650 g. Of 7438 infants in the G-ROP study, 3575 (48.1%) were girls, and maternal race/ethnicity was 2310 (31.1%) African American, 3615 (48.6%) white, 233 (3.1%) Asian, 40 (0.52%) American Indian/Alaskan Native, and 93 (1.3%) Pacific Islander. In the study cohort, 747 infants (11.8%) had type 1 or 2 ROP, 2068 (32.6%) had lower-grade ROP, and 3536 (55.6%) had no ROP. The CO-ROP model had a sensitivity of 96.9% (95% CI, 95.4%-97.9%) and a specificity of 40.9% (95% CI, 39.3%-42.5%). It missed 23 (3.1%) infants who developed severe ROP. The CO-ROP model would have reduced the number of infants who received examinations by 26.1% (95% CI, 25.0%-27.2%). The CO-ROP model demonstrated high

  10. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns

    PubMed Central

    Chérel, Guillaume; Cottineau, Clémentine; Reuillon, Romain

    2015-01-01

    Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model’s predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic. PMID:26368917

  11. Validation of Fatigue Modeling Predictions in Aviation Operations

    NASA Technical Reports Server (NTRS)

    Gregory, Kevin; Martinez, Siera; Flynn-Evans, Erin

    2017-01-01

    Bio-mathematical fatigue models that predict levels of alertness and performance are one potential tool for use within integrated fatigue risk management approaches. A number of models have been developed that provide predictions based on acute and chronic sleep loss, circadian desynchronization, and sleep inertia. Some are publicly available and gaining traction in settings such as commercial aviation as a means of evaluating flight crew schedules for potential fatigue-related risks. Yet, most models have not been rigorously evaluated and independently validated for the operations to which they are being applied and many users are not fully aware of the limitations in which model results should be interpreted and applied.

  12. Optimisation des trajectoires verticales par la methode de la recherche de l'harmonie =

    NASA Astrophysics Data System (ADS)

    Ruby, Margaux

    Face au rechauffement climatique, les besoins de trouver des solutions pour reduire les emissions de CO2 sont urgentes. L'optimisation des trajectoires est un des moyens pour reduire la consommation de carburant lors d'un vol. Afin de determiner la trajectoire optimale de l'avion, differents algorithmes ont ete developpes. Le but de ces algorithmes est de reduire au maximum le cout total d'un vol d'un avion qui est directement lie a la consommation de carburant et au temps de vol. Un autre parametre, nomme l'indice de cout est considere dans la definition du cout de vol. La consommation de carburant est fournie via des donnees de performances pour chaque phase de vol. Dans le cas de ce memoire, les phases d'un vol complet, soit, une phase de montee, une phase de croisiere et une phase de descente, sont etudies. Des " marches de montee " etaient definies comme des montees de 2 000ft lors de la phase de croisiere sont egalement etudiees. L'algorithme developpe lors de ce memoire est un metaheuristique, nomme la recherche de l'harmonie, qui, concilie deux types de recherches : la recherche locale et la recherche basee sur une population. Cet algorithme se base sur l'observation des musiciens lors d'un concert, ou plus exactement sur la capacite de la musique a trouver sa meilleure harmonie, soit, en termes d'optimisation, le plus bas cout. Differentes donnees d'entrees comme le poids de l'avion, la destination, la vitesse de l'avion initiale et le nombre d'iterations doivent etre, entre autre, fournies a l'algorithme pour qu'il soit capable de determiner la solution optimale qui est definie comme : [Vitesse de montee, Altitude, Vitesse de croisiere, Vitesse de descente]. L'algorithme a ete developpe a l'aide du logiciel MATLAB et teste pour plusieurs destinations et plusieurs poids pour un seul type d'avion. Pour la validation, les resultats obtenus par cet algorithme ont ete compares dans un premier temps aux resultats obtenus suite a une recherche exhaustive qui a

  13. Des recommandations probantes pour surveiller l’innocuité des antipsychotiques de deuxième génération chez les enfants et les adolescents

    PubMed Central

    Pringsheim, Tamara; Panagiotopoulos, Constadina; Davidson, Jana; Ho, Josephine

    2012-01-01

    HISTORIQUE : Au Canada, l’utilisation d’antipsychotiques, notamment les antipsychotiques de deuxième génération (ADG), a augmenté de façon considérable depuis cinq ans chez les enfants ayant des troubles de santé mentale. Ces médicaments ont le potentiel de causer de graves complications métaboliques et neurologiques lorsqu’on les utilise de manière chronique. OBJECTIF : Synthétiser les données probantes relatives aux effets secondaires métaboliques et neurologiques précis associés à l’usage d’ADG chez les enfants et fournir des recommandations probantes sur la surveillance de ces effets secondaires. MÉTHODOLOGIE : Les auteurs ont procédé à une analyse systématique des essais cliniques contrôlés des ADG auprès d’enfants. Ils ont fait des recommandations à l’égard de la surveillance de l’innocuité des ADG d’après un modèle de classification fondé sur le système GRADE (système de notation de l’évaluation et de l’élaboration des recommandations). Lorsque les données probantes n’étaient pas suffisantes, ils fondaient leurs recommandations sur le consensus et l’avis d’experts. Un groupe consensuel multidisciplinaire a analysé toutes les données probantes pertinentes et est parvenu à un consensus à l’égard des recommandations. RÉSULTATS : Les recommandations probantes portant sur la surveillance de l’innocuité des ADG figurent dans les présentes lignes directrices. Les auteurs indiquent la qualité des recommandations relatives à des examens physiques et tests de laboratoire précis à l’égard de chaque ADG à des moments déterminés. CONCLUSION : De multiples essais aléatoires et contrôlés ont permis d’évaluer l’efficacité de bon nombre des ADG utilisés pour traiter les troubles de santé mentale en pédiatrie. Toutefois, leurs avantages ne sont pas sans risques : on observe à la fois des effets secondaires métaboliques et neurologiques chez les enfants traités au moyen d

  14. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  15. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Technical Reports Server (NTRS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard; Hearty, Thomas; hide

    2011-01-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole disk Earth model simulations used to better under- stand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute s Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model (Tinetti et al., 2006a,b). This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of approx.100 pixels on the visible disk, and four categories of water clouds, which were defined using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to the Earth s lightcurve, absolute brightness, and spectral data, with a root-mean-square error of typically less than 3% for the multiwavelength lightcurves, and residuals of approx.10% for the absolute brightness throughout the visible and NIR spectral range. We extend our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of approx.7%, and temperature errors of less than 1K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated

  16. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  17. Earth as an extrasolar planet: Earth model validation using EPOXI earth observations.

    PubMed

    Robinson, Tyler D; Meadows, Victoria S; Crisp, David; Deming, Drake; A'hearn, Michael F; Charbonneau, David; Livengood, Timothy A; Seager, Sara; Barry, Richard K; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M; McFadden, Lucy A; Wellnitz, Dennis D

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  18. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    PubMed Central

    Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-01-01

    Abstract The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward

  19. Computational Modeling and Validation for Hypersonic Inlets

    NASA Technical Reports Server (NTRS)

    Povinelli, Louis A.

    1996-01-01

    Hypersonic inlet research activity at NASA is reviewed. The basis for the paper is the experimental tests performed with three inlets: the NASA Lewis Research Center Mach 5, the McDonnell Douglas Mach 12, and the NASA Langley Mach 18. Both three-dimensional PNS and NS codes have been used to compute the flow within the three inlets. Modeling assumptions in the codes involve the turbulence model, the nature of the boundary layer, shock wave-boundary layer interaction, and the flow spilled to the outside of the inlet. Use of the codes and the experimental data are helping to develop a clearer understanding of the inlet flow physics and to focus on the modeling improvements required in order to arrive at validated codes.

  20. Influence du comportement des accompagnants sur le vécu des patients admis pour hémorragies digestives hautes au CHU campus de Lomé (Togo)

    PubMed Central

    Bagny, Aklesso; Dusabe, Angelique; Bouglouga, Oumboma; Lawson-ananisoh, Mawuli Late; Kaaga, Yeba Laconi; Djibril, Mohaman Awalou; Soedje, Kokou Mensah; Dassa, Simliwa Kolou; Redah, Datouda

    2014-01-01

    Introduction L'hémorragie digestive haute est une urgence, qui constitue souvent pour les patients un danger mortel suscitant inquiétude et agitation. Dans cet état, le patient dépend de ses accompagnants pour ses soins et pour honorer le traitement; mais souvent, il a été observé une discordance entre l'urgence et les comportements des accompagnants. Le but de cette étude était de décrire les facteurs socioéconomiques et psychologiques pouvant influencer les comportements des accompagnants des patients admis pour HDH, estimer l'indice de relation entre ces comportements et les facteurs associés d'une part et le vécu des patients admis pour HDH d'autre part. Méthodes Il s'agit d'une étude prospective menée de Septembre 2010 à Juin 2011 (soit 10 mois). Nous avions utilisé l'entretien semi-dirigé et l'observation directe pour collecter nos données, ces dernières avaient été traitées par les méthodes statistiques et d'analyse de contenu. Résultats Dans la présente étude, les comportements des accompagnants des patients admis pour HDH sont en majorité marqués par l'abandon (84%) et le manque de sollicitude (80,2%). Ces comportements sont souvent stimulés par les facteurs socioéconomiques tels que les difficultés économiques (83,2%), des conflits intrafamiliaux (85,1%) et des représentations (maladie incurable ou envoûtement) de la maladie par les accompagnants (73,3%) des cas. Quant aux patients, ils vivent ces comportements comme étant des menaces de mort ou des rejets (77,20%) et comme étant une dévalorisation ou une humiliation de la part de leurs accompagnants (70,30%). Les résultats confirment l'existence de lien significatif entre les comportements des accompagnants et les facteurs socio économiques, entre les comportements des accompagnants et des facteurs psychologiques, et entre le vécu des patients admis pour l'HDH et les comportements des accompagnants. Conclusion Des études ultérieures devraient aborder les points

  1. Validation of Slosh Modeling Approach Using STAR-CCM+

    NASA Technical Reports Server (NTRS)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  2. Quantitative impedance measurements for eddy current model validation

    NASA Astrophysics Data System (ADS)

    Khan, T. A.; Nakagawa, N.

    2000-05-01

    This paper reports on a series of laboratory-based impedance measurement data, collected by the use of a quantitatively accurate, mechanically controlled measurement station. The purpose of the measurement is to validate a BEM-based eddy current model against experiment. We have therefore selected two "validation probes," which are both split-D differential probes. Their internal structures and dimensions are extracted from x-ray CT scan data, and thus known within the measurement tolerance. A series of measurements was carried out, using the validation probes and two Ti-6Al-4V block specimens, one containing two 1-mm long fatigue cracks, and the other containing six EDM notches of a range of sizes. Motor-controlled XY scanner performed raster scans over the cracks, with the probe riding on the surface with a spring-loaded mechanism to maintain the lift off. Both an impedance analyzer and a commercial EC instrument were used in the measurement. The probes were driven in both differential and single-coil modes for the specific purpose of model validation. The differential measurements were done exclusively by the eddyscope, while the single-coil data were taken with both the impedance analyzer and the eddyscope. From the single-coil measurements, we obtained the transfer function to translate the voltage output of the eddyscope into impedance values, and then used it to translate the differential measurement data into impedance results. The presentation will highlight the schematics of the measurement procedure, a representative of raw data, explanation of the post data-processing procedure, and then a series of resulting 2D flaw impedance results. A noise estimation will be given also, in order to quantify the accuracy of these measurements, and to be used in probability-of-detection estimation.—This work was supported by the NSF Industry/University Cooperative Research Program.

  3. Validation of the factor structure of the adolescent dissociative experiences scale in a sample of trauma-exposed detained youth.

    PubMed

    Kerig, Patricia K; Charak, Ruby; Chaplo, Shannon D; Bennett, Diana C; Armour, Cherie; Modrowski, Crosby A; McGee, Andrew B

    2016-09-01

    The inclusion of a dissociative subtype in the Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM–5 ) criteria for the diagnosis of posttraumatic stress disorder (PTSD) has highlighted the need for valid and reliable measures of dissociative symptoms across developmental periods. The Adolescent Dissociative Experiences Scale (A-DES) is 1 of the few measures validated for young persons, but previous studies have yielded inconsistent results regarding its factor structure. Further, research to date on the A-DES has been based upon nonclinical samples of youth or those without a known history of trauma. To address these gaps in the literature, the present study investigated the factor structure and construct validity of the A-DES in a sample of highly trauma-exposed youth involved in the juvenile justice system. A sample of 784 youth (73.7% boys) recruited from a detention center completed self-report measures of trauma exposure and the A-DES, a subset of whom (n = 212) also completed a measure of PTSD symptoms. Confirmatory factor analyses revealed a best fitting 3-factor structure comprised of depersonalization or derealization, amnesia, and loss of conscious control, with configural and metric invariance across gender. Logistic regression analyses indicated that the depersonalization or derealization factor effectively distinguished between those youth who did and did not likely meet criteria for a diagnosis of PTSD as well as those with PTSD who did and did not likely meet criteria for the dissociative subtype. These results provide support for the multidimensionality of the construct of posttraumatic dissociation and contribute to the understanding of the dissociative subtype of PTSD among adolescents. (PsycINFO Database Record PsycINFO Database Record (c) 2016 APA, all rights reserved

  4. Dynamic modelling and experimental validation of three wheeled tilting vehicles

    NASA Astrophysics Data System (ADS)

    Amati, Nicola; Festini, Andrea; Pelizza, Luigi; Tonoli, Andrea

    2011-06-01

    The present paper describes the study of the stability in the straight running of a three-wheeled tilting vehicle for urban and sub-urban mobility. The analysis was carried out by developing a multibody model in the Matlab/SimulinkSimMechanics environment. An Adams-Motorcycle model and an equivalent analytical model were developed for the cross-validation and for highlighting the similarities with the lateral dynamics of motorcycles. Field tests were carried out to validate the model and identify some critical parameters, such as the damping on the steering system. The stability analysis demonstrates that the lateral dynamic motions are characterised by vibration modes that are similar to that of a motorcycle. Additionally, it shows that the wobble mode is significantly affected by the castor trail, whereas it is only slightly affected by the dynamics of the front suspension. For the present case study, the frame compliance also has no influence on the weave and wobble.

  5. Experimental Validation of Model Updating and Damage Detection via Eigenvalue Sensitivity Methods with Artificial Boundary Conditions

    DTIC Science & Technology

    2017-09-01

    VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS by Matthew D. Bouwense...VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY CONDITIONS 5. FUNDING NUMBERS 6. AUTHOR...unlimited. EXPERIMENTAL VALIDATION OF MODEL UPDATING AND DAMAGE DETECTION VIA EIGENVALUE SENSITIVITY METHODS WITH ARTIFICIAL BOUNDARY

  6. Scramjet Thermal Management (Tenue thermique des superstatoreacteurs)

    DTIC Science & Technology

    2010-09-01

    RTO-EN-AVT-185 13 - 1 Scramjet Thermal Management (Tenue thermique des superstatoréacteurs) Marc BOUCHEZ MBDA France Rond-point Marcel...SPEED PROPULSION: ENGINE DESIGN – INTEGRATION AND THERMAL MANAGEMENT” fut donné à l’Institut Von Karman en septembre 2010 et à l’Université Wright de ...Dayton (Ohio, USA) en décembre 2010. Le présent cours aborde la tenue thermique. Il est basé sur des informations publiées, et des détails

  7. Selection, calibration, and validation of models of tumor growth.

    PubMed

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  8. Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.

    PubMed

    Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle

    2017-02-01

    To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.

  9. Classification of 13 DES supernova with OzDES

    NASA Astrophysics Data System (ADS)

    Macaulay, E.; Allam, S.; Tucker, D.; Asorey, J.; Davis, T. M.; Kremlin, A.; Lewis, G. F.; Lidman, C.; Martini, P.; Sommer, N. E.; Tucker, B. E.; Aldering, G.; Gupta, R.; Kim, A. G.; Thomas, R. C.; Barbary, K.; Bloom, J. S.; Goldstein, D.; Nugent, P.; Perlmutter, S.; Foley, R. J.; Pan, Y.-C.; Casas, R.; Castander, F. J.; Papadopoulos, A.; Morganson, E.; Desai, S.; Paech, K.; Smith, R. C.; Schubnell, M.; Moller, A.; Muthukrishna, D. R.; Yuan, F.; Zhang, B.; Hinton, S.; Parkinson, D.; Uddin, S.; Kessler, R.; Lasker, J.; Scolnic, D.; Brout, D. J.; D'Andrea, C.; Gladney, L.; March, M.; Sako, M.; Wolf, R. C.; Brown, P. J.; Krisciunas, K.; Suntzeff, N.; Nichol, R.; Maartens, R.; Childress, M.; Prajs, S.; Smith, M.; Sullivan, M.; Kovacs, E.; Kuhlmann, S.; Spinka, H.; Ahn, E.; Finley, D. A.; Frieman, J.; Marriner, J.; Wester, W.

    2018-01-01

    We report new spectroscopic classifications by OzDES of supernovae discovered by the Dark Energy Survey (ATEL #4668). The spectra (370-885nm) were obtained with the AAOmega Spectrograph (Saunders et al. 2004, SPIE, 5492, 389) and the 2dF fibre positioner at the Anglo-Australian Telescope (AAT).

  10. Classification of 26 DES supernova with OzDES

    NASA Astrophysics Data System (ADS)

    Calcino, J.; Davis, T. M.; Hoormann, J. K.; Asorey, J.; Glazebrook, K.; Carnero, A.; Lidman, C.; Martini, P.; Moller, A.; Sommer, N. E.; Sharp, R.; Tucker, B. E.; Barbary, K.; Bloom, J. S.; Goldstein, D.; Nugent, P.; Perlmutter, S.; Foley, R. J.; Pan, Y.-C.; Casas, R.; Castander, F. J.; Papadopoulos, A.; Morganson, E.; Desai, S.; Paech, K.; Smith, R. C.; Schubnell, M.; Muthukrishna, D. R.; Yuan, F.; Zhang, B.; Hinton, S.; Lewis, G. F.; Parkinson, D.; Uddin, S.; Kessler, R.; Lasker, J.; Scolnic, D.; Brout, D. J.; D'Andrea, C.; Gladney, L.; March, M.; Sako, M.; Wolf, R. C.; Brown, P. J.; Krisciunas, K.; Suntzeff, N.; Macaulay, E.; Nichol, R.; Maartens, R.; Childress, M.; Prajs, S.; Smith, M.; Sullivan, M.; Kovacs, E.; Kuhlmann, S.; Spinka, H.; Ahn, E.; Finley, D. A.; Frieman, J.; Marriner, J.; Wester, W.; Aldering, G.; Gupta, R.; Kim, A. G.; Thomas, R. C.

    2018-01-01

    We report new spectroscopic classifications by OzDES of supernovae discovered by the Dark Energy Survey (ATEL #4668). The spectra (370-885nm) were obtained with the AAOmega Spectrograph (Saunders et al. 2004, SPIE, 5492, 389) and the 2dF fibre positioner at the Anglo-Australian Telescope (AAT).

  11. Classification of 25 DES supernova with OzDES

    NASA Astrophysics Data System (ADS)

    Calcino, J.; Davis, T. M.; Hoormann, J. K.; Asorey, J.; Glazebrook, K.; Carnero, A.; Lidman, C.; Martini, P.; Moller, A.; Sommer, N. E.; Sharp, R.; Tucker, B. E.; Barbary, K.; Bloom, J. S.; Goldstein, D.; Nugent, P.; Perlmutter, S.; Foley, R. J.; Pan, Y.-C.; Casas, R.; Castander, F. J.; Papadopoulos, A.; Morganson, E.; Desai, S.; Paech, K.; Smith, R. C.; Schubnell, M.; Muthukrishna, D. R.; Yuan, F.; Zhang, B.; Hinton, S.; Lewis, G. F.; Parkinson, D.; Uddin, S.; Kessler, R.; Lasker, J.; Scolnic, D.; Brout, D. J.; D'Andrea, C.; Gladney, L.; March, M.; Sako, M.; Wolf, R. C.; Brown, P. J.; Krisciunas, K.; Suntzeff, N.; Macaulay, E.; Nichol, R.; Maartens, R.; Childress, M.; Prajs, S.; Smith, M.; Sullivan, M.; Kovacs, E.; Kuhlmann, S.; Spinka, H.; Ahn, E.; Finley, D. A.; Frieman, J.; Marriner, J.; Wester, W.; Aldering, G.; Gupta, R.; Kim, A. G.; Thomas, R. C.

    2018-01-01

    We report new spectroscopic classifications by OzDES of supernovae discovered by the Dark Energy Survey (ATEL #4668). The spectra (370-885nm) were obtained with the AAOmega Spectrograph (Saunders et al. 2004, SPIE, 5492, 389) and the 2dF fibre positioner at the Anglo-Australian Telescope (AAT).

  12. Classification of 18 DES supernova with OzDES

    NASA Astrophysics Data System (ADS)

    Swann, E. S.; Lewis, G. F.; Lidman, C.; Panther, F. H.; Sharp, R.; Sommer, N. E.; Tucker, B. E.; Muthukrishna, D.; Casas, R.; Castander, F. J.; Papadopoulos, A.; Morganson, E.; Desai, S.; Paech, K.; Smith, R. C.; Schubnell, M.; Moller, A. R.; Yuan, F.; Zhang, B.; Davis, T. M.; Hinton, S.; Asorey, J.; Uddin, S.; Kessler, R.; Lasker, J.; Scolnic, D.; Brout, D. J.; D'Andrea, C.; Gladney, L.; March, M.; Sako, M.; Wolf, R. C.; Brown, P. J.; Krisciunas, K.; Suntzeff, N.; Macaulay, E.; Nichol, R.; Maartens, R.; Childress, M.; Prajs, S.; Smith, M.; Sullivan, M.; Kovacs, E.; Kuhlmann, S.; Spinka, H.; Ahn, E.; Finley, D. A.; Frieman, J.; Marriner, J.; Wester, W.; Aldering, G.; Gupta, R.; Kim, A. G.; Thomas, R. C.; Barbary, K.; Bloom, J. S.; Goldstein, D.; Nugent, P.; Perlmutter, S.; Foley, R. J.; Pan, Y.-C.

    2018-06-01

    We report new spectroscopic classifications by OzDES of supernovae discovered by the Dark Energy Survey (ATEL #4668). The spectra (370-885nm) were obtained with the AAOmega Spectrograph (Saunders et al. 2004, SPIE, 5492, 389) and the 2dF fibre positioner at the Anglo-Australian Telescope (AAT).

  13. Classification of 17 DES supernova with OzDES

    NASA Astrophysics Data System (ADS)

    Muthukrishna, D.; Sharp, R. G.; Tucker, B. E.; Moller, A.; Sommer, N. E.; Asorey, J.; Lewis, G. F.; Lidman, C.; Mould, J.; Macaulay, E.; Maartens, R.; Kovacs, E.; Kuhlmann, S.; Spinka, H.; Ahn, E.; Finley, D. A.; Frieman, J.; Marriner, J.; Wester, W.; Aldering, G.; Gupta, R.; Kim, A. G.; Thomas, R. C.; Barbary, K.; Bloom, J. S.; Goldstein, D.; Nugent, P.; Perlmutter, S.; Foley, R. J.; Pan, Y.-C.; Casas, R.; Castander, F. J.; Papadopoulos, A.; Morganson, E.; Desai, S.; Paech, K.; Smith, R. C.; Schubnell, M.; Yuan, F.; Zhang, B.; Davis, T. M.; Hinton, S.; Parkinson, D.; Uddin, S.; Kessler, R.; Lasker, J.; Scolnic, D.; Brout, D. J.; D'Andrea, C.; Gladney, L.; March, M.; Sako, M.; Wolf, R. C.; Brown, P. J.; Krisciunas, K.; Suntzeff, N.; Nichol, R.; Childress, M.; Prajs, S.; Smith, M.; Sullivan, M.

    2017-09-01

    We report new spectroscopic classifications by OzDES of supernovae discovered by the Dark Energy Survey (ATEL #4668). The spectra (370-885nm) were obtained with the AAOmega Spectrograph (Saunders et al. 2004, SPIE, 5492, 389) and the 2dF fibre positioner at the Anglo-Australian Telescope (AAT).

  14. Summary of EASM Turbulence Models in CFL3D With Validation Test Cases

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Gatski, Thomas B.

    2003-01-01

    This paper summarizes the Explicit Algebraic Stress Model in k-omega form (EASM-ko) and in k-epsilon form (EASM-ke) in the Reynolds-averaged Navier-Stokes code CFL3D. These models have been actively used over the last several years in CFL3D, and have undergone some minor modifications during that time. Details of the equations and method for coding the latest versions of the models are given, and numerous validation cases are presented. This paper serves as a validation archive for these models.

  15. Rationality Validation of a Layered Decision Model for Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Huaqiang; Alves-Foss, James; Zhang, Du

    2007-08-31

    We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDMmore » rationality through simulation.« less

  16. A method for landing gear modeling and simulation with experimental validation

    NASA Technical Reports Server (NTRS)

    Daniels, James N.

    1996-01-01

    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  17. The Development and Validation of a New Land Surface Model for Regional and Global Climate Modeling

    NASA Astrophysics Data System (ADS)

    Lynch-Stieglitz, Marc

    1995-11-01

    A new land-surface scheme intended for use in mesoscale and global climate models has been developed and validated. The ground scheme consists of 6 soil layers. Diffusion and a modified tipping bucket model govern heat and water flow respectively. A 3 layer snow model has been incorporated into a modified BEST vegetation scheme. TOPMODEL equations and Digital Elevation Model data are used to generate baseflow which supports lowland saturated zones. Soil moisture heterogeneity represented by saturated lowlands subsequently impacts watershed evapotranspiration, the partitioning of surface fluxes, and the development of the storm hydrograph. Five years of meteorological and hydrological data from the Sleepers river watershed located in the eastern highlands of Vermont where winter snow cover is significant were then used to drive and validate the new scheme. Site validation data were sufficient to evaluate model performance with regard to various aspects of the watershed water balance, including snowpack growth/ablation, the spring snowmelt hydrograph, storm hydrographs, and the seasonal development of watershed evapotranspiration and soil moisture. By including topographic effects, not only are the main spring hydrographs and individual storm hydrographs adequately resolved, but the mechanisms generating runoff are consistent with current views of hydrologic processes. The seasonal movement of the mean water table depth and the saturated area of the watershed are consistent with site data and the overall model hydroclimatology, including the surface fluxes, seems reasonable.

  18. Influence des défauts de la structure du verre sur la résistance mécanique des fibres optiques

    NASA Astrophysics Data System (ADS)

    Chmel, A.; Baptizmanski, V. V.; Kharshak, A. A.

    1992-12-01

    For silica (pure and doped) optical fibers prepared from preforms irradiated by thermal neutrons and Ar^+ ions, the measurements of lifetime under transverse stress were carried out. It is found that the neutron bombardment leads to decrease of the lifetime and the strength of fiber while the ion implantation results in increasing of these parameters. The influence of particle irradiation on the glass strength is explained by the generation of structural defects which were observed by the infrared an Raman spectroscopy methods. The direction of the change of mechanical properties of fibers is determined by the type of defects and their distribution in the cross-section of fiber. Des préformes de fibres optiques de silice ont été exposées à un bombardement par des faisceaux de neutrons thermiques et d'ions Ar^+ à énergie de 40 keV afin d'engendrer des défauts de la structure respectivement dans le volume ou dans la couche superficielle des échantillons. Les fibres obtenues des préformes irradiées et non irradiées ont été mises en charge de flexion. On a déterminé leur temps de rupture sous diverses contraintes de traction sur la surface extérieure de la fibre en flexion. On a observé une diminution du temps de rupture des fibres après le bombardement neutronique et une augmentation après l'implantation ionique. L'analyse de la nature des défauts en faisant appel à la spectroscopie infrarouge et Raman a fait ressortir que l'irradiation aux ions entraînait essentiellement des liaisons chimiques dans une fine couche superficielle de l'échantillon alors que l'irradiation aux neutrons occasionnait une déformation des zones du réseau silicique dans toute la section de la préforme et de la fibre. L'amélioration de la résistance mécanique s'explique par une augmentation de la mobilité des éléments structuraux de la matrice de verre par suite d'une rupture partielle des liaisons chimiques et sa diminution par l'apparition de concentrateurs

  19. Etude de la dynamique des porteurs dans des nanofils de silicium par spectroscopie terahertz

    NASA Astrophysics Data System (ADS)

    Beaudoin, Alexandre

    Ce memoire presente une etude des proprietes de conduction electrique et de la dynamique temporelle des porteurs de charges dans des nanofils de silicium sondes par rayonnement terahertz. Les cas de nanofils de silicium non intentionnellement dopes et dopes type n sont compares pour differentes configurations du montage experimental. Les mesures de spectroscopie terahertz en transmission montre qu'il est possible de detecter la presence de dopants dans les nanofils via leur absorption du rayonnement terahertz (˜ 1--12 meV). Les difficultes de modelisation de la transmission d'une impulsion electromagnetique dans un systeme de nanofils sont egalement discutees. La detection differentielle, une modification au systeme de spectroscopie terahertz, est testee et ses performances sont comparees au montage de caracterisation standard. Les instructions et des recommendations pour la mise en place de ce type de mesure sont incluses. Les resultats d'une experience de pompe optique-sonde terahertz sont egalement presentes. Dans cette experience, les porteurs de charge temporairement crees suite a l'absorption de la pompe optique (lambda ˜ 800 nm) dans les nanofils (les photoporteurs) s'ajoutent aux porteurs initialement presents et augmentent done l'absorption du rayonnement terahertz. Premierement, l'anisotropie de l'absorption terahertz et de la pompe optique par les nanofils est demontree. Deuxiemement, le temps de recombinaison des photoporteurs est etudie en fonction du nombre de photoporteurs injectes. Une hypothese expliquant les comportements observes pour les nanofils non-dopes et dopes-n est presentee. Troisiemement, la photoconductivite est extraite pour les nanofils non-dopes et dopes-n sur une plage de 0.5 a 2 THz. Un lissage sur la photoconductivite permet d'estimer le nombre de dopants dans les nanofils dopes-n. Mots-cles: nanofil, silicium, terahertz, conductivite, spectroscopie, photoconductivite.

  20. Validation of landsurface processes in the AMIP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, T J

    The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less

  1. A new simple local muscle recovery model and its theoretical and experimental validation.

    PubMed

    Ma, Liang; Zhang, Wei; Wu, Su; Zhang, Zhanwu

    2015-01-01

    This study was conducted to provide theoretical and experimental validation of a local muscle recovery model. Muscle recovery has been modeled in different empirical and theoretical approaches to determine work-rest allowance for musculoskeletal disorder (MSD) prevention. However, time-related parameters and individual attributes have not been sufficiently considered in conventional approaches. A new muscle recovery model was proposed by integrating time-related task parameters and individual attributes. Theoretically, this muscle recovery model was compared to other theoretical models mathematically. Experimentally, a total of 20 subjects participated in the experimental validation. Hand grip force recovery and shoulder joint strength recovery were measured after a fatiguing operation. The recovery profile was fitted by using the recovery model, and individual recovery rates were calculated as well after fitting. Good fitting values (r(2) > .8) were found for all the subjects. Significant differences in recovery rates were found among different muscle groups (p < .05). The theoretical muscle recovery model was primarily validated by characterization of the recovery process after fatiguing operation. The determined recovery rate may be useful to represent individual recovery attribute.

  2. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed

  3. Detached-Eddy Simulation Based on the V2-F Model

    NASA Technical Reports Server (NTRS)

    Jee, Sol Keun; Shariff, Karim R.

    2012-01-01

    Detached-eddy simulation (DES) based on the v(sup 2)-f Reynolds-averaged Navier-Stokes (RANS) model is developed and tested. The v(sup 2)-f model incorporates the anisotropy of near-wall turbulence which is absent in other RANS models commonly used in the DES community. The v(sup 2)-f RANS model is modified in order the proposed v(sup 2)-f-based DES formulation reduces to a transport equation for the subgrid-scale kinetic energy isotropic turbulence. First, three coefficients in the elliptic relaxation equation are modified, which is tested in channel flows with friction Reynolds number up to 2000. Then, the proposed v(sup 2)-f DES model formulation is derived. The constant, C(sub DES), required in the DES formulation was calibrated by simulating both decaying and statistically-steady isotropic turbulence. After C(sub DES) was calibrated, the v(sub 2)-f DES formulation is tested for flow around a circular cylinder at a Reynolds number of 3900, in which case turbulence develops after separation. Simulations indicate that this model represents the turbulent wake nearly as accurately as the dynamic Smagorinsky model. Spalart-Allmaras-based DES is also included in the cylinder flow simulation for comparison.

  4. Lessons learned from recent geomagnetic disturbance model validation activities

    NASA Astrophysics Data System (ADS)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  5. Methodes iteratives paralleles: Applications en neutronique et en mecanique des fluides

    NASA Astrophysics Data System (ADS)

    Qaddouri, Abdessamad

    Dans cette these, le calcul parallele est applique successivement a la neutronique et a la mecanique des fluides. Dans chacune de ces deux applications, des methodes iteratives sont utilisees pour resoudre le systeme d'equations algebriques resultant de la discretisation des equations du probleme physique. Dans le probleme de neutronique, le calcul des matrices des probabilites de collision (PC) ainsi qu'un schema iteratif multigroupe utilisant une methode inverse de puissance sont parallelises. Dans le probleme de mecanique des fluides, un code d'elements finis utilisant un algorithme iteratif du type GMRES preconditionne est parallelise. Cette these est presentee sous forme de six articles suivis d'une conclusion. Les cinq premiers articles traitent des applications en neutronique, articles qui representent l'evolution de notre travail dans ce domaine. Cette evolution passe par un calcul parallele des matrices des PC et un algorithme multigroupe parallele teste sur un probleme unidimensionnel (article 1), puis par deux algorithmes paralleles l'un mutiregion l'autre multigroupe, testes sur des problemes bidimensionnels (articles 2--3). Ces deux premieres etapes sont suivies par l'application de deux techniques d'acceleration, le rebalancement neutronique et la minimisation du residu aux deux algorithmes paralleles (article 4). Finalement, on a mis en oeuvre l'algorithme multigroupe et le calcul parallele des matrices des PC sur un code de production DRAGON ou les tests sont plus realistes et peuvent etre tridimensionnels (article 5). Le sixieme article (article 6), consacre a l'application a la mecanique des fluides, traite la parallelisation d'un code d'elements finis FES ou le partitionneur de graphe METIS et la librairie PSPARSLIB sont utilises.

  6. Temporal and external validation of a prediction model for adverse outcomes among inpatients with diabetes.

    PubMed

    Adderley, N J; Mallett, S; Marshall, T; Ghosh, S; Rayman, G; Bellary, S; Coleman, J; Akiboye, F; Toulis, K A; Nirantharakumar, K

    2018-06-01

    To temporally and externally validate our previously developed prediction model, which used data from University Hospitals Birmingham to identify inpatients with diabetes at high risk of adverse outcome (mortality or excessive length of stay), in order to demonstrate its applicability to other hospital populations within the UK. Temporal validation was performed using data from University Hospitals Birmingham and external validation was performed using data from both the Heart of England NHS Foundation Trust and Ipswich Hospital. All adult inpatients with diabetes were included. Variables included in the model were age, gender, ethnicity, admission type, intensive therapy unit admission, insulin therapy, albumin, sodium, potassium, haemoglobin, C-reactive protein, estimated GFR and neutrophil count. Adverse outcome was defined as excessive length of stay or death. Model discrimination in the temporal and external validation datasets was good. In temporal validation using data from University Hospitals Birmingham, the area under the curve was 0.797 (95% CI 0.785-0.810), sensitivity was 70% (95% CI 67-72) and specificity was 75% (95% CI 74-76). In external validation using data from Heart of England NHS Foundation Trust, the area under the curve was 0.758 (95% CI 0.747-0.768), sensitivity was 73% (95% CI 71-74) and specificity was 66% (95% CI 65-67). In external validation using data from Ipswich, the area under the curve was 0.736 (95% CI 0.711-0.761), sensitivity was 63% (95% CI 59-68) and specificity was 69% (95% CI 67-72). These results were similar to those for the internally validated model derived from University Hospitals Birmingham. The prediction model to identify patients with diabetes at high risk of developing an adverse event while in hospital performed well in temporal and external validation. The externally validated prediction model is a novel tool that can be used to improve care pathways for inpatients with diabetes. Further research to assess

  7. Use of DES Modeling for Determining Launch Availability for SLS

    NASA Technical Reports Server (NTRS)

    Watson, Mike; Staton, Eric; Cates, Grant; Finn, Ron; Altino, Karen; Burns, Lee

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing new capabilities for human and scientific exploration beyond Earth's orbit. This effort includes the Space Shuttle derived Space Launch System (SLS), the Multi-Purpose Crew Vehicle (MPCV) "Orion", and the Ground Systems Development and Operations (GSDO). There are several requirements and Technical Performance Measures (TPMs) that have been levied by the Exploration Systems Development (ESD) upon the SLS, MPCV, and GSDO Programs including an integrated Launch Availability (LA) TPM. The LA TPM is used to drive into the SLS, Orion and GSDO designs a high confidence of successfully launching exploration missions that have narrow Earth departure windows. The LA TPM takes into consideration the reliability of the overall system (SLS, Orion and GSDO), natural environments, likelihood of a failure, and the time required to recover from an anomaly. A challenge with the LA TPM is the interrelationships between SLS, Orion, GSDO and the natural environments during launch countdown and launch delays that makes it impossible to develop an analytical solution for calculating the integrated launch probability. This paper provides an overview of how Discrete Event Simulation (DES) modeling was used to develop the LA TPM, how it was allocated down to the individual programs, and how the LA analysis is being used to inform and drive the SLS, Orion, and GSDO designs to ensure adequate launch availability for future human exploration.

  8. Use of DES Modeling for Determining Launch Availability for SLS

    NASA Technical Reports Server (NTRS)

    Staton, Eric; Cates, Grant; Finn, Ronald; Altino, Karen M.; Burns, K. Lee; Watson, Michael D.

    2014-01-01

    The National Aeronautics and Space Administration (NASA) is developing new capabilities for human and scientific exploration beyond Earth's orbit. This effort includes the Space Shuttle derived Space Launch System (SLS), the Orion Multi-Purpose Crew Vehicle (MPCV), and the Ground Systems Development and Operations (GSDO). There are several requirements and Technical Performance Measures (TPMs) that have been levied by the Exploration Systems Development (ESD) upon the SLS, Orion, and GSDO Programs including an integrated Launch Availability (LA) TPM. The LA TPM is used to drive into the SLS, Orion and GSDO designs a high confidence of successfully launching exploration missions that have narrow Earth departure windows. The LA TPM takes into consideration the reliability of the overall system (SLS, Orion and GSDO), natural environments, likelihood of a failure, and the time required to recover from an anomaly. A challenge with the LA TPM is the interrelationships between SLS, Orion, GSDO and the natural environments during launch countdown and launch delays that makes it impossible to develop an analytical solution for calculating the integrated launch probability. This paper provides an overview of how Discrete Event Simulation (DES) modeling was used to develop the LA TPM, how it was allocated down to the individual programs, and how the LA analysis is being used to inform and drive the SLS, Orion, and GSDO designs to ensure adequate launch availability for future human exploration.

  9. Differential Validation of a Path Analytic Model of University Dropout.

    ERIC Educational Resources Information Center

    Winteler, Adolf

    Tinto's conceptual schema of college dropout forms the theoretical framework for the development of a model of university student dropout intention. This study validated Tinto's model in two different departments within a single university. Analyses were conducted on a sample of 684 college freshmen in the Education and Economics Department. A…

  10. Exploring the Validity of Proposed Transgenic Animal Models of Attention-Deficit Hyperactivity Disorder (ADHD).

    PubMed

    de la Peña, June Bryan; Dela Peña, Irene Joy; Custodio, Raly James; Botanas, Chrislean Jun; Kim, Hee Jin; Cheong, Jae Hoon

    2018-05-01

    Attention-deficit/hyperactivity disorder (ADHD) is a common, behavioral, and heterogeneous neurodevelopmental condition characterized by hyperactivity, impulsivity, and inattention. Symptoms of this disorder are managed by treatment with methylphenidate, amphetamine, and/or atomoxetine. The cause of ADHD is unknown, but substantial evidence indicates that this disorder has a significant genetic component. Transgenic animals have become an essential tool in uncovering the genetic factors underlying ADHD. Although they cannot accurately reflect the human condition, they can provide insights into the disorder that cannot be obtained from human studies due to various limitations. An ideal animal model of ADHD must have face (similarity in symptoms), predictive (similarity in response to treatment or medications), and construct (similarity in etiology or underlying pathophysiological mechanism) validity. As the exact etiology of ADHD remains unclear, the construct validity of animal models of ADHD would always be limited. The proposed transgenic animal models of ADHD have substantially increased and diversified over the years. In this paper, we compiled and explored the validity of proposed transgenic animal models of ADHD. Each of the reviewed transgenic animal models has strengths and limitations. Some fulfill most of the validity criteria of an animal model of ADHD and have been extensively used, while there are others that require further validation. Nevertheless, these transgenic animal models of ADHD have provided and will continue to provide valuable insights into the genetic underpinnings of this complex disorder.

  11. Validations of Computational Weld Models: Comparison of Residual Stresses

    DTIC Science & Technology

    2010-08-01

    DRDC Atlantic CR 2009-222; R & D pour la défense Canada – Atlantique; août 2010. Introduction : Lorsque des travaux de soudage sont réalisés afin de...rechargement de deux panneaux rigides. Les structures étaient munies de thermocouples et de jauges de déformation per- mettant d’enregistrer les variations

  12. Des Moines metropolitan area ITS strategic plan

    DOT National Transportation Integrated Search

    1998-08-19

    The Des Moines Area Metropolitan Organization (MPO) completed an early deployment study for the Des Moines metropolitan area in late 1997. The purpose of the study was to develop a strategic plan for Intelligent Transportation Systems (ITS) deploymen...

  13. Temporal validation for landsat-based volume estimation model

    Treesearch

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  14. Methode d’Identification des Forces Aerodynamiques Instationnaires sur les Essais en Vol, Validation Experimentale (Method of Mathematical Identification of Unsteady Airloads From Flight Measurements, Experimental Validation)

    DTIC Science & Technology

    2000-05-01

    gage en vol de rdponses de jauges de contraintes en responses in maneuver, illustrated by an example manoeuvre, illustrd par un exemple issu de la coming...sous ddrapage, ... , braquages gouvernes,..) la forme: -Les mesures sont directement les rkponses de - minimiser Z = Q(k - Xj tb) 2 jauges de...3, la rdponse - les facteurs de ponddration des mesures, fli ou incidence de l’avion, la rdponse de la jauge plus ou momns subjectifs, sont remplacds

  15. Modeling and validating HL7 FHIR profiles using semantic web Shape Expressions (ShEx).

    PubMed

    Solbrig, Harold R; Prud'hommeaux, Eric; Grieve, Grahame; McKenzie, Lloyd; Mandel, Joshua C; Sharma, Deepak K; Jiang, Guoqian

    2017-03-01

    HL7 Fast Healthcare Interoperability Resources (FHIR) is an emerging open standard for the exchange of electronic healthcare information. FHIR resources are defined in a specialized modeling language. FHIR instances can currently be represented in either XML or JSON. The FHIR and Semantic Web communities are developing a third FHIR instance representation format in Resource Description Framework (RDF). Shape Expressions (ShEx), a formal RDF data constraint language, is a candidate for describing and validating the FHIR RDF representation. Create a FHIR to ShEx model transformation and assess its ability to describe and validate FHIR RDF data. We created the methods and tools that generate the ShEx schemas modeling the FHIR to RDF specification being developed by HL7 ITS/W3C RDF Task Force, and evaluated the applicability of ShEx in the description and validation of FHIR to RDF transformations. The ShEx models contributed significantly to workgroup consensus. Algorithmic transformations from the FHIR model to ShEx schemas and FHIR example data to RDF transformations were incorporated into the FHIR build process. ShEx schemas representing 109 FHIR resources were used to validate 511 FHIR RDF data examples from the Standards for Trial Use (STU 3) Ballot version. We were able to uncover unresolved issues in the FHIR to RDF specification and detect 10 types of errors and root causes in the actual implementation. The FHIR ShEx representations have been included in the official FHIR web pages for the STU 3 Ballot version since September 2016. ShEx can be used to define and validate the syntax of a FHIR resource, which is complementary to the use of RDF Schema (RDFS) and Web Ontology Language (OWL) for semantic validation. ShEx proved useful for describing a standard model of FHIR RDF data. The combination of a formal model and a succinct format enabled comprehensive review and automated validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Validation of Risk Assessment Models of Venous Thromboembolism in Hospitalized Medical Patients.

    PubMed

    Greene, M Todd; Spyropoulos, Alex C; Chopra, Vineet; Grant, Paul J; Kaatz, Scott; Bernstein, Steven J; Flanders, Scott A

    2016-09-01

    Patients hospitalized for acute medical illness are at increased risk for venous thromboembolism. Although risk assessment is recommended and several at-admission risk assessment models have been developed, these have not been adequately derived or externally validated. Therefore, an optimal approach to evaluate venous thromboembolism risk in medical patients is not known. We conducted an external validation study of existing venous thromboembolism risk assessment models using data collected on 63,548 hospitalized medical patients as part of the Michigan Hospital Medicine Safety (HMS) Consortium. For each patient, cumulative venous thromboembolism risk scores and risk categories were calculated. Cox regression models were used to quantify the association between venous thromboembolism events and assigned risk categories. Model discrimination was assessed using Harrell's C-index. Venous thromboembolism incidence in hospitalized medical patients is low (1%). Although existing risk assessment models demonstrate good calibration (hazard ratios for "at-risk" range 2.97-3.59), model discrimination is generally poor for all risk assessment models (C-index range 0.58-0.64). The performance of several existing risk assessment models for predicting venous thromboembolism among acutely ill, hospitalized medical patients at admission is limited. Given the low venous thromboembolism incidence in this nonsurgical patient population, careful consideration of how best to utilize existing venous thromboembolism risk assessment models is necessary, and further development and validation of novel venous thromboembolism risk assessment models for this patient population may be warranted. Published by Elsevier Inc.

  17. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  18. Radiative transfer model validations during the First ISLSCP Field Experiment

    NASA Technical Reports Server (NTRS)

    Frouin, Robert; Breon, Francois-Marie; Gautier, Catherine

    1990-01-01

    Two simple radiative transfer models, the 5S model based on Tanre et al. (1985, 1986) and the wide-band model of Morcrette (1984) are validated by comparing their outputs with results obtained during the First ISLSCP Field Experiment on concomitant radiosonde, aerosol turbidity, and radiation measurements and sky photographs. Results showed that the 5S model overestimates the short-wave irradiance by 13.2 W/sq m, whereas the Morcrette model underestimated the long-wave irradiance by 7.4 W/sq m.

  19. Heat Transfer Modeling and Validation for Optically Thick Alumina Fibrous Insulation

    NASA Technical Reports Server (NTRS)

    Daryabeigi, Kamran

    2009-01-01

    Combined radiation/conduction heat transfer through unbonded alumina fibrous insulation was modeled using the diffusion approximation for modeling the radiation component of heat transfer in the optically thick insulation. The validity of the heat transfer model was investigated by comparison to previously reported experimental effective thermal conductivity data over the insulation density range of 24 to 96 kg/cu m, with a pressure range of 0.001 to 750 torr (0.1 to 101.3 x 10(exp 3) Pa), and test sample hot side temperature range of 530 to 1360 K. The model was further validated by comparison to thermal conductivity measurements using the transient step heating technique on an insulation sample at a density of 144 kg/cu m over a pressure range of 0.001 to 760 torr, and temperature range of 290 to 1090 K.

  20. Evaluation des Connaissances-Attitudes-Pratiques des populations des districts sanitaires de Benoye, Laoukassy, Moundou et N’Djaména Sud sur la rage canine au Tchad

    PubMed Central

    Mindekem, Rolande; Lechenne, Monique; Alfaroukh, Idriss Oumar; Moto, Daugla Doumagoum; Zinsstag, Jakob; Ouedraogo, Laurent Tinoaga; Salifou, Sahidou

    2017-01-01

    Introduction La rage canine demeure une préoccupation en Afrique comme au Tchad. La présente étude vise à évaluer les Connaissances-Attitudes-Pratiques des populations pour la prise en charge appropriée des personnes exposées et une lutte efficace. Méthodes C’était une étude transversale descriptive réalisée en juillet et septembre dans quatre districts sanitaires au Tchad en 2015. Les données ont été collectées à l’aide d’un questionnaire auprès des ménages recrutés suivant un sondage aléatoire à 3 degrés. Résultats C’était 2428 personnes enquêtées avec un niveau maximum primaire (54,12%). L’âge moyen était de 36 ± 13,50 ans. Ils étaient cultivateurs (35,17%), commerçants (18,04%), ménagères (12,81%). La rage était définie comme une maladie transmise du chien à l’homme (41,43%), une altération du cerveau (41,27%), une sous-alimentation (10,26%). Le chat était faiblement connu réservoir (13,84%) et vecteur (19,77%) ainsi que la griffure comme moyen de transmission (4,61%) et la vaccination du chat comme mesure préventive (0,49%). Les premiers soins en cas de morsure à domicile étaient les pratiques traditionnelles (47,69%), le lavage des plaies (19,48%) ou aucune action entreprise (20,43%). Les ménages consultaient la santé humaine (78,50%), la santé animale (5,35%) et les guérisseurs traditionnels (27%). Conclusion La communication en rapport avec des premiers soins à la maison en cas de morsure, la connaissance du chat comme réservoir et vecteur, celle de la griffure comme moyen de transmission et la promotion de la consultation des services vétérinaires en cas de morsure sont nécessaires. PMID:28761600

  1. SHERMAN, a shape-based thermophysical model. I. Model description and validation

    NASA Astrophysics Data System (ADS)

    Magri, Christopher; Howell, Ellen S.; Vervack, Ronald J.; Nolan, Michael C.; Fernández, Yanga R.; Marshall, Sean E.; Crowell, Jenna L.

    2018-03-01

    SHERMAN, a new thermophysical modeling package designed for analyzing near-infrared spectra of asteroids and other solid bodies, is presented. The model's features, the methods it uses to solve for surface and subsurface temperatures, and the synthetic data it outputs are described. A set of validation tests demonstrates that SHERMAN produces accurate output in a variety of special cases for which correct results can be derived from theory. These cases include a family of solutions to the heat equation for which thermal inertia can have any value and thermophysical properties can vary with depth and with temperature. An appendix describes a new approximation method for estimating surface temperatures within spherical-section craters, more suitable for modeling infrared beaming at short wavelengths than the standard method.

  2. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    PubMed

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  3. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  4. When is the Anelastic Approximation a Valid Model for Compressible Convection?

    NASA Astrophysics Data System (ADS)

    Alboussiere, T.; Curbelo, J.; Labrosse, S.; Ricard, Y. R.; Dubuffet, F.

    2017-12-01

    Compressible convection is ubiquitous in large natural systems such Planetary atmospheres, stellar and planetary interiors. Its modelling is notoriously more difficult than the case when the Boussinesq approximation applies. One reason for that difficulty has been put forward by Ogura and Phillips (1961): the compressible equations generate sound waves with very short time scales which need to be resolved. This is why they introduced an anelastic model, based on an expansion of the solution around an isentropic hydrostatic profile. How accurate is that anelastic model? What are the conditions for its validity? To answer these questions, we have developed a numerical model for the full set of compressible equations and compared its solutions with those of the corresponding anelastic model. We considered a simple rectangular 2D Rayleigh-Bénard configuration and decided to restrict the analysis to infinite Prandtl numbers. This choice is valid for convection in the mantles of rocky planets, but more importantly lead to a zero Mach number. So we got rid of the question of the interference of acoustic waves with convection. In that simplified context, we used the entropy balances (that of the full set of equations and that of the anelastic model) to investigate the differences between exact and anelastic solutions. We found that the validity of the anelastic model is dictated by two conditions: first, the superadiabatic temperature difference must be small compared with the adiabatic temperature difference (as expected) ɛ = Δ TSA / delta Ta << 1, and secondly that the product of ɛ with the Nusselt number must be small.

  5. On the validation of a code and a turbulence model appropriate to circulation control airfoils

    NASA Technical Reports Server (NTRS)

    Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.

    1988-01-01

    A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.

  6. Use of the Ames Check Standard Model for the Validation of Wall Interference Corrections

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Amaya, M.; Flach, R.

    2018-01-01

    The new check standard model of the NASA Ames 11-ft Transonic Wind Tunnel was chosen for a future validation of the facility's wall interference correction system. The chosen validation approach takes advantage of the fact that test conditions experienced by a large model in the slotted part of the tunnel's test section will change significantly if a subset of the slots is temporarily sealed. Therefore, the model's aerodynamic coefficients have to be recorded, corrected, and compared for two different test section configurations in order to perform the validation. Test section configurations with highly accurate Mach number and dynamic pressure calibrations were selected for the validation. First, the model is tested with all test section slots in open configuration while keeping the model's center of rotation on the tunnel centerline. In the next step, slots on the test section floor are sealed and the model is moved to a new center of rotation that is 33 inches below the tunnel centerline. Then, the original angle of attack sweeps are repeated. Afterwards, wall interference corrections are applied to both test data sets and response surface models of the resulting aerodynamic coefficients in interference-free flow are generated. Finally, the response surface models are used to predict the aerodynamic coefficients for a family of angles of attack while keeping dynamic pressure, Mach number, and Reynolds number constant. The validation is considered successful if the corrected aerodynamic coefficients obtained from the related response surface model pair show good agreement. Residual differences between the corrected coefficient sets will be analyzed as well because they are an indicator of the overall accuracy of the facility's wall interference correction process.

  7. Optimisation des proprietes fonctionnelles des alliages a memoire de forme suite a l'application de traitements thermomecaniques

    NASA Astrophysics Data System (ADS)

    Demers, Vincent

    L'objectif de ce projet est de determiner les conditions de laminage et la temperature de traitement thermique maximisant les proprietes fonctionnelles de l'alliage a memoire de forme Ti-Ni. Les specimens sont caracterises par des mesures de calorimetrie, de microscopie optique, de gene ration de contrainte, de deformation recuperable et des essais mecaniques. Pour un cycle unique, l'utilisation d'un taux d'ecrouissage e=1.5 obtenu avec l'application d'une force de tension FT = 0.1sigma y et d'une huile minerale resulte en un echantillon droit, sans microfissure et qui apres un recuit a 400°C, produit un materiau nanostructure manifestant des proprietes fonctionnelles deux fois plus grandes que le meme materiau ayant une structure polygonisee. Pour des cycles repetes, les memes conditions de laminage sont valables mais le niveau de deformation optimal est situe entre e=0.75-2, et depend particulierement du mode de sollicitation, du niveau de stabilisation et du nombre de cycles a la rupture requis par l'application.

  8. Qualitative Validation of the IMM Model for ISS and STS Programs

    NASA Technical Reports Server (NTRS)

    Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.

    2016-01-01

    To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.

  9. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    NASA Technical Reports Server (NTRS)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  10. Validation of the Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM).

    PubMed

    Willis, Michael; Johansen, Pierre; Nilsson, Andreas; Asseburg, Christian

    2017-03-01

    The Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM) was developed to address study questions pertaining to the cost-effectiveness of treatment alternatives in the care of patients with type 2 diabetes mellitus (T2DM). Naturally, the usefulness of a model is determined by the accuracy of its predictions. A previous version of ECHO-T2DM was validated against actual trial outcomes and the model predictions were generally accurate. However, there have been recent upgrades to the model, which modify model predictions and necessitate an update of the validation exercises. The objectives of this study were to extend the methods available for evaluating model validity, to conduct a formal model validation of ECHO-T2DM (version 2.3.0) in accordance with the principles espoused by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM), and secondarily to evaluate the relative accuracy of four sets of macrovascular risk equations included in ECHO-T2DM. We followed the ISPOR/SMDM guidelines on model validation, evaluating face validity, verification, cross-validation, and external validation. Model verification involved 297 'stress tests', in which specific model inputs were modified systematically to ascertain correct model implementation. Cross-validation consisted of a comparison between ECHO-T2DM predictions and those of the seminal National Institutes of Health model. In external validation, study characteristics were entered into ECHO-T2DM to replicate the clinical results of 12 studies (including 17 patient populations), and model predictions were compared to observed values using established statistical techniques as well as measures of average prediction error, separately for the four sets of macrovascular risk equations supported in ECHO-T2DM. Sub-group analyses were conducted for dependent vs. independent outcomes and for microvascular vs. macrovascular vs. mortality

  11. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  12. Nouvelle approche à la prise en charge des condylomes

    PubMed Central

    Lopaschuk, Catharine C.

    2013-01-01

    Résumé Objectif Faire le résumé des anciens et des nouveaux moyens de traitement des verrues génitales ou condylomes et déterminer comment les utiliser de manière appropriée. Sources des données Une recherche documentaire a été effectuée dans les bases de données suivantes: MEDLINE, PubMed, EMBASE, base de données des synthèses systématiques et registre central des études contrôlées de la Collaboration Cochrane (en anglais), ACP Journal Club et Trip. Les bibliographies des articles extraits ont aussi été examinées. Les études cliniques, les articles de révision qualitative, les rapports consensuels et les guides de pratique clinique ont été retenus. Message principal Les verrues symptomatiques sont présentes chez au moins 1 % des personnes âgées entre 15 et 49 ans et on estime que jusqu’à 50 % des gens sont infectés par le virus du papillome humain à un moment donné de leur vie. L’imiquimod et la podophyllotoxine sont 2 nouveaux traitements pour les verrues génitales externes qui sont moins douloureux et peuvent être appliqués par les patients à la maison. De plus, il a été démontré que le vaccin quadrivalent contre le virus du papillome humain est efficace pour prévenir les condylomes et le cancer du col. Les plus anciennes méthodes thérapeutiques ont aussi leur place dans certaines situations, comme les verrues intravaginales, urétrales, anales ou récalcitrantes ou encore pour les patientes enceintes. Conclusion Les nouveaux traitements des verrues génitales externes peuvent réduire la douleur causée par la thérapie et le nombre de visites au cabinet. Les autres méthodes thérapeutiques demeurent utiles dans certaines situations.

  13. CMB lensing tomography with the DES Science Verification galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giannantonio, T.

    We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less

  14. CMB lensing tomography with the DES Science Verification galaxies

    DOE PAGES

    Giannantonio, T.

    2016-01-07

    We measure the cross-correlation between the galaxy density in the Dark Energy Survey (DES) Science Verification data and the lensing of the cosmic microwave background (CMB) as reconstructed with the Planck satellite and the South Pole Telescope (SPT). When using the DES main galaxy sample over the full redshift range 0.2 < z phot < 1.2, a cross-correlation signal is detected at 6σ and 4σ with SPT and Planck respectively. We then divide the DES galaxies into five photometric redshift bins, finding significant (>2σ) detections in all bins. Comparing to the fiducial Planck cosmology, we find the redshift evolution ofmore » the signal matches expectations, although the amplitude is consistently lower than predicted across redshift bins. We test for possible systematics that could affect our result and find no evidence for significant contamination. Finally, we demonstrate how these measurements can be used to constrain the growth of structure across cosmic time. We find the data are fit by a model in which the amplitude of structure in the z < 1.2 universe is 0.73 ± 0.16 times as large as predicted in the LCDM Planck cosmology, a 1.7σ deviation.« less

  15. Etude de la Morphologie et de la Cinématique de l'Emission des Raies interdites autour des Etoiles T Tauri

    NASA Astrophysics Data System (ADS)

    Lavalley, Claudia

    2000-06-01

    Le phénomène de perte de masse joue un rôle essentiel dès les premières étapes de la formation stellaire et semble être intimement lié à l'accrétion de matière sur l'étoile, probablement par l'intermédiaire de champs magnétiques permettant de convertir l'énergie cinétique accrétée, en puissance d'éjection. Les étoiles T Tauri classiques, âgées de quelques millions d'années et présentant une faible extinction, offrent un excellent cadre pour étudier les régions internes des vents stellaires. Dans ce travail, je présente les premières études sur la morphologie des jets associés aux étoiles DG Tau, CW Tau et RW Aur à une résolution angulaire de 0.1'' et sur la cinématique à deux dimensions de l'émission des raies de [O I]?6300Å, [N II]?6583Å et [S II] ?6716,6731Å dans le jet de DG Tau. Ces données ont été obtenues avec deux techniques d'observation complètement nouvelles, devenues disponibles entre 1994 et 1998 au télescope CFH, et idéalement adaptées à ce problème: l'imagerie en bande étroite derrière l'optique adaptative (PUEO) qui fournit des données à très haute résolution angulaire (~0.1''), et la spectro-imagerie intégrale de champ (TIGRE/OASIS) qui donne accès à l'information spatiale et spectrale à 2D, à haute résolution angulaire (ici ~0.5''-0.75'') et moyenne résolution spectrale (100-170 km/s). Les trois jets étudiés, résolus pour la première fois à partir de 55 u.a. de l'étoile, présentent une largeur similaire (30-35 u.a.) jusqu'à 100 u.a. et une morphologie dominée par des noeuds d'émission. Les jets des étoiles à faible excès infrarouge CW Tau et RW Aur sont très similaires aux deux autres jets des sources peu enfouies observés jusqu'à présent à la même échelle spatiale. Le jet de DG Tau, plus perturbé que les deux autres, et provenant d'une source avec une enveloppe encore importante, est aussi très similaire au seul autre jet associé à une source encore enfouie r

  16. Addendum to validation of FHWA's Traffic Noise Model (TNM) : phase 1

    DOT National Transportation Integrated Search

    2004-07-01

    (FHWA) is conducting a multiple-phase study to assess the accuracy and make recommendations on the use of FHWAs Traffic Noise Model (TNM). The TNM Validation Study involves highway noise data collection and TNM modeling for the purpose of data com...

  17. Sonderverfahren des Spritzgießens

    NASA Astrophysics Data System (ADS)

    Michaeli, Walther; Lettowsky, Christoph

    Das Spritzgießen ist neben der Extrusion das wichtigste Verarbeitungsverfahren für Kunststoffe [1]. Das Verfahren hat sich seit seinen Ursprüngen Ende des 19. Jahrhunderts bis heute stetig weiterentwickelt [2]. In neuerer Zeit steigt die Anzahl komplexer Anwendungen, die die gezielte Kombination verschiedener Funktionalitäten in einem Formteil erfordern. Das Standard-Spritzgießen kann diese Anforderungen immer weniger befriedigen. Daher gewinnen die Sonderverfahren des Spritzgießens zunehmend an Bedeutung [3]. Ihre Anzahl beträgt inzwischen über 100. Die Aufgabe des Anwenders ist es, aus der Vielzahl der möglichen Verfahren, ein anforderungsgerechtes auszuwählen, das sowohl unter technischen wie wirtschaftlichen Gesichtspunkten die optimale Lösung darstellt. Dies setzt die ständige Auseinandersetzung mit Entwicklungstendenzen im Bereich der Spritzgießtechnologie voraus. Daher soll im folgenden Abschnitt ein Überblick über die wichtigsten Spritzgieß-Sonderverfahren gegeben werden.

  18. Révision systématique des effets de la fréquence des repas en famille sur les résultats psychosociaux chez les jeunes

    PubMed Central

    Harrison, Megan E.; Norris, Mark L.; Obeid, Nicole; Fu, Maeghan; Weinstangel, Hannah; Sampson, Margaret

    2015-01-01

    Résumé Objectif Effectuer une révision systématique des effets de repas en famille fréquents sur les résultats psychosociaux chez les enfants et les adolescents et examiner s’il existe des différences dans les résultats selon le sexe. Sources des données Des études ont été cernées à la suite d’une recherche dans MEDLINE (de 1948 à la dernière semaine de juin 2011) et dans PsycINFO (de 1806 à la première semaine de juillet 2011) à l’aide de l’interface Ovide. Les expressions et mots clés MeSH utilisés seuls ou en combinaisons étaient les suivants : family, meal, food intake, nutrition, diets, body weight, adolescent attitudes, eating behaviour, feeding behaviour et eating disorders. Les bibliographies des articles jugés pertinents ont aussi été passées en revus. Sélection des études La recherche initiale a produit 1783 articles. Pour être incluses dans l’analyse, les études devaient répondre aux critères suivants : être publiées en anglais dans une revue révisée par des pairs; porter sur des enfants ou des adolescents; traiter de l’influence des repas en famille sur les paramètres psychosociaux (p. ex. consommation de drogues et autres substances, troubles de l’alimentation, dépression) chez les enfants ou les adolescents; avoir une conception d’étude appropriée, notamment des méthodes statistiques acceptables pour l’analyse des paramètres. Quatorze articles satisfaisaient aux critères d’inclusion. Deux examinateurs indépendants ont étudié et analysé les articles. Synthèse Dans l’ensemble, les résultats font valoir que la fréquence des repas en famille est inversement proportionnelle aux troubles de l’alimentation, à la consommation d’alcool et de drogues, aux comportements violents, aux sentiments de dépression ou aux pensées suicidaires chez les adolescents. Il existe une relation positive entre de fréquents repas en famille, une bonne estime de soi et la réussite scolaire. Les

  19. Modelling and validation of electromechanical shock absorbers

    NASA Astrophysics Data System (ADS)

    Tonoli, Andrea; Amati, Nicola; Girardello Detoni, Joaquim; Galluzzi, Renato; Gasparin, Enrico

    2013-08-01

    Electromechanical vehicle suspension systems represent a promising substitute to conventional hydraulic solutions. However, the design of electromechanical devices that are able to supply high damping forces without exceeding geometric dimension and mass constraints is a difficult task. All these challenges meet in off-road vehicle suspension systems, where the power density of the dampers is a crucial parameter. In this context, the present paper outlines a particular shock absorber configuration where a suitable electric machine and a transmission mechanism are utilised to meet off-road vehicle requirements. A dynamic model is used to represent the device. Subsequently, experimental tests are performed on an actual prototype to verify the functionality of the damper and validate the proposed model.

  20. Validation of the filament winding process model

    NASA Technical Reports Server (NTRS)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  1. Résultats fonctionnels des lésions des tendons fléchisseurs de la main: à propos de 90 cas

    PubMed Central

    Boussakri, Hassan; Azarkane, Mohamad; Elidrissi, Mohamad; Shimi, Mohamad; Elibrahimi, Abdelhalim; Elmrini, Abdelmajid

    2013-01-01

    Les auteurs rapportent une série de 90 patients, présentant une section tendineuse des fléchisseurs de la main, et suivis avec un recul moyen de 8 mois (min: 2 mois; max: 13 mois). La lésion était localisée dans 12% des cas en zone I, 46% des cas en zone II, 2% en zone III et 25% des cas en zone IV et V. Pour le pouce (13 patients), 10 cas en zone T2 et 3 cas en zone T3. La technique opératoire utilisée était les sutures tendineuses en cadre de Kessler modifié, associée à un surjet épitendineux. Nous avons obtenus 54% de très bons résultats, 34% de résultats moyens et 12% de mauvais résultats. Pour le pouce les résultats semble moins bons avec un taux de résultats médiocre de 48%. Certes les chiffres de cette série sont moins bons que ceux des autres séries publiées dans la littérature. Les facteurs influençant les résultats sont d'abord l'utilisation d'immobilisation postopératoire systématique ainsi que le mécanisme d'agression, et la localisation à la zone II et au pouce. Les complications mécaniques sont représentées par 7% de rupture, toutes au niveau de pouce, et 31% des adhérences tendineuses (soit 30 cas), dont 19 en zone II, l'infections (22%) et 4% des cas d'algodystrophies. PMID:23847698

  2. Validation of coupled atmosphere-fire behavior models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less

  3. Validation of the 'full reconnection model' of the sawtooth instability in KSTAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nam, Y. B.; Ko, J. S.; Choe, G. H.

    In this paper, the central safety factor (q 0) during sawtooth oscillation has been measured with a great accuracy with the motional Stark effect (MSE) system on KSTAR and the measured value was However, this measurement alone cannot validate the disputed full and partial reconnection models definitively due to non-trivial off-set error (~0.05). Supplemental experiment of the excited m = 2, m = 3 modes that are extremely sensitive to the background q 0 and core magnetic shear definitively validates the 'full reconnection model'. The radial position of the excited modes right after the crash and time evolution into themore » 1/1 kink mode before the crash in a sawtoothing plasma suggests that in the MHD quiescent period after the crash and before the crash. Finally, additional measurement of the long lived m = 3, m = 5 modes in a non-sawtoothing discharge (presumably ) further validates the 'full reconnection model'.« less

  4. Validation of the 'full reconnection model' of the sawtooth instability in KSTAR

    DOE PAGES

    Nam, Y. B.; Ko, J. S.; Choe, G. H.; ...

    2018-03-26

    In this paper, the central safety factor (q 0) during sawtooth oscillation has been measured with a great accuracy with the motional Stark effect (MSE) system on KSTAR and the measured value was However, this measurement alone cannot validate the disputed full and partial reconnection models definitively due to non-trivial off-set error (~0.05). Supplemental experiment of the excited m = 2, m = 3 modes that are extremely sensitive to the background q 0 and core magnetic shear definitively validates the 'full reconnection model'. The radial position of the excited modes right after the crash and time evolution into themore » 1/1 kink mode before the crash in a sawtoothing plasma suggests that in the MHD quiescent period after the crash and before the crash. Finally, additional measurement of the long lived m = 3, m = 5 modes in a non-sawtoothing discharge (presumably ) further validates the 'full reconnection model'.« less

  5. The tissue microarray data exchange specification: A document type definition to validate and enhance XML data

    PubMed Central

    Nohle, David G; Ayers, Leona W

    2005-01-01

    Background The Association for Pathology Informatics (API) Extensible Mark-up Language (XML) TMA Data Exchange Specification (TMA DES) proposed in April 2003 provides a community-based, open source tool for sharing tissue microarray (TMA) data in a common format. Each tissue core within an array has separate data including digital images; therefore an organized, common approach to produce, navigate and publish such data facilitates viewing, sharing and merging TMA data from different laboratories. The AIDS and Cancer Specimen Resource (ACSR) is a HIV/AIDS tissue bank consortium sponsored by the National Cancer Institute (NCI) Division of Cancer Treatment and Diagnosis (DCTD). The ACSR offers HIV-related malignancies and uninfected control tissues in microarrays (TMA) accompanied by de-identified clinical data to approved researchers. Exporting our TMA data into the proposed API specified format offers an opportunity to evaluate the API specification in an applied setting and to explore its usefulness. Results A document type definition (DTD) that governs the allowed common data elements (CDE) in TMA DES export XML files was written, tested and evolved and is in routine use by the ACSR. This DTD defines TMA DES CDEs which are implemented in an external file that can be supplemented by internal DTD extensions for locally defined TMA data elements (LDE). Conclusion ACSR implementation of the TMA DES demonstrated the utility of the specification and allowed application of a DTD to validate the language of the API specified XML elements and to identify possible enhancements within our TMA data management application. Improvements to the specification have additionally been suggested by our experience in importing other institution's exported TMA data. Enhancements to TMA DES to remove ambiguous situations and clarify the data should be considered. Better specified identifiers and hierarchical relationships will make automatic use of the data possible. Our tool can be

  6. Developpements numeriques recents realises en aeroelasticite chez Dassault Aviation pour la conception des avions de combat modernes et des avions d’affaires

    DTIC Science & Technology

    2003-03-01

    combat modernes et des avions d’affaires E. Garrigues, Th. Percheron DASSAULT AVIATION DGT/DTA/IAP F-922 14, Saint-Cloud Cedex France 1. Introduction ...de vol, des acedidrations rigides et des rdponses de la structure ( jauges et acedidrations). Struturl Premdicton Grdjustments n~~~ligh Testsn~n Fig4ure

  7. Propeller aircraft interior noise model utilization study and validation

    NASA Technical Reports Server (NTRS)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  8. Discovery of the Lensed Quasar System DES J0408-5354

    DOE PAGES

    Lin, H.; Buckley-Geer, E.; Agnello, A.; ...

    2017-03-27

    We report the discovery and spectroscopic confirmation of the quad-like lensed quasar system DES J0408-5354 found in the Dark Energy Survey (DES) Year 1 (Y1) data. This system was discovered during a search for DES Y1 strong lensing systems using a method that identified candidates as red galaxies with multiple blue neighbors. DES J0408-5354 consists of a central red galaxy surrounded by three bright (more » $$i\\lt 20$$) blue objects and a fourth red object. Subsequent spectroscopic observations using the Gemini South telescope confirmed that the three blue objects are indeed the lensed images of a quasar with redshift z = 2.375, and that the central red object is an early-type lensing galaxy with redshift z = 0.597. DES J0408-5354 is the first quad lensed quasar system to be found in DES and begins to demonstrate the potential of DES to discover and dramatically increase the sample size of these very rare objects.« less

  9. Discovery of the Lensed Quasar System DES J0408-5354

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, H.; Buckley-Geer, E.; Agnello, A.

    We report the discovery and spectroscopic confirmation of the quad-like lensed quasar system DES J0408-5354 found in the Dark Energy Survey (DES) Year 1 (Y1) data. This system was discovered during a search for DES Y1 strong lensing systems using a method that identified candidates as red galaxies with multiple blue neighbors. DES J0408-5354 consists of a central red galaxy surrounded by three bright (more » $$i\\lt 20$$) blue objects and a fourth red object. Subsequent spectroscopic observations using the Gemini South telescope confirmed that the three blue objects are indeed the lensed images of a quasar with redshift z = 2.375, and that the central red object is an early-type lensing galaxy with redshift z = 0.597. DES J0408-5354 is the first quad lensed quasar system to be found in DES and begins to demonstrate the potential of DES to discover and dramatically increase the sample size of these very rare objects.« less

  10. Validation of PV-RPM Code in the System Advisor Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less

  11. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  12. Spatial calibration and temporal validation of flow for regional scale hydrologic modeling

    USDA-ARS?s Scientific Manuscript database

    Physically based regional scale hydrologic modeling is gaining importance for planning and management of water resources. Calibration and validation of such regional scale model is necessary before applying it for scenario assessment. However, in most regional scale hydrologic modeling, flow validat...

  13. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  14. Validity test and its consistency in the construction of patient loyalty model

    NASA Astrophysics Data System (ADS)

    Yanuar, Ferra

    2016-04-01

    The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.

  15. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  16. Development and Validation of a Disease Severity Scoring Model for Pediatric Sepsis.

    PubMed

    Hu, Li; Zhu, Yimin; Chen, Mengshi; Li, Xun; Lu, Xiulan; Liang, Ying; Tan, Hongzhuan

    2016-07-01

    Multiple severity scoring systems have been devised and evaluated in adult sepsis, but a simplified scoring model for pediatric sepsis has not yet been developed. This study aimed to develop and validate a new scoring model to stratify the severity of pediatric sepsis, thus assisting the treatment of sepsis in children. Data from 634 consecutive patients who presented with sepsis at Children's hospital of Hunan province in China in 2011-2013 were analyzed, with 476 patients placed in training group and 158 patients in validation group. Stepwise discriminant analysis was used to develop the accurate discriminate model. A simplified scoring model was generated using weightings defined by the discriminate coefficients. The discriminant ability of the model was tested by receiver operating characteristic curves (ROC). The discriminant analysis showed that prothrombin time, D-dimer, total bilirubin, serum total protein, uric acid, PaO2/FiO2 ratio, myoglobin were associated with severity of sepsis. These seven variables were assigned with values of 4, 3, 3, 4, 3, 3, 3 respectively based on the standardized discriminant coefficients. Patients with higher scores had higher risk of severe sepsis. The areas under ROC (AROC) were 0.836 for accurate discriminate model, and 0.825 for simplified scoring model in validation group. The proposed disease severity scoring model for pediatric sepsis showed adequate discriminatory capacity and sufficient accuracy, which has important clinical significance in evaluating the severity of pediatric sepsis and predicting its progress.

  17. Crazy like a fox. Validity and ethics of animal models of human psychiatric disease.

    PubMed

    Rollin, Michael D H; Rollin, Bernard E

    2014-04-01

    Animal models of human disease play a central role in modern biomedical science. Developing animal models for human mental illness presents unique practical and philosophical challenges. In this article we argue that (1) existing animal models of psychiatric disease are not valid, (2) attempts to model syndromes are undermined by current nosology, (3) models of symptoms are rife with circular logic and anthropomorphism, (4) any model must make unjustified assumptions about subjective experience, and (5) any model deemed valid would be inherently unethical, for if an animal adequately models human subjective experience, then there is no morally relevant difference between that animal and a human.

  18. Explorer : des clés pour mieux comprendre la matière

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellis, Jonathan R.

    2011-02-14

    Will the LHC upset theories of the infinitely small? Physicists would like the accelerator to shake the standard model. This theory of elementary particles and forces leaves many gray areas. The LHC and its experiments have been designed to enlighten them. [Le LHC va-t-il bouleverser les théories de l'infiniment petit ? Les physiciens aimeraient que l'accélérateur fasse trembler le modèle standard. Cette théorie des particules élémentaires et des forces laisse de nombreuses zones d'ombre. Le LHC et ses expériences ont été conçus pour les éclairer.

  19. Analyse des possibilités de fonctionnement en régime des désexcitation des moteurs à aimants permanents

    NASA Astrophysics Data System (ADS)

    Multon, Bernard; Lucidarme, Jean; Prévond, Laurent

    1995-05-01

    In this paper, we study the extending of speed range of motors (or generators) with permanent magnet inductor and supplied by electronic converter. The amplitude of phase voltage and current waveforms are limited by electronics supply. The aim of this study is to achieve a maximum power near of the base speed one on an extended speed range. This require an airgap flux weakening so called “flux weakening” above base speed. A parametric analysis of motor electromagnetic characteristics is made to show the influence of armature reaction and magnetic saliency on speed range. We show that exists an ideal condition to obtain a constant power speed range theoretically unlimited. Magnetic saliency permits to enhance the power factor especially when L_d> L_q. As main hypotheses, we consider no saturation, e.m.f. and current sine waveforms and a sinusoidal airgap flux density. Finally, we recapitulate the permanent magnet rotor structures able to obtain a flux-weakening operation. Cet article traite de l'extension de la plage de vitesse des moteurs (ou alternateurs) à excitation par aimants permanents et alimentés par convertisseur électronique. La tension et le courant sont limités en amplitude par l'alimentation électronique.L'objectif est d'obtenir une puissance proche de celle correspondant au régime de base sur une plage de vitesse étendue. Ceci nécessite une réduction de flux d'entrefer ou “désexition” au delà de la vitesse de base. Une analyse paramétrique des caractéristiques du moteur est effectuée pour mettre en évidence l'influence de la réaction d'induit et de la saillance magnétique. Nous montrons qu'il existe une condition idéale pour obtenir une plage de fonctionnement à puissance constante théoriquement illimitée. La saillance magnétique permet d'accroître le facteur de puissance surtout lorsque L_d> L_q. Les principales hypothèses de cette étude sont l'absence de saturation, des f.e.m. et des courants sinusoïdaux et une

  20. Formation des etoiles massives dans les galaxies spirales

    NASA Astrophysics Data System (ADS)

    Lelievre, Mario

    Le but de cette thèse est de décrire la formation des étoiles massives dans les galaxies spirales appartenant à divers types morphologiques. L'imagerie Hα profonde combinée à une robuste méthode d'identification des régions HII ont permis de détecter et de mesurer les propriétés (position, taille, luminosité, taux de formation d'étoiles) de plusieurs régions HII situées dans le disque interne (R < R25) de dix galaxies mais aussi à leur périphérie (R ≥ R 25). De façon générale, la répartition des régions HII ne montre aucune évidence de structure morphologique à R < R25 (bras spiraux, anneau, barre) à moins de limiter l'analyse aux régions HII les plus grosses ou les plus lumineuses. La répartition des régions HII, de même que leur taille et leur luminosité, sont toutefois sujettes à de forts effets de sélection qui dépendent de la distance des galaxies et qu'il faut corriger en ramenant l'échantillon à une résolution spatiale commune. Les fonctions de luminosité montrent que les régions HII les plus brillantes ont tendance à se former dans la portion interne du disque. De plus, l'analyse des pentes révèle une forte corrélation linéaire par rapport au type morphologique. Aucun pic n'est observé dans les fonctions de luminosité à log L-37 qui révèlerait la transition entre les régions HII bornées par l'ionisation et par la densité. Une relation cubique est obtenue entre la taille et la luminosité des régions HII, cette relation variant toutefois de façon significative entre le disque interne et la périphérie d'une même galaxie. La densité et la dynamique du gaz et des étoiles pourraient influencer de façon significative la stabilité des nuages moléculaires face à l'effondrement gravitationnel. D'une part, l'étendue du disque de régions HII pour cinq galaxies de l'échantillon coïncide avec celle de l'hydrogène atomique. D'autre part, en analysant la stabilité des disques galactiques, on conclue

  1. Developing and validating risk prediction models in an individual participant data meta-analysis

    PubMed Central

    2014-01-01

    Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587

  2. Cancer du sein au Maroc: profil phénotypique des tumeurs

    PubMed Central

    Khalil, Ahmadaye Ibrahim; Bendahhou, Karima; Mestaghanmi, Houriya; Saile, Rachid; Benider, Abdellatif

    2016-01-01

    Le cancer du sein est le plus fréquent chez la femme et figure parmi les principales causes de mortalité liées au cancer. La curabilité de ce type tumoral est en augmentation, grâce aux programmes de dépistage et aux progrès thérapeutiques, qui ont certes augmenté la survie des patients. Mais des défis restent à relever en rapport avec l’instabilité phénotypique des cellules cancéreuses. L’objectif de ce travail est d’étudier le profil phénotypique du cancer du sein chez les patients pris en charge au Centre Mohammed VI pour le traitement des Cancers, durant les années 2013-2014. Il s’agit d’une étude transversale sur deux années, incluant les cas du cancer du sein pris en charge au Centre. Le recueil des données était fait à partir des dossiers des patients et analysés par le logiciel Epi Info. 1277 patients ont été pris en charge au sein de notre centre. 99,5% des cas de sexe féminin, l’âge moyen était 50,20 ± 11,34 ans. Le type histologique le plus fréquent était le carcinome canalaire infiltrant (80,7% des cas). Le stade diagnostic était précoce (56,9%). Le phénotype moléculaire le plus fréquent était le luminal A (41,4% des cas). Le luminal B, le HER2 et les triples négatifs étaient dans respectivement 10,4%, 6,3%, 11,2% des cas. L’étude du phénotype tumoral des patients atteints du cancer du sein permet l’orientation du clinicien dans le choix du traitement, et des décideurs dans la planification de programmes de lutte contre cette pathologie. PMID:28292037

  3. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  4. Accreditation or Validation of Prior Experiential Learning: Knowledge and "Savoirs" in France-A Different Perspective?

    ERIC Educational Resources Information Center

    Pouget, Mireille; Osborne, Michael

    2004-01-01

    This article stems from the study of the process and application of Accreditation of Prior Experiential Learning (APEL) in the French higher education system, in France referred to as VAP (Validation des Acquis Professionnels ). The paper seeks to review not only the context in which the concepts underpinning VAP in France have developed, but also…

  5. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and

  6. Conductivite dans le modele de Hubbard bi-dimensionnel a faible couplage

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic

    Le modele de Hubbard bi-dimensionnel (2D) est souvent considere comme le modele minimal pour les supraconducteurs a haute temperature critique a base d'oxyde de cuivre (SCHT). Sur un reseau carre, ce modele possede les phases qui sont communes a tous les SCHT, la phase antiferromagnetique, la phase supraconductrice et la phase dite du pseudogap. Il n'a pas de solution exacte, toutefois, plusieurs methodes approximatives permettent d'etudier ses proprietes de facon numerique. Les proprietes optiques et de transport sont bien connues dans les SCHT et sont donc de bonne candidates pour valider un modele theorique et aider a comprendre mieux la physique de ces materiaux. La presente these porte sur le calcul de ces proprietes pour le modele de Hubbard 2D a couplage faible ou intermediaire. La methode de calcul utilisee est l'approche auto-coherente a deux particules (ACDP), qui est non-perturbative et inclue l'effet des fluctuations de spin et de charge a toutes les longueurs d'onde. La derivation complete de l'expression de la conductivite dans l'approche ACDP est presentee. Cette expression contient ce qu'on appelle les corrections de vertex, qui tiennent compte des correlations entre quasi-particules. Pour rendre possible le calcul numerique de ces corrections, des algorithmes utilisant, entre autres, des transformees de Fourier rapides et des splines cubiques sont developpes. Les calculs sont faits pour le reseau carre avec sauts aux plus proches voisins autour du point critique antiferromagnetique. Aux dopages plus faibles que le point critique, la conductivite optique presente une bosse dans l'infrarouge moyen a basse temperature, tel qu'observe dans plusieurs SCHT. Dans la resistivite en fonction de la temperature, on trouve un comportement isolant dans le pseudogap lorsque les corrections de vertex sont negligees et metallique lorsqu'elles sont prises en compte. Pres du point critique, la resistivite est lineaire en T a basse temperature et devient

  7. Approaches to Validation of Models for Low Gravity Fluid Behavior

    NASA Technical Reports Server (NTRS)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  8. High-Lift System Aerodynamics (L’Aerodynamique des Systems Hypersustentateurs)

    DTIC Science & Technology

    1993-09-01

    les kcouleinents incoinpressibles sur les profils L -a pr~sente m~thode num~rique montre que la simulation multi-corps, qui est un...de l ’&oulement du d𔄀chelle de discrdtisation, d’origine physique, introduit des m~me ordre que l ’~paisseur des couches limites sur le point de...hypersustentateurs. Enfin, les consequences des exigences de furtivite sur la forme des aeronefs - c’est it dire la creation de configurations telles que

  9. Validation workflow for a clinical Bayesian network model in multidisciplinary decision making in head and neck oncology treatment.

    PubMed

    Cypko, Mario A; Stoehr, Matthaeus; Kozniewski, Marcin; Druzdzel, Marek J; Dietz, Andreas; Berliner, Leonard; Lemke, Heinz U

    2017-11-01

    Oncological treatment is being increasingly complex, and therefore, decision making in multidisciplinary teams is becoming the key activity in the clinical pathways. The increased complexity is related to the number and variability of possible treatment decisions that may be relevant to a patient. In this paper, we describe validation of a multidisciplinary cancer treatment decision in the clinical domain of head and neck oncology. Probabilistic graphical models and corresponding inference algorithms, in the form of Bayesian networks, can support complex decision-making processes by providing a mathematically reproducible and transparent advice. The quality of BN-based advice depends on the quality of the model. Therefore, it is vital to validate the model before it is applied in practice. For an example BN subnetwork of laryngeal cancer with 303 variables, we evaluated 66 patient records. To validate the model on this dataset, a validation workflow was applied in combination with quantitative and qualitative analyses. In the subsequent analyses, we observed four sources of imprecise predictions: incorrect data, incomplete patient data, outvoting relevant observations, and incorrect model. Finally, the four problems were solved by modifying the data and the model. The presented validation effort is related to the model complexity. For simpler models, the validation workflow is the same, although it may require fewer validation methods. The validation success is related to the model's well-founded knowledge base. The remaining laryngeal cancer model may disclose additional sources of imprecise predictions.

  10. Contrôle du marché informel à l’heure de la mondialisation des échanges. Le cas des antirétroviraux au Chili

    PubMed Central

    Brousselle, Astrid; Morales, Cristián

    2013-01-01

    Résumé Les nouveaux médicaments pour le VIH/sida ont créé des besoins d’accessibilité aux traitements que les gouvernements n’ont pas toujours réussi à couvrir. Il en résulte l’émergence d’un marché informel des ARV. Par l’analyse de la situation au Chili, nous traitons des différents créneaux d’approvisionnement, des conséquences de l’existence d’un tel marché, ainsi que des moyens envisageables pour réduire les effets indésirables. Les aspects tant microéconomiques que macroéconomiques concernant le marché et l’accessibilité aux médicaments sont abordés. PMID:23997580

  11. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  12. Validation of sea ice models using an uncertainty-based distance metric for multiple model variables: NEW METRIC FOR SEA ICE MODEL VALIDATION

    DOE PAGES

    Urrego-Blanco, Jorge R.; Hunke, Elizabeth C.; Urban, Nathan M.; ...

    2017-04-01

    Here, we implement a variance-based distance metric (D n) to objectively assess skill of sea ice models when multiple output variables or uncertainties in both model predictions and observations need to be considered. The metric compares observations and model data pairs on common spatial and temporal grids improving upon highly aggregated metrics (e.g., total sea ice extent or volume) by capturing the spatial character of model skill. The D n metric is a gamma-distributed statistic that is more general than the χ 2 statistic commonly used to assess model fit, which requires the assumption that the model is unbiased andmore » can only incorporate observational error in the analysis. The D n statistic does not assume that the model is unbiased, and allows the incorporation of multiple observational data sets for the same variable and simultaneously for different variables, along with different types of variances that can characterize uncertainties in both observations and the model. This approach represents a step to establish a systematic framework for probabilistic validation of sea ice models. The methodology is also useful for model tuning by using the D n metric as a cost function and incorporating model parametric uncertainty as part of a scheme to optimize model functionality. We apply this approach to evaluate different configurations of the standalone Los Alamos sea ice model (CICE) encompassing the parametric uncertainty in the model, and to find new sets of model configurations that produce better agreement than previous configurations between model and observational estimates of sea ice concentration and thickness.« less

  13. L'ethique de l'environnement comme dimension transversale de l'education en sciences et en technologies: Proposition d'un modele educationnel

    NASA Astrophysics Data System (ADS)

    Chavez, Milagros

    Cette these presente la trajectoire et les resultats d'une recherche dont l'objectif global est de developper un modele educationnel integrant l'ethique de l'environnement comme dimension transversale de l'education en sciences et en technologies. Face au paradigme positiviste toujours dominant dans l'enseignement des sciences, il a semble utile d'ouvrir un espace de reflexion et de proposer, sous forme d'un modele formel, une orientation pedagogique qui soit plus en resonance avec quelques-unes des preoccupations fondamentales de notre epoque: en particulier celle qui concerne la relation de humain avec son environnement et plus specifiquement, le role de la science dans le faconnement d'une telle relation, par sa contribution a la transformation des conditions de vie, au point de compromettre les equilibres naturels. En fonction de cette problematique generale, les objectifs de la recherche sont les suivants: (1) definir les elements paradigmatiques, theoriques et axiologiques du modele educationnel a construire et (2) definir ses composantes strategiques. De caractere theorico-speculatif, cette recherche a adopte la demarche de l'anasynthese, en la situant dans la perspective critique de la recherche en education. Le cadre theorique de cette these s'est construit autour de quatre concepts pivots: modele educationnel, education en sciences et en technologies, transversalite educative et ethique de l'environnement. Ces concepts ont ete clarifies a partir d'un corpus textuel, puis, sur cette base, des choix theoriques ont ete faits, a partir desquels un prototype du modele a ete elabore. Ce prototype a ensuite ete soumis a une double validation (par des experts et par une mise a l'essai), dans le but d'y apporter des ameliorations et, a partir de la, de construire un modele optimal. Ce dernier comporte deux dimensions: theorico-axiologique et strategique. La premiere s'appuie sur une conception de l'education en sciences et en technologies comme appropriation d

  14. Development and validation of a mortality risk model for pediatric sepsis.

    PubMed

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-05-01

    Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial.We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities.According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively.The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients.

  15. Development and validation of a mortality risk model for pediatric sepsis

    PubMed Central

    Chen, Mengshi; Lu, Xiulan; Hu, Li; Liu, Pingping; Zhao, Wenjiao; Yan, Haipeng; Tang, Liang; Zhu, Yimin; Xiao, Zhenghui; Chen, Lizhang; Tan, Hongzhuan

    2017-01-01

    Abstract Pediatric sepsis is a burdensome public health problem. Assessing the mortality risk of pediatric sepsis patients, offering effective treatment guidance, and improving prognosis to reduce mortality rates, are crucial. We extracted data derived from electronic medical records of pediatric sepsis patients that were collected during the first 24 hours after admission to the pediatric intensive care unit (PICU) of the Hunan Children's hospital from January 2012 to June 2014. A total of 788 children were randomly divided into a training (592, 75%) and validation group (196, 25%). The risk factors for mortality among these patients were identified by conducting multivariate logistic regression in the training group. Based on the established logistic regression equation, the logit probabilities for all patients (in both groups) were calculated to verify the model's internal and external validities. According to the training group, 6 variables (brain natriuretic peptide, albumin, total bilirubin, D-dimer, lactate levels, and mechanical ventilation in 24 hours) were included in the final logistic regression model. The areas under the curves of the model were 0.854 (0.826, 0.881) and 0.844 (0.816, 0.873) in the training and validation groups, respectively. The Mortality Risk Model for Pediatric Sepsis we established in this study showed acceptable accuracy to predict the mortality risk in pediatric sepsis patients. PMID:28514310

  16. Filament winding cylinders. II - Validation of the process model

    NASA Technical Reports Server (NTRS)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  17. Opération multimode transverse des OPOs: des structures classiques aux corrélations quantiques

    NASA Astrophysics Data System (ADS)

    Martinelli, M.; Ducci, S.; Gigan, S.; Treps, N.; Maître, A.; Fabre, C.

    2002-06-01

    Nous démontrons la formation de structures transverses sur les faisceaux émis par un oscillateur paramétrique optique (OPO) de type II en configuration confocale. D'un point de vue classique nous mettons en évidence le caractère multimode transverse de telles structures. A travers l'étude des corrélations spatiales des faisceaux générés nous montrons que ces structures sont également multimodes d'un point de vue quantique.

  18. Electronic Messaging for the 90s (Les Messageries Electroniques des Annees 90)

    DTIC Science & Technology

    1993-05-01

    des autres unitds (un programme peut par exemple inclure un algorithme de 2. L CICULAIONDE ’INFRMAION recherche dans des donn~es...dit quc le rdseau, en particulier le r.dseau distant, cette coopdration homme /macbine. aifre des performances bien infirieures A celles des ordinateurs...dans ce sens, entre constructeurs d’ordinateurs, des opdrations traitementltransport. constructeurs didquipements de t~ l ~communications et exploitants de

  19. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  20. Rational selection of training and test sets for the development of validated QSAR models

    NASA Astrophysics Data System (ADS)

    Golbraikh, Alexander; Shen, Min; Xiao, Zhiyan; Xiao, Yun-De; Lee, Kuo-Hsiung; Tropsha, Alexander

    2003-02-01

    Quantitative Structure-Activity Relationship (QSAR) models are used increasingly to screen chemical databases and/or virtual chemical libraries for potentially bioactive molecules. These developments emphasize the importance of rigorous model validation to ensure that the models have acceptable predictive power. Using k nearest neighbors ( kNN) variable selection QSAR method for the analysis of several datasets, we have demonstrated recently that the widely accepted leave-one-out (LOO) cross-validated R2 (q2) is an inadequate characteristic to assess the predictive ability of the models [Golbraikh, A., Tropsha, A. Beware of q2! J. Mol. Graphics Mod. 20, 269-276, (2002)]. Herein, we provide additional evidence that there exists no correlation between the values of q 2 for the training set and accuracy of prediction ( R 2) for the test set and argue that this observation is a general property of any QSAR model developed with LOO cross-validation. We suggest that external validation using rationally selected training and test sets provides a means to establish a reliable QSAR model. We propose several approaches to the division of experimental datasets into training and test sets and apply them in QSAR studies of 48 functionalized amino acid anticonvulsants and a series of 157 epipodophyllotoxin derivatives with antitumor activity. We formulate a set of general criteria for the evaluation of predictive power of QSAR models.

  1. Predictive Model for Particle Residence Time Distributions in Riser Reactors. Part 1: Model Development and Validation

    DOE PAGES

    Foust, Thomas D.; Ziegler, Jack L.; Pannala, Sreekanth; ...

    2017-02-28

    Here in this computational study, we model the mixing of biomass pyrolysis vapor with solid catalyst in circulating riser reactors with a focus on the determination of solid catalyst residence time distributions (RTDs). A comprehensive set of 2D and 3D simulations were conducted for a pilot-scale riser using the Eulerian-Eulerian two-fluid modeling framework with and without sub-grid-scale models for the gas-solids interaction. A validation test case was also simulated and compared to experiments, showing agreement in the pressure gradient and RTD mean and spread. For simulation cases, it was found that for accurate RTD prediction, the Johnson and Jackson partialmore » slip solids boundary condition was required for all models and a sub-grid model is useful so that ultra high resolutions grids that are very computationally intensive are not required. Finally, we discovered a 2/3 scaling relation for the RTD mean and spread when comparing resolved 2D simulations to validated unresolved 3D sub-grid-scale model simulations.« less

  2. Validation de la traduction française de la SURPS pour une population d’adolescents québécois

    PubMed Central

    Castonguay-Jolin, Laura; Perrier-Ménard, Eveline; Castellanos-Ryan, Natalie; Parent, Sophie; Vitaro, Frank; Tremblay, Richard E; Garel, Patricia; Séguin, Jean R; Conrod, Patricia J

    2013-01-01

    Objectif La Substance Use Risk Profile Scale (SURPS) est un instrument de dépistage des caractéristiques de personnalité qui représentent un risque pour le développement d’une consommation problématique de substances. La SURPS comporte 23 items évaluant 4 dimensions et permet aux intervenants en santé mentale de mieux cibler la prévention. La SURPS a été validée au Canada anglais, au Royaume-Uni, en Chine et au Sri Lanka; l’objectif de cette étude est de valider une traduction française de la SURPS pour des adolescents francophones québécois, en plus d’en tester la sensibilité dans une population clinique. Méthode Deux cent deux jeunes de 15 ans d’un échantillon communautaire ont répondu à la SURPS et à des mesures de la personnalité et de l’utilisation de substances. La cohérence interne, la solution factorielle et la validité concomitante de l’échelle ont été évaluées. Quarante adolescents (âge moyen de 15,7 ans) présentant un diagnostic psychiatrique ont également répondu à la SURPS et les scores ont été comparés aux normes de l’échantillon communautaire. Résultats La traduction française de la SURPS démontre une bonne cohérence interne ainsi qu’une solution factorielle à 4 facteurs semblable à la version originale. Ses 4 souséchelles ont une bonne validité concomitante. De plus, 3 de ses souséchelles sont corrélées avec des mesures relatives à la consommation de substances psychoactives. Finalement, 95 % des participants de l’échantillon clinique ont été identifiés à risque selon les scores limites de la SURPS. Conclusion La version française de la SURPS paraît être une mesure valide et sensible pouvant être utilisée auprès d’une population adolescente, québécoise et francophone. PMID:24099502

  3. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  4. Detached-Eddy Simulation Based on the v2-f Model

    NASA Technical Reports Server (NTRS)

    Jee, Sol Keun; Shariff, Karim

    2012-01-01

    Detached eddy simulation (DES) based on the v2-f RANS model is proposed. This RANS model incorporates the anisotropy of near-wall turbulence which is absent in other RANS models commonly used in the DES community. In LES mode, the proposed DES formulation reduces to a transport equation for the subgrid-scale kinetic energy. The constant, CDES, required by this model was calibrated by simulating isotropic turbulence. In the final paper, DES simulations of canonical separated flows will be presented.

  5. Imagerie polarimétrique active : des applications militaires et duales

    NASA Astrophysics Data System (ADS)

    Goudail, François; Boffety, Matthieu; Leviandier, Luc; Vannier, Nicolas

    2017-12-01

    L'imagerie polarimétrique active permet de révéler des contrastes invisibles à l'oeil humain et aux caméras classiques. Elle peut étendre les capacités de décamouflage des systèmes d'imagerie active et améliorer la détection d'objets dangereux sur des pistes. Elle possède également de nombreuses applications duales dans des domaines tels que l'imagerie biomédicale ou la vision industrielle.

  6. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, Alexander; Hawes, Frederick; Fox, Marsha

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation

  7. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  8. Explicit validation of a surface shortwave radiation balance model over snow-covered complex terrain

    NASA Astrophysics Data System (ADS)

    Helbig, N.; Löwe, H.; Mayer, B.; Lehning, M.

    2010-09-01

    A model that computes the surface radiation balance for all sky conditions in complex terrain is presented. The spatial distribution of direct and diffuse sky radiation is determined from observations of incident global radiation, air temperature, and relative humidity at a single measurement location. Incident radiation under cloudless sky is spatially derived from a parameterization of the atmospheric transmittance. Direct and diffuse sky radiation for all sky conditions are obtained by decomposing the measured global radiation value. Spatial incident radiation values under all atmospheric conditions are computed by adjusting the spatial radiation values obtained from the parametric model with the radiation components obtained from the decomposition model at the measurement site. Topographic influences such as shading are accounted for. The radiosity approach is used to compute anisotropic terrain reflected radiation. Validations of the shortwave radiation balance model are presented in detail for a day with cloudless sky. For a day with overcast sky a first validation is presented. Validation of a section of the horizon line as well as of individual radiation components is performed with high-quality measurements. A new measurement setup was designed to determine terrain reflected radiation. There is good agreement between the measurements and the modeled terrain reflected radiation values as well as with incident radiation values. A comparison of the model with a fully three-dimensional radiative transfer Monte Carlo model is presented. That validation reveals a good agreement between modeled radiation values.

  9. A Public-Private Partnership Develops and Externally Validates a 30-Day Hospital Readmission Risk Prediction Model

    PubMed Central

    Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat

    2013-01-01

    Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to

  10. Sorbent, Sublimation, and Icing Modeling Methods: Experimental Validation and Application to an Integrated MTSA Subassembly Thermal Model

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2010-01-01

    This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.

  11. MT3DMS: Model use, calibration, and validation

    USGS Publications Warehouse

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  12. Static and Dynamic Disorder in Bacterial Light-Harvesting Complex LH2: A 2DES Simulation Study.

    PubMed

    Rancova, Olga; Abramavicius, Darius

    2014-07-10

    Two-dimensional coherent electronic spectroscopy (2DES) is a powerful technique in distinguishing homogeneous and inhomogeneous broadening contributions to the spectral line shapes of molecular transitions induced by environment fluctuations. Using an excitonic model of a double-ring LH2 aggregate, we perform simulations of its 2DES spectra and find that the model of a harmonic environment cannot provide a consistent set of parameters for two temperatures: 77 K and room temperature. This indicates the highly anharmonic nature of protein fluctuations for the pigments of the B850 ring. However, the fluctuations of B800 ring pigments can be assumed as harmonic in this temperature range.

  13. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in

  14. Evaluation des prescriptions antibiotiques au service des urgences de l’Hôpital Militaire d’Instruction Mohammed V (HMIMV)

    PubMed Central

    Elbouti, Anass; Rafai, Mostafa; Chouaib, Naoufal; Jidane, Said; Belkouch, Ahmed; Bakkali, Hicham; Belyamani, Lahcen

    2016-01-01

    Cette étude à pour objectifs de décrire les pratiques des prescriptions, évaluer leur pertinence et leur conformité aux règles d’utilisations et étudier les facteurs susceptibles de les influencer. Il s’agit d’une étude transversale d’évaluation des prescriptions antibiotiques portant sur 105 patients réalisée au service des urgences médico-chirurgicales de l’H.M.I.Med V de Rabat sur une période d’un mois. Le recueil des données était fait à l’aide d’un questionnaire rapportant les données démographiques et anamnestiques, les antécédents, la notion d’allergie, les données spécifiques de l’examen clinique, les données para cliniques, la prescription détaillée de l’antibiotique. Les données récoltées ont été ensuite évaluées par un médecin référent, chargé d’indiquer les éventuelles erreurs de traitement. Parmi les infections ayant motivé la prescription des antibiotiques, les affections des systèmes respiratoires et urinaires étaient au premier rang, les familles d’antibiotiques les plus couramment employées sont les pénicillines, les quinolones et les céphalosporines. 74 prescriptions soit (70,5%) étaient à la fois pertinentes et conformes contre 9 prescriptions soit (8,6%) justifiées mais non pertinentes et 6 prescriptions soit (5,7%) étaient jugées injustifiées par le médecin référent par absence d’infection. Les évaluations des pratiques médicales sont rarement menées dans les établissements de santé; c’est dans ce cadre que nous avons voulu nous inscrire en produisant cette étude afin d’améliorer la pertinence de nos prescriptions antibiotiques et d’optimiser leur conformité aux différentes recommandations. PMID:28292124

  15. Dépistage des maladies cardiovasculaires chez des étudiants de l'Université de Douala et influence des activités physiques et sportives

    PubMed Central

    Ewane, Marielle Epacka; Mandengue, Samuel Honoré; Priso, Eugene Belle; Tamba, Stéphane Moumbe; Ahmadou; Fouda, André Bita

    2012-01-01

    Introduction Les maladies cardiovasculaires (MCV) constituent l'une des principales causes de mortalité dans les pays en développement. Le dépistage de ces dernières chez des jeunes est un défi dans la lutte contre leur expansion. Le but de cette étude était de dépister ces maladies au sein d'une population jeunes d’étudiants camerounais. Methodes Deux mille six cent cinquante-huit étudiants de l'Université de Douala (23,6 ± 2,9 ans, sex-ratio H/F = 0,9) ont en Avril - Mai 2011 participé à une campagne de dépistage gratuit du diabète, de l'hypertension artérielle (HTA) et de l'obésité. Ils ont également été soumis à une d'enquête évaluant leur niveau en activités physiques et sportives (APS). Resultats 12,7% des participants avaient une pression artérielle (PA) ≥ 140/90 mmHg, 3,6% étaient obèses et 0,9% avaient une glycémie ≥1,26 g/L. Des corrélations ont été trouvées entre certains facteurs de risque (diabète, hypertension et obésité) et le niveau académique d'une part (r =0,366; p < 0,0001) et le temps passé devant la télévision d'autres part (r = 0,411; p < 0,0001). L‘APS était inversement corrélée à l‘âge (r =-0,015; p < 0,0001) et au temps passé devant la télévision (r = -0,059; p = 0,002). Conclusion La présence des MCV et leurs facteurs de risque mis en évidence dans cette étude réalisée en milieu estudiantin camerounais interpelle à une prévention et une éducation dans la lutte contre ces dernières. PMID:22655111

  16. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less

  17. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  18. Use of DES in mildly separated internal flow: dimples in a turbulent channel

    NASA Astrophysics Data System (ADS)

    Tay, Chien Ming Jonathan; Khoo, Boo Cheong; Chew, Yong Tian

    2017-12-01

    Detached eddy simulation (DES) is investigated as a means to study an array of shallow dimples with depth to diameter ratios of 1.5% and 5% in a turbulent channel. The DES captures large-scale flow features relatively well, but is unable to predict skin friction accurately due to flow modelling near the wall. The current work instead relies on the accuracy of DES to predict large-scale flow features, as well as its well-documented reliability in predicting flow separation regions to support the proposed mechanism that dimples reduce drag by introducing spanwise flow components near the wall through the addition of streamwise vorticity. Profiles of the turbulent energy budget show the stabilising effect of the dimples on the flow. The presence of flow separation however modulates the net drag reduction. Increasing the Reynolds number can reduce the size of the separated region and experiments show that this increases the overall drag reduction.

  19. Decompression models: review, relevance and validation capabilities.

    PubMed

    Hugon, J

    2014-01-01

    For more than a century, several types of mathematical models have been proposed to describe tissue desaturation mechanisms in order to limit decompression sickness. These models are statistically assessed by DCS cases, and, over time, have gradually included bubble formation biophysics. This paper proposes to review this evolution and discuss its limitations. This review is organized around the comparison of decompression model biophysical criteria and theoretical foundations. Then, the DCS-predictive capability was analyzed to assess whether it could be improved by combining different approaches. Most of the operational decompression models have a neo-Haldanian form. Nevertheless, bubble modeling has been gaining popularity, and the circulating bubble amount has become a major output. By merging both views, it seems possible to build a relevant global decompression model that intends to simulate bubble production while predicting DCS risks for all types of exposures and decompression profiles. A statistical approach combining both DCS and bubble detection databases has to be developed to calibrate a global decompression model. Doppler ultrasound and DCS data are essential: i. to make correlation and validation phases reliable; ii. to adjust biophysical criteria to fit at best the observed bubble kinetics; and iii. to build a relevant risk function.

  20. DEVELOPMENT OF GUIDELINES FOR CALIBRATING, VALIDATING, AND EVALUATING HYDROLOGIC AND WATER QUALITY MODELS: ASABE ENGINEERING PRACTICE 621

    USDA-ARS?s Scientific Manuscript database

    Information to support application of hydrologic and water quality (H/WQ) models abounds, yet modelers commonly use arbitrary, ad hoc methods to conduct, document, and report model calibration, validation, and evaluation. Consistent methods are needed to improve model calibration, validation, and e...

  1. Translation and validation of the Spanish version of the "Echelle de Satisfaction des Besoins Psychologiques" in the sports context.

    PubMed

    Domínguez, Evelia; Martín, Patricia; Martín-Albo, José; Núñez, Juan L; León, Jaime

    2010-11-01

    The aim of the present research was to translate and to analyze the psychometric properties of the Spanish version of the Satisfaction of Psychological Needs Scale, using a sample of 284 athletes (204 male and 78 female). Results of the confirmatory factor analysis confirmed the correlated three-factor structure of the scale. Furthermore, the results showed evidence of convergence validity with the Basic Psychological Needs in Exercise Scale. The predictive validity was tested using a structural equation model in which task orientation climate predicted the three basic psychological needs and these, in turn, intrinsic motivation. Likewise, we documented evidence of reliability, analyzed as internal consistency and temporal stability. Results partially support the use of the Spanish version of the scale in sports.

  2. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2013-11-20

    Granger causality F-test validation 3.1.2. Dynamic time warping for uneven temporal relationships Many causal relationships are imperfectly...mapping for dynamic feedback models Granger causality and DTW can identify causal relationships and consider complex temporal factors. However, many ...variant of the tf-idf algorithm (Manning, Raghavan, Schutze et al., 2008), typically used in search engines, to “score” features. The (-log tf) in

  3. Use of DES Modeling for Determining Launch Availability for SLS

    NASA Technical Reports Server (NTRS)

    Watson, Michael; Staton, Eric; Cates, Grant; Finn, Ronald; Altino, Karen M.; Burns, K. Lee

    2014-01-01

    (1) NASA is developing a new heavy lift launch system for human and scientific exploration beyond Earth orbit comprising of the Space Launch System (SLS), Orion Multi-Purpose Crew Vehicle (MPCV), and Ground Systems Development and Operations (GSDO); (2) The desire of the system is to ensure a high confidence of successfully launching the exploration missions, especially those that require multiple launches, have a narrow Earth departure window, and high investment costs; and (3) This presentation discusses the process used by a Cross-Program team to develop the Exploration Systems Development (ESD) Launch Availability (LA) Technical Performance Measure (TPM) and allocate it to each of the Programs through the use of Discrete Event Simulations (DES).

  4. Calibration and validation of a general infiltration model

    NASA Astrophysics Data System (ADS)

    Mishra, Surendra Kumar; Ranjan Kumar, Shashi; Singh, Vijay P.

    1999-08-01

    A general infiltration model proposed by Singh and Yu (1990) was calibrated and validated using a split sampling approach for 191 sets of infiltration data observed in the states of Minnesota and Georgia in the USA. Of the five model parameters, fc (the final infiltration rate), So (the available storage space) and exponent n were found to be more predictable than the other two parameters: m (exponent) and a (proportionality factor). A critical examination of the general model revealed that it is related to the Soil Conservation Service (1956) curve number (SCS-CN) method and its parameter So is equivalent to the potential maximum retention of the SCS-CN method and is, in turn, found to be a function of soil sorptivity and hydraulic conductivity. The general model was found to describe infiltration rate with time varying curve number.

  5. Acute Brain Dysfunction: Development and Validation of a Daily Prediction Model.

    PubMed

    Marra, Annachiara; Pandharipande, Pratik P; Shotwell, Matthew S; Chandrasekhar, Rameela; Girard, Timothy D; Shintani, Ayumi K; Peelen, Linda M; Moons, Karl G M; Dittus, Robert S; Ely, E Wesley; Vasilevskis, Eduard E

    2018-03-24

    The goal of this study was to develop and validate a dynamic risk model to predict daily changes in acute brain dysfunction (ie, delirium and coma), discharge, and mortality in ICU patients. Using data from a multicenter prospective ICU cohort, a daily acute brain dysfunction-prediction model (ABD-pm) was developed by using multinomial logistic regression that estimated 15 transition probabilities (from one of three brain function states [normal, delirious, or comatose] to one of five possible outcomes [normal, delirious, comatose, ICU discharge, or died]) using baseline and daily risk factors. Model discrimination was assessed by using predictive characteristics such as negative predictive value (NPV). Calibration was assessed by plotting empirical vs model-estimated probabilities. Internal validation was performed by using a bootstrap procedure. Data were analyzed from 810 patients (6,711 daily transitions). The ABD-pm included individual risk factors: mental status, age, preexisting cognitive impairment, baseline and daily severity of illness, and daily administration of sedatives. The model yielded very high NPVs for "next day" delirium (NPV: 0.823), coma (NPV: 0.892), normal cognitive state (NPV: 0.875), ICU discharge (NPV: 0.905), and mortality (NPV: 0.981). The model demonstrated outstanding calibration when predicting the total number of patients expected to be in any given state across predicted risk. We developed and internally validated a dynamic risk model that predicts the daily risk for one of three cognitive states, ICU discharge, or mortality. The ABD-pm may be useful for predicting the proportion of patients for each outcome state across entire ICU populations to guide quality, safety, and care delivery activities. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  6. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    PubMed

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  7. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  8. La planification préalable des soins pour les patients en pédiatrie

    PubMed Central

    2008-01-01

    RÉSUMÉ Les progrès médicaux et technologiques ont permis d’ac-croître les taux de survie et d’améliorer la qualité de vie des nourrissons, des enfants et des adolescents ayant des maladies chroniques mettant la vie en danger. La planifi-cation préalable des soins inclut le processus relié aux discussions sur les traitements essentiels au maintien de la survie et la détermination des objectifs des soins de longue durée. Les dispensateurs de soins pédiatriques ont l’obligation éthique d’assimiler cet aspect des soins médicaux. Le présent document de principes vise à aider les dispensateurs de soins à discuter de la planification préalable des soins des patients pédiatriques dans diverses situations. La planification préalable des soins exige des communications efficaces afin de clarifier les objectifs des soins et de s’entendre sur les traitements pertinents ou non pour réaliser ces objectifs, y compris les mesures de réanimation et les mesures palliatives.

  9. Design and validation of a model to predict early mortality in haemodialysis patients.

    PubMed

    Mauri, Joan M; Clèries, Montse; Vela, Emili

    2008-05-01

    Mortality and morbidity rates are higher in patients receiving haemodialysis therapy than in the general population. Detection of risk factors related to early death in these patients could be of aid for clinical and administrative decision making. Objectives. The aims of this study were (1) to identify risk factors (comorbidity and variables specific to haemodialysis) associated with death in the first year following the start of haemodialysis and (2) to design and validate a prognostic model to quantify the probability of death for each patient. An analysis was carried out on all patients starting haemodialysis treatment in Catalonia during the period 1997-2003 (n = 5738). The data source was the Renal Registry of Catalonia, a mandatory population registry. Patients were randomly divided into two samples: 60% (n = 3455) of the total were used to develop the prognostic model and the remaining 40% (n = 2283) to validate the model. Logistic regression analysis was used to construct the model. One-year mortality in the total study population was 16.5%. The predictive model included the following variables: age, sex, primary renal disease, grade of functional autonomy, chronic obstructive pulmonary disease, malignant processes, chronic liver disease, cardiovascular disease, initial vascular access and malnutrition. The analyses showed adequate calibration for both the sample to develop the model and the validation sample (Hosmer-Lemeshow statistic 0.97 and P = 0.49, respectively) as well as adequate discrimination (ROC curve 0.78 in both cases). Risk factors implicated in mortality at one year following the start of haemodialysis have been determined and a prognostic model designed. The validated, easy-to-apply model quantifies individual patient risk attributable to various factors, some of them amenable to correction by directed interventions.

  10. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model.

    PubMed

    Wen, Kuang-Yi; Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Two of the seven factors, 'organizational motivation' and 'meeting user needs,' were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term.

  11. A diagnostic model for the detection of sensitization to wheat allergens was developed and validated in bakery workers.

    PubMed

    Suarthana, Eva; Vergouwe, Yvonne; Moons, Karel G; de Monchy, Jan; Grobbee, Diederick; Heederik, Dick; Meijer, Evert

    2010-09-01

    To develop and validate a prediction model to detect sensitization to wheat allergens in bakery workers. The prediction model was developed in 867 Dutch bakery workers (development set, prevalence of sensitization 13%) and included questionnaire items (candidate predictors). First, principal component analysis was used to reduce the number of candidate predictors. Then, multivariable logistic regression analysis was used to develop the model. Internal validation and extent of optimism was assessed with bootstrapping. External validation was studied in 390 independent Dutch bakery workers (validation set, prevalence of sensitization 20%). The prediction model contained the predictors nasoconjunctival symptoms, asthma symptoms, shortness of breath and wheeze, work-related upper and lower respiratory symptoms, and traditional bakery. The model showed good discrimination with an area under the receiver operating characteristic (ROC) curve area of 0.76 (and 0.75 after internal validation). Application of the model in the validation set gave a reasonable discrimination (ROC area=0.69) and good calibration after a small adjustment of the model intercept. A simple model with questionnaire items only can be used to stratify bakers according to their risk of sensitization to wheat allergens. Its use may increase the cost-effectiveness of (subsequent) medical surveillance.

  12. Model improvements and validation of TerraSAR-X precise orbit determination

    NASA Astrophysics Data System (ADS)

    Hackel, S.; Montenbruck, O.; Steigenberger, P.; Balss, U.; Gisinger, C.; Eineder, M.

    2017-05-01

    The radar imaging satellite mission TerraSAR-X requires precisely determined satellite orbits for validating geodetic remote sensing techniques. Since the achieved quality of the operationally derived, reduced-dynamic (RD) orbit solutions limits the capabilities of the synthetic aperture radar (SAR) validation, an effort is made to improve the estimated orbit solutions. This paper discusses the benefits of refined dynamical models on orbit accuracy as well as estimated empirical accelerations and compares different dynamic models in a RD orbit determination. Modeling aspects discussed in the paper include the use of a macro-model for drag and radiation pressure computation, the use of high-quality atmospheric density and wind models as well as the benefit of high-fidelity gravity and ocean tide models. The Sun-synchronous dusk-dawn orbit geometry of TerraSAR-X results in a particular high correlation of solar radiation pressure modeling and estimated normal-direction positions. Furthermore, this mission offers a unique suite of independent sensors for orbit validation. Several parameters serve as quality indicators for the estimated satellite orbit solutions. These include the magnitude of the estimated empirical accelerations, satellite laser ranging (SLR) residuals, and SLR-based orbit corrections. Moreover, the radargrammetric distance measurements of the SAR instrument are selected for assessing the quality of the orbit solutions and compared to the SLR analysis. The use of high-fidelity satellite dynamics models in the RD approach is shown to clearly improve the orbit quality compared to simplified models and loosely constrained empirical accelerations. The estimated empirical accelerations are substantially reduced by 30% in tangential direction when working with the refined dynamical models. Likewise the SLR residuals are reduced from -3 ± 17 to 2 ± 13 mm, and the SLR-derived normal-direction position corrections are reduced from 15 to 6 mm, obtained from

  13. Joint measurement of lensing-galaxy correlations using SPT and DES SV data

    DOE PAGES

    Baxter, E. J.

    2016-07-04

    We measure the correlation of galaxy lensing and cosmic microwave background lensing with a set of galaxies expected to trace the matter density field. The measurements are performed using pre-survey Dark Energy Survey (DES) Science Verification optical imaging data and millimeter-wave data from the 2500 square degree South Pole Telescope Sunyaev-Zel'dovich (SPT-SZ) survey. The two lensing-galaxy correlations are jointly fit to extract constraints on cosmological parameters, constraints on the redshift distribution of the lens galaxies, and constraints on the absolute shear calibration of DES galaxy lensing measurements. We show that an attractive feature of these fits is that they are fairly insensitive to the clustering bias of the galaxies used as matter tracers. The measurement presented in this work confirms that DES and SPT data are consistent with each other and with the currently favoredmore » $$\\Lambda$$CDM cosmological model. In conclusion, it also demonstrates that joint lensing-galaxy correlation measurement considered here contains a wealth of information that can be extracted using current and future surveys.« less

  14. Joint measurement of lensing-galaxy correlations using SPT and DES SV data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, E. J.

    We measure the correlation of galaxy lensing and cosmic microwave background lensing with a set of galaxies expected to trace the matter density field. The measurements are performed using pre-survey Dark Energy Survey (DES) Science Verification optical imaging data and millimeter-wave data from the 2500 square degree South Pole Telescope Sunyaev-Zel'dovich (SPT-SZ) survey. The two lensing-galaxy correlations are jointly fit to extract constraints on cosmological parameters, constraints on the redshift distribution of the lens galaxies, and constraints on the absolute shear calibration of DES galaxy lensing measurements. We show that an attractive feature of these fits is that they are fairly insensitive to the clustering bias of the galaxies used as matter tracers. The measurement presented in this work confirms that DES and SPT data are consistent with each other and with the currently favoredmore » $$\\Lambda$$CDM cosmological model. In conclusion, it also demonstrates that joint lensing-galaxy correlation measurement considered here contains a wealth of information that can be extracted using current and future surveys.« less

  15. Joint measurement of lensing–galaxy correlations using SPT and DES SV data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, E.; Clampitt, J.; Giannantonio, T.

    We measure the correlation of galaxy lensing and cosmic microwave background lensing with a set of galaxies expected to trace the matter density field. The measurements are performed using pre-survey Dark Energy Survey (DES) Science Verification optical imaging data and millimetre-wave data from the 2500 sq. deg. South Pole Telescope Sunyaev–Zel'dovich (SPT-SZ) survey. The two lensing–galaxy correlations are jointly fit to extract constraints on cosmological parameters, constraints on the redshift distribution of the lens galaxies, and constraints on the absolute shear calibration of DES galaxy-lensing measurements. We show that an attractive feature of these fits is that they are fairlymore » insensitive to the clustering bias of the galaxies used as matter tracers. The measurement presented in this work confirms that DES and SPT data are consistent with each other and with the currently favoured Λ cold dark matter cosmological model. It also demonstrates that joint lensing–galaxy correlation measurement considered here contains a wealth of information that can be extracted using current and future surveys.« less

  16. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  17. Understanding Dynamic Model Validation of a Wind Turbine Generator and a Wind Power Plant: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muljadi, Eduard; Zhang, Ying Chen; Gevorgian, Vahan

    Regional reliability organizations require power plants to validate the dynamic models that represent them to ensure that power systems studies are performed to the best representation of the components installed. In the process of validating a wind power plant (WPP), one must be cognizant of the parameter settings of the wind turbine generators (WTGs) and the operational settings of the WPP. Validating the dynamic model of a WPP is required to be performed periodically. This is because the control parameters of the WTGs and the other supporting components within a WPP may be modified to comply with new grid codesmore » or upgrades to the WTG controller with new capabilities developed by the turbine manufacturers or requested by the plant owners or operators. The diversity within a WPP affects the way we represent it in a model. Diversity within a WPP may be found in the way the WTGs are controlled, the wind resource, the layout of the WPP (electrical diversity), and the type of WTGs used. Each group of WTGs constitutes a significant portion of the output power of the WPP, and their unique and salient behaviors should be represented individually. The objective of this paper is to illustrate the process of dynamic model validations of WTGs and WPPs, the available data recorded that must be screened before it is used for the dynamic validations, and the assumptions made in the dynamic models of the WTG and WPP that must be understood. Without understanding the correct process, the validations may lead to the wrong representations of the WTG and WPP modeled.« less

  18. Caracterisation des revetements par projection thermique a haute vitesse obtenus a partir de differentes poudres de wc-10co-4cr pour des applications en aeronautique =

    NASA Astrophysics Data System (ADS)

    Quintero Malpica, Alfonso

    Les revetements par projection thermique HVOF (High Velocity Oxy-Fuel) sont communement utilises dans l'industrie aeronautique, notamment au sein du partenaire industriel du projet (Tecnickrome Aeronautique Inc), comme des remplacants pour les revetements produits par l'electrodeposition du chrome dur due aux problemes environnementaux. Ce projet avait pour but de trouver une poudre alternative a celle qui est actuellement utilisee pour la production des revetements de type WC-10Co-4Cr obtenus avec la technologie de projection thermique a haute vitesse HVOF et en utilisant le systeme de projection HVOF-JET KOTERTM III. Dans un premier temps, cinq poudres incluant celle de reference, ayant des distributions granulometriques differentes, ont ete projetees dans le but d'identifier quelles poudres pouvaient etre utilisees avec le systeme de projection HVOF-JET KOTERTM III en gardant des parametres similaires (debit d'hydrogene, debit d'oxygene, debit de poudre et distance de projection) que pour la poudre de reference. Les revetements obtenus a partir des poudres etudiees ont ete evalues selon les criteres d'acceptation des revetements sollicites par les principaux manufacturiers des trains d'atterrissage. Les tests ont porte sur l'epaisseur, l'adhesion, la microstructure, la microdurete, les contraintes residuelles et la rugosite. A partir des resultats obtenus, seulement deux poudres ont rencontre toutes les proprietes demandees par les specifications aeronautiques. L'influence de la variation de la distance de projection sur la qualite des revetements a ete etudiee. Cinq distances (100, 125, 150, 175 et 200 mm) ont ete choisies pour faire la projection des deux poudres selectionnees. Les revetements obtenus ont montre de proprietes des revetements similaires (epaisseur, adhesion, microstructure, microdurete, contraintes residuelles et rugosite). Il a ete trouve que la distance de projection est un parametre indirect du systeme de projection HVOF-JET KOTERTM III et

  19. La microscopie ionique analytique des tissus biologiques

    NASA Astrophysics Data System (ADS)

    Galle, P.

    Proposed in 1960 by R. Castaing and G. Slodzian, secondary ion emission microanalysis is a microanalytical method which is now largely used for the study of inert material. The instrument called the analytical ion microscope can also be used for the study of biological spécimens ; images representing the distribution of a given stable or radioactive isotope in a tissue section are obtained with a resolution of 0.5 μm. Among the characteristics of this method, two are of particular interest in biological research : its capacity for isotopic analysis and its very high sensitivity which makes possible for the first time a chemical analysis of element at a very low or even at a trace concentration in a microvolume. Proposé en 1960 par R. Castaing et G. Slodzian, la microanalyse par émission ionique secondaire est une méthode qui permet, entre autre, d'obtenir des images représentant la distribution des isotopes présents à la surface d'un échantillon solide avec une résolution de 0,5 μm. D'intérêt très général, cette méthode a été d'abord largement utilisée pour l'étude des matériaux inertes. Elle offre en outre des possibilités entièrement nouvelles dans le domaine de la recherche biomédicale. L'instrument réalisé, le microscope ionique analytique présente deux caractéristiques particulièrement intéressantes pour la biologie : la possibilité d'analyse isotopique, et l'extrême sensibilité permettant de détecter et de localiser dans une coupe histologique des éléments à des concentrations très faibles voire à l'état de trace.

  20. Validation and Application of a Dried Blood Spot Assay for Biofilm-Active Antibiotics Commonly Used for Treatment of Prosthetic Implant Infections

    PubMed Central

    Knippenberg, Ben; Page-Sharp, Madhu; Clark, Ben; Dyer, John; Batty, Kevin T.; Davis, Timothy M. E.

    2016-01-01

    Dried blood spot (DBS) antibiotic assays can facilitate pharmacokinetic (PK)/pharmacodynamic (PD) studies in situations where venous blood sampling is logistically difficult. We sought to develop, validate, and apply a DBS assay for rifampin (RIF), fusidic acid (FUS), and ciprofloxacin (CIP). These antibiotics are considered active against organisms in biofilms and are therefore commonly used for the treatment of infections associated with prosthetic implants. A liquid chromatography-mass spectroscopy DBS assay was developed and validated, including red cell partitioning and thermal stability for each drug and the rifampin metabolite desacetyl rifampin (Des-RIF). Plasma and DBS concentrations in 10 healthy adults were compared, and the concentration-time profiles were incorporated into population PK models. The limits of quantification for RIF, Des-RIF, CIP, and FUS in DBS were 15 μg/liter, 14 μg/liter, 25 μg/liter, and 153 μg/liter, respectively. Adjusting for hematocrit, red cell partitioning, and relative recovery, DBS-predicted plasma concentrations were comparable to measured plasma concentrations for each antibiotic (r > 0.95; P < 0.0001), and Bland-Altman plots showed no significant bias. The final population PK estimates of clearance, volume of distribution, and time above threshold MICs for measured and DBS-predicted plasma concentrations were comparable. These drugs were stable in DBSs for at least 10 days at room temperature and 1 month at 4°C. The present DBS antibiotic assays are robust and can be used as surrogates for plasma concentrations to provide valid PK and PK/PD data in a variety of clinical situations, including therapeutic drug monitoring or studies of implant infections. PMID:27270283

  1. Estimating ICU bed capacity using discrete event simulation.

    PubMed

    Zhu, Zhecheng; Hen, Bee Hoon; Teow, Kiok Liang

    2012-01-01

    The intensive care unit (ICU) in a hospital caters for critically ill patients. The number of the ICU beds has a direct impact on many aspects of hospital performance. Lack of the ICU beds may cause ambulance diversion and surgery cancellation, while an excess of ICU beds may cause a waste of resources. This paper aims to develop a discrete event simulation (DES) model to help the healthcare service providers determine the proper ICU bed capacity which strikes the balance between service level and cost effectiveness. The DES model is developed to reflect the complex patient flow of the ICU system. Actual operational data, including emergency arrivals, elective arrivals and length of stay, are directly fed into the DES model to capture the variations in the system. The DES model is validated by open box test and black box test. The validated model is used to test two what-if scenarios which the healthcare service providers are interested in: the proper number of the ICU beds in service to meet the target rejection rate and the extra ICU beds in service needed to meet the demand growth. A 12-month period of actual operational data was collected from an ICU department with 13 ICU beds in service. Comparison between the simulation results and the actual situation shows that the DES model accurately captures the variations in the system, and the DES model is flexible to simulate various what-if scenarios. DES helps the healthcare service providers describe the current situation, and simulate the what-if scenarios for future planning.

  2. Development and validation of a two-dimensional fast-response flood estimation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judi, David R; Mcpherson, Timothy N; Burian, Steven J

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. Themore » simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.« less

  3. Review and evaluation of performance measures for survival prediction models in external validation settings.

    PubMed

    Rahman, M Shafiqur; Ambler, Gareth; Choodari-Oskooei, Babak; Omar, Rumana Z

    2017-04-18

    When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell's concordance measure which tended to increase as censoring increased. We recommend that Uno's concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller's measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston's D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive accuracy curves. In addition, we recommend to investigate the characteristics

  4. The Safety Culture Enactment Questionnaire (SCEQ): Theoretical model and empirical validation.

    PubMed

    de Castro, Borja López; Gracia, Francisco J; Tomás, Inés; Peiró, José M

    2017-06-01

    This paper presents the Safety Culture Enactment Questionnaire (SCEQ), designed to assess the degree to which safety is an enacted value in the day-to-day running of nuclear power plants (NPPs). The SCEQ is based on a theoretical safety culture model that is manifested in three fundamental components of the functioning and operation of any organization: strategic decisions, human resources practices, and daily activities and behaviors. The extent to which the importance of safety is enacted in each of these three components provides information about the pervasiveness of the safety culture in the NPP. To validate the SCEQ and the model on which it is based, two separate studies were carried out with data collection in 2008 and 2014, respectively. In Study 1, the SCEQ was administered to the employees of two Spanish NPPs (N=533) belonging to the same company. Participants in Study 2 included 598 employees from the same NPPs, who completed the SCEQ and other questionnaires measuring different safety outcomes (safety climate, safety satisfaction, job satisfaction and risky behaviors). Study 1 comprised item formulation and examination of the factorial structure and reliability of the SCEQ. Study 2 tested internal consistency and provided evidence of factorial validity, validity based on relationships with other variables, and discriminant validity between the SCEQ and safety climate. Exploratory Factor Analysis (EFA) carried out in Study 1 revealed a three-factor solution corresponding to the three components of the theoretical model. Reliability analyses showed strong internal consistency for the three scales of the SCEQ, and each of the 21 items on the questionnaire contributed to the homogeneity of its theoretically developed scale. Confirmatory Factor Analysis (CFA) carried out in Study 2 supported the internal structure of the SCEQ; internal consistency of the scales was also supported. Furthermore, the three scales of the SCEQ showed the expected correlation

  5. Development of the Verona coding definitions of emotional sequences to code health providers' responses (VR-CoDES-P) to patient cues and concerns.

    PubMed

    Del Piccolo, Lidia; de Haes, Hanneke; Heaven, Cathy; Jansen, Jesse; Verheul, William; Bensing, Jozien; Bergvik, Svein; Deveugele, Myriam; Eide, Hilde; Fletcher, Ian; Goss, Claudia; Humphris, Gerry; Kim, Young-Mi; Langewitz, Wolf; Mazzi, Maria Angela; Mjaaland, Trond; Moretti, Francesca; Nübling, Matthias; Rimondini, Michela; Salmon, Peter; Sibbern, Tonje; Skre, Ingunn; van Dulmen, Sandra; Wissow, Larry; Young, Bridget; Zandbelt, Linda; Zimmermann, Christa; Finset, Arnstein

    2011-02-01

    To present a method to classify health provider responses to patient cues and concerns according to the VR-CoDES-CC (Del Piccolo et al. (2009) [2] and Zimmermann et al. (submitted for publication) [3]). The system permits sequence analysis and a detailed description of how providers handle patient's expressions of emotion. The Verona-CoDES-P system has been developed based on consensus views within the "Verona Network of Sequence Analysis". The different phases of the creation process are described in detail. A reliability study has been conducted on 20 interviews from a convenience sample of 104 psychiatric consultations. The VR-CoDES-P has two main classes of provider responses, corresponding to the degree of explicitness (yes/no) and space (yes/no) that is given by the health provider to each cue/concern expressed by the patient. The system can be further subdivided into 17 individual categories. Statistical analyses showed that the VR-CoDES-P is reliable (agreement 92.86%, Cohen's kappa 0.90 (±0.04) p<0.0001). Once validity and reliability are tested in different settings, the system should be applied to investigate the relationship between provider responses to patients' expression of emotions and outcome variables. Research employing the VR-CoDES-P should be applied to develop research-based approaches to maximize appropriate responses to patients' indirect and overt expressions of emotional needs. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  6. Integrated corridor management (ICM) analysis, modeling, and simulation (AMS) for Minneapolis site : model calibration and validation report.

    DOT National Transportation Integrated Search

    2010-02-01

    This technical report documents the calibration and validation of the baseline (2008) mesoscopic model for the I-394 Minneapolis, Minnesota, Pioneer Site. DynusT was selected as the mesoscopic model for analyzing operating conditions in the I-394 cor...

  7. Approche à l’endroit des blessures traumatiques à la main en soins primaires

    PubMed Central

    Cheung, Kevin; Hatchell, Alexandra; Thoma, Achilleas

    2013-01-01

    Résumé Objectif Passer en revue la prise en charge initiale des blessures traumatiques communes à la main que voient les médecins de soins primaires. Sources des données Nous avons examiné les données cliniques probantes et les ouvrages spécialisés récents cernés par des recherches dans la base de données électronique MEDLINE. Nous avons utilisé l’opinion d’experts pour compléter les recommandations dans les domaines où les données scientifiques étaient rares. Message principal Les médecins de soins primaires sont couramment appelés à prendre en charge des patients victimes de blessures traumatiques à la main. Dans le contexte d’un cas clinique, nous examinons l’évaluation, le diagnostic et la prise en charge initiale des traumatismes communs à la main. La présentation et la prise en charge des blessures au lit de l’ongle, des amputations de l’extrémité du doigt, des doigts en maillet, des fractures à la main, des lacérations de tendons, des morsures et de la ténosynovite infectieuse seront aussi discutées. Les principes de la prise en charge des blessures traumatiques à la main comportent la réduction et l’immobilisation des fractures, la prescription d’imagerie radiographique post-réduction, l’obtention d’un recouvrement par les tissus mous, la prévention et le traitement des infections et l’assurance d’une prophylaxie antitétanique. Conclusion Il est essentiel d’assurer une évaluation et une prise en charge appropriées des blessures traumatiques à la main pour prévenir une morbidité considérable à long terme dans une population autrement en santé. La reconnaissance sans délai des blessures qui nécessitent une demande de consultation urgente ou rapide auprès d’un chirurgien spécialiste de la main est également critique.

  8. Calibration and validation of toxicokinetic-toxicodynamic models for three neonicotinoids and some aquatic macroinvertebrates.

    PubMed

    Focks, Andreas; Belgers, Dick; Boerwinkel, Marie-Claire; Buijse, Laura; Roessink, Ivo; Van den Brink, Paul J

    2018-05-01

    Exposure patterns in ecotoxicological experiments often do not match the exposure profiles for which a risk assessment needs to be performed. This limitation can be overcome by using toxicokinetic-toxicodynamic (TKTD) models for the prediction of effects under time-variable exposure. For the use of TKTD models in the environmental risk assessment of chemicals, it is required to calibrate and validate the model for specific compound-species combinations. In this study, the survival of macroinvertebrates after exposure to the neonicotinoid insecticide was modelled using TKTD models from the General Unified Threshold models of Survival (GUTS) framework. The models were calibrated on existing survival data from acute or chronic tests under static exposure regime. Validation experiments were performed for two sets of species-compound combinations: one set focussed on multiple species sensitivity to a single compound: imidacloprid, and the other set on the effects of multiple compounds for a single species, i.e., the three neonicotinoid compounds imidacloprid, thiacloprid and thiamethoxam, on the survival of the mayfly Cloeon dipterum. The calibrated models were used to predict survival over time, including uncertainty ranges, for the different time-variable exposure profiles used in the validation experiments. From the comparison between observed and predicted survival, it appeared that the accuracy of the model predictions was acceptable for four of five tested species in the multiple species data set. For compounds such as neonicotinoids, which are known to have the potential to show increased toxicity under prolonged exposure, the calibration and validation of TKTD models for survival needs to be performed ideally by considering calibration data from both acute and chronic tests.

  9. Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model

    DTIC Science & Technology

    2010-03-01

    EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End

  10. Gene-environment interactions and construct validity in preclinical models of psychiatric disorders.

    PubMed

    Burrows, Emma L; McOmish, Caitlin E; Hannan, Anthony J

    2011-08-01

    The contributions of genetic risk factors to susceptibility for brain disorders are often so closely intertwined with environmental factors that studying genes in isolation cannot provide the full picture of pathogenesis. With recent advances in our understanding of psychiatric genetics and environmental modifiers we are now in a position to develop more accurate animal models of psychiatric disorders which exemplify the complex interaction of genes and environment. Here, we consider some of the insights that have emerged from studying the relationship between defined genetic alterations and environmental factors in rodent models. A key issue in such animal models is the optimization of construct validity, at both genetic and environmental levels. Standard housing of laboratory mice and rats generally includes ad libitum food access and limited opportunity for physical exercise, leading to metabolic dysfunction under control conditions, and thus reducing validity of animal models with respect to clinical populations. A related issue, of specific relevance to neuroscientists, is that most standard-housed rodents have limited opportunity for sensory and cognitive stimulation, which in turn provides reduced incentive for complex motor activity. Decades of research using environmental enrichment has demonstrated beneficial effects on brain and behavior in both wild-type and genetically modified rodent models, relative to standard-housed littermate controls. One interpretation of such studies is that environmentally enriched animals more closely approximate average human levels of cognitive and sensorimotor stimulation, whereas the standard housing currently used in most laboratories models a more sedentary state of reduced mental and physical activity and abnormal stress levels. The use of such standard housing as a single environmental variable may limit the capacity for preclinical models to translate into successful clinical trials. Therefore, there is a need to

  11. Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.

    PubMed

    Berthelsen, Hanne; Hakanen, Jari J; Westerlund, Hugo

    2018-01-01

    This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.

  12. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and

  13. Methodes d'amas quantiques a temperature finie appliquees au modele de Hubbard

    NASA Astrophysics Data System (ADS)

    Plouffe, Dany

    Depuis leur decouverte dans les annees 80, les supraconducteurs a haute temperature critique ont suscite beaucoup d'interet en physique du solide. Comprendre l'origine des phases observees dans ces materiaux, telle la supraconductivite, est l'un des grands defis de la physique theorique du solide des 25 dernieres annees. L'un des mecanismes pressentis pour expliquer ces phenomenes est la forte interaction electron-electron. Le modele de Hubbard est l'un des modeles les plus simples pour tenir compte de ces interactions. Malgre la simplicite apparente de ce modele, certaines de ses caracteristiques, dont son diagramme de phase, ne sont toujours pas bien etablies, et ce malgre plusieurs avancements theoriques dans les dernieres annees. Cette etude se consacre a faire une analyse de methodes numeriques permettant de calculer diverses proprietes du modele de Hubbard en fonction de la temperature. Nous decrivons des methodes (la VCA et la CPT) qui permettent de calculer approximativement la fonction de Green a temperature finie sur un systeme infini a partir de la fonction de Green calculee sur un amas de taille finie. Pour calculer ces fonctions de Green, nous allons utiliser des methodes permettant de reduire considerablement les efforts numeriques necessaires pour les calculs des moyennes thermodynamiques, en reduisant considerablement l'espace des etats a considerer dans ces moyennes. Bien que cette etude vise d'abord a developper des methodes d'amas pour resoudre le modele de Hubbard a temperature finie de facon generale ainsi qu'a etudier les proprietes de base de ce modele, nous allons l'appliquer a des conditions qui s'approchent de supraconducteurs a haute temperature critique. Les methodes presentees dans cette etude permettent de tracer un diagramme de phase pour l'antiferromagnetisme et la supraconductivite qui presentent plusieurs similarites avec celui des supraconducteurs a haute temperature. Mots-cles : modele de Hubbard, thermodynamique

  14. Validation of a dynamic linked segment model to calculate joint moments in lifting.

    PubMed

    de Looze, M P; Kingma, I; Bussmann, J B; Toussaint, H M

    1992-08-01

    A two-dimensional dynamic linked segment model was constructed and applied to a lifting activity. Reactive forces and moments were calculated by an instantaneous approach involving the application of Newtonian mechanics to individual adjacent rigid segments in succession. The analysis started once at the feet and once at a hands/load segment. The model was validated by comparing predicted external forces and moments at the feet or at a hands/load segment to actual values, which were simultaneously measured (ground reaction force at the feet) or assumed to be zero (external moments at feet and hands/load and external forces, beside gravitation, at hands/load). In addition, results of both procedures, in terms of joint moments, including the moment at the intervertebral disc between the fifth lumbar and first sacral vertebra (L5-S1), were compared. A correlation of r = 0.88 between calculated and measured vertical ground reaction forces was found. The calculated external forces and moments at the hands showed only minor deviations from the expected zero level. The moments at L5-S1, calculated starting from feet compared to starting from hands/load, yielded a coefficient of correlation of r = 0.99. However, moments calculated from hands/load were 3.6% (averaged values) and 10.9% (peak values) higher. This difference is assumed to be due mainly to erroneous estimations of the positions of centres of gravity and joint rotation centres. The estimation of the location of L5-S1 rotation axis can affect the results significantly. Despite the numerous studies estimating the load on the low back during lifting on the basis of linked segment models, only a few attempts to validate these models have been made. This study is concerned with the validity of the presented linked segment model. The results support the model's validity. Effects of several sources of error threatening the validity are discussed. Copyright © 1992. Published by Elsevier Ltd.

  15. Prise en charge des hypertendus dans la ville de Cotonou (Bénin) en 2011: connaissances attitudes et pratiques des médecins généralistes

    PubMed Central

    Houenassi, Martin Dèdonougbo; David, Dokoui; Codjo, Léopold Houétondji; Attinsounon, Angelo Cossi; Alassani, Adebayo; Ahoui, Séraphin; Dovonou, Albert Comlan; Adoukonou, Thierry Armel; Dohou, Serge Hugues Mahougnon; Wanvoegbe, Armand; Agbodande, Anthelme

    2016-01-01

    Summary But: Ce travail vise à évaluer les connaissances, attitudes et pratiques des médecins généralistes sur la prise en charge de l’hypertension artérielle à Cotonou. Méthodes: L’étude était transversale et descriptive basée sur une enquête multicentrique du 1er Mai 2011 au 31 Juillet 2011. Un recrutement de tous les médecins généralistes, volontaires exerçant dans les centres de santé privés, publics et confessionnels de la ville de Cotonou, ayant autorisé l’étude, a été fait. Le 7ème rapport de Joint National Committee (JNC7) a été utilisé comme référentiel pour l’évaluation de la prise en charge des hypertendus. Un auto-questionnaire adressé aux généralistes, testé et validé, a été utilisé pour recueillir ces données. Résultats: Au total, 41 médecins généralistes dans huit établissements sanitaires ont été inclus. Près de la moitié des généralistes (48.8%) ne connaissaient pas la définition de l’HTA. Seulement 25 généralistes (61.0%) pouvaient décrire les conditions de mesure de la pression artérielle. Dix généralistes (24.4%) étaient incapables de lister la moitié des examens du bilan minimum de l’hypertension artérielle (HTA). La majorité (92.7%) ne connaissait pas la notion de risque cardiovasculaire global. L’objectif tensionnel (TA ≤ 140/90 mmHg) n’était connu que par 18 (43.9%) médecins. Les mesures hygiéno-diététiques seules (82.9%) et la monothérapie seule (70.7%) étaient les modalités thérapeutiques les plus prescrites. Les classes pharmacologiques antihypertensives prescrites étaient surtout les inhibiteurs calciques (82.9%), les inhibiteurs de l’enzyme de conversion (53.7%) et les diurétiques (36.6%). Les généralistes référaient les hypertendus aux cardiologues principalement pour non-maitrise du chiffres tensionnels (63,4%) et l’apparition de complications aigues (56.1%). Conclusion: Les connaissances des généralistes sur la prise en charge de l

  16. L'Infection Nosocomiale en Reanimation des Brules

    PubMed Central

    Siah, S.; Belefqih, R.; Elouennass, M.; Fouadi, F.E.; Ihrai, I.

    2009-01-01

    Summary L'infection nosocomiale bactérienne étant l'une des principales causes de morbidité et de mortalité chez le brûlé, nous avons réalisé une étude rétrospective portant sur 84 patients hospitalisés au sein du service de réanimation des brûlés de l'Hôpital Militaire d'Instruction Mohammed V de Rabat, sur une période de 3 ans, du premier janvier 2001 au 31 décembre 2003. Les critères d'infection nosocomiale étaient ceux du Center for Disease Control d'Atlanta de 1988. Les taux d'incidence ont été calculés. La population infectée a été comparée à celle non infectée. L'écologie bactérienne du service a été décrite comme aussi l'antibiotype. Il ressort de cette étude la survenue de 87 infections nosocomiales chez 27 patients. L'incidence cumulative était de 103 infections pour 1000 jours de traitement. Pour ce qui est des caractéristiques des infections bactériennes, les sites infectés étaient la peau (77%), le sang (13,8%), les voies urinaires (8%) et les poumons (1,1%). Les principaux germes étaient: Staphylococcus sp. (33,3%), Pseudomonas aeruginosa (23%), Enterococcus faecalis et Acinetobacter (8%). Les staphylocoques étaient méticillo-résistants dans 22,2% des cas. Le Pseudomonas et l'Acinetobacter étaient multirésistants (60%). Dans notre étude les facteurs prédictifs de survenue des infections nosocomiales que nous avons retenus après l'étude comparative des populations infectées et non infectées ont été l'âge, le body mass index, l'abbreviated burn severity index et le remplissage initial. En isolant ces paramètres, nous avons pu établir une équation à valeur prédictive de survenue d'infection nosocomiale chez le patient brûlé. PMID:21991158

  17. VR-CoDES and patient-centeredness. The intersection points between a measure and a concept.

    PubMed

    Del Piccolo, Lidia

    2017-11-01

    The Verona Coding Definitions of Emotional sequences (VR-CoDES) system has been applied in a wide range of studies, in some of these, because of its attention on healthcare provider's ability to respond to patient emotions, it has been used as a proxy of patient-centeredness. The paper aims to discuss how the VR-CoDES can contribute to the broader concept of patient-centeredness and its limitations. VR-CoDES and patient-centeredness concept are briefly described, trying to detect commonalities and distinctions. The VR-CoDES dimensions of Explicit/non explicit responding and Providing or Reducing Space are analysed in relation to relevant aspects of patient-centred communication. Emotional aspects are encompassed within patient-centeredness model, but they represent only one of the numerous dimensions that contribute to define patient-centeredness as well as Explicit/non explicit responding and Providing or Reducing Space serve different functions during communication. The VR-CoDES can contribute to operationalize the description of emotional aspects emerging in a consultation, by inducing coders to adopt a factual attitude in assessing how health providers react to patient's expression of emotions. To better define empirically which measure affective aspects and dimensions of health provider responses are relevant and may contribute to patient-centeredness in different clinical settings. Copyright © 2017. Published by Elsevier B.V.

  18. IMPLEMENTATION AND VALIDATION OF A FULLY IMPLICIT ACCUMULATOR MODEL IN RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Haihua; Zou, Ling; Zhang, Hongbin

    2016-01-01

    This paper presents the implementation and validation of an accumulator model in RELAP-7 under the framework of preconditioned Jacobian free Newton Krylov (JFNK) method, based on the similar model used in RELAP5. RELAP-7 is a new nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). RELAP-7 is a fully implicit system code. The JFNK and preconditioning methods used in RELAP-7 is briefly discussed. The slightly modified accumulator model is summarized for completeness. The implemented model was validated with LOFT L3-1 test and benchmarked with RELAP5 results. RELAP-7 and RELAP5 had almost identical results for themore » accumulator gas pressure and water level, although there were some minor difference in other parameters such as accumulator gas temperature and tank wall temperature. One advantage of the JFNK method is its easiness to maintain and modify models due to fully separation of numerical methods from physical models. It would be straightforward to extend the current RELAP-7 accumulator model to simulate the advanced accumulator design.« less

  19. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2014-05-20

    but there can still be many recommendations generated. Therefore, the recommender results are displayed in a sortable table where each row is a...reporting period. Since the synthesis graph can be complex and have many dependencies, the system must determine the order of evaluation of nodes, and...validation failure, if any. 3.1. Automatic Feature Extraction In many domains, causal models can often be more readily described as patterns of

  20. Croissance des couches minces et des multicouches de matériaux supraconducteurs H Tboldmath_c : bilan et perspective

    NASA Astrophysics Data System (ADS)

    Contour, J. P.

    1994-11-01

    The main physical and chemical techniques of epitaxial growth of High T_c superconductor thin films are described together with their in situ analysis facilities and discussed with respect to their cost, sophistication and results (T_c, J_c growth defects, thickness and composition uniformity, crystallinity, electronic applications...). The future trends of the growth machines are then examined in connection with the present results and the development of superconductor electronics. Après la présentation des principales techniques de croissance physique et physicochimique de couches minces d'oxydes supraconducteurs à haute température critique, un bilan des résultats sera dressé par rapport aux différentes propriétés des films (transition résistive, courant critique, défauts de croissance, uniformité d'épaisseur et de composition, cristallinité...), aux difficultés de mise en œuvre et au coût de l'expérience. Les perspectives des différentes techniques seront ensuite examinées dans le cadre du développement potentiel d'une électronique utilisant les matériaux supraconducteurs H T_c.