Sample records for model remains valid

  1. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus.

  2. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus. PMID:29755979

  3. Modeling mania in preclinical settings: a comprehensive review

    PubMed Central

    Sharma, Ajaykumar N.; Fries, Gabriel R.; Galvez, Juan F.; Valvassori, Samira S.; Soares, Jair C.; Carvalho, André F.; Quevedo, Joao

    2015-01-01

    The current pathophysiological understanding of mechanisms leading to onset and progression of bipolar manic episodes remains limited. At the same time, available animal models for mania have limited face, construct, and predictive validities. Additionally, these models fail to encompass recent pathophysiological frameworks of bipolar disorder (BD), e.g. neuroprogression. Therefore, there is a need to search for novel preclinical models for mania that could comprehensively address these limitations. Herein we review the history, validity, and caveats of currently available animal models for mania. We also review new genetic models for mania, namely knockout mice for genes involved in neurotransmission, synapse formation, and intracellular signaling pathways. Furthermore, we review recent trends in preclinical models for mania that may aid in the comprehension of mechanisms underlying the neuroprogressive and recurring nature of BD. In conclusion, the validity of animal models for mania remains limited. Nevertheless, novel (e.g. genetic) animal models as well as adaptation of existing paradigms hold promise. PMID:26545487

  4. Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review.

    PubMed

    Morgan, Michael; Aydin, Abdullatif; Salih, Alan; Robati, Shibby; Ahmed, Kamran

    To conduct a systematic review of orthopedic training and assessment simulators with reference to their level of evidence (LoE) and level of recommendation. Medline and EMBASE library databases were searched for English language articles published between 1980 and 2016, describing orthopedic simulators or validation studies of these models. All studies were assessed for LoE, and each model was subsequently awarded a level of recommendation using a modified Oxford Centre for Evidence-Based Medicine classification, adapted for education. A total of 76 articles describing orthopedic simulators met the inclusion criteria, 47 of which described at least 1 validation study. The most commonly identified models (n = 34) and validation studies (n = 26) were for knee arthroscopy. Construct validation was the most frequent validation study attempted by authors. In all, 62% (47 of 76) of the simulator studies described arthroscopy simulators, which also contained validation studies with the highest LoE. Orthopedic simulators are increasingly being subjected to validation studies, although the LoE of such studies generally remain low. There remains a lack of focus on nontechnical skills and on cost analyses of orthopedic simulators. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. Validating a Model of Effective Teaching Behaviour of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Maulana, Ridwan; Helms-Lorenz, Michelle; Van de Grift, Wim

    2017-01-01

    Although effective teaching behaviour is central for pupil outcomes, the extent to which pre-service teachers behave effectively in the classroom and how their behaviour relates to pupils' engagement remain unanswered. The present study aims to validate a theoretical model linking effective pre-service teaching behaviour and pupil's engagement,…

  6. Finding Furfural Hydrogenation Catalysts via Predictive Modelling

    PubMed Central

    Strassberger, Zea; Mooijman, Maurice; Ruijter, Eelco; Alberts, Albert H; Maldonado, Ana G; Orru, Romano V A; Rothenberg, Gadi

    2010-01-01

    Abstract We combine multicomponent reactions, catalytic performance studies and predictive modelling to find transfer hydrogenation catalysts. An initial set of 18 ruthenium-carbene complexes were synthesized and screened in the transfer hydrogenation of furfural to furfurol with isopropyl alcohol complexes gave varied yields, from 62% up to >99.9%, with no obvious structure/activity correlations. Control experiments proved that the carbene ligand remains coordinated to the ruthenium centre throughout the reaction. Deuterium-labelling studies showed a secondary isotope effect (kH:kD=1.5). Further mechanistic studies showed that this transfer hydrogenation follows the so-called monohydride pathway. Using these data, we built a predictive model for 13 of the catalysts, based on 2D and 3D molecular descriptors. We tested and validated the model using the remaining five catalysts (cross-validation, R2=0.913). Then, with this model, the conversion and selectivity were predicted for four completely new ruthenium-carbene complexes. These four catalysts were then synthesized and tested. The results were within 3% of the model’s predictions, demonstrating the validity and value of predictive modelling in catalyst optimization. PMID:23193388

  7. Development and validation of a predictive model for excessive postpartum blood loss: A retrospective, cohort study.

    PubMed

    Rubio-Álvarez, Ana; Molina-Alarcón, Milagros; Arias-Arias, Ángel; Hernández-Martínez, Antonio

    2018-03-01

    postpartum haemorrhage is one of the leading causes of maternal morbidity and mortality worldwide. Despite the use of uterotonics agents as preventive measure, it remains a challenge to identify those women who are at increased risk of postpartum bleeding. to develop and to validate a predictive model to assess the risk of excessive bleeding in women with vaginal birth. retrospective cohorts study. "Mancha-Centro Hospital" (Spain). the elaboration of the predictive model was based on a derivation cohort consisting of 2336 women between 2009 and 2011. For validation purposes, a prospective cohort of 953 women between 2013 and 2014 were employed. Women with antenatal fetal demise, multiple pregnancies and gestations under 35 weeks were excluded METHODS: we used a multivariate analysis with binary logistic regression, Ridge Regression and areas under the Receiver Operating Characteristic curves to determine the predictive ability of the proposed model. there was 197 (8.43%) women with excessive bleeding in the derivation cohort and 63 (6.61%) women in the validation cohort. Predictive factors in the final model were: maternal age, primiparity, duration of the first and second stages of labour, neonatal birth weight and antepartum haemoglobin levels. Accordingly, the predictive ability of this model in the derivation cohort was 0.90 (95% CI: 0.85-0.93), while it remained 0.83 (95% CI: 0.74-0.92) in the validation cohort. this predictive model is proved to have an excellent predictive ability in the derivation cohort, and its validation in a latter population equally shows a good ability for prediction. This model can be employed to identify women with a higher risk of postpartum haemorrhage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Evaluation of a Computational Model of Situational Awareness

    NASA Technical Reports Server (NTRS)

    Burdick, Mark D.; Shively, R. Jay; Rutkewski, Michael (Technical Monitor)

    2000-01-01

    Although the use of the psychological construct of situational awareness (SA) assists researchers in creating a flight environment that is safer and more predictable, its true potential remains untapped until a valid means of predicting SA a priori becomes available. Previous work proposed a computational model of SA (CSA) that sought to Fill that void. The current line of research is aimed at validating that model. The results show that the model accurately predicted SA in a piloted simulation.

  9. Improving the Validity of Activity of Daily Living Dependency Risk Assessment

    PubMed Central

    Clark, Daniel O.; Stump, Timothy E.; Tu, Wanzhu; Miller, Douglas K.

    2015-01-01

    Objectives Efforts to prevent activity of daily living (ADL) dependency may be improved through models that assess older adults’ dependency risk. We evaluated whether cognition and gait speed measures improve the predictive validity of interview-based models. Method Participants were 8,095 self-respondents in the 2006 Health and Retirement Survey who were aged 65 years or over and independent in five ADLs. Incident ADL dependency was determined from the 2008 interview. Models were developed using random 2/3rd cohorts and validated in the remaining 1/3rd. Results Compared to a c-statistic of 0.79 in the best interview model, the model including cognitive measures had c-statistics of 0.82 and 0.80 while the best fitting gait speed model had c-statistics of 0.83 and 0.79 in the development and validation cohorts, respectively. Conclusion Two relatively brief models, one that requires an in-person assessment and one that does not, had excellent validity for predicting incident ADL dependency but did not significantly improve the predictive validity of the best fitting interview-based models. PMID:24652867

  10. Validation workflow for a clinical Bayesian network model in multidisciplinary decision making in head and neck oncology treatment.

    PubMed

    Cypko, Mario A; Stoehr, Matthaeus; Kozniewski, Marcin; Druzdzel, Marek J; Dietz, Andreas; Berliner, Leonard; Lemke, Heinz U

    2017-11-01

    Oncological treatment is being increasingly complex, and therefore, decision making in multidisciplinary teams is becoming the key activity in the clinical pathways. The increased complexity is related to the number and variability of possible treatment decisions that may be relevant to a patient. In this paper, we describe validation of a multidisciplinary cancer treatment decision in the clinical domain of head and neck oncology. Probabilistic graphical models and corresponding inference algorithms, in the form of Bayesian networks, can support complex decision-making processes by providing a mathematically reproducible and transparent advice. The quality of BN-based advice depends on the quality of the model. Therefore, it is vital to validate the model before it is applied in practice. For an example BN subnetwork of laryngeal cancer with 303 variables, we evaluated 66 patient records. To validate the model on this dataset, a validation workflow was applied in combination with quantitative and qualitative analyses. In the subsequent analyses, we observed four sources of imprecise predictions: incorrect data, incomplete patient data, outvoting relevant observations, and incorrect model. Finally, the four problems were solved by modifying the data and the model. The presented validation effort is related to the model complexity. For simpler models, the validation workflow is the same, although it may require fewer validation methods. The validation success is related to the model's well-founded knowledge base. The remaining laryngeal cancer model may disclose additional sources of imprecise predictions.

  11. Empirical Refinements of a Molecular Genetics Learning Progression: The Molecular Constructs

    ERIC Educational Resources Information Center

    Todd, Amber; Kenyon, Lisa

    2016-01-01

    This article describes revisions to four of the eight constructs of the Duncan molecular genetics learning progression [Duncan, Rogat, & Yarden, (2009)]. As learning progressions remain hypothetical models until validated by multiple rounds of empirical studies, these revisions are an important step toward validating the progression. Our…

  12. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2017-01-01

    Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237

  13. Beware of external validation! - A Comparative Study of Several Validation Techniques used in QSAR Modelling.

    PubMed

    Majumdar, Subhabrata; Basak, Subhash C

    2018-04-26

    Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    PubMed

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. The Twin-Cycle Experiential Learning Model: Reconceptualising Kolb's Theory

    ERIC Educational Resources Information Center

    Bergsteiner, Harald; Avery, Gayle C.

    2014-01-01

    Experiential learning styles remain popular despite criticisms about their validity, usefulness, fragmentation and poor definitions and categorisation. After examining four prominent models and building on Bergsteiner, Avery, and Neumann's suggestion of a dual cycle, this paper proposes a twin-cycle experiential learning model to overcome…

  16. Surrogates for numerical simulations; optimization of eddy-promoter heat exchangers

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.; Patera, Anthony

    1993-01-01

    Although the advent of fast and inexpensive parallel computers has rendered numerous previously intractable calculations feasible, many numerical simulations remain too resource-intensive to be directly inserted in engineering optimization efforts. An attractive alternative to direct insertion considers models for computational systems: the expensive simulation is evoked only to construct and validate a simplified, input-output model; this simplified input-output model then serves as a simulation surrogate in subsequent engineering optimization studies. A simple 'Bayesian-validated' statistical framework for the construction, validation, and purposive application of static computer simulation surrogates is presented. As an example, dissipation-transport optimization of laminar-flow eddy-promoter heat exchangers are considered: parallel spectral element Navier-Stokes calculations serve to construct and validate surrogates for the flowrate and Nusselt number; these surrogates then represent the originating Navier-Stokes equations in the ensuing design process.

  17. QSAR modeling of GPCR ligands: methodologies and examples of applications.

    PubMed

    Tropsha, A; Wang, S X

    2006-01-01

    GPCR ligands represent not only one of the major classes of current drugs but the major continuing source of novel potent pharmaceutical agents. Because 3D structures of GPCRs as determined by experimental techniques are still unavailable, ligand-based drug discovery methods remain the major computational molecular modeling approaches to the analysis of growing data sets of tested GPCR ligands. This paper presents an overview of modern Quantitative Structure Activity Relationship (QSAR) modeling. We discuss the critical issue of model validation and the strategy for applying the successfully validated QSAR models to virtual screening of available chemical databases. We present several examples of applications of validated QSAR modeling approaches to GPCR ligands. We conclude with the comments on exciting developments in the QSAR modeling of GPCR ligands that focus on the study of emerging data sets of compounds with dual or even multiple activities against two or more of GPCRs.

  18. Towards practical application of sensors for monitoring animal health; design and validation of a model to detect ketosis.

    PubMed

    Steensels, Machteld; Maltz, Ephraim; Bahr, Claudia; Berckmans, Daniel; Antler, Aharon; Halachmi, Ilan

    2017-05-01

    The objective of this study was to design and validate a mathematical model to detect post-calving ketosis. The validation was conducted in four commercial dairy farms in Israel, on a total of 706 multiparous Holstein dairy cows: 203 cows clinically diagnosed with ketosis and 503 healthy cows. A logistic binary regression model was developed, where the dependent variable is categorical (healthy/diseased) and a set of explanatory variables were measured with existing commercial sensors: rumination duration, activity and milk yield of each individual cow. In a first validation step (within-farm), the model was calibrated on the database of each farm separately. Two thirds of the sick cows and an equal number of healthy cows were randomly selected for model validation. The remaining one third of the cows, which did not participate in the model validation, were used for model calibration. In order to overcome the random selection effect, this procedure was repeated 100 times. In a second (between-farms) validation step, the model was calibrated on one farm and validated on another farm. Within-farm accuracy, ranging from 74 to 79%, was higher than between-farm accuracy, ranging from 49 to 72%, in all farms. The within-farm sensitivities ranged from 78 to 90%, and specificities ranged from 71 to 74%. The between-farms sensitivities ranged from 65 to 95%. The developed model can be improved in future research, by employing other variables that can be added; or by exploring other models to achieve greater sensitivity and specificity.

  19. Exploring the Validity of Proposed Transgenic Animal Models of Attention-Deficit Hyperactivity Disorder (ADHD).

    PubMed

    de la Peña, June Bryan; Dela Peña, Irene Joy; Custodio, Raly James; Botanas, Chrislean Jun; Kim, Hee Jin; Cheong, Jae Hoon

    2018-05-01

    Attention-deficit/hyperactivity disorder (ADHD) is a common, behavioral, and heterogeneous neurodevelopmental condition characterized by hyperactivity, impulsivity, and inattention. Symptoms of this disorder are managed by treatment with methylphenidate, amphetamine, and/or atomoxetine. The cause of ADHD is unknown, but substantial evidence indicates that this disorder has a significant genetic component. Transgenic animals have become an essential tool in uncovering the genetic factors underlying ADHD. Although they cannot accurately reflect the human condition, they can provide insights into the disorder that cannot be obtained from human studies due to various limitations. An ideal animal model of ADHD must have face (similarity in symptoms), predictive (similarity in response to treatment or medications), and construct (similarity in etiology or underlying pathophysiological mechanism) validity. As the exact etiology of ADHD remains unclear, the construct validity of animal models of ADHD would always be limited. The proposed transgenic animal models of ADHD have substantially increased and diversified over the years. In this paper, we compiled and explored the validity of proposed transgenic animal models of ADHD. Each of the reviewed transgenic animal models has strengths and limitations. Some fulfill most of the validity criteria of an animal model of ADHD and have been extensively used, while there are others that require further validation. Nevertheless, these transgenic animal models of ADHD have provided and will continue to provide valuable insights into the genetic underpinnings of this complex disorder.

  20. Towards a model-based patient selection strategy for proton therapy: External validation of photon-derived Normal Tissue Complication Probability models in a head and neck proton therapy cohort

    PubMed Central

    Blanchard, P; Wong, AJ; Gunn, GB; Garden, AS; Mohamed, ASR; Rosenthal, DI; Crutison, J; Wu, R; Zhang, X; Zhu, XR; Mohan, R; Amin, MV; Fuller, CD; Frank, SJ

    2017-01-01

    Objective To externally validate head and neck cancer (HNC) photon-derived normal tissue complication probability (NTCP) models in patients treated with proton beam therapy (PBT). Methods This prospective cohort consisted of HNC patients treated with PBT at a single institution. NTCP models were selected based on the availability of data for validation and evaluated using the leave-one-out cross-validated area under the curve (AUC) for the receiver operating characteristics curve. Results 192 patients were included. The most prevalent tumor site was oropharynx (n=86, 45%), followed by sinonasal (n=28), nasopharyngeal (n=27) or parotid (n=27) tumors. Apart from the prediction of acute mucositis (reduction of AUC of 0.17), the models overall performed well. The validation (PBT) AUC and the published AUC were respectively 0.90 versus 0.88 for feeding tube 6 months post-PBT; 0.70 versus 0.80 for physician rated dysphagia 6 months post-PBT; 0.70 versus 0.80 for dry mouth 6 months post-PBT; and 0.73 versus 0.85 for hypothyroidism 12 months post-PBT. Conclusion While the drop in NTCP model performance was expected in PBT patients, the models showed robustness and remained valid. Further work is warranted, but these results support the validity of the model-based approach for treatment selection for HNC patients. PMID:27641784

  1. Early Detection of Increased Intracranial Pressure Episodes in Traumatic Brain Injury: External Validation in an Adult and in a Pediatric Cohort.

    PubMed

    Güiza, Fabian; Depreitere, Bart; Piper, Ian; Citerio, Giuseppe; Jorens, Philippe G; Maas, Andrew; Schuhmann, Martin U; Lo, Tsz-Yan Milly; Donald, Rob; Jones, Patricia; Maier, Gottlieb; Van den Berghe, Greet; Meyfroidt, Geert

    2017-03-01

    A model for early detection of episodes of increased intracranial pressure in traumatic brain injury patients has been previously developed and validated based on retrospective adult patient data from the multicenter Brain-IT database. The purpose of the present study is to validate this early detection model in different cohorts of recently treated adult and pediatric traumatic brain injury patients. Prognostic modeling. Noninterventional, observational, retrospective study. The adult validation cohort comprised recent traumatic brain injury patients from San Gerardo Hospital in Monza (n = 50), Leuven University Hospital (n = 26), Antwerp University Hospital (n = 19), Tübingen University Hospital (n = 18), and Southern General Hospital in Glasgow (n = 8). The pediatric validation cohort comprised patients from neurosurgical and intensive care centers in Edinburgh and Newcastle (n = 79). None. The model's performance was evaluated with respect to discrimination, calibration, overall performance, and clinical usefulness. In the recent adult validation cohort, the model retained excellent performance as in the original study. In the pediatric validation cohort, the model retained good discrimination and a positive net benefit, albeit with a performance drop in the remaining criteria. The obtained external validation results confirm the robustness of the model to predict future increased intracranial pressure events 30 minutes in advance, in adult and pediatric traumatic brain injury patients. These results are a large step toward an early warning system for increased intracranial pressure that can be generally applied. Furthermore, the sparseness of this model that uses only two routinely monitored signals as inputs (intracranial pressure and mean arterial blood pressure) is an additional asset.

  2. Easy and low-cost identification of metabolic syndrome in patients treated with second-generation antipsychotics: artificial neural network and logistic regression models.

    PubMed

    Lin, Chao-Cheng; Bai, Ya-Mei; Chen, Jen-Yeu; Hwang, Tzung-Jeng; Chen, Tzu-Ting; Chiu, Hung-Wen; Li, Yu-Chuan

    2010-03-01

    Metabolic syndrome (MetS) is an important side effect of second-generation antipsychotics (SGAs). However, many SGA-treated patients with MetS remain undetected. In this study, we trained and validated artificial neural network (ANN) and multiple logistic regression models without biochemical parameters to rapidly identify MetS in patients with SGA treatment. A total of 383 patients with a diagnosis of schizophrenia or schizoaffective disorder (DSM-IV criteria) with SGA treatment for more than 6 months were investigated to determine whether they met the MetS criteria according to the International Diabetes Federation. The data for these patients were collected between March 2005 and September 2005. The input variables of ANN and logistic regression were limited to demographic and anthropometric data only. All models were trained by randomly selecting two-thirds of the patient data and were internally validated with the remaining one-third of the data. The models were then externally validated with data from 69 patients from another hospital, collected between March 2008 and June 2008. The area under the receiver operating characteristic curve (AUC) was used to measure the performance of all models. Both the final ANN and logistic regression models had high accuracy (88.3% vs 83.6%), sensitivity (93.1% vs 86.2%), and specificity (86.9% vs 83.8%) to identify MetS in the internal validation set. The mean +/- SD AUC was high for both the ANN and logistic regression models (0.934 +/- 0.033 vs 0.922 +/- 0.035, P = .63). During external validation, high AUC was still obtained for both models. Waist circumference and diastolic blood pressure were the common variables that were left in the final ANN and logistic regression models. Our study developed accurate ANN and logistic regression models to detect MetS in patients with SGA treatment. The models are likely to provide a noninvasive tool for large-scale screening of MetS in this group of patients. (c) 2010 Physicians Postgraduate Press, Inc.

  3. Validation of Bioreactor and Human-on-a-Chip Devices for Chemical Safety Assessment.

    PubMed

    Rebelo, Sofia P; Dehne, Eva-Maria; Brito, Catarina; Horland, Reyk; Alves, Paula M; Marx, Uwe

    2016-01-01

    Equipment and device qualification and test assay validation in the field of tissue engineered human organs for substance assessment remain formidable tasks with only a few successful examples so far. The hurdles seem to increase with the growing complexity of the biological systems, emulated by the respective models. Controlled single tissue or organ culture in bioreactors improves the organ-specific functions and maintains their phenotypic stability for longer periods of time. The reproducibility attained with bioreactor operations is, per se, an advantage for the validation of safety assessment. Regulatory agencies have gradually altered the validation concept from exhaustive "product" to rigorous and detailed process characterization, valuing reproducibility as a standard for validation. "Human-on-a-chip" technologies applying micro-physiological systems to the in vitro combination of miniaturized human organ equivalents into functional human micro-organisms are nowadays thought to be the most elaborate solution created to date. They target the replacement of the current most complex models-laboratory animals. Therefore, we provide here a road map towards the validation of such "human-on-a-chip" models and qualification of their respective bioreactor and microchip equipment along a path currently used for the respective animal models.

  4. Developing a Self-Scoring Comprehensive Instrument to Measure Rest's Four-Component Model of Moral Behavior: The Moral Skills Inventory.

    PubMed

    Chambers, David W

    2011-01-01

    One of the most extensively studied constructs in dental education is the four-component model of moral behavior proposed by James Rest and the set of instruments for measuring it developed by Rest, Muriel Bebeau, and others. Although significant associations have been identified between the four components Rest proposed (called here Moral Sensitivity, Moral Reasoning, Moral Integrity, and Moral Courage) and dental ethics courses and practitioners with disciplined licenses, there is no single instrument that measures all four components, and existing single component instruments require professional scoring. This article describes the development and validation of a short, self-scoring instrument, the Moral Skills Inventory, that measures all four components. Evidence of face validity, test/retest reliability, and concurrent convergent and divergent predictive validity are demonstrated in three populations: dental students, clinical dental faculty members, and regents and officers of the American College of Dentists. Significant issues remain in developing the Rest four-component model for use in dental education and practice. Specifically, further construct validation research is needed to understand the nature of the components. In particular, it remains undetermined whether moral constructs are characteristics of individuals that drive behavior in specific situations or whether particular patterns of moral behavior learned and used in response to individual circumstances are summarized by researchers and then imputed to practitioners.

  5. Use of Latent Class Analysis to define groups based on validity, cognition, and emotional functioning.

    PubMed

    Morin, Ruth T; Axelrod, Bradley N

    Latent Class Analysis (LCA) was used to classify a heterogeneous sample of neuropsychology data. In particular, we used measures of performance validity, symptom validity, cognition, and emotional functioning to assess and describe latent groups of functioning in these areas. A data-set of 680 neuropsychological evaluation protocols was analyzed using a LCA. Data were collected from evaluations performed for clinical purposes at an urban medical center. A four-class model emerged as the best fitting model of latent classes. The resulting classes were distinct based on measures of performance validity and symptom validity. Class A performed poorly on both performance and symptom validity measures. Class B had intact performance validity and heightened symptom reporting. The remaining two Classes performed adequately on both performance and symptom validity measures, differing only in cognitive and emotional functioning. In general, performance invalidity was associated with worse cognitive performance, while symptom invalidity was associated with elevated emotional distress. LCA appears useful in identifying groups within a heterogeneous sample with distinct performance patterns. Further, the orthogonal nature of performance and symptom validities is supported.

  6. Modern modeling techniques had limited external validity in predicting mortality from traumatic brain injury.

    PubMed

    van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W

    2016-10-01

    Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Validating a mouse model of ovarian cancer for early detection through imaging | Division of Cancer Prevention

    Cancer.gov

    Despite advances in treatment strategies, ovarian cancer remains the deadliest gynecological malignancy and the 5th largest cancer killer in women. Located deep in the body, with few early symptoms and no effective screening technique, ovarian cancer has remained stubbornly difficult to understand, much less effectively combat. Ovarian cancer is almost always discovered at an

  8. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  9. UT simulation using a fully automated 3D hybrid model: Application to planar backwall breaking defects inspection

    NASA Astrophysics Data System (ADS)

    Imperiale, Alexandre; Chatillon, Sylvain; Darmon, Michel; Leymarie, Nicolas; Demaldent, Edouard

    2018-04-01

    The high frequency models gathered in the CIVA software allow fast computations and provide satisfactory quantitative predictions in a wide range of situations. However, the domain of validity of these models is limited since they do not accurately predict the ultrasound response in configurations involving subwavelength complex phenomena. In addition, when modelling backwall breaking defects inspection, an important challenge remains to capture the propagation of the creeping waves that are generated at the critical angle. Hybrid models combining numerical and asymptotic methods have already been shown to be an effective strategy to overcome these limitations in 2D [1]. However, 3D simulations remain a crucial issue for industrial applications because of the computational cost of the numerical solver. A dedicated three dimensional high order finite element model combined with a domain decomposition method has been recently proposed to tackle 3D limitations [2]. In this communication, we will focus on the specific case of planar backwall breaking defects, with an adapted coupling strategy in order to efficiently model the propagation of creeping waves. Numerical and experimental validations will be proposed on various configurations.

  10. A dynamic multi-scale Markov model based methodology for remaining life prediction

    NASA Astrophysics Data System (ADS)

    Yan, Jihong; Guo, Chaozhong; Wang, Xing

    2011-05-01

    The ability to accurately predict the remaining life of partially degraded components is crucial in prognostics. In this paper, a performance degradation index is designed using multi-feature fusion techniques to represent deterioration severities of facilities. Based on this indicator, an improved Markov model is proposed for remaining life prediction. Fuzzy C-Means (FCM) algorithm is employed to perform state division for Markov model in order to avoid the uncertainty of state division caused by the hard division approach. Considering the influence of both historical and real time data, a dynamic prediction method is introduced into Markov model by a weighted coefficient. Multi-scale theory is employed to solve the state division problem of multi-sample prediction. Consequently, a dynamic multi-scale Markov model is constructed. An experiment is designed based on a Bently-RK4 rotor testbed to validate the dynamic multi-scale Markov model, experimental results illustrate the effectiveness of the methodology.

  11. An atomic model of brome mosaic virus using direct electron detection and real-space optimization.

    PubMed

    Wang, Zhao; Hryc, Corey F; Bammes, Benjamin; Afonine, Pavel V; Jakana, Joanita; Chen, Dong-Hua; Liu, Xiangan; Baker, Matthew L; Kao, Cheng; Ludtke, Steven J; Schmid, Michael F; Adams, Paul D; Chiu, Wah

    2014-09-04

    Advances in electron cryo-microscopy have enabled structure determination of macromolecules at near-atomic resolution. However, structure determination, even using de novo methods, remains susceptible to model bias and overfitting. Here we describe a complete workflow for data acquisition, image processing, all-atom modelling and validation of brome mosaic virus, an RNA virus. Data were collected with a direct electron detector in integrating mode and an exposure beyond the traditional radiation damage limit. The final density map has a resolution of 3.8 Å as assessed by two independent data sets and maps. We used the map to derive an all-atom model with a newly implemented real-space optimization protocol. The validity of the model was verified by its match with the density map and a previous model from X-ray crystallography, as well as the internal consistency of models from independent maps. This study demonstrates a practical approach to obtain a rigorously validated atomic resolution electron cryo-microscopy structure.

  12. Prior Study of Cross-Cultural Validation of McGill Quality-of-Life Questionnaire in Mainland Mandarin Chinese Patients With Cancer.

    PubMed

    Hu, Liya; Li, Jingwen; Wang, Xu; Payne, Sheila; Chen, Yuan; Mei, Qi

    2015-11-01

    The validation of McGill quality-of-life questionnaire (MQOLQ) in mainland China, which had already been used in multicultural palliative care background including Hong Kong and Taiwan, remained unknown. Eligible patients completed the translated Chinese version of McGill questionnaires (MQOL-C), which had been examined before the study. Construct validity was preliminarily assessed through exploratory factor analysis extracting 4 factors that construct a new hypothesis model and then the original model was proved to be better confirmed by confirmatory factor analysis. Internal consistency of all the subscales was within 0.582 to 0.917. Furthermore, test-retest reliability ranged from 0.509 to 0.859, which was determined by Spearman rank correlation coefficient. Face validation and feasibility also confirm the good validity of MQOL-C. The MQOL-C has satisfied validation in mainland Chinese patients with cancer, although cultural difference should be considered while using it. © The Author(s) 2014.

  13. Object-Oriented Modeling of an Energy Harvesting System Based on Thermoelectric Generators

    NASA Astrophysics Data System (ADS)

    Nesarajah, Marco; Frey, Georg

    This paper deals with the modeling of an energy harvesting system based on thermoelectric generators (TEG), and the validation of the model by means of a test bench. TEGs are capable to improve the overall energy efficiency of energy systems, e.g. combustion engines or heating systems, by using the remaining waste heat to generate electrical power. Previously, a component-oriented model of the TEG itself was developed in Modelica® language. With this model any TEG can be described and simulated given the material properties and the physical dimension. Now, this model was extended by the surrounding components to a complete model of a thermoelectric energy harvesting system. In addition to the TEG, the model contains the cooling system, the heat source, and the power electronics. To validate the simulation model, a test bench was built and installed on an oil-fired household heating system. The paper reports results of the measurements and discusses the validity of the developed simulation models. Furthermore, the efficiency of the proposed energy harvesting system is derived and possible improvements based on design variations tested in the simulation model are proposed.

  14. HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models

    NASA Astrophysics Data System (ADS)

    Melsen, Lieke A.; Teuling, Adriaan J.; Torfs, Paul J. J. F.; Uijlenhoet, Remko; Mizukami, Naoki; Clark, Martyn P.

    2016-03-01

    A meta-analysis on 192 peer-reviewed articles reporting on applications of the variable infiltration capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.

  15. HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models

    NASA Astrophysics Data System (ADS)

    Melsen, L. A.; Teuling, A. J.; Torfs, P. J. J. F.; Uijlenhoet, R.; Mizukami, N.; Clark, M. P.

    2015-12-01

    A meta-analysis on 192 peer-reviewed articles reporting applications of the Variable Infiltration Capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.

  16. Validation of a computational knee joint model using an alignment method for the knee laxity test and computed tomography.

    PubMed

    Kang, Kyoung-Tak; Kim, Sung-Hwan; Son, Juhyun; Lee, Young Han; Koh, Yong-Gon

    2017-01-01

    Computational models have been identified as efficient techniques in the clinical decision-making process. However, computational model was validated using published data in most previous studies, and the kinematic validation of such models still remains a challenge. Recently, studies using medical imaging have provided a more accurate visualization of knee joint kinematics. The purpose of the present study was to perform kinematic validation for the subject-specific computational knee joint model by comparison with subject's medical imaging under identical laxity condition. The laxity test was applied to the anterior-posterior drawer under 90° flexion and the varus-valgus under 20° flexion with a series of stress radiographs, a Telos device, and computed tomography. The loading condition in the computational subject-specific knee joint model was identical to the laxity test condition in the medical image. Our computational model showed knee laxity kinematic trends that were consistent with the computed tomography images, except for negligible differences because of the indirect application of the subject's in vivo material properties. Medical imaging based on computed tomography with the laxity test allowed us to measure not only the precise translation but also the rotation of the knee joint. This methodology will be beneficial in the validation of laxity tests for subject- or patient-specific computational models.

  17. Assessing the Current Status of Atmospheric Radiation Modelling: Progress, Challenges and the Needs for the Next Generation of Models

    NASA Astrophysics Data System (ADS)

    Joyce, C. J.; Tobiska, W. K.; Copeland, K.; Smart, D. F.; Shea, M. A.; Nowicki, S.; Atwell, W.; Benton, E. R.; Wilkins, R.; Hands, A.; Gronoff, G.; Meier, M. M.; Schwadron, N.

    2017-12-01

    Despite its potential for causing a wide range of harmful effects, including health hazards to airline passengers and damage to aircraft and satellite electronics, atmospheric radiation remains a relatively poorly defined risk, lacking sufficient measurements and modelling to fully evaluate the dangers posed. While our reliance on airline travel has increased dramatically over time, there remains an absence of international guidance and standards to protect aircraft passengers from potential health impacts due to radiation exposure. This subject has been gaining traction within the scientific community in recent years, with an expanding number of models with increasing capabilities being made available to evaluate atmospheric radiation hazards. We provide a general description of these modelling efforts, including the physics and methods used by the models, as well as their data inputs and outputs. We also discuss the current capacity for model validation via measurements and discuss the needs for the next generation of models, both in terms of their capabilities and the measurements required to validate them. This review of the status of atmospheric radiation modelling is part of a larger series of studies made as part of the SAFESKY program, with other efforts focusing on the underlying physics and implications, measurements and regulations/standards of atmospheric radiation.

  18. One Size Fits All? The Validity of a Composite Poverty Index Across Urban and Rural Households in South Africa.

    PubMed

    Steinert, Janina Isabel; Cluver, Lucie Dale; Melendez-Torres, G J; Vollmer, Sebastian

    2018-01-01

    Composite indices have been prominently used in poverty research. However, validity of these indices remains subject to debate. This paper examines the validity of a common type of composite poverty indices using data from a cross-sectional survey of 2477 households in urban and rural KwaZulu-Natal, South Africa. Multiple-group comparisons in structural equation modelling were employed for testing differences in the measurement model across urban and rural groups. The analysis revealed substantial variations between urban and rural respondents both in the conceptualisation of poverty as well as in the weights and importance assigned to individual poverty indicators. The validity of a 'one size fits all' measurement model can therefore not be confirmed. In consequence, it becomes virtually impossible to determine a household's poverty level relative to the full sample. Findings from our analysis have important practical implications in nuancing how we can sensitively use composite poverty indices to identify poor people.

  19. Critical analysis of 3-D organoid in vitro cell culture models for high-throughput drug candidate toxicity assessments.

    PubMed

    Astashkina, Anna; Grainger, David W

    2014-04-01

    Drug failure due to toxicity indicators remains among the primary reasons for staggering drug attrition rates during clinical studies and post-marketing surveillance. Broader validation and use of next-generation 3-D improved cell culture models are expected to improve predictive power and effectiveness of drug toxicological predictions. However, after decades of promising research significant gaps remain in our collective ability to extract quality human toxicity information from in vitro data using 3-D cell and tissue models. Issues, challenges and future directions for the field to improve drug assay predictive power and reliability of 3-D models are reviewed. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Gear fatigue crack prognosis using embedded model, gear dynamic model and fracture mechanics

    NASA Astrophysics Data System (ADS)

    Li, C. James; Lee, Hyungdae

    2005-07-01

    This paper presents a model-based method that predicts remaining useful life of a gear with a fatigue crack. The method consists of an embedded model to identify gear meshing stiffness from measured gear torsional vibration, an inverse method to estimate crack size from the estimated meshing stiffness; a gear dynamic model to simulate gear meshing dynamics and determine the dynamic load on the cracked tooth; and a fast crack propagation model to forecast the remaining useful life based on the estimated crack size and dynamic load. The fast crack propagation model was established to avoid repeated calculations of FEM and facilitate field deployment of the proposed method. Experimental studies were conducted to validate and demonstrate the feasibility of the proposed method for prognosis of a cracked gear.

  1. Development and validation of instrument for ergonomic evaluation of tablet arm chairs

    PubMed Central

    Tirloni, Adriana Seára; dos Reis, Diogo Cunha; Bornia, Antonio Cezar; de Andrade, Dalton Francisco; Borgatto, Adriano Ferreti; Moro, Antônio Renato Pereira

    2016-01-01

    The purpose of this study was to develop and validate an evaluation instrument for tablet arm chairs based on ergonomic requirements, focused on user perceptions and using Item Response Theory (IRT). This exploratory study involved 1,633 participants (university students and professors) in four steps: a pilot study (n=26), semantic validation (n=430), content validation (n=11) and construct validation (n=1,166). Samejima's graded response model was applied to validate the instrument. The results showed that all the steps (theoretical and practical) of the instrument's development and validation processes were successful and that the group of remaining items (n=45) had a high consistency (0.95). This instrument can be used in the furniture industry by engineers and product designers and in the purchasing process of tablet arm chairs for schools, universities and auditoriums. PMID:28337099

  2. Defining metrics of the Quasi-Biennial Oscillation in global climate models

    NASA Astrophysics Data System (ADS)

    Schenzinger, Verena; Osprey, Scott; Gray, Lesley; Butchart, Neal

    2017-06-01

    As the dominant mode of variability in the tropical stratosphere, the Quasi-Biennial Oscillation (QBO) has been subject to extensive research. Though there is a well-developed theory of this phenomenon being forced by wave-mean flow interaction, simulating the QBO adequately in global climate models still remains difficult. This paper presents a set of metrics to characterize the morphology of the QBO using a number of different reanalysis datasets and the FU Berlin radiosonde observation dataset. The same metrics are then calculated from Coupled Model Intercomparison Project 5 and Chemistry-Climate Model Validation Activity 2 simulations which included a representation of QBO-like behaviour to evaluate which aspects of the QBO are well captured by the models and which ones remain a challenge for future model development.

  3. Robustness of near-infrared calibration models for the prediction of milk constituents during the milking process.

    PubMed

    Melfsen, Andreas; Hartung, Eberhard; Haeussermann, Angelika

    2013-02-01

    The robustness of in-line raw milk analysis with near-infrared spectroscopy (NIRS) was tested with respect to the prediction of the raw milk contents fat, protein and lactose. Near-infrared (NIR) spectra of raw milk (n = 3119) were acquired on three different farms during the milking process of 354 milkings over a period of six months. Calibration models were calculated for: a random data set of each farm (fully random internal calibration); first two thirds of the visits per farm (internal calibration); whole datasets of two of the three farms (external calibration), and combinations of external and internal datasets. Validation was done either on the remaining data set per farm (internal validation) or on data of the remaining farms (external validation). Excellent calibration results were obtained when fully randomised internal calibration sets were used for milk analysis. In this case, RPD values of around ten, five and three for the prediction of fat, protein and lactose content, respectively, were achieved. Farm internal calibrations achieved much poorer prediction results especially for the prediction of protein and lactose with RPD values of around two and one respectively. The prediction accuracy improved when validation was done on spectra of an external farm, mainly due to the higher sample variation in external calibration sets in terms of feeding diets and individual cow effects. The results showed that further improvements were achieved when additional farm information was added to the calibration set. One of the main requirements towards a robust calibration model is the ability to predict milk constituents in unknown future milk samples. The robustness and quality of prediction increases with increasing variation of, e.g., feeding and cow individual milk composition in the calibration model.

  4. The Five Cs of Positive Youth Development in Norway: Assessment and Associations with Positive and Negative Outcomes

    ERIC Educational Resources Information Center

    Holsen, Ingrid; Geldhof, John; Larsen, Torill; Aardal, Elisabeth

    2017-01-01

    As the field of positive youth development (PYD) emerges internationally, models of PYD designed for use in the US must be extended to diverse contexts. For instance, a robust body of evidence supports Lerner and colleagues' Five Cs Model of PYD in the US, but it remains unclear whether the Five Cs Model can validly capture positive development in…

  5. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  6. A novel integrated framework and improved methodology of computer-aided drug design.

    PubMed

    Chen, Calvin Yu-Chian

    2013-01-01

    Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.

  7. Development of a physically-based planar inductors VHDL-AMS model for integrated power converter design

    NASA Astrophysics Data System (ADS)

    Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé

    2014-05-01

    Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.

  8. Tandem internal models execute motor learning in the cerebellum.

    PubMed

    Honda, Takeru; Nagao, Soichi; Hashimoto, Yuji; Ishikawa, Kinya; Yokota, Takanori; Mizusawa, Hidehiro; Ito, Masao

    2018-06-25

    In performing skillful movement, humans use predictions from internal models formed by repetition learning. However, the computational organization of internal models in the brain remains unknown. Here, we demonstrate that a computational architecture employing a tandem configuration of forward and inverse internal models enables efficient motor learning in the cerebellum. The model predicted learning adaptations observed in hand-reaching experiments in humans wearing a prism lens and explained the kinetic components of these behavioral adaptations. The tandem system also predicted a form of subliminal motor learning that was experimentally validated after training intentional misses of hand targets. Patients with cerebellar degeneration disease showed behavioral impairments consistent with tandemly arranged internal models. These findings validate computational tandemization of internal models in motor control and its potential uses in more complex forms of learning and cognition. Copyright © 2018 the Author(s). Published by PNAS.

  9. Domain of validity of the perturbative approach to femtosecond optical spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelin, Maxim F.; Rao, B. Jayachander; Nest, Mathias

    2013-12-14

    We have performed numerical nonperturbative simulations of transient absorption pump-probe responses for a series of molecular model systems. The resulting signals as a function of the laser field strength and the pump-probe delay time are compared with those obtained in the perturbative response function formalism. The simulations and their theoretical analysis indicate that the perturbative description remains valid up to moderately strong laser pulses, corresponding to a rather substantial depopulation (population) of the initial (final) electronic states.

  10. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  11. Parental modelling of eating behaviours: observational validation of the Parental Modelling of Eating Behaviours scale (PARM).

    PubMed

    Palfreyman, Zoe; Haycraft, Emma; Meyer, Caroline

    2015-03-01

    Parents are important role models for their children's eating behaviours. This study aimed to further validate the recently developed Parental Modelling of Eating Behaviours Scale (PARM) by examining the relationships between maternal self-reports on the PARM with the modelling practices exhibited by these mothers during three family mealtime observations. Relationships between observed maternal modelling and maternal reports of children's eating behaviours were also explored. Seventeen mothers with children aged between 2 and 6 years were video recorded at home on three separate occasions whilst eating a meal with their child. Mothers also completed the PARM, the Children's Eating Behaviour Questionnaire and provided demographic information about themselves and their child. Findings provided validation for all three PARM subscales, which were positively associated with their observed counterparts on the observational coding scheme (PARM-O). The results also indicate that habituation to observations did not change the feeding behaviours displayed by mothers. In addition, observed maternal modelling was significantly related to children's food responsiveness (i.e., their interest in and desire for foods), enjoyment of food, and food fussiness. This study makes three important contributions to the literature. It provides construct validation for the PARM measure and provides further observational support for maternal modelling being related to lower levels of food fussiness and higher levels of food enjoyment in their children. These findings also suggest that maternal feeding behaviours remain consistent across repeated observations of family mealtimes, providing validation for previous research which has used single observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Prognostic modelling options for remaining useful life estimation by industry

    NASA Astrophysics Data System (ADS)

    Sikorska, J. Z.; Hodkiewicz, M.; Ma, L.

    2011-07-01

    Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.

  13. Guidelines for Use of the Approximate Beta-Poisson Dose-Response Model.

    PubMed

    Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie

    2017-07-01

    For dose-response analysis in quantitative microbial risk assessment (QMRA), the exact beta-Poisson model is a two-parameter mechanistic dose-response model with parameters α>0 and β>0, which involves the Kummer confluent hypergeometric function. Evaluation of a hypergeometric function is a computational challenge. Denoting PI(d) as the probability of infection at a given mean dose d, the widely used dose-response model PI(d)=1-(1+dβ)-α is an approximate formula for the exact beta-Poisson model. Notwithstanding the required conditions α<β and β>1, issues related to the validity and approximation accuracy of this approximate formula have remained largely ignored in practice, partly because these conditions are too general to provide clear guidance. Consequently, this study proposes a probability measure Pr(0 < r < 1 | α̂, β̂) as a validity measure (r is a random variable that follows a gamma distribution; α̂ and β̂ are the maximum likelihood estimates of α and β in the approximate model); and the constraint conditions β̂>(22α̂)0.50 for 0.02<α̂<2 as a rule of thumb to ensure an accurate approximation (e.g., Pr(0 < r < 1 | α̂, β̂) >0.99) . This validity measure and rule of thumb were validated by application to all the completed beta-Poisson models (related to 85 data sets) from the QMRA community portal (QMRA Wiki). The results showed that the higher the probability Pr(0 < r < 1 | α̂, β̂), the better the approximation. The results further showed that, among the total 85 models examined, 68 models were identified as valid approximate model applications, which all had a near perfect match to the corresponding exact beta-Poisson model dose-response curve. © 2016 Society for Risk Analysis.

  14. The Bidimensional Impression Management Index (BIMI): measuring agentic and communal forms of impression management.

    PubMed

    Blasberg, Sabrina A; Rogers, Katherine H; Paulhus, Delroy L

    2014-01-01

    Measures of impression management have yet to incorporate two-factor models of person perception. The 2 primary factors are often labeled agency and communion. In Study 1, we assembled a new measure of impression management—the Bidimensional Impression Management Index (BIMI): It comprises 2 subscales designed specifically to tap agentic and communal content. Both subscales showed adequate alpha reliabilities under both honest and faking conditions. In Study 2, the BIMI was cross-validated in a new sample: The subscales remained relatively independent, and their reliabilities remained solid. A coherent pattern of personality correlates also supported the validities of both subscales. In Study 3, the differential sensitivity of the 2 subscales was demonstrated by manipulating the job type in simulated job applications. Implications and applications of the BIMI are discussed.

  15. Using the GLIMMIX Procedure in SAS 9.3 to Fit a Standard Dichotomous Rasch and Hierarchical 1-PL IRT Model

    ERIC Educational Resources Information Center

    Black, Ryan A.; Butler, Stephen F.

    2012-01-01

    Although Rasch models have been shown to be a sound methodological approach to develop and validate measures of psychological constructs for more than 50 years, they remain underutilized in psychology and other social sciences. Until recently, one reason for this underutilization was the lack of syntactically simple procedures to fit Rasch and…

  16. A risk-model for hospital mortality among patients with severe sepsis or septic shock based on German national administrative claims data

    PubMed Central

    Fleischmann-Struzek, Carolin; Rüddel, Hendrik; Reinhart, Konrad; Thomas-Rüddel, Daniel O.

    2018-01-01

    Background Sepsis is a major cause of preventable deaths in hospitals. Feasible and valid methods for comparing quality of sepsis care between hospitals are needed. The aim of this study was to develop a risk-adjustment model suitable for comparing sepsis-related mortality between German hospitals. Methods We developed a risk-model using national German claims data. Since these data are available with a time-lag of 1.5 years only, the stability of the model across time was investigated. The model was derived from inpatient cases with severe sepsis or septic shock treated in 2013 using logistic regression with backward selection and generalized estimating equations to correct for clustering. It was validated among cases treated in 2015. Finally, the model development was repeated in 2015. To investigate secular changes, the risk-adjusted trajectory of mortality across the years 2010–2015 was analyzed. Results The 2013 deviation sample consisted of 113,750 cases; the 2015 validation sample consisted of 134,851 cases. The model developed in 2013 showed good validity regarding discrimination (AUC = 0.74), calibration (observed mortality in 1st and 10th risk-decile: 11%-78%), and fit (R2 = 0.16). Validity remained stable when the model was applied to 2015 (AUC = 0.74, 1st and 10th risk-decile: 10%-77%, R2 = 0.17). There was no indication of overfitting of the model. The final model developed in year 2015 contained 40 risk-factors. Between 2010 and 2015 hospital mortality in sepsis decreased from 48% to 42%. Adjusted for risk-factors the trajectory of decrease was still significant. Conclusions The risk-model shows good predictive validity and stability across time. The model is suitable to be used as an external algorithm for comparing risk-adjusted sepsis mortality among German hospitals or regions based on administrative claims data, but secular changes need to be taken into account when interpreting risk-adjusted mortality. PMID:29558486

  17. A risk-model for hospital mortality among patients with severe sepsis or septic shock based on German national administrative claims data.

    PubMed

    Schwarzkopf, Daniel; Fleischmann-Struzek, Carolin; Rüddel, Hendrik; Reinhart, Konrad; Thomas-Rüddel, Daniel O

    2018-01-01

    Sepsis is a major cause of preventable deaths in hospitals. Feasible and valid methods for comparing quality of sepsis care between hospitals are needed. The aim of this study was to develop a risk-adjustment model suitable for comparing sepsis-related mortality between German hospitals. We developed a risk-model using national German claims data. Since these data are available with a time-lag of 1.5 years only, the stability of the model across time was investigated. The model was derived from inpatient cases with severe sepsis or septic shock treated in 2013 using logistic regression with backward selection and generalized estimating equations to correct for clustering. It was validated among cases treated in 2015. Finally, the model development was repeated in 2015. To investigate secular changes, the risk-adjusted trajectory of mortality across the years 2010-2015 was analyzed. The 2013 deviation sample consisted of 113,750 cases; the 2015 validation sample consisted of 134,851 cases. The model developed in 2013 showed good validity regarding discrimination (AUC = 0.74), calibration (observed mortality in 1st and 10th risk-decile: 11%-78%), and fit (R2 = 0.16). Validity remained stable when the model was applied to 2015 (AUC = 0.74, 1st and 10th risk-decile: 10%-77%, R2 = 0.17). There was no indication of overfitting of the model. The final model developed in year 2015 contained 40 risk-factors. Between 2010 and 2015 hospital mortality in sepsis decreased from 48% to 42%. Adjusted for risk-factors the trajectory of decrease was still significant. The risk-model shows good predictive validity and stability across time. The model is suitable to be used as an external algorithm for comparing risk-adjusted sepsis mortality among German hospitals or regions based on administrative claims data, but secular changes need to be taken into account when interpreting risk-adjusted mortality.

  18. Extracellular Neural Microstimulation May Activate Much Larger Regions than Expected by Simulations: A Combined Experimental and Modeling Study

    PubMed Central

    Joucla, Sébastien; Branchereau, Pascal; Cattaert, Daniel; Yvert, Blaise

    2012-01-01

    Electrical stimulation of the central nervous system has been widely used for decades for either fundamental research purposes or clinical treatment applications. Yet, very little is known regarding the spatial extent of an electrical stimulation. If pioneering experimental studies reported that activation threshold currents (TCs) increase with the square of the neuron-to-electrode distance over a few hundreds of microns, there is no evidence that this quadratic law remains valid for larger distances. Moreover, nowadays, numerical simulation approaches have supplanted experimental studies for estimating TCs. However, model predictions have not yet been validated directly with experiments within a common paradigm. Here, we present a direct comparison between experimental determination and modeling prediction of TCs up to distances of several millimeters. First, we combined patch-clamp recording and microelectrode array stimulation in whole embryonic mouse spinal cords to determine TCs. Experimental thresholds did not follow a quadratic law beyond 1 millimeter, but rather tended to remain constant for distances larger than 1 millimeter. We next built a combined finite element – compartment model of the same experimental paradigm to predict TCs. While theoretical TCs closely matched experimental TCs for distances <250 microns, they were highly overestimated for larger distances. This discrepancy remained even after modifications of the finite element model of the potential field, taking into account anisotropic, heterogeneous or dielectric properties of the tissue. In conclusion, these results show that quadratic evolution of TCs does not always hold for large distances between the electrode and the neuron and that classical models may underestimate volumes of tissue activated by electrical stimulation. PMID:22879886

  19. Validation of Model-Based Prognostics for Pneumatic Valves in a Demonstration Testbed

    DTIC Science & Technology

    2014-10-02

    predict end of life ( EOL ) and remaining useful life (RUL). The approach still follows the general estimation-prediction framework devel- oped in the...atmosphere, with linearly increasing leak area. kA2leak = Cleak (16) We define valve end of life ( EOL ) through open/close time limits of the valves, as in...represents end of life ( EOL ), and ∆kE represents remaining useful life (RUL). For valves, timing requirements are provided that de- fine the maximum

  20. Accelerated Aging in Electrolytic Capacitors for Prognostics

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Kulkarni, Chetan; Saha, Sankalita; Biswas, Gautam; Goebel, Kai Frank

    2012-01-01

    The focus of this work is the analysis of different degradation phenomena based on thermal overstress and electrical overstress accelerated aging systems and the use of accelerated aging techniques for prognostics algorithm development. Results on thermal overstress and electrical overstress experiments are presented. In addition, preliminary results toward the development of physics-based degradation models are presented focusing on the electrolyte evaporation failure mechanism. An empirical degradation model based on percentage capacitance loss under electrical overstress is presented and used in: (i) a Bayesian-based implementation of model-based prognostics using a discrete Kalman filter for health state estimation, and (ii) a dynamic system representation of the degradation model for forecasting and remaining useful life (RUL) estimation. A leave-one-out validation methodology is used to assess the validity of the methodology under the small sample size constrain. The results observed on the RUL estimation are consistent through the validation tests comparing relative accuracy and prediction error. It has been observed that the inaccuracy of the model to represent the change in degradation behavior observed at the end of the test data is consistent throughout the validation tests, indicating the need of a more detailed degradation model or the use of an algorithm that could estimate model parameters on-line. Based on the observed degradation process under different stress intensity with rest periods, the need for more sophisticated degradation models is further supported. The current degradation model does not represent the capacitance recovery over rest periods following an accelerated aging stress period.

  1. Agent-based modeling of noncommunicable diseases: a systematic review.

    PubMed

    Nianogo, Roch A; Arah, Onyebuchi A

    2015-03-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application.

  2. Agent-Based Modeling of Noncommunicable Diseases: A Systematic Review

    PubMed Central

    Arah, Onyebuchi A.

    2015-01-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application. PMID:25602871

  3. Assessment of the Hypochondriasis Domain: The Multidimensional Inventory of Hypochondriacal Traits (MIHT)

    ERIC Educational Resources Information Center

    Longley, Susan L.; Watson, David; Noyes, Russell, Jr.

    2005-01-01

    Although hypochondriasis is associated with the costly use of unnecessary medical resources, this mental health problem remains largely neglected. A lack of clear conceptual models and valid measures has impeded accurate assessment and hindered progress. The Multidimensional Inventory of Hypochondriacal Traits (MIHT) addresses these deficiencies…

  4. A novel QSAR model of Salmonella mutagenicity and its application in the safety assessment of drug impurities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valencia, Antoni; Prous, Josep; Mora, Oscar

    As indicated in ICH M7 draft guidance, in silico predictive tools including statistically-based QSARs and expert analysis may be used as a computational assessment for bacterial mutagenicity for the qualification of impurities in pharmaceuticals. To address this need, we developed and validated a QSAR model to predict Salmonella t. mutagenicity (Ames assay outcome) of pharmaceutical impurities using Prous Institute's Symmetry℠, a new in silico solution for drug discovery and toxicity screening, and the Mold2 molecular descriptor package (FDA/NCTR). Data was sourced from public benchmark databases with known Ames assay mutagenicity outcomes for 7300 chemicals (57% mutagens). Of these data, 90%more » was used to train the model and the remaining 10% was set aside as a holdout set for validation. The model's applicability to drug impurities was tested using a FDA/CDER database of 951 structures, of which 94% were found within the model's applicability domain. The predictive performance of the model is acceptable for supporting regulatory decision-making with 84 ± 1% sensitivity, 81 ± 1% specificity, 83 ± 1% concordance and 79 ± 1% negative predictivity based on internal cross-validation, while the holdout dataset yielded 83% sensitivity, 77% specificity, 80% concordance and 78% negative predictivity. Given the importance of having confidence in negative predictions, an additional external validation of the model was also carried out, using marketed drugs known to be Ames-negative, and obtained 98% coverage and 81% specificity. Additionally, Ames mutagenicity data from FDA/CFSAN was used to create another data set of 1535 chemicals for external validation of the model, yielding 98% coverage, 73% sensitivity, 86% specificity, 81% concordance and 84% negative predictivity. - Highlights: • A new in silico QSAR model to predict Ames mutagenicity is described. • The model is extensively validated with chemicals from the FDA and the public domain. • Validation tests show desirable high sensitivity and high negative predictivity. • The model predicted 14 reportedly difficult to predict drug impurities with accuracy. • The model is suitable to support risk evaluation of potentially mutagenic compounds.« less

  5. Validating Remotely Sensed Land Surface Evapotranspiration Based on Multi-scale Field Measurements

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Liu, S.; Ziwei, X.; Liang, S.

    2012-12-01

    The land surface evapotranspiration plays an important role in the surface energy balance and the water cycle. There have been significant technical and theoretical advances in our knowledge of evapotranspiration over the past two decades. Acquisition of the temporally and spatially continuous distribution of evapotranspiration using remote sensing technology has attracted the widespread attention of researchers and managers. However, remote sensing technology still has many uncertainties coming from model mechanism, model inputs, parameterization schemes, and scaling issue in the regional estimation. Achieving remotely sensed evapotranspiration (RS_ET) with confident certainty is required but difficult. As a result, it is indispensable to develop the validation methods to quantitatively assess the accuracy and error sources of the regional RS_ET estimations. This study proposes an innovative validation method based on multi-scale evapotranspiration acquired from field measurements, with the validation results including the accuracy assessment, error source analysis, and uncertainty analysis of the validation process. It is a potentially useful approach to evaluate the accuracy and analyze the spatio-temporal properties of RS_ET at both the basin and local scales, and is appropriate to validate RS_ET in diverse resolutions at different time-scales. An independent RS_ET validation using this method was presented over the Hai River Basin, China in 2002-2009 as a case study. Validation at the basin scale showed good agreements between the 1 km annual RS_ET and the validation data such as the water balanced evapotranspiration, MODIS evapotranspiration products, precipitation, and landuse types. Validation at the local scale also had good results for monthly, daily RS_ET at 30 m and 1 km resolutions, comparing to the multi-scale evapotranspiration measurements from the EC and LAS, respectively, with the footprint model over three typical landscapes. Although some validation experiments demonstrated that the models yield accurate estimates at flux measurement sites, the question remains whether they are performing well over the broader landscape. Moreover, a large number of RS_ET products have been released in recent years. Thus, we also pay attention to the cross-validation method of RS_ET derived from multi-source models. "The Multi-scale Observation Experiment on Evapotranspiration over Heterogeneous Land Surfaces: Flux Observation Matrix" campaign is carried out at the middle reaches of the Heihe River Basin, China in 2012. Flux measurements from an observation matrix composed of 22 EC and 4 LAS are acquired to investigate the cross-validation of multi-source models over different landscapes. In this case, six remote sensing models, including the empirical statistical model, the one-source and two-source models, the Penman-Monteith equation based model, the Priestley-Taylor equation based model, and the complementary relationship based model, are used to perform an intercomparison. All the results from the two cases of RS_ET validation showed that the proposed validation methods are reasonable and feasible.

  6. Design and validation of a model to predict early mortality in haemodialysis patients.

    PubMed

    Mauri, Joan M; Clèries, Montse; Vela, Emili

    2008-05-01

    Mortality and morbidity rates are higher in patients receiving haemodialysis therapy than in the general population. Detection of risk factors related to early death in these patients could be of aid for clinical and administrative decision making. Objectives. The aims of this study were (1) to identify risk factors (comorbidity and variables specific to haemodialysis) associated with death in the first year following the start of haemodialysis and (2) to design and validate a prognostic model to quantify the probability of death for each patient. An analysis was carried out on all patients starting haemodialysis treatment in Catalonia during the period 1997-2003 (n = 5738). The data source was the Renal Registry of Catalonia, a mandatory population registry. Patients were randomly divided into two samples: 60% (n = 3455) of the total were used to develop the prognostic model and the remaining 40% (n = 2283) to validate the model. Logistic regression analysis was used to construct the model. One-year mortality in the total study population was 16.5%. The predictive model included the following variables: age, sex, primary renal disease, grade of functional autonomy, chronic obstructive pulmonary disease, malignant processes, chronic liver disease, cardiovascular disease, initial vascular access and malnutrition. The analyses showed adequate calibration for both the sample to develop the model and the validation sample (Hosmer-Lemeshow statistic 0.97 and P = 0.49, respectively) as well as adequate discrimination (ROC curve 0.78 in both cases). Risk factors implicated in mortality at one year following the start of haemodialysis have been determined and a prognostic model designed. The validated, easy-to-apply model quantifies individual patient risk attributable to various factors, some of them amenable to correction by directed interventions.

  7. Mathematical multi-scale model of the cardiovascular system including mitral valve dynamics. Application to ischemic mitral insufficiency

    PubMed Central

    2011-01-01

    Background Valve dysfunction is a common cardiovascular pathology. Despite significant clinical research, there is little formal study of how valve dysfunction affects overall circulatory dynamics. Validated models would offer the ability to better understand these dynamics and thus optimize diagnosis, as well as surgical and other interventions. Methods A cardiovascular and circulatory system (CVS) model has already been validated in silico, and in several animal model studies. It accounts for valve dynamics using Heaviside functions to simulate a physiologically accurate "open on pressure, close on flow" law. However, it does not consider real-time valve opening dynamics and therefore does not fully capture valve dysfunction, particularly where the dysfunction involves partial closure. This research describes an updated version of this previous closed-loop CVS model that includes the progressive opening of the mitral valve, and is defined over the full cardiac cycle. Results Simulations of the cardiovascular system with healthy mitral valve are performed, and, the global hemodynamic behaviour is studied compared with previously validated results. The error between resulting pressure-volume (PV) loops of already validated CVS model and the new CVS model that includes the progressive opening of the mitral valve is assessed and remains within typical measurement error and variability. Simulations of ischemic mitral insufficiency are also performed. Pressure-Volume loops, transmitral flow evolution and mitral valve aperture area evolution follow reported measurements in shape, amplitude and trends. Conclusions The resulting cardiovascular system model including mitral valve dynamics provides a foundation for clinical validation and the study of valvular dysfunction in vivo. The overall models and results could readily be generalised to other cardiac valves. PMID:21942971

  8. AHP-based spatial analysis of water quality impact assessment due to change in vehicular traffic caused by highway broadening in Sikkim Himalaya

    NASA Astrophysics Data System (ADS)

    Banerjee, Polash; Ghose, Mrinal Kanti; Pradhan, Ratika

    2018-05-01

    Spatial analysis of water quality impact assessment of highway projects in mountainous areas remains largely unexplored. A methodology is presented here for Spatial Water Quality Impact Assessment (SWQIA) due to highway-broadening-induced vehicular traffic change in the East district of Sikkim. Pollution load of the highway runoff was estimated using an Average Annual Daily Traffic-Based Empirical model in combination with mass balance model to predict pollution in the rivers within the study area. Spatial interpolation and overlay analysis were used for impact mapping. Analytic Hierarchy Process-Based Water Quality Status Index was used to prepare a composite impact map. Model validation criteria, cross-validation criteria, and spatial explicit sensitivity analysis show that the SWQIA model is robust. The study shows that vehicular traffic is a significant contributor to water pollution in the study area. The model is catering specifically to impact analysis of the concerned project. It can be an aid for decision support system for the project stakeholders. The applicability of SWQIA model needs to be explored and validated in the context of a larger set of water quality parameters and project scenarios at a greater spatial scale.

  9. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  10. Portable visible and near-infrared spectrophotometer for triglyceride measurements.

    PubMed

    Kobayashi, Takanori; Kato, Yukiko Hakariya; Tsukamoto, Megumi; Ikuta, Kazuyoshi; Sakudo, Akikazu

    2009-01-01

    An affordable and portable machine is required for the practical use of visible and near-infrared (Vis-NIR) spectroscopy. A portable fruit tester comprising a Vis-NIR spectrophotometer was modified for use in the transmittance mode and employed to quantify triglyceride levels in serum in combination with a chemometric analysis. Transmittance spectra collected in the 600- to 1100-nm region were subjected to a partial least-squares regression analysis and leave-out cross-validation to develop a chemometrics model for predicting triglyceride concentrations in serum. The model yielded a coefficient of determination in cross-validation (R2VAL) of 0.7831 with a standard error of cross-validation (SECV) of 43.68 mg/dl. The detection limit of the model was 148.79 mg/dl. Furthermore, masked samples predicted by the model yielded a coefficient of determination in prediction (R2PRED) of 0.6856 with a standard error of prediction (SEP) and detection limit of 61.54 and 159.38 mg/dl, respectively. The portable Vis-NIR spectrophotometer may prove convenient for the measurement of triglyceride concentrations in serum, although before practical use there remain obstacles, which are discussed.

  11. Prediction of early death among patients enrolled in phase I trials: development and validation of a new model based on platelet count and albumin.

    PubMed

    Ploquin, A; Olmos, D; Lacombe, D; A'Hern, R; Duhamel, A; Twelves, C; Marsoni, S; Morales-Barrera, R; Soria, J-C; Verweij, J; Voest, E E; Schöffski, P; Schellens, J H; Kramar, A; Kristeleit, R S; Arkenau, H-T; Kaye, S B; Penel, N

    2012-09-25

    Selecting patients with 'sufficient life expectancy' for Phase I oncology trials remains challenging. The Royal Marsden Hospital Score (RMS) previously identified high-risk patients as those with ≥ 2 of the following: albumin <35 g l(-1); LDH > upper limit of normal; >2 metastatic sites. This study developed an alternative prognostic model, and compared its performance with that of the RMS. The primary end point was the 90-day mortality rate. The new model was developed from the same database as RMS, but it used Chi-squared Automatic Interaction Detection (CHAID). The ROC characteristics of both methods were then validated in an independent database of 324 patients enrolled in European Organization on Research and Treatment of Cancer Phase I trials of cytotoxic agents between 2000 and 2009. The CHAID method identified high-risk patients as those with albumin <33 g l(-1) or ≥ 33 g l(-1), but platelet counts ≥ 400.000 mm(-3). In the validation data set, the rates of correctly classified patients were 0.79 vs 0.67 for the CHAID model and RMS, respectively. The negative predictive values (NPV) were similar for the CHAID model and RMS. The CHAID model and RMS provided a similarly high level of NPV, but the CHAID model gave a better accuracy in the validation set. Both CHAID model and RMS may improve the screening process in phase I trials.

  12. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    NASA Technical Reports Server (NTRS)

    Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai

    2011-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  13. Towards A Model-Based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.

  14. 8 CFR 1212.4 - Applications for the exercise of discretion under section 212(d)(1) and 212(d)(3).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the official record of each conviction, and any other documents relating to commutation of sentence... valid for a period not to exceed the validity of the biometric BCC for applications for admission at U.S... may remain valid. Although the waiver may remain valid, the non-biometric border crossing card portion...

  15. 8 CFR 1212.4 - Applications for the exercise of discretion under section 212(d)(1) and 212(d)(3).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the official record of each conviction, and any other documents relating to commutation of sentence... valid for a period not to exceed the validity of the biometric BCC for applications for admission at U.S... may remain valid. Although the waiver may remain valid, the non-biometric border crossing card portion...

  16. 8 CFR 1212.4 - Applications for the exercise of discretion under section 212(d)(1) and 212(d)(3).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the official record of each conviction, and any other documents relating to commutation of sentence... valid for a period not to exceed the validity of the biometric BCC for applications for admission at U.S... may remain valid. Although the waiver may remain valid, the non-biometric border crossing card portion...

  17. 8 CFR 1212.4 - Applications for the exercise of discretion under section 212(d)(1) and 212(d)(3).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the official record of each conviction, and any other documents relating to commutation of sentence... valid for a period not to exceed the validity of the biometric BCC for applications for admission at U.S... may remain valid. Although the waiver may remain valid, the non-biometric border crossing card portion...

  18. A coarse grain model for protein-surface interactions

    NASA Astrophysics Data System (ADS)

    Wei, Shuai; Knotts, Thomas A.

    2013-09-01

    The interaction of proteins with surfaces is important in numerous applications in many fields—such as biotechnology, proteomics, sensors, and medicine—but fundamental understanding of how protein stability and structure are affected by surfaces remains incomplete. Over the last several years, molecular simulation using coarse grain models has yielded significant insights, but the formalisms used to represent the surface interactions have been rudimentary. We present a new model for protein surface interactions that incorporates the chemical specificity of both the surface and the residues comprising the protein in the context of a one-bead-per-residue, coarse grain approach that maintains computational efficiency. The model is parameterized against experimental adsorption energies for multiple model peptides on different types of surfaces. The validity of the model is established by its ability to quantitatively and qualitatively predict the free energy of adsorption and structural changes for multiple biologically-relevant proteins on different surfaces. The validation, done with proteins not used in parameterization, shows that the model produces remarkable agreement between simulation and experiment.

  19. SU-E-T-131: Artificial Neural Networks Applied to Overall Survival Prediction for Patients with Periampullary Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gong, Y; Yu, J; Yeung, V

    Purpose: Artificial neural networks (ANN) can be used to discover complex relations within datasets to help with medical decision making. This study aimed to develop an ANN method to predict two-year overall survival of patients with peri-ampullary cancer (PAC) following resection. Methods: Data were collected from 334 patients with PAC following resection treated in our institutional pancreatic tumor registry between 2006 and 2012. The dataset contains 14 variables including age, gender, T-stage, tumor differentiation, positive-lymph-node ratio, positive resection margins, chemotherapy, radiation therapy, and tumor histology.After censoring for two-year survival analysis, 309 patients were left, of which 44 patients (∼15%) weremore » randomly selected to form testing set. The remaining 265 cases were randomly divided into training set (211 cases, ∼80% of 265) and validation set (54 cases, ∼20% of 265) for 20 times to build 20 ANN models. Each ANN has one hidden layer with 5 units. The 20 ANN models were ranked according to their concordance index (c-index) of prediction on validation sets. To further improve prediction, the top 10% of ANN models were selected, and their outputs averaged for prediction on testing set. Results: By random division, 44 cases in testing set and the remaining 265 cases have approximately equal two-year survival rates, 36.4% and 35.5% respectively. The 20 ANN models, which were trained and validated on the 265 cases, yielded mean c-indexes as 0.59 and 0.63 on validation sets and the testing set, respectively. C-index was 0.72 when the two best ANN models (top 10%) were used in prediction on testing set. The c-index of Cox regression analysis was 0.63. Conclusion: ANN improved survival prediction for patients with PAC. More patient data and further analysis of additional factors may be needed for a more robust model, which will help guide physicians in providing optimal post-operative care. This project was supported by PA CURE Grant.« less

  20. Physiological time model of Scirpophaga incertulas (Lepidoptera: Pyralidae) in rice in Guandong Province, People's Republic of China.

    PubMed

    Stevenson, Douglass E; Feng, Ge; Zhang, Runjie; Harris, Marvin K

    2005-08-01

    Scirpophaga incertulas (Walker) (Lepidoptera: Pyralidae) is autochthonous and monophagous on rice, Oryza spp., which favors the development of a physiological time model using degree-days (degrees C) to establish a well defined window during which adults will be present in fields. Model development of S. incertulas adult flight phenology used climatic data and historical field observations of S. incertulas from 1962 through 1988. Analysis of variance was used to evaluate 5,203 prospective models with starting dates ranging from 1 January (day 1) to 30 April (day 121) and base temperatures ranging from -3 through 18.5 degrees C. From six candidate models, which shared the lowest standard deviation of prediction error, a model with a base temperature of 10 degrees C starting on 19 January was selected for validation. Validation with linear regression evaluated the differences between predicted and observed events and showed the model consistently predicted phenological events of 10 to 90% cumulative flight activity within a 3.5-d prediction interval regarded as acceptable for pest management decision making. The degree-day phenology model developed here is expected to find field application in Guandong Province. Expansion to other areas of rice production will require field validation. We expect the degree-day characterization of the activity period will remain essentially intact, but the start day may vary based on climate and geographic location. The development and validation of the phenology model of the S. incertulas by using procedures originally developed for pecan nut casebearer, Acrobasis nuxvorella Neunzig, shows the fungibility of this approach to developing prediction models for other insects.

  1. A new class of enhanced kinetic sampling methods for building Markov state models

    NASA Astrophysics Data System (ADS)

    Bhoutekar, Arti; Ghosh, Susmita; Bhattacharya, Swati; Chatterjee, Abhijit

    2017-10-01

    Markov state models (MSMs) and other related kinetic network models are frequently used to study the long-timescale dynamical behavior of biomolecular and materials systems. MSMs are often constructed bottom-up using brute-force molecular dynamics (MD) simulations when the model contains a large number of states and kinetic pathways that are not known a priori. However, the resulting network generally encompasses only parts of the configurational space, and regardless of any additional MD performed, several states and pathways will still remain missing. This implies that the duration for which the MSM can faithfully capture the true dynamics, which we term as the validity time for the MSM, is always finite and unfortunately much shorter than the MD time invested to construct the model. A general framework that relates the kinetic uncertainty in the model to the validity time, missing states and pathways, network topology, and statistical sampling is presented. Performing additional calculations for frequently-sampled states/pathways may not alter the MSM validity time. A new class of enhanced kinetic sampling techniques is introduced that aims at targeting rare states/pathways that contribute most to the uncertainty so that the validity time is boosted in an effective manner. Examples including straightforward 1D energy landscapes, lattice models, and biomolecular systems are provided to illustrate the application of the method. Developments presented here will be of interest to the kinetic Monte Carlo community as well.

  2. Towards Developing an Industry-Validated Food Technology Curriculum in Afghanistan

    ERIC Educational Resources Information Center

    Ebner, Paul; McNamara, Kevin; Deering, Amanda; Oliver, Haley; Rahimi, Mirwais; Faisal, Hamid

    2017-01-01

    Afghanistan remains an agrarian country with most analyses holding food production and processing as key to recovery. To date, however, there are no public or private higher education departments focused on food technology. To bridge this gap, Herat University initiated a new academic department conferring BS degrees in food technology. Models for…

  3. How Teachers Become Leaders: An Internationally Validated Theoretical Model of Teacher Leadership Development

    ERIC Educational Resources Information Center

    Poekert, Philip; Alexandrou, Alex; Shannon, Darbianne

    2016-01-01

    Teacher leadership is increasingly being touted as a practical response to guide teacher learning in school improvement and policy reform efforts. However, the field of research on teacher leadership in relation to post-compulsory educational development has been and remains largely atheoretical to date. This empirical study proposes a grounded…

  4. 8 CFR 212.4 - Applications for the exercise of discretion under section 212(d)(1) and 212(d)(3).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... supplemented by the official record of each conviction, and any other documents relating to commutation of... Card, issued by the DOS shall be valid for a period not to exceed the validity of the biometric BCC for... is noted on the card may remain valid. Although the waiver may remain valid, the non-biometric border...

  5. 8 CFR 212.4 - Applications for the exercise of discretion under section 212(d)(1) and 212(d)(3).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... supplemented by the official record of each conviction, and any other documents relating to commutation of... Card, issued by the DOS shall be valid for a period not to exceed the validity of the biometric BCC for... is noted on the card may remain valid. Although the waiver may remain valid, the non-biometric border...

  6. The nature of generalized anxiety disorder and pathological worry: current evidence and conceptual models.

    PubMed

    Brown, T A

    1997-10-01

    To examine the nature and conceptualization of generalized anxiety disorder (GAD) and chronic worry as well as data bearing on the validity of GAD as a distinct diagnosis. Narrative literature review. Although a wealth of data have been obtained on the epidemiology, genetics, and nature of GAD, many important questions remain regarding the validity of current conceptual models of pathological worry and the discriminability of GAD from certain emotional disorders (for instance, mood disorders) and higher-order trait vulnerability dimensions (for example, negative affect). Because the constituent features of GAD are salient to current conceptual models of emotional disorders (for example, models that implicate negative affect or worry/anxious apprehension as vulnerability factors), research on the nature of GAD and its associated features should provide important information on the pathogenesis, course, and co-occurrence of the entire range of anxiety and mood disorders.

  7. Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction

    NASA Astrophysics Data System (ADS)

    Mach, J. C.; Budrow, C. J.; Pagan, D. C.; Ruff, J. P. C.; Park, J.-S.; Okasinski, J.; Beaudoin, A. J.; Miller, M. P.

    2017-05-01

    Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present work, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to develop significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. The experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.

  8. Edible moisture barriers: how to assess of their potential and limits in food products shelf-life extension?

    PubMed

    Bourlieu, C; Guillard, V; Vallès-Pamiès, B; Guilbert, S; Gontard, N

    2009-05-01

    Control of moisture transfer inside composite food products or between food and its environment remains today a major challenge in food preservation. A wide rage of film-forming compounds is now available and facilitates tailoring moisture barriers with optimized functional properties. Despite these huge potentials, a realistic assessment of the film or coating efficacy is still critical. Due to nonlinear water sorption isotherms, water-dependent diffusivities, and variations of physical state, modelling transport phenomena through edible barriers is complex. Water vapor permeability can hardly be considered as an inherent property of films and only gives a relative indication of the barrier efficacy. The formal or mechanistic models reported in literature that describe the influence of testing conditions on the barrier properties of edible films are reviewed and discussed. Most of these models have been validated on a narrow range of conditions. Conversely, few original predictive models based on Fick's Second Law have been developed to assess shelf-life extension of food products including barriers. These models, assuming complex and realistic hypothesis, have been validated in various model foods. The development of nondestructive methods of moisture content measurement should speed up model validation and allow a better comprehension of moisture transfer through edible films.

  9. Development of an anaerobic threshold (HRLT, HRVT) estimation equation using the heart rate threshold (HRT) during the treadmill incremental exercise test

    PubMed Central

    Ham, Joo-ho; Park, Hun-Young; Kim, Youn-ho; Bae, Sang-kon; Ko, Byung-hoon

    2017-01-01

    [Purpose] The purpose of this study was to develop a regression model to estimate the heart rate at the lactate threshold (HRLT) and the heart rate at the ventilatory threshold (HRVT) using the heart rate threshold (HRT), and to test the validity of the regression model. [Methods] We performed a graded exercise test with a treadmill in 220 normal individuals (men: 112, women: 108) aged 20–59 years. HRT, HRLT, and HRVT were measured in all subjects. A regression model was developed to estimate HRLT and HRVT using HRT with 70% of the data (men: 79, women: 76) through randomization (7:3), with the Bernoulli trial. The validity of the regression model developed with the remaining 30% of the data (men: 33, women: 32) was also examined. [Results] Based on the regression coefficient, we found that the independent variable HRT was a significant variable in all regression models. The adjusted R2 of the developed regression models averaged about 70%, and the standard error of estimation of the validity test results was 11 bpm, which is similar to that of the developed model. [Conclusion] These results suggest that HRT is a useful parameter for predicting HRLT and HRVT. PMID:29036765

  10. Development of an anaerobic threshold (HRLT, HRVT) estimation equation using the heart rate threshold (HRT) during the treadmill incremental exercise test.

    PubMed

    Ham, Joo-Ho; Park, Hun-Young; Kim, Youn-Ho; Bae, Sang-Kon; Ko, Byung-Hoon; Nam, Sang-Seok

    2017-09-30

    The purpose of this study was to develop a regression model to estimate the heart rate at the lactate threshold (HRLT) and the heart rate at the ventilatory threshold (HRVT) using the heart rate threshold (HRT), and to test the validity of the regression model. We performed a graded exercise test with a treadmill in 220 normal individuals (men: 112, women: 108) aged 20-59 years. HRT, HRLT, and HRVT were measured in all subjects. A regression model was developed to estimate HRLT and HRVT using HRT with 70% of the data (men: 79, women: 76) through randomization (7:3), with the Bernoulli trial. The validity of the regression model developed with the remaining 30% of the data (men: 33, women: 32) was also examined. Based on the regression coefficient, we found that the independent variable HRT was a significant variable in all regression models. The adjusted R2 of the developed regression models averaged about 70%, and the standard error of estimation of the validity test results was 11 bpm, which is similar to that of the developed model. These results suggest that HRT is a useful parameter for predicting HRLT and HRVT. ©2017 The Korean Society for Exercise Nutrition

  11. Agricultural Policy Environmental eXtender Simulation of Three Adjacent Row-Crop Watersheds in the Claypan Region.

    PubMed

    Anomaa Senaviratne, G M M M; Udawatta, Ranjith P; Baffaut, Claire; Anderson, Stephen H

    2013-01-01

    The Agricultural Policy Environmental Extender (APEX) model is used to evaluate best management practices on pollutant loading in whole farms or small watersheds. The objectives of this study were to conduct a sensitivity analysis to determine the effect of model parameters on APEX output and use the parameterized, calibrated, and validated model to evaluate long-term benefits of grass waterways. The APEX model was used to model three (East, Center, and West) adjacent field-size watersheds with claypan soils under a no-till corn ( L.)/soybean [ (L.) Merr.] rotation. Twenty-seven parameters were sensitive for crop yield, runoff, sediment, nitrogen (dissolved and total), and phosphorous (dissolved and total) simulations. The model was calibrated using measured event-based data from the Center watershed from 1993 to 1997 and validated with data from the West and East watersheds. Simulated crop yields were within ±13% of the measured yield. The model performance for event-based runoff was excellent, with calibration and validation > 0.9 and Nash-Sutcliffe coefficients (NSC) > 0.8, respectively. Sediment and total nitrogen calibration results were satisfactory for larger rainfall events (>50 mm), with > 0.5 and NSC > 0.4, but validation results remained poor, with NSC between 0.18 and 0.3. Total phosphorous was well calibrated and validated, with > 0.8 and NSC > 0.7, respectively. The presence of grass waterways reduced annual total phosphorus loadings by 13 to 25%. The replicated study indicates that APEX provides a convenient and efficient tool to evaluate long-term benefits of conservation practices. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  12. QSAR Modeling of Rat Acute Toxicity by Oral Exposure

    PubMed Central

    Zhu, Hao; Martin, Todd M.; Ye, Lin; Sedykh, Alexander; Young, Douglas M.; Tropsha, Alexander

    2009-01-01

    Few Quantitative Structure-Activity Relationship (QSAR) studies have successfully modeled large, diverse rodent toxicity endpoints. In this study, a comprehensive dataset of 7,385 compounds with their most conservative lethal dose (LD50) values has been compiled. A combinatorial QSAR approach has been employed to develop robust and predictive models of acute toxicity in rats caused by oral exposure to chemicals. To enable fair comparison between the predictive power of models generated in this study versus a commercial toxicity predictor, TOPKAT (Toxicity Prediction by Komputer Assisted Technology), a modeling subset of the entire dataset was selected that included all 3,472 compounds used in the TOPKAT’s training set. The remaining 3,913 compounds, which were not present in the TOPKAT training set, were used as the external validation set. QSAR models of five different types were developed for the modeling set. The prediction accuracy for the external validation set was estimated by determination coefficient R2 of linear regression between actual and predicted LD50 values. The use of the applicability domain threshold implemented in most models generally improved the external prediction accuracy but expectedly led to the decrease in chemical space coverage; depending on the applicability domain threshold, R2 ranged from 0.24 to 0.70. Ultimately, several consensus models were developed by averaging the predicted LD50 for every compound using all 5 models. The consensus models afforded higher prediction accuracy for the external validation dataset with the higher coverage as compared to individual constituent models. The validated consensus LD50 models developed in this study can be used as reliable computational predictors of in vivo acute toxicity. PMID:19845371

  13. Quantitative structure-activity relationship modeling of rat acute toxicity by oral exposure.

    PubMed

    Zhu, Hao; Martin, Todd M; Ye, Lin; Sedykh, Alexander; Young, Douglas M; Tropsha, Alexander

    2009-12-01

    Few quantitative structure-activity relationship (QSAR) studies have successfully modeled large, diverse rodent toxicity end points. In this study, a comprehensive data set of 7385 compounds with their most conservative lethal dose (LD(50)) values has been compiled. A combinatorial QSAR approach has been employed to develop robust and predictive models of acute toxicity in rats caused by oral exposure to chemicals. To enable fair comparison between the predictive power of models generated in this study versus a commercial toxicity predictor, TOPKAT (Toxicity Prediction by Komputer Assisted Technology), a modeling subset of the entire data set was selected that included all 3472 compounds used in TOPKAT's training set. The remaining 3913 compounds, which were not present in the TOPKAT training set, were used as the external validation set. QSAR models of five different types were developed for the modeling set. The prediction accuracy for the external validation set was estimated by determination coefficient R(2) of linear regression between actual and predicted LD(50) values. The use of the applicability domain threshold implemented in most models generally improved the external prediction accuracy but expectedly led to the decrease in chemical space coverage; depending on the applicability domain threshold, R(2) ranged from 0.24 to 0.70. Ultimately, several consensus models were developed by averaging the predicted LD(50) for every compound using all five models. The consensus models afforded higher prediction accuracy for the external validation data set with the higher coverage as compared to individual constituent models. The validated consensus LD(50) models developed in this study can be used as reliable computational predictors of in vivo acute toxicity.

  14. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  15. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  16. Cost Modeling for Space Optical Telescope Assemblies

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2011-01-01

    Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.

  17. Updating and prospective validation of a prognostic model for high sickness absence.

    PubMed

    Roelen, C A M; Heymans, M W; Twisk, J W R; van Rhenen, W; Pallesen, S; Bjorvatn, B; Moen, B E; Magerøy, N

    2015-01-01

    To further develop and validate a Dutch prognostic model for high sickness absence (SA). Three-wave longitudinal cohort study of 2,059 Norwegian nurses. The Dutch prognostic model was used to predict high SA among Norwegian nurses at wave 2. Subsequently, the model was updated by adding person-related (age, gender, marital status, children at home, and coping strategies), health-related (BMI, physical activity, smoking, and caffeine and alcohol intake), and work-related (job satisfaction, job demands, decision latitude, social support at work, and both work-to-family and family-to-work spillover) variables. The updated model was then prospectively validated for predictions at wave 3. 1,557 (77 %) nurses had complete data at wave 2 and 1,342 (65 %) at wave 3. The risk of high SA was under-estimated by the Dutch model, but discrimination between high-risk and low-risk nurses was fair after re-calibration to the Norwegian data. Gender, marital status, BMI, physical activity, smoking, alcohol intake, job satisfaction, job demands, decision latitude, support at the workplace, and work-to-family spillover were identified as potential predictors of high SA. However, these predictors did not improve the model's discriminative ability, which remained fair at wave 3. The prognostic model correctly identifies 73 % of Norwegian nurses at risk of high SA, although additional predictors are needed before the model can be used to screen working populations for risk of high SA.

  18. Are the Insomnia Severity Index and Pittsburgh Sleep Quality Index valid outcome measures for Cognitive Behavioral Therapy for Insomnia? Inquiry from the perspective of response shifts and longitudinal measurement invariance in their Chinese versions.

    PubMed

    Chen, Po-Yi; Jan, Ya-Wen; Yang, Chien-Ming

    2017-07-01

    The purpose of this study was to examine whether the Insomnia Severity Index (ISI) and Pittsburgh Sleep Quality Index (PSQI) are valid outcome measures for Cognitive Behavioral Therapy for Insomnia (CBT-I). Specifically, we tested whether the factorial parameters of the ISI and the PSQI could remain invariant against CBT-I, which is a prerequisite to using their change scores as an unbiased measure of the treatment outcome of CBT-I. A clinical data set including scores on the Chinese versions of the ISI and the PSQI obtained from 114 insomnia patients prior to and after a 6-week CBT-I program in Taiwan was analyzed. A series of measurement invariance (MI) tests were conducted to compare the factorial parameters of the ISI and the PSQI before and after the CBT-I treatment program. Most factorial parameters of the ISI remained invariant after CBT-I. However, the factorial model of the PSQI changed after CBT-I treatment. An extra loading with three residual correlations was added into the factorial model after treatment. The partial strong invariance of the ISI supports that it is a valid outcome measure for CBT-I. In contrast, various changes in the factor model of the PSQI indicate that it may not be an appropriate outcome measure for CBT-I. Some possible causes for the changes of the constructs of the PSQI following CBT-I are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Validation of SCIAMACHY and TOMS UV Radiances Using Ground and Space Observations

    NASA Technical Reports Server (NTRS)

    Hilsenrath, E.; Bhartia, P. K.; Bojkov, B. R.; Kowalewski, M.; Labow, G.; Ahmad, Z.

    2004-01-01

    Verification of a stratospheric ozone recovery remains a high priority for environmental research and policy definition. Models predict an ozone recovery at a much lower rate than the measured depletion rate observed to date. Therefore improved precision of the satellite and ground ozone observing systems are required over the long term to verify its recovery. We show that validation of satellite radiances from space and from the ground can be a very effective means for correcting long term drifts of backscatter type satellite measurements and can be used to cross calibrate all B W instruments in orbit (TOMS, SBW/2, GOME, SCIAMACHY, OM, GOME-2, OMPS). This method bypasses the retrieval algorithms used for both satellite and ground based measurements that are normally used to validate and correct the satellite data. Radiance comparisons employ forward models and are inherently more accurate than inverse (retrieval) algorithms. This approach however requires well calibrated instruments and an accurate radiative transfer model that accounts for aerosols. TOMS and SCIAMACHY calibrations are checked to demonstrate this method and to demonstrate applicability for long term trends.

  20. Exact Solution of the Markov Propagator for the Voter Model on the Complete Graph

    DTIC Science & Technology

    2014-07-01

    distribution of the random walk. This process can also be applied to other models, incomplete graphs, or to multiple dimensions. An advantage of this...since any multiple of an eigenvector remains an eigenvector. Without any loss, let bk = 1. Now we can ascertain the explicit solution for bj when k < j...this bound is valid for all initial probability distributions. However, without detailed information about the eigenvectors, we cannot extract more

  1. The Development of Valid Subtypes for Depression in Primary Care Settings

    PubMed Central

    Karasz, Alison

    2009-01-01

    A persistent theme in the debate on the classification of depressive disorders is the distinction between biological and environmental depressions. Despite decades of research, there remains little consensus on how to distinguish between depressive subtypes. This preliminary study describes a method that could be useful, if implemented on a larger scale, in the development of valid subtypes of depression in primary care settings, using explanatory models of depressive illness. Seventeen depressed Hispanic patients at an inner city general practice participated in explanatory model interviews. Participants generated illness narratives, which included details about symptoms, cause, course, impact, health seeking, and anticipated outcome. Two distinct subtypes emerged from the analysis. The internal model subtype was characterized by internal attributions, specifically the notion of an “injured self.” The external model subtype conceptualized depression as a reaction to life situations. Each subtype was associated with a distinct constellation of clinical features and health seeking experiences. Future directions for research using explanatory models to establish depressive subtypes are explored. PMID:18414123

  2. Design and validation of a comprehensive fecal incontinence questionnaire.

    PubMed

    Macmillan, Alexandra K; Merrie, Arend E H; Marshall, Roger J; Parry, Bryan R

    2008-10-01

    Fecal incontinence can have a profound effect on quality of life. Its prevalence remains uncertain because of stigma, lack of consistent definition, and dearth of validated measures. This study was designed to develop a valid clinical and epidemiologic questionnaire, building on current literature and expertise. Patients and experts undertook face validity testing. Construct validity, criterion validity, and test-retest reliability was undertaken. Construct validity comprised factor analysis and internal consistency of the quality of life scale. The validity of known groups was tested against 77 control subjects by using regression models. Questionnaire results were compared with a stool diary for criterion validity. Test-retest reliability was calculated from repeated questionnaire completion. The questionnaire achieved good face validity. It was completed by 104 patients. The quality of life scale had four underlying traits (factor analysis) and high internal consistency (overall Cronbach alpha = 0.97). Patients and control subjects answered the questionnaire significantly differently (P < 0.01) in known-groups validity testing. Criterion validity assessment found mean differences close to zero. Median reliability for the whole questionnaire was 0.79 (range, 0.35-1). This questionnaire compares favorably with other available instruments, although the interpretation of stool consistency requires further research. Its sensitivity to treatment still needs to be investigated.

  3. A Stochastic Framework for Evaluating Seizure Prediction Algorithms Using Hidden Markov Models

    PubMed Central

    Wong, Stephen; Gardner, Andrew B.; Krieger, Abba M.; Litt, Brian

    2007-01-01

    Responsive, implantable stimulation devices to treat epilepsy are now in clinical trials. New evidence suggests that these devices may be more effective when they deliver therapy before seizure onset. Despite years of effort, prospective seizure prediction, which could improve device performance, remains elusive. In large part, this is explained by lack of agreement on a statistical framework for modeling seizure generation and a method for validating algorithm performance. We present a novel stochastic framework based on a three-state hidden Markov model (HMM) (representing interictal, preictal, and seizure states) with the feature that periods of increased seizure probability can transition back to the interictal state. This notion reflects clinical experience and may enhance interpretation of published seizure prediction studies. Our model accommodates clipped EEG segments and formalizes intuitive notions regarding statistical validation. We derive equations for type I and type II errors as a function of the number of seizures, duration of interictal data, and prediction horizon length and we demonstrate the model’s utility with a novel seizure detection algorithm that appeared to predicted seizure onset. We propose this framework as a vital tool for designing and validating prediction algorithms and for facilitating collaborative research in this area. PMID:17021032

  4. System and method for modeling and analyzing complex scenarios

    DOEpatents

    Shevitz, Daniel Wolf

    2013-04-09

    An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.

  5. Performance assessment of Large Eddy Simulation (LES) for modeling dispersion in an urban street canyon with tree planting

    NASA Astrophysics Data System (ADS)

    Moonen, P.; Gromke, C.; Dorer, V.

    2013-08-01

    The potential of a Large Eddy Simulation (LES) model to reliably predict near-field pollutant dispersion is assessed. To that extent, detailed time-resolved numerical simulations of coupled flow and dispersion are conducted for a street canyon with tree planting. Different crown porosities are considered. The model performance is assessed in several steps, ranging from a qualitative comparison to measured concentrations, over statistical data analysis by means of scatter plots and box plots, up to the calculation of objective validation metrics. The extensive validation effort highlights and quantifies notable features and shortcomings of the model, which would otherwise remain unnoticed. The model performance is found to be spatially non-uniform. Closer agreement with measurement data is achieved near the canyon ends than for the central part of the canyon, and typical model acceptance criteria are satisfied more easily for the leeward than for the windward canyon wall. This demonstrates the need for rigorous model evaluation. Only quality-assured models can be used with confidence to support assessment, planning and implementation of pollutant mitigation strategies.

  6. Regulatory Disruption and Arbitrage in Health-Care Data Protection.

    PubMed

    Terry, Nicolas P

    This article explains how the structure of U.S. health-care data protection (specifically its sectoral and downstream properties) has led to a chronically uneven policy environment for different types of health-care data. It examines claims for health-care data protection exceptionalism and competing demands such as data liquidity. In conclusion, the article takes the position that healthcare- data exceptionalism remains a valid imperative and that even current concerns about data liquidity can be accommodated in an exceptional protective model. However, re-calibrating our protection of health-care data residing outside of the traditional health-care domain is challenging, currently even politically impossible. Notwithstanding, a hybrid model is envisioned with downstream HIPAA model remaining the dominant force within the health-care domain, but being supplemented by targeted upstream and point-of-use protections applying to health-care data in disrupted spaces.

  7. Construct validity evidence for the Male Role Norms Inventory-Short Form: A structural equation modeling approach using the bifactor model.

    PubMed

    Levant, Ronald F; Hall, Rosalie J; Weigold, Ingrid K; McCurdy, Eric R

    2016-10-01

    The construct validity of the Male Role Norms Inventory-Short Form (MRNI-SF) was assessed using a latent variable approach implemented with structural equation modeling (SEM). The MRNI-SF was specified as having a bifactor structure, and validation scales were also specified as latent variables. The latent variable approach had the advantages of separating effects of general and specific factors and controlling for some sources of measurement error. Data (N = 484) were from a diverse sample (38.8% men of color, 22.3% men of diverse sexualities) of community-dwelling and college men who responded to an online survey. The construct validity of the MRNI-SF General Traditional Masculinity Ideology factor was supported for all 4 of the proposed latent correlations with: (a) Male Role Attitudes Scale; (b) general factor of Conformity to Masculine Norms Inventory-46; (c) higher-order factor of Gender Role Conflict Scale; and (d) Personal Attributes Questionnaire-Masculinity Scale. Significant correlations with relevant other latent factors provided concurrent validity evidence for the MRNI-SF specific factors of Negativity toward Sexual Minorities, Importance of Sex, Restrictive Emotionality, and Toughness, with all 8 of the hypothesized relationships supported. However, 3 relationships concerning Dominance were not supported. (The construct validity of the remaining 2 MRNI-SF specific factors-Avoidance of Femininity and Self-Reliance through Mechanical Skills was not assessed.) Comparisons were made, and meaningful differences noted, between the latent correlations emphasized in this study and their raw variable counterparts. Results are discussed in terms of the advantages of an SEM approach and the unique characteristics of the bifactor model. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Development of land use regression models for nitrogen dioxide, ultrafine particles, lung deposited surface area, and four other markers of particulate matter pollution in the Swiss SAPALDIA regions.

    PubMed

    Eeftens, Marloes; Meier, Reto; Schindler, Christian; Aguilera, Inmaculada; Phuleria, Harish; Ineichen, Alex; Davey, Mark; Ducret-Stich, Regina; Keidel, Dirk; Probst-Hensch, Nicole; Künzli, Nino; Tsai, Ming-Yi

    2016-04-18

    Land Use Regression (LUR) is a popular method to explain and predict spatial contrasts in air pollution concentrations, but LUR models for ultrafine particles, such as particle number concentration (PNC) are especially scarce. Moreover, no models have been previously presented for the lung deposited surface area (LDSA) of ultrafine particles. The additional value of ultrafine particle metrics has not been well investigated due to lack of exposure measurements and models. Air pollution measurements were performed in 2011 and 2012 in the eight areas of the Swiss SAPALDIA study at up to 40 sites per area for NO2 and at 20 sites in four areas for markers of particulate air pollution. We developed multi-area LUR models for biannual average concentrations of PM2.5, PM2.5 absorbance, PM10, PMcoarse, PNC and LDSA, as well as alpine, non-alpine and study area specific models for NO2, using predictor variables which were available at a national level. Models were validated using leave-one-out cross-validation, as well as independent external validation with routine monitoring data. Model explained variance (R(2)) was moderate for the various PM mass fractions PM2.5 (0.57), PM10 (0.63) and PMcoarse (0.45), and was high for PM2.5 absorbance (0.81), PNC (0.87) and LDSA (0.91). Study-area specific LUR models for NO2 (R(2) range 0.52-0.89) outperformed combined-area alpine (R (2)  = 0.53) and non-alpine (R (2)  = 0.65) models in terms of both cross-validation and independent external validation, and were better able to account for between-area variability. Predictor variables related to traffic and national dispersion model estimates were important predictors. LUR models for all pollutants captured spatial variability of long-term average concentrations, performed adequately in validation, and could be successfully applied to the SAPALDIA cohort. Dispersion model predictions or area indicators served well to capture the between area variance. For NO2, applying study-area specific models was preferable over applying combined-area alpine/non-alpine models. Correlations between pollutants were higher in the model predictions than in the measurements, so it will remain challenging to disentangle their health effects.

  9. NLTE steady-state response matrix method.

    NASA Astrophysics Data System (ADS)

    Faussurier, G.; More, R. M.

    2000-05-01

    A connection between atomic kinetics and non-equilibrium thermodynamics has been recently established by using a collisional-radiative model modified to include line absorption. The calculated net emission can be expressed as a non-local thermodynamic equilibrium (NLTE) symmetric response matrix. In the paper, this connection is extended to both cases of the average-atom model and the Busquet's model (RAdiative-Dependent IOnization Model, RADIOM). The main properties of the response matrix still remain valid. The RADIOM source function found in the literature leads to a diagonal response matrix, stressing the absence of any frequency redistribution among the frequency groups at this order of calculation.

  10. Validity of Factors of the Psychopathy Checklist–Revised in Female Prisoners

    PubMed Central

    Kennealy, Patrick J.; Hicks, Brian M.; Patrick, Christopher J.

    2008-01-01

    The validity of the Psychopathy Checklist–Revised (PCL-R) has been examined extensively in men, but its validity for women remains understudied. Specifically, the correlates of the general construct of psychopathy and its components as assessed by PCL-R total, factor, and facet scores have yet to be examined in depth. Based on previous research conducted with male offenders, a large female inmate sample was used to examine the patterns of relations between total, factor, and facet scores on the PCL-R and various criterion variables. These variables include ratings of psychopathy based on Cleckley’s criteria, symptoms of antisocial personality disorder, and measures of substance use and abuse, criminal behavior, institutional misconduct, interpersonal aggression, normal range personality, intellectual functioning, and social background variables. Results were highly consistent with past findings in male samples and provide further evidence for the construct validity of the PCL-R two-factor and four-facet models across genders. PMID:17986651

  11. Predicting non-isometric fatigue induced by electrical stimulation pulse trains as a function of pulse duration

    PubMed Central

    2013-01-01

    Background Our previous model of the non-isometric muscle fatigue that occurs during repetitive functional electrical stimulation included models of force, motion, and fatigue and accounted for applied load but not stimulation pulse duration. Our objectives were to: 1) further develop, 2) validate, and 3) present outcome measures for a non-isometric fatigue model that can predict the effect of a range of pulse durations on muscle fatigue. Methods A computer-controlled stimulator sent electrical pulses to electrodes on the thighs of 25 able-bodied human subjects. Isometric and non-isometric non-fatiguing and fatiguing knee torques and/or angles were measured. Pulse duration (170–600 μs) was the independent variable. Measurements were divided into parameter identification and model validation subsets. Results The fatigue model was simplified by removing two of three non-isometric parameters. The third remained a function of other model parameters. Between 66% and 77% of the variability in the angle measurements was explained by the new model. Conclusion Muscle fatigue in response to different stimulation pulse durations can be predicted during non-isometric repetitive contractions. PMID:23374142

  12. Predictive 5-Year Survivorship Model of Cystic Fibrosis

    PubMed Central

    Liou, Theodore G.; Adler, Frederick R.; FitzSimmons, Stacey C.; Cahill, Barbara C.; Hibbs, Jonathan R.; Marshall, Bruce C.

    2007-01-01

    The objective of this study was to create a 5-year survivorship model to identify key clinical features of cystic fibrosis. Such a model could help researchers and clinicians to evaluate therapies, improve the design of prospective studies, monitor practice patterns, counsel individual patients, and determine the best candidates for lung transplantation. The authors used information from the Cystic Fibrosis Foundation Patient Registry (CFFPR), which has collected longitudinal data on approximately 90% of cystic fibrosis patients diagnosed in the United States since 1986. They developed multivariate logistic regression models by using data on 5,820 patients randomly selected from 11,630 in the CFFPR in 1993. Models were tested for goodness of fit and were validated for the remaining 5,810 patients for 1993. The validated 5-year survivorship model included age, forced expiratory volume in 1 second as a percentage of predicted normal, gender, weight-for-age z score, pancreatic sufficiency, diabetes mellitus, Staphylococcus aureus infection, Burkerholderia cepacia infection, and annual number of acute pulmonary exacerbations. The model provides insights into the complex nature of cystic fibrosis and supplies a rigorous tool for clinical practice and research. PMID:11207152

  13. Proposal and validation of a new model to estimate survival for hepatocellular carcinoma patients.

    PubMed

    Liu, Po-Hong; Hsu, Chia-Yang; Hsia, Cheng-Yuan; Lee, Yun-Hsuan; Huang, Yi-Hsiang; Su, Chien-Wei; Lee, Fa-Yauh; Lin, Han-Chieh; Huo, Teh-Ia

    2016-08-01

    The survival of hepatocellular carcinoma (HCC) patients is heterogeneous. We aim to develop and validate a simple prognostic model to estimate survival for HCC patients (MESH score). A total of 3182 patients were randomised into derivation and validation cohort. Multivariate analysis was used to identify independent predictors of survival in the derivation cohort. The validation cohort was employed to examine the prognostic capabilities. The MESH score allocated 1 point for each of the following parameters: large tumour (beyond Milan criteria), presence of vascular invasion or metastasis, Child-Turcotte-Pugh score ≥6, performance status ≥2, serum alpha-fetoprotein level ≥20 ng/ml, and serum alkaline phosphatase ≥200 IU/L, with a maximal of 6 points. In the validation cohort, significant survival differences were found across all MESH scores from 0 to 6 (all p < 0.01). The MESH system was associated with the highest homogeneity and lowest corrected Akaike information criterion compared with Barcelona Clínic Liver Cancer, Hong Kong Liver Cancer (HKLC), Cancer of the Liver Italian Program, Taipei Integrated Scoring and model to estimate survival in ambulatory HCC Patients systems. The prognostic accuracy of the MESH scores remained constant in patients with hepatitis B- or hepatitis C-related HCC. The MESH score can also discriminate survival for patients from early to advanced stages of HCC. This newly proposed simple and accurate survival model provides enhanced prognostic accuracy for HCC. The MESH system is a useful supplement to the BCLC and HKLC classification schemes in refining treatment strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A Novel Model for Predicting Incident Moderate to Severe Anemia and Iron Deficiency in Patients with Newly Diagnosed Ulcerative Colitis.

    PubMed

    Khan, Nabeel; Patel, Dhruvan; Shah, Yash; Yang, Yu-Xiao

    2017-05-01

    Anemia and iron deficiency are common complications of ulcerative colitis (UC). We aimed to develop and internally validate a prediction model for the incidence of moderate to severe anemia and iron deficiency anemia (IDA) in newly diagnosed patients with UC. Multivariable logistic regression was performed among a nationwide cohort of patients who were newly diagnosed with UC in the VA health-care system. Model development was performed in a random two-third of the total cohort and then validated in the remaining one-third of the cohort. As candidate predictors, we examined routinely available data at the time of UC diagnosis including demographics, medications, laboratory results, and endoscopy findings. A total of 789 patients met the inclusion criteria. For the outcome of moderate to severe anemia, age, albumin level and mild anemia at UC diagnosis were predictors selected for the model. The AUC for this model was 0.69 (95% CI 0.64-0.74). For the outcome of moderate to severe anemia with evidence of iron deficiency, the predictors included African-American ethnicity, mild anemia, age, and albumin level at UC diagnosis. The AUC was 0.76, (95% CI 0.69-0.82). Calibration was consistently good in all models (Hosmer-Lemeshow goodness of fit p > 0.05). The models performed similarly in the internal validation cohort. We developed and internally validated a prognostic model for predicting the risk of moderate to severe anemia and IDA among newly diagnosed patients with UC. This will help identify patients at high risk of these complications, who could benefit from surveillance and preventive measures.

  15. Safety assessment of immunomodulatory biologics: the promise and challenges of regulatory T-cell modulation.

    PubMed

    Ponce, Rafael A

    2011-01-01

    Regulatory T-cell (T(reg)) modulation is developing as an important therapeutic opportunity for the treatment of a number of important diseases, including cancer, autoimmunity, infection, and organ transplant rejection. However, as demonstrated with IL-2 and TGN-1412, our understanding of the complex immunological interactions that occur with T(reg) modulation in both non-clinical models and in patients remains limited and appears highly contextual. This lack of understanding will challenge our ability to identify the patient population who might derive the highest benefit from T(reg) modulation and creates special challenges as we transition these therapeutics from non-clinical models into humans. Thus, in vivo testing in the most representative animal model systems, with careful progress in the clinic, will remain critical in developing therapeutics targeting T(reg) and understanding their clinical utility. Moreover, toxicology models can inform some of the potential liabilities associated with T(reg) modulation, but not all, suggesting a continued need to explore and validate predictive models.

  16. Bayesian Framework Approach for Prognostic Studies in Electrolytic Capacitor under Thermal Overstress Conditions

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Goebel, Kai; Biswas, Gautam

    2012-01-01

    Electrolytic capacitors are used in several applications ranging from power supplies for safety critical avionics equipment to power drivers for electro-mechanical actuator. Past experiences show that capacitors tend to degrade and fail faster when subjected to high electrical or thermal stress conditions during operations. This makes them good candidates for prognostics and health management. Model-based prognostics captures system knowledge in the form of physics-based models of components in order to obtain accurate predictions of end of life based on their current state of heal th and their anticipated future use and operational conditions. The focus of this paper is on deriving first principles degradation models for thermal stress conditions and implementing Bayesian framework for making remaining useful life predictions. Data collected from simultaneous experiments are used to validate the models. Our overall goal is to derive accurate models of capacitor degradation, and use them to remaining useful life in DC-DC converters.

  17. Model selection for the North American Breeding Bird Survey: A comparison of methods

    USGS Publications Warehouse

    Link, William; Sauer, John; Niven, Daniel

    2017-01-01

    The North American Breeding Bird Survey (BBS) provides data for >420 bird species at multiple geographic scales over 5 decades. Modern computational methods have facilitated the fitting of complex hierarchical models to these data. It is easy to propose and fit new models, but little attention has been given to model selection. Here, we discuss and illustrate model selection using leave-one-out cross validation, and the Bayesian Predictive Information Criterion (BPIC). Cross-validation is enormously computationally intensive; we thus evaluate the performance of the Watanabe-Akaike Information Criterion (WAIC) as a computationally efficient approximation to the BPIC. Our evaluation is based on analyses of 4 models as applied to 20 species covered by the BBS. Model selection based on BPIC provided no strong evidence of one model being consistently superior to the others; for 14/20 species, none of the models emerged as superior. For the remaining 6 species, a first-difference model of population trajectory was always among the best fitting. Our results show that WAIC is not reliable as a surrogate for BPIC. Development of appropriate model sets and their evaluation using BPIC is an important innovation for the analysis of BBS data.

  18. Emotion modelling towards affective pathogenesis.

    PubMed

    Bas, James Le

    2009-12-01

    Objective: There is a need in psychiatry for models that integrate pathological states with normal systems. The interaction of arousal and emotion is the focus of an exploration of affective pathogenesis. Method: Given that the explicit causes of affective disorder remain nascent, methods of linking emotion and disorder are evaluated. Results: A network model of emotional families is presented, in which emotions exist as quantal gradients. Morbid emotional states are seen as the activation of distal emotion sites. The phenomenology of affective disorders is described with reference to this model. Recourse is made to non-linear dynamic theory. Conclusions: Metaphoric emotion models have face validity and may prove a useful heuristic.

  19. Monogenic Mouse Models of Autism Spectrum Disorders: Common Mechanisms and Missing Links

    PubMed Central

    Hulbert, Samuel W.; Jiang, Yong-hui

    2016-01-01

    Autism Spectrum Disorders (ASDs) present unique challenges in the fields of genetics and neurobiology because of the clinical and molecular heterogeneity underlying these disorders. Genetic mutations found in ASD patients provide opportunities to dissect the molecular and circuit mechanisms underlying autistic behaviors using animal models. Ongoing studies of genetically modified models have offered critical insight into possible common mechanisms arising from different mutations, but links between molecular abnormalities and behavioral phenotypes remain elusive. The challenges encountered in modeling autism in mice demand a new analytic paradigm that integrates behavioral analysis with circuit-level analysis in genetically modified models with strong construct validity. PMID:26733386

  20. A square-force cohesion model and its extraction from bulk measurements

    NASA Astrophysics Data System (ADS)

    Liu, Peiyuan; Lamarche, Casey; Kellogg, Kevin; Hrenya, Christine

    2017-11-01

    Cohesive particles remain poorly understood, with order of magnitude differences exhibited for prior, physical predictions of agglomerate size. A major obstacle lies in the absence of robust models of particle-particle cohesion, thereby precluding accurate prediction of the behavior of cohesive particles. Rigorous cohesion models commonly contain parameters related to surface roughness, to which cohesion shows extreme sensitivity. However, both roughness measurement and its distillation into these model parameters are challenging. Accordingly, we propose a ``square-force'' model, where cohesive force remains constant until a cut-off separation. Via DEM simulations, we demonstrate validity of the square-force model as surrogate of more rigorous models, when its two parameters are selected to match the two key quantities governing dense and dilute granular flows, namely maximum cohesive force and critical cohesive energy, respectively. Perhaps more importantly, we establish a method to extract the parameters in the square-force model via defluidization, due to its ability to isolate the effects of the two parameters. Thus, instead of relying on complicated scans of individual grains, determination of particle-particle cohesion from simple bulk measurements becomes feasible. Dow Corning Corporation.

  1. Can we predict 4-year graduation in podiatric medical school using admission data?

    PubMed

    Sesodia, Sanjay; Molnar, David; Shaw, Graham P

    2012-01-01

    This study examined the predictive ability of educational background and demographic variables, available at the admission stage, to identify applicants who will graduate in 4 years from podiatric medical school. A logistic regression model was used to identify two predictors of 4-year graduation: age at matriculation and total Medical College Admission Test score. The model was cross-validated using a second independent sample from the same population. Cross-validation gives greater confidence that the results could be more generally applied. Total Medical College Admission Test score was the strongest predictor of 4-year graduation, with age at matriculation being a statistically significant but weaker predictor. Despite the model's capacity to predict 4-year graduation better than random assignment, a sufficient amount of error in prediction remained, suggesting that important predictors are missing from the model. Furthermore, the high rate of false-positives makes it inappropriate to use age and Medical College Admission Test score as admission screens in an attempt to eliminate attrition by not accepting at-risk students.

  2. Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction

    DOE PAGES

    Mach, J. C.; Budrow, C. J.; Pagan, D. C.; ...

    2017-03-15

    Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present paper, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to developmore » significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. Finally, the experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.« less

  3. The implementation and validation of improved landsurface hydrology in an atmospheric general circulation model

    NASA Technical Reports Server (NTRS)

    Johnson, Kevin D.; Entekhabi, Dara; Eagleson, Peter S.

    1991-01-01

    Landsurface hydrological parameterizations are implemented in the NASA Goddard Institute for Space Studies (GISS) General Circulation Model (GCM). These parameterizations are: (1) runoff and evapotranspiration functions that include the effects of subgrid scale spatial variability and use physically based equations of hydrologic flux at the soil surface, and (2) a realistic soil moisture diffusion scheme for the movement of water in the soil column. A one dimensional climate model with a complete hydrologic cycle is used to screen the basic sensitivities of the hydrological parameterizations before implementation into the full three dimensional GCM. Results of the final simulation with the GISS GCM and the new landsurface hydrology indicate that the runoff rate, especially in the tropics is significantly improved. As a result, the remaining components of the heat and moisture balance show comparable improvements when compared to observations. The validation of model results is carried from the large global (ocean and landsurface) scale, to the zonal, continental, and finally the finer river basin scales.

  4. Validating a Model for Welding Induced Residual Stress Using High-Energy X-ray Diffraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mach, J. C.; Budrow, C. J.; Pagan, D. C.

    Integrated computational materials engineering (ICME) provides a pathway to advance performance in structures through the use of physically-based models to better understand how manufacturing processes influence product performance. As one particular challenge, consider that residual stresses induced in fabrication are pervasive and directly impact the life of structures. For ICME to be an effective strategy, it is essential that predictive capability be developed in conjunction with critical experiments. In the present paper, simulation results from a multi-physics model for gas metal arc welding are evaluated through x-ray diffraction using synchrotron radiation. A test component was designed with intent to developmore » significant gradients in residual stress, be representative of real-world engineering application, yet remain tractable for finely spaced strain measurements with positioning equipment available at synchrotron facilities. Finally, the experimental validation lends confidence to model predictions, facilitating the explicit consideration of residual stress distribution in prediction of fatigue life.« less

  5. Gravity Waves Generated by Convection: A New Idealized Model Tool and Direct Validation with Satellite Observations

    NASA Astrophysics Data System (ADS)

    Alexander, M. Joan; Stephan, Claudia

    2015-04-01

    In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional circulation and provides a new method for future development of realistic convective gravity wave parameterizations.

  6. Dutch population specific sex estimation formulae using the proximal femur.

    PubMed

    Colman, K L; Janssen, M C L; Stull, K E; van Rijn, R R; Oostra, R J; de Boer, H H; van der Merwe, A E

    2018-05-01

    Sex estimation techniques are frequently applied in forensic anthropological analyses of unidentified human skeletal remains. While morphological sex estimation methods are able to endure population differences, the classification accuracy of metric sex estimation methods are population-specific. No metric sex estimation method currently exists for the Dutch population. The purpose of this study is to create Dutch population specific sex estimation formulae by means of osteometric analyses of the proximal femur. Since the Netherlands lacks a representative contemporary skeletal reference population, 2D plane reconstructions, derived from clinical computed tomography (CT) data, were used as an alternative source for a representative reference sample. The first part of this study assesses the intra- and inter-observer error, or reliability, of twelve measurements of the proximal femur. The technical error of measurement (TEM) and relative TEM (%TEM) were calculated using 26 dry adult femora. In addition, the agreement, or accuracy, between the dry bone and CT-based measurements was determined by percent agreement. Only reliable and accurate measurements were retained for the logistic regression sex estimation formulae; a training set (n=86) was used to create the models while an independent testing set (n=28) was used to validate the models. Due to high levels of multicollinearity, only single variable models were created. Cross-validated classification accuracies ranged from 86% to 92%. The high cross-validated classification accuracies indicate that the developed formulae can contribute to the biological profile and specifically in sex estimation of unidentified human skeletal remains in the Netherlands. Furthermore, the results indicate that clinical CT data can be a valuable alternative source of data when representative skeletal collections are unavailable. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Validation of a Full-Immersion Simulation Platform for Percutaneous Nephrolithotomy Using Three-Dimensional Printing Technology.

    PubMed

    Ghazi, Ahmed; Campbell, Timothy; Melnyk, Rachel; Feng, Changyong; Andrusco, Alex; Stone, Jonathan; Erturk, Erdal

    2017-12-01

    The restriction of resident hours with an increasing focus on patient safety and a reduced caseload has impacted surgical training. A complex and complication prone procedure such as percutaneous nephrolithotomy (PCNL) with a steep learning curve may create an unsafe environment for hands-on resident training. In this study, we validate a high fidelity, inanimate PCNL model within a full-immersion simulation environment. Anatomically correct models of the human pelvicaliceal system, kidney, and relevant adjacent structures were created using polyvinyl alcohol hydrogels and three-dimensional-printed injection molds. All steps of a PCNL were simulated including percutaneous renal access, nephroscopy, and lithotripsy. Five experts (>100 caseload) and 10 novices (<20 caseload) from both urology (full procedure) and interventional radiology (access only) departments completed the simulation. Face and content validity were calculated using model ratings for similarity to the real procedure and usefulness as a training tool. Differences in performance among groups with various levels of experience using clinically relevant procedural metrics were used to calculate construct validity. The model was determined to have an excellent face and content validity with an average score of 4.5/5.0 and 4.6/5.0, respectively. There were significant differences between novice and expert operative metrics including mean fluoroscopy time, the number of percutaneous access attempts, and number of times the needle was repositioned. Experts achieved better stone clearance with fewer procedural complications. We demonstrated the face, content, and construct validity of an inanimate, full task trainer for PCNL. Construct validity between experts and novices was demonstrated using incorporated procedural metrics, which permitted the accurate assessment of performance. While hands-on training under supervision remains an integral part of any residency, this full-immersion simulation provides a comprehensive tool for surgical skills development and evaluation before hands-on exposure.

  8. Physics-based distributed snow models in the operational arena: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Winstral, A. H.; Jonas, T.; Schirmer, M.; Helbig, N.

    2017-12-01

    The demand for modeling tools robust to climate change and weather extremes along with coincident increases in computational capabilities have led to an increase in the use of physics-based snow models in operational applications. Current operational applications include the WSL-SLF's across Switzerland, ASO's in California, and USDA-ARS's in Idaho. While the physics-based approaches offer many advantages there remain limitations and modeling challenges. The most evident limitation remains computation times that often limit forecasters to a single, deterministic model run. Other limitations however remain less conspicuous amidst the assumptions that these models require little to no calibration based on their foundation on physical principles. Yet all energy balance snow models seemingly contain parameterizations or simplifications of processes where validation data are scarce or present understanding is limited. At the research-basin scale where many of these models were developed these modeling elements may prove adequate. However when applied over large areas, spatially invariable parameterizations of snow albedo, roughness lengths and atmospheric exchange coefficients - all vital to determining the snowcover energy balance - become problematic. Moreover as we apply models over larger grid cells, the representation of sub-grid variability such as the snow-covered fraction adds to the challenges. Here, we will demonstrate some of the major sensitivities of distributed energy balance snow models to particular model constructs, the need for advanced and spatially flexible methods and parameterizations, and prompt the community for open dialogue and future collaborations to further modeling capabilities.

  9. CFD Modeling of Superheated Fuel Sprays

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    2008-01-01

    An understanding of fuel atomization and vaporization behavior at superheat conditions is identified to be a topic of importance in the design of modern supersonic engines. As a part of the NASA aeronautics initiative, we have undertaken an assessment study to establish baseline accuracy of existing CFD models used in the evaluation of a ashing jet. In a first attempt towards attaining this goal, we have incorporated an existing superheat vaporization model into our spray solution procedure but made some improvements to combine the existing models valid at superheated conditions with the models valid at stable (non-superheat) evaporating conditions. Also, the paper reports some validation results based on the experimental data obtained from the literature for a superheated spray generated by the sudden release of pressurized R134A from a cylindrical nozzle. The predicted profiles for both gas and droplet velocities show a reasonable agreement with the measured data and exhibit a self-similar pattern similar to the correlation reported in the literature. Because of the uncertainty involved in the specification of the initial conditions, we have investigated the effect of initial droplet size distribution on the validation results. The predicted results were found to be sensitive to the initial conditions used for the droplet size specification. However, it was shown that decent droplet size comparisons could be achieved with properly selected initial conditions, For the case considered, it is reasonable to assume that the present vaporization models are capable of providing a reasonable qualitative description for the two-phase jet characteristics generated by a ashing jet. However, there remains some uncertainty with regard to the specification of certain initial spray conditions and there is a need for experimental data on separate gas and liquid temperatures in order to validate the vaporization models based on the Adachi correlation for a liquid involving R134A.

  10. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  11. The psychometric properties of a shortened Dutch version of the consequences scale used in the Core Alcohol and Drug Survey.

    PubMed

    De Bruyn, Sara; Wouters, Edwin; Ponnet, Koen; Van Damme, Joris; Van Hal, Guido

    2017-01-01

    Alcohol and drug misuse among college students has been studied extensively and has been clearly identified as a public health problem. Within more general populations alcohol misuse remains one of the leading causes of disease, disability and death worldwide. Conducting research on alcohol misuse requires valid and reliable instruments to measure its consequences. One scale that is often used is the consequences scale in the Core Alcohol and Drug Survey (CADS). However, psychometric studies on the CADS are rare and the ones that do exist report varying results. This article aims to address this imbalance by examining the psychometric properties of a Dutch version of the CADS in a large sample of Flemish university and college students. The analyses are based on data collected by the inter-university project 'Head in the clouds', measuring alcohol use among students. In total, 19,253 students participated (22.1% response rate). The CADS scale was measured using 19 consequences, and participants were asked how often they had experienced these on a 6-point scale. Firstly, the factor structure of the CADS was examined. Two models from literature were compared by performing confirmatory factor analyses (CFA) and were adapted if necessary. Secondly, we assessed the composite reliability as well as the convergent, discriminant and concurrent validity. The two-factor model, identifying personal consequences (had a hangover; got nauseated or vomited; missed a class) and social consequences (got into an argument or fight; been criticized by someone I know; done something I later regretted; been hurt or injured) was indicated to be the best model, having both a good model fit and an acceptable composite reliability. In addition, construct validity was evaluated to be acceptable, with good discriminant validity, although the convergent validity of the factor measuring 'social consequences' could be improved. Concurrent validity was evaluated as good. In deciding which model best represents the data, it is crucial that not only the model fit is evaluated, but the importance of factor reliability and validity issues is also taken into account. The two-factor model, identifying personal consequences and social consequences, was concluded to be the best model. This shortened Dutch version of the CADS (CADS_D) is a useful tool to screen alcohol-related consequences among college students.

  12. Nonlinear dynamics of planetary gears using analytical and finite element models

    NASA Astrophysics Data System (ADS)

    Ambarisha, Vijaya Kumar; Parker, Robert G.

    2007-05-01

    Vibration-induced gear noise and dynamic loads remain key concerns in many transmission applications that use planetary gears. Tooth separations at large vibrations introduce nonlinearity in geared systems. The present work examines the complex, nonlinear dynamic behavior of spur planetary gears using two models: (i) a lumped-parameter model, and (ii) a finite element model. The two-dimensional (2D) lumped-parameter model represents the gears as lumped inertias, the gear meshes as nonlinear springs with tooth contact loss and periodically varying stiffness due to changing tooth contact conditions, and the supports as linear springs. The 2D finite element model is developed from a unique finite element-contact analysis solver specialized for gear dynamics. Mesh stiffness variation excitation, corner contact, and gear tooth contact loss are all intrinsically considered in the finite element analysis. The dynamics of planetary gears show a rich spectrum of nonlinear phenomena. Nonlinear jumps, chaotic motions, and period-doubling bifurcations occur when the mesh frequency or any of its higher harmonics are near a natural frequency of the system. Responses from the dynamic analysis using analytical and finite element models are successfully compared qualitatively and quantitatively. These comparisons validate the effectiveness of the lumped-parameter model to simulate the dynamics of planetary gears. Mesh phasing rules to suppress rotational and translational vibrations in planetary gears are valid even when nonlinearity from tooth contact loss occurs. These mesh phasing rules, however, are not valid in the chaotic and period-doubling regions.

  13. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  14. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  15. Visualization of the variability of 3D statistical shape models by animation.

    PubMed

    Lamecker, Hans; Seebass, Martin; Lange, Thomas; Hege, Hans-Christian; Deuflhard, Peter

    2004-01-01

    Models of the 3D shape of anatomical objects and the knowledge about their statistical variability are of great benefit in many computer assisted medical applications like images analysis, therapy or surgery planning. Statistical model of shapes have successfully been applied to automate the task of image segmentation. The generation of 3D statistical shape models requires the identification of corresponding points on two shapes. This remains a difficult problem, especially for shapes of complicated topology. In order to interpret and validate variations encoded in a statistical shape model, visual inspection is of great importance. This work describes the generation and interpretation of statistical shape models of the liver and the pelvic bone.

  16. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.

  17. Worldwide multi-model intercomparison of clear-sky solar irradiance predictions

    NASA Astrophysics Data System (ADS)

    Ruiz-Arias, Jose A.; Gueymard, Christian A.; Cebecauer, Tomas

    2017-06-01

    Accurate modeling of solar radiation in the absence of clouds is highly important because solar power production peaks during cloud-free situations. The conventional validation approach of clear-sky solar radiation models relies on the comparison between model predictions and ground observations. Therefore, this approach is limited to locations with availability of high-quality ground observations, which are scarce worldwide. As a consequence, many areas of in-terest for, e.g., solar energy development, still remain sub-validated. Here, a worldwide inter-comparison of the global horizontal irradiance (GHI) and direct normal irradiance (DNI) calculated by a number of appropriate clear-sky solar ra-diation models is proposed, without direct intervention of any weather or solar radiation ground-based observations. The model inputs are all gathered from atmospheric reanalyses covering the globe. The model predictions are compared to each other and only their relative disagreements are quantified. The largest differences between model predictions are found over central and northern Africa, the Middle East, and all over Asia. This coincides with areas of high aerosol optical depth and highly varying aerosol distribution size. Overall, the differences in modeled DNI are found about twice larger than for GHI. It is argued that the prevailing weather regimes (most importantly, aerosol conditions) over regions exhibiting substantial divergences are not adequately parameterized by all models. Further validation and scrutiny using conventional methods based on ground observations should be pursued in priority over those specific regions to correctly evaluate the performance of clear-sky models, and select those that can be recommended for solar concentrating applications in particular.

  18. The Singularity Mystery Associated with a Radially Continuous Maxwell Viscoelastic Structure

    NASA Technical Reports Server (NTRS)

    Fang, Ming; Hager, Bradford H.

    1995-01-01

    The singularity problem associated with a radially continuous Maxwell viscoclastic structure is investigated. A special tool called the isolation function is developed. Results calculated using the isolation function show that the discrete model assumption is no longer valid when the viscoelastic parameter becomes a continuous function of radius. Continuous variations in the upper mantle viscoelastic parameter are especially powerful in destroying the mode-like structures. The contribution to the load Love numbers of the singularities is sensitive to the convexity of the viscoelastic parameter models. The difference between the vertical response and the horizontal response found in layered viscoelastic parameter models remains with continuous models.

  19. Developing a dengue forecast model using machine learning: A case study in China.

    PubMed

    Guo, Pi; Liu, Tao; Zhang, Qin; Wang, Li; Xiao, Jianpeng; Zhang, Qingying; Luo, Ganfeng; Li, Zhihao; He, Jianfeng; Zhang, Yonghui; Ma, Wenjun

    2017-10-01

    In China, dengue remains an important public health issue with expanded areas and increased incidence recently. Accurate and timely forecasts of dengue incidence in China are still lacking. We aimed to use the state-of-the-art machine learning algorithms to develop an accurate predictive model of dengue. Weekly dengue cases, Baidu search queries and climate factors (mean temperature, relative humidity and rainfall) during 2011-2014 in Guangdong were gathered. A dengue search index was constructed for developing the predictive models in combination with climate factors. The observed year and week were also included in the models to control for the long-term trend and seasonality. Several machine learning algorithms, including the support vector regression (SVR) algorithm, step-down linear regression model, gradient boosted regression tree algorithm (GBM), negative binomial regression model (NBM), least absolute shrinkage and selection operator (LASSO) linear regression model and generalized additive model (GAM), were used as candidate models to predict dengue incidence. Performance and goodness of fit of the models were assessed using the root-mean-square error (RMSE) and R-squared measures. The residuals of the models were examined using the autocorrelation and partial autocorrelation function analyses to check the validity of the models. The models were further validated using dengue surveillance data from five other provinces. The epidemics during the last 12 weeks and the peak of the 2014 large outbreak were accurately forecasted by the SVR model selected by a cross-validation technique. Moreover, the SVR model had the consistently smallest prediction error rates for tracking the dynamics of dengue and forecasting the outbreaks in other areas in China. The proposed SVR model achieved a superior performance in comparison with other forecasting techniques assessed in this study. The findings can help the government and community respond early to dengue epidemics.

  20. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation metric will provide a quantified confidence and probability of success for the final SLS dynamics model, which will be critical for a successful launch program, and can be applied in the many other industries where an accurate dynamic model is required.

  1. [Vis-NIR spectroscopic pattern recognition combined with SG smoothing applied to breed screening of transgenic sugarcane].

    PubMed

    Liu, Gui-Song; Guo, Hao-Song; Pan, Tao; Wang, Ji-Hua; Cao, Gan

    2014-10-01

    Based on Savitzky-Golay (SG) smoothing screening, principal component analysis (PCA) combined with separately supervised linear discriminant analysis (LDA) and unsupervised hierarchical clustering analysis (HCA) were used for non-destructive visible and near-infrared (Vis-NIR) detection for breed screening of transgenic sugarcane. A random and stability-dependent framework of calibration, prediction, and validation was proposed. A total of 456 samples of sugarcane leaves planting in the elongating stage were collected from the field, which was composed of 306 transgenic (positive) samples containing Bt and Bar gene and 150 non-transgenic (negative) samples. A total of 156 samples (negative 50 and positive 106) were randomly selected as the validation set; the remaining samples (negative 100 and positive 200, a total of 300 samples) were used as the modeling set, and then the modeling set was subdivided into calibration (negative 50 and positive 100, a total of 150 samples) and prediction sets (negative 50 and positive 100, a total of 150 samples) for 50 times. The number of SG smoothing points was ex- panded, while some modes of higher derivative were removed because of small absolute value, and a total of 264 smoothing modes were used for screening. The pairwise combinations of first three principal components were used, and then the optimal combination of principal components was selected according to the model effect. Based on all divisions of calibration and prediction sets and all SG smoothing modes, the SG-PCA-LDA and SG-PCA-HCA models were established, the model parameters were optimized based on the average prediction effect for all divisions to produce modeling stability. Finally, the model validation was performed by validation set. With SG smoothing, the modeling accuracy and stability of PCA-LDA, PCA-HCA were signif- icantly improved. For the optimal SG-PCA-LDA model, the recognition rate of positive and negative validation samples were 94.3%, 96.0%; and were 92.5%, 98.0% for the optimal SG-PCA-LDA model, respectively. Vis-NIR spectro- scopic pattern recognition combined with SG smoothing could be used for accurate recognition of transgenic sugarcane leaves, and provided a convenient screening method for transgenic sugarcane breeding.

  2. Laboratory parameter-based machine learning model for excluding non-alcoholic fatty liver disease (NAFLD) in the general population.

    PubMed

    Yip, T C-F; Ma, A J; Wong, V W-S; Tse, Y-K; Chan, H L-Y; Yuen, P-C; Wong, G L-H

    2017-08-01

    Non-alcoholic fatty liver disease (NAFLD) affects 20%-40% of the general population in developed countries and is an increasingly important cause of hepatocellular carcinoma. Electronic medical records facilitate large-scale epidemiological studies, existing NAFLD scores often require clinical and anthropometric parameters that may not be captured in those databases. To develop and validate a laboratory parameter-based machine learning model to detect NAFLD for the general population. We randomly divided 922 subjects from a population screening study into training and validation groups; NAFLD was diagnosed by proton-magnetic resonance spectroscopy. On the basis of machine learning from 23 routine clinical and laboratory parameters after elastic net regulation, we evaluated the logistic regression, ridge regression, AdaBoost and decision tree models. The areas under receiver-operating characteristic curve (AUROC) of models in validation group were compared. Six predictors including alanine aminotransferase, high-density lipoprotein cholesterol, triglyceride, haemoglobin A 1c , white blood cell count and the presence of hypertension were selected. The NAFLD ridge score achieved AUROC of 0.87 (95% CI 0.83-0.90) and 0.88 (0.84-0.91) in the training and validation groups respectively. Using dual cut-offs of 0.24 and 0.44, NAFLD ridge score achieved 92% (86%-96%) sensitivity and 90% (86%-93%) specificity with corresponding negative and positive predictive values of 96% (91%-98%) and 69% (59%-78%), and 87% of overall accuracy among 70% of classifiable subjects in the validation group; 30% of subjects remained indeterminate. NAFLD ridge score is a simple and robust reference comparable to existing NAFLD scores to exclude NAFLD patients in epidemiological studies. © 2017 John Wiley & Sons Ltd.

  3. Impact of correlation of predictors on discrimination of risk models in development and external populations.

    PubMed

    Kundu, Suman; Mazumdar, Madhu; Ferket, Bart

    2017-04-19

    The area under the ROC curve (AUC) of risk models is known to be influenced by differences in case-mix and effect size of predictors. The impact of heterogeneity in correlation among predictors has however been under investigated. We sought to evaluate how correlation among predictors affects the AUC in development and external populations. We simulated hypothetical populations using two different methods based on means, standard deviations, and correlation of two continuous predictors. In the first approach, the distribution and correlation of predictors were assumed for the total population. In the second approach, these parameters were modeled conditional on disease status. In both approaches, multivariable logistic regression models were fitted to predict disease risk in individuals. Each risk model developed in a population was validated in the remaining populations to investigate external validity. For both approaches, we observed that the magnitude of the AUC in the development and external populations depends on the correlation among predictors. Lower AUCs were estimated in scenarios of both strong positive and negative correlation, depending on the direction of predictor effects and the simulation method. However, when adjusted effect sizes of predictors were specified in the opposite directions, increasingly negative correlation consistently improved the AUC. AUCs in external validation populations were higher or lower than in the derivation cohort, even in the presence of similar predictor effects. Discrimination of risk prediction models should be assessed in various external populations with different correlation structures to make better inferences about model generalizability.

  4. Bridging the gap between computation and clinical biology: validation of cable theory in humans

    PubMed Central

    Finlay, Malcolm C.; Xu, Lei; Taggart, Peter; Hanson, Ben; Lambiase, Pier D.

    2013-01-01

    Introduction: Computerized simulations of cardiac activity have significantly contributed to our understanding of cardiac electrophysiology, but techniques of simulations based on patient-acquired data remain in their infancy. We sought to integrate data acquired from human electrophysiological studies into patient-specific models, and validated this approach by testing whether electrophysiological responses to sequential premature stimuli could be predicted in a quantitatively accurate manner. Methods: Eleven patients with structurally normal hearts underwent electrophysiological studies. Semi-automated analysis was used to reconstruct activation and repolarization dynamics for each electrode. This S2 extrastimuli data was used to inform individualized models of cardiac conduction, including a novel derivation of conduction velocity restitution. Activation dynamics of multiple premature extrastimuli were then predicted from this model and compared against measured patient data as well as data derived from the ten-Tusscher cell-ionic model. Results: Activation dynamics following a premature S3 were significantly different from those after an S2. Patient specific models demonstrated accurate prediction of the S3 activation wave, (Pearson's R2 = 0.90, median error 4%). Examination of the modeled conduction dynamics allowed inferences into the spatial dispersion of activation delay. Further validation was performed against data from the ten-Tusscher cell-ionic model, with our model accurately recapitulating predictions of repolarization times (R2 = 0.99). Conclusions: Simulations based on clinically acquired data can be used to successfully predict complex activation patterns following sequential extrastimuli. Such modeling techniques may be useful as a method of incorporation of clinical data into predictive models. PMID:24027527

  5. Development of Macroscale Models of UO 2 Fuel Sintering and Densification using Multiscale Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenquist, Ian; Tonks, Michael

    2016-10-01

    Light water reactor fuel pellets are fabricated using sintering to final densities of 95% or greater. During reactor operation, the porosity remaining in the fuel after fabrication decreases further due to irradiation-assisted densification. While empirical models have been developed to describe this densification process, a mechanistic model is needed as part of the ongoing work by the NEAMS program to develop a more predictive fuel performance code. In this work we will develop a phase field model of sintering of UO 2 in the MARMOT code, and validate it by comparing to published sintering data. We will then add themore » capability to capture irradiation effects into the model, and use it to develop a mechanistic model of densification that will go into the BISON code and add another essential piece to the microstructure-based materials models. The final step will be to add the effects of applied fields, to model field-assisted sintering of UO 2. The results of the phase field model will be validated by comparing to data from field-assisted sintering. Tasks over three years: 1. Develop a sintering model for UO 2 in MARMOT 2. Expand model to account for irradiation effects 3. Develop a mechanistic macroscale model of densification for BISON« less

  6. GIS-aided Statistical Landslide Susceptibility Modeling And Mapping Of Antipolo Rizal (Philippines)

    NASA Astrophysics Data System (ADS)

    Dumlao, A. J.; Victor, J. A.

    2015-09-01

    Slope instability associated with heavy rainfall or earthquake is a familiar geotechnical problem in the Philippines. The main objective of this study is to perform a detailed landslide susceptibility assessment of Antipolo City. The statistical method of assessment used was logistic regression. Landslide inventory was done through interpretation of aerial photographs and satellite images with corresponding field verification. In this study, morphologic and non-morphologic factors contributing to landslide occurrence and their corresponding spatial relationships were considered. The analysis of landslide susceptibility was implemented in a Geographic Information System (GIS). The 17320 randomly selected datasets were divided into training and test data sets. K- cross fold validation is done with k= 5. The subsamples are then fitted five times with k-1 training data set and the remaining fold as the validation data set. The AUROC of each model is validated using each corresponding data set. The AUROC of the five models are; 0.978, 0.977, 0.977, 0.974, and 0.979 respectively, implying that the models are effective in correctly predicting the occurrence and nonoccurrence of landslide activity. Field verification was also done. The landslide susceptibility map was then generated from the model. It is classified into four categories; low, moderate, high and very high susceptibility. The study also shows that almost 40% of Antipolo City has been assessed to be potentially dangerous areas in terms of landslide occurrence.

  7. Quantitative studies on structure-DPPH• scavenging activity relationships of food phenolic acids.

    PubMed

    Jing, Pu; Zhao, Shu-Juan; Jian, Wen-Jie; Qian, Bing-Jun; Dong, Ying; Pang, Jie

    2012-11-01

    Phenolic acids are potent antioxidants, yet the quantitative structure-activity relationships of phenolic acids remain unclear. The purpose of this study was to establish 3D-QSAR models able to predict phenolic acids with high DPPH• scavenging activity and understand their structure-activity relationships. The model has been established by using a training set of compounds with cross-validated q2 = 0.638/0.855, non-cross-validated r2 = 0.984/0.986, standard error of estimate = 0.236/0.216, and F = 139.126/208.320 for the best CoMFA/CoMSIA models. The predictive ability of the models was validated with the correlation coefficient r2(pred) = 0.971/0.996 (>0.6) for each model. Additionally, the contour map results suggested that structural characteristics of phenolics acids favorable for the high DPPH• scavenging activity might include: (1) bulky and/or electron-donating substituent groups on the phenol ring; (2) electron-donating groups at the meta-position and/or hydrophobic groups at the meta-/ortho-position; (3) hydrogen-bond donor/electron-donating groups at the ortho-position. The results have been confirmed based on structural analyses of phenolic acids and their DPPH• scavenging data from eight recent publications. The findings may provide deeper insight into the antioxidant mechanisms and provide useful information for selecting phenolic acids for free radical scavenging properties.

  8. A biomarker-based risk score to predict death in patients with atrial fibrillation: the ABC (age, biomarkers, clinical history) death risk score

    PubMed Central

    Hijazi, Ziad; Oldgren, Jonas; Lindbäck, Johan; Alexander, John H; Connolly, Stuart J; Eikelboom, John W; Ezekowitz, Michael D; Held, Claes; Hylek, Elaine M; Lopes, Renato D; Yusuf, Salim; Granger, Christopher B; Siegbahn, Agneta; Wallentin, Lars

    2018-01-01

    Abstract Aims In atrial fibrillation (AF), mortality remains high despite effective anticoagulation. A model predicting the risk of death in these patients is currently not available. We developed and validated a risk score for death in anticoagulated patients with AF including both clinical information and biomarkers. Methods and results The new risk score was developed and internally validated in 14 611 patients with AF randomized to apixaban vs. warfarin for a median of 1.9 years. External validation was performed in 8548 patients with AF randomized to dabigatran vs. warfarin for 2.0 years. Biomarker samples were obtained at study entry. Variables significantly contributing to the prediction of all-cause mortality were assessed by Cox-regression. Each variable obtained a weight proportional to the model coefficients. There were 1047 all-cause deaths in the derivation and 594 in the validation cohort. The most important predictors of death were N-terminal pro B-type natriuretic peptide, troponin-T, growth differentiation factor-15, age, and heart failure, and these were included in the ABC (Age, Biomarkers, Clinical history)-death risk score. The score was well-calibrated and yielded higher c-indices than a model based on all clinical variables in both the derivation (0.74 vs. 0.68) and validation cohorts (0.74 vs. 0.67). The reduction in mortality with apixaban was most pronounced in patients with a high ABC-death score. Conclusion A new biomarker-based score for predicting risk of death in anticoagulated AF patients was developed, internally and externally validated, and well-calibrated in two large cohorts. The ABC-death risk score performed well and may contribute to overall risk assessment in AF. ClinicalTrials.gov identifier NCT00412984 and NCT00262600 PMID:29069359

  9. The influence of preburial insect access on the decomposition rate.

    PubMed

    Bachmann, Jutta; Simmons, Tal

    2010-07-01

    This study compared total body score (TBS) in buried remains (35 cm depth) with and without insect access prior to burial. Sixty rabbit carcasses were exhumed at 50 accumulated degree day (ADD) intervals. Weight loss, TBS, intra-abdominal decomposition, carcass/soil interface temperature, and below-carcass soil pH were recorded and analyzed. Results showed significant differences (p < 0.001) in decomposition rates between carcasses with and without insect access prior to burial. An approximately 30% enhanced decomposition rate with insects was observed. TBS was the most valid tool in postmortem interval (PMI) estimation. All other variables showed only weak relationships to decomposition stages, adding little value to PMI estimation. Although progress in estimating the PMI for surface remains has been made, no previous studies have accomplished this for buried remains. This study builds a framework to which further comparable studies can contribute, to produce predictive models for PMI estimation in buried human remains.

  10. Near infrared spectroscopy combined with multivariate analysis for monitoring the ethanol precipitation process of fraction I + II + III supernatant in human albumin separation

    NASA Astrophysics Data System (ADS)

    Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian

    2017-03-01

    Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I + II + III (FI + II + III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (Rp2), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501 g/L, 0.465 g/L and 5.57 for TP, and 0.969, 0.530 g/L, 0.341 g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI + II + III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS.

  11. Near infrared spectroscopy combined with multivariate analysis for monitoring the ethanol precipitation process of fraction I+II+III supernatant in human albumin separation.

    PubMed

    Li, Can; Wang, Fei; Zang, Lixuan; Zang, Hengchang; Alcalà, Manel; Nie, Lei; Wang, Mingyu; Li, Lian

    2017-03-15

    Nowadays, as a powerful process analytical tool, near infrared spectroscopy (NIRS) has been widely applied in process monitoring. In present work, NIRS combined with multivariate analysis was used to monitor the ethanol precipitation process of fraction I+II+III (FI+II+III) supernatant in human albumin (HA) separation to achieve qualitative and quantitative monitoring at the same time and assure the product's quality. First, a qualitative model was established by using principal component analysis (PCA) with 6 of 8 normal batches samples, and evaluated by the remaining 2 normal batches and 3 abnormal batches. The results showed that the first principal component (PC1) score chart could be successfully used for fault detection and diagnosis. Then, two quantitative models were built with 6 of 8 normal batches to determine the content of the total protein (TP) and HA separately by using partial least squares regression (PLS-R) strategy, and the models were validated by 2 remaining normal batches. The determination coefficient of validation (R p 2 ), root mean square error of cross validation (RMSECV), root mean square error of prediction (RMSEP) and ratio of performance deviation (RPD) were 0.975, 0.501g/L, 0.465g/L and 5.57 for TP, and 0.969, 0.530g/L, 0.341g/L and 5.47 for HA, respectively. The results showed that the established models could give a rapid and accurate measurement of the content of TP and HA. The results of this study indicated that NIRS is an effective tool and could be successfully used for qualitative and quantitative monitoring the ethanol precipitation process of FI+II+III supernatant simultaneously. This research has significant reference value for assuring the quality and improving the recovery ratio of HA in industrialization scale by using NIRS. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Examining the Relations Among the DSM-5 Alternative Model of Personality, the Five-Factor Model, and Externalizing and Internalizing Behavior.

    PubMed

    Sleep, Chelsea E; Hyatt, Courtland S; Lamkin, Joanna; Maples-Keller, Jessica L; Miller, Joshua D

    2017-01-26

    Given long-standing criticisms of the DSM's reliance on categorical models of psychopathology, including the poor reliability and validity of personality-disorder diagnoses, the American Psychiatric Association (APA) published an alternative model (AM) of personality disorders in Section III of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5; APA, 2013), which, in part, comprises 5 pathological trait domains based on the 5-factor model (FFM). However, the empirical profiles and discriminant validity of the AM traits remain in question. We recruited a sample of undergraduates (N = 340) for the current study to compare the relations found between a measure of the DSM-5 AM traits (i.e., the Personality Inventory for DSM-5; PID-5; Krueger, Derringer, Markon, Watson, & Skodol, 2012) and a measure of the FFM (i.e., the International Personality Item Pool; IPIP; Goldberg, 1999) in relation to externalizing and internalizing symptoms. In general, the domains from the 2 measures were significantly related and demonstrated similar patterns of relations with these criteria, such that Antagonism/low Agreeableness and Disinhibition/low Conscientiousness were related to externalizing behaviors, whereas Negative Affectivity/Neuroticism was most significantly related to internalizing symptoms. However, the PID-5 demonstrated large interrelations among its domains and poorer discriminant validity than the IPIP. These results provide additional support that the conception of the trait model included in the DSM-5 AM is an extension of the FFM, but highlight some of the issues that arise due to the PID-5's more limited discriminant validity. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Is questionnaire-based sitting time inaccurate and can it be improved? A cross-sectional investigation using accelerometer-based sitting time

    PubMed Central

    Gupta, Nidhi; Christiansen, Caroline Stordal; Hanisch, Christiana; Bay, Hans; Burr, Hermann; Holtermann, Andreas

    2017-01-01

    Objectives To investigate the differences between a questionnaire-based and accelerometer-based sitting time, and develop a model for improving the accuracy of questionnaire-based sitting time for predicting accelerometer-based sitting time. Methods 183 workers in a cross-sectional study reported sitting time per day using a single question during the measurement period, and wore 2 Actigraph GT3X+ accelerometers on the thigh and trunk for 1–4 working days to determine their actual sitting time per day using the validated Acti4 software. Least squares regression models were fitted with questionnaire-based siting time and other self-reported predictors to predict accelerometer-based sitting time. Results Questionnaire-based and accelerometer-based average sitting times were ≈272 and ≈476 min/day, respectively. A low Pearson correlation (r=0.32), high mean bias (204.1 min) and wide limits of agreement (549.8 to −139.7 min) between questionnaire-based and accelerometer-based sitting time were found. The prediction model based on questionnaire-based sitting explained 10% of the variance in accelerometer-based sitting time. Inclusion of 9 self-reported predictors in the model increased the explained variance to 41%, with 10% optimism using a resampling bootstrap validation. Based on a split validation analysis, the developed prediction model on ≈75% of the workers (n=132) reduced the mean and the SD of the difference between questionnaire-based and accelerometer-based sitting time by 64% and 42%, respectively, in the remaining 25% of the workers. Conclusions This study indicates that questionnaire-based sitting time has low validity and that a prediction model can be one solution to materially improve the precision of questionnaire-based sitting time. PMID:28093433

  14. Modeling Major Adverse Outcomes of Pediatric and Adult Patients With Congenital Heart Disease Undergoing Cardiac Catheterization: Observations From the NCDR IMPACT Registry (National Cardiovascular Data Registry Improving Pediatric and Adult Congenital Treatment).

    PubMed

    Jayaram, Natalie; Spertus, John A; Kennedy, Kevin F; Vincent, Robert; Martin, Gerard R; Curtis, Jeptha P; Nykanen, David; Moore, Phillip M; Bergersen, Lisa

    2017-11-21

    Risk standardization for adverse events after congenital cardiac catheterization is needed to equitably compare patient outcomes among different hospitals as a foundation for quality improvement. The goal of this project was to develop a risk-standardization methodology to adjust for patient characteristics when comparing major adverse outcomes in the NCDR's (National Cardiovascular Data Registry) IMPACT Registry (Improving Pediatric and Adult Congenital Treatment). Between January 2011 and March 2014, 39 725 consecutive patients within IMPACT undergoing cardiac catheterization were identified. Given the heterogeneity of interventional procedures for congenital heart disease, new procedure-type risk categories were derived with empirical data and expert opinion, as were markers of hemodynamic vulnerability. A multivariable hierarchical logistic regression model to identify patient and procedural characteristics predictive of a major adverse event or death after cardiac catheterization was derived in 70% of the cohort and validated in the remaining 30%. The rate of major adverse event or death was 7.1% and 7.2% in the derivation and validation cohorts, respectively. Six procedure-type risk categories and 6 independent indicators of hemodynamic vulnerability were identified. The final risk adjustment model included procedure-type risk category, number of hemodynamic vulnerability indicators, renal insufficiency, single-ventricle physiology, and coagulation disorder. The model had good discrimination, with a C-statistic of 0.76 and 0.75 in the derivation and validation cohorts, respectively. Model calibration in the validation cohort was excellent, with a slope of 0.97 (standard error, 0.04; P value [for difference from 1] =0.53) and an intercept of 0.007 (standard error, 0.12; P value [for difference from 0] =0.95). The creation of a validated risk-standardization model for adverse outcomes after congenital cardiac catheterization can support reporting of risk-adjusted outcomes in the IMPACT Registry as a foundation for quality improvement. © 2017 American Heart Association, Inc.

  15. A Supervised Learning Process to Validate Online Disease Reports for Use in Predictive Models.

    PubMed

    Patching, Helena M M; Hudson, Laurence M; Cooke, Warrick; Garcia, Andres J; Hay, Simon I; Roberts, Mark; Moyes, Catherine L

    2015-12-01

    Pathogen distribution models that predict spatial variation in disease occurrence require data from a large number of geographic locations to generate disease risk maps. Traditionally, this process has used data from public health reporting systems; however, using online reports of new infections could speed up the process dramatically. Data from both public health systems and online sources must be validated before they can be used, but no mechanisms exist to validate data from online media reports. We have developed a supervised learning process to validate geolocated disease outbreak data in a timely manner. The process uses three input features, the data source and two metrics derived from the location of each disease occurrence. The location of disease occurrence provides information on the probability of disease occurrence at that location based on environmental and socioeconomic factors and the distance within or outside the current known disease extent. The process also uses validation scores, generated by disease experts who review a subset of the data, to build a training data set. The aim of the supervised learning process is to generate validation scores that can be used as weights going into the pathogen distribution model. After analyzing the three input features and testing the performance of alternative processes, we selected a cascade of ensembles comprising logistic regressors. Parameter values for the training data subset size, number of predictors, and number of layers in the cascade were tested before the process was deployed. The final configuration was tested using data for two contrasting diseases (dengue and cholera), and 66%-79% of data points were assigned a validation score. The remaining data points are scored by the experts, and the results inform the training data set for the next set of predictors, as well as going to the pathogen distribution model. The new supervised learning process has been implemented within our live site and is being used to validate the data that our system uses to produce updated predictive disease maps on a weekly basis.

  16. Validation of Models Used to Inform Colorectal Cancer Screening Guidelines: Accuracy and Implications.

    PubMed

    Rutter, Carolyn M; Knudsen, Amy B; Marsh, Tracey L; Doria-Rose, V Paul; Johnson, Eric; Pabiniak, Chester; Kuntz, Karen M; van Ballegooijen, Marjolein; Zauber, Ann G; Lansdorp-Vogelaar, Iris

    2016-07-01

    Microsimulation models synthesize evidence about disease processes and interventions, providing a method for predicting long-term benefits and harms of prevention, screening, and treatment strategies. Because models often require assumptions about unobservable processes, assessing a model's predictive accuracy is important. We validated 3 colorectal cancer (CRC) microsimulation models against outcomes from the United Kingdom Flexible Sigmoidoscopy Screening (UKFSS) Trial, a randomized controlled trial that examined the effectiveness of one-time flexible sigmoidoscopy screening to reduce CRC mortality. The models incorporate different assumptions about the time from adenoma initiation to development of preclinical and symptomatic CRC. Analyses compare model predictions to study estimates across a range of outcomes to provide insight into the accuracy of model assumptions. All 3 models accurately predicted the relative reduction in CRC mortality 10 years after screening (predicted hazard ratios, with 95% percentile intervals: 0.56 [0.44, 0.71], 0.63 [0.51, 0.75], 0.68 [0.53, 0.83]; estimated with 95% confidence interval: 0.56 [0.45, 0.69]). Two models with longer average preclinical duration accurately predicted the relative reduction in 10-year CRC incidence. Two models with longer mean sojourn time accurately predicted the number of screen-detected cancers. All 3 models predicted too many proximal adenomas among patients referred to colonoscopy. Model accuracy can only be established through external validation. Analyses such as these are therefore essential for any decision model. Results supported the assumptions that the average time from adenoma initiation to development of preclinical cancer is long (up to 25 years), and mean sojourn time is close to 4 years, suggesting the window for early detection and intervention by screening is relatively long. Variation in dwell time remains uncertain and could have important clinical and policy implications. © The Author(s) 2016.

  17. Model invariance across genders of the Broad Autism Phenotype Questionnaire.

    PubMed

    Broderick, Neill; Wade, Jordan L; Meyer, J Patrick; Hull, Michael; Reeve, Ronald E

    2015-10-01

    ASD is one of the most heritable neuropsychiatric disorders, though comprehensive genetic liability remains elusive. To facilitate genetic research, researchers employ the concept of the broad autism phenotype (BAP), a milder presentation of traits in undiagnosed relatives. Research suggests that the BAP Questionnaire (BAPQ) demonstrates psychometric properties superior to other self-report measures. To examine evidence regarding validity of the BAPQ, the current study used confirmatory factor analysis to test the assumption of model invariance across genders. Results of the current study upheld model invariance at each level of parameter constraint; however, model fit indices suggested limited goodness-of-fit between the proposed model and the sample. Exploratory analyses investigated alternate factor structure models but ultimately supported the proposed three-factor structure model.

  18. VALIDATION OF THE CORONAL THICK TARGET SOURCE MODEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleishman, Gregory D.; Xu, Yan; Nita, Gelu N.

    2016-01-10

    We present detailed 3D modeling of a dense, coronal thick-target X-ray flare using the GX Simulator tool, photospheric magnetic measurements, and microwave imaging and spectroscopy data. The developed model offers a remarkable agreement between the synthesized and observed spectra and images in both X-ray and microwave domains, which validates the entire model. The flaring loop parameters are chosen to reproduce the emission measure, temperature, and the nonthermal electron distribution at low energies derived from the X-ray spectral fit, while the remaining parameters, unconstrained by the X-ray data, are selected such as to match the microwave images and total power spectra.more » The modeling suggests that the accelerated electrons are trapped in the coronal part of the flaring loop, but away from where the magnetic field is minimal, and, thus, demonstrates that the data are clearly inconsistent with electron magnetic trapping in the weak diffusion regime mediated by the Coulomb collisions. Thus, the modeling supports the interpretation of the coronal thick-target sources as sites of electron acceleration in flares and supplies us with a realistic 3D model with physical parameters of the acceleration region and flaring loop.« less

  19. [Effect of the ISS Russian segment configuration on the service module radiation environment].

    PubMed

    Mitrikas, V G

    2011-01-01

    Mathematical modeling of variations in the Service module radiation environment as a function of ISS Russian segment configuration was carried out using models of the RS modules and a spherical humanoid phantom. ISS reconfiguration impacted significantly only the phantom brought into the transfer compartment (ExT). The Radiation Safety Service prohibition for cosmonauts to stay in this compartment during solar flare events remains valid. In all other instances, error of dose estimation is higher as compared to dose value estimation with consideration for ISS RS reconfiguration.

  20. In Vitro Simulation and Validation of the Circulation with Congenital Heart Defects

    PubMed Central

    Figliola, Richard S.; Giardini, Alessandro; Conover, Tim; Camp, Tiffany A.; Biglino, Giovanni; Chiulli, John; Hsia, Tain-Yen

    2010-01-01

    Despite the recent advances in computational modeling, experimental simulation of the circulation with congenital heart defect using mock flow circuits remains an important tool for device testing, and for detailing the probable flow consequences resulting from surgical and interventional corrections. Validated mock circuits can be applied to qualify the results from novel computational models. New mathematical tools, coupled with advanced clinical imaging methods, allow for improved assessment of experimental circuit performance relative to human function, as well as the potential for patient-specific adaptation. In this review, we address the development of three in vitro mock circuits specific for studies of congenital heart defects. Performance of an in vitro right heart circulation circuit through a series of verification and validation exercises is described, including correlations with animal studies, and quantifying the effects of circuit inertiance on test results. We present our experience in the design of mock circuits suitable for investigations of the characteristics of the Fontan circulation. We use one such mock circuit to evaluate the accuracy of Doppler predictions in the presence of aortic coarctation. PMID:21218147

  1. Simulation of hydrodynamics and solute transport in the Pamlico River estuary, North Carolina

    USGS Publications Warehouse

    Bales, Jerad; Robbins, Jeanne C.

    1995-01-01

    An investigation was conducted to characterize flow, circulation, and solute transport in the Pamlico River estuary, North Carolina. The study included a detailed field-measurement program and the calibration, validation, and application of a physically realistic numerical model of hydro- dynamics and transport. Water level, salinity, water temperature, wind speed and direction, and current data were collected during March 1988 through September 1992, and were used to characterize physical conditions in the estuary. Data from pre- existing streamflow gaging stations and meteoro- logical stations were also used. A two-dimensional vertically averaged hydrodynamic and solute transport model was applied to the 48-kilometer study reach. The model domain was discretized into 5,620 separate 200- by 200-meter computational cells. Model calibration was achieved through adjustment of parameters for June 14-30, 1991. Data from selected periods in 1989 and 1991 were used for model validation. Water levels used for model calibration and validation ranged from -0.052 to 0.698 meter; salinities ranged from 0.1 to 13.1 parts per thousand; and wind speeds ranged from calm to 22 meters per second. The model was tested for stratified and unstratified conditions. Simulated and observed data were used to evaluate model performance. The calibrated model was applied for selected periods in 1989 and 1991. Instantaneous flows were simulated at each boundary and at mid- estuary. Circulation patterns were characterized using vector plots, particle tracking, and solute transport. Particle tracks showed that materials released at mid-estuary may remain in the system for 25 days or longer.

  2. Validity and reliability of the Multidimensional Body Image Scale in Malaysian university students.

    PubMed

    Gan, W Y; Mohd, Nasir M T; Siti, Aishah H; Zalilah, M S

    2012-12-01

    This study aimed to evaluate the validity and reliability of the Multidimensional Body Image Scale (MBIS), a seven-factor, 62-item scale developed for Malaysian female adolescents. This scale was evaluated among male and female Malaysian university students. A total of 671 university students (52.2% women and 47.8% men) completed a self-administered questionnaire on MBIS, Eating Attitude Test-26, and Rosenberg Self-Esteem Scale. Their height and weight were measured. Results in confirmatory factor analysis showed that the 62-item MBIS reported poor fit to the data, xhi2/df = 4.126, p < 0.001, CFI = 0.808, SRMR = 0.070, RMSEA = 0.068 (90% CI = 0.067, 0.070). After re-specification of the model, the model fit was improved with 46 items remaining, chi2/df = 3.346, p < 0.001, CFI = 0.903, SRMR = 0.053, RMSEA = 0.059 (90% CI = 0.057, 0.061), and the model showed good fit to the data for men and women separately. This 46-item MBIS had good internal consistency in both men (Cronbach's alpha = 0.88) and women (Cronbach's alpha = 0.92). In terms of construct validity, it showed positive correlations with disordered eating and body weight status, but negative correlation with self-esteem. Also, this scale discriminated well between participants with and without disordered eating. The MBIS-46 demonstrated good reliability and validity for the evaluation of body image among university students. Further studies need to be conducted to confirm the validation results of the 46-item MBIS.

  3. Strategies for carbohydrate model building, refinement and validation

    PubMed Central

    2017-01-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein–sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade. PMID:28177313

  4. Strategies for carbohydrate model building, refinement and validation.

    PubMed

    Agirre, Jon

    2017-02-01

    Sugars are the most stereochemically intricate family of biomolecules and present substantial challenges to anyone trying to understand their nomenclature, reactions or branched structures. Current crystallographic programs provide an abstraction layer allowing inexpert structural biologists to build complete protein or nucleic acid model components automatically either from scratch or with little manual intervention. This is, however, still not generally true for sugars. The need for carbohydrate-specific building and validation tools has been highlighted a number of times in the past, concomitantly with the introduction of a new generation of experimental methods that have been ramping up the production of protein-sugar complexes and glycoproteins for the past decade. While some incipient advances have been made to address these demands, correctly modelling and refining carbohydrates remains a challenge. This article will address many of the typical difficulties that a structural biologist may face when dealing with carbohydrates, with an emphasis on problem solving in the resolution range where X-ray crystallography and cryo-electron microscopy are expected to overlap in the next decade.

  5. A pilot study: the development of a culturally tailored Malaysian Diabetes Education Module (MY-DEMO) based on the Health Belief Model.

    PubMed

    Ahmad, Badariah; Ramadas, Amutha; Kia Fatt, Quek; Md Zain, Anuar Zaini

    2014-04-08

    Diabetes education and self-care remains the cornerstone of diabetes management. There are many structured diabetes modules available in the United Kingdom, Europe and United States of America. Contrastingly, few structured and validated diabetes modules are available in Malaysia. This pilot study aims to develop and validate diabetes education material suitable and tailored for a multicultural society like Malaysia. The theoretical framework of this module was founded from the Health Belief Model (HBM). The participants were assessed using 6-item pre- and post-test questionnaires that measured some of the known HBM constructs namely cues to action, perceived severity and perceived benefit. Data was analysed using PASW Statistics 18.0. The pre- and post-test questionnaires were administered to 88 participants (31 males). In general, there was a significant increase in the total score in post-test (97.34 ± 6.13%) compared to pre-test (92.80 ± 12.83%) (p < 0.05) and a significant increase in excellent score (>85%) at post-test (84.1%) compared to pre-test (70.5%) (p < 0.05). There was an improvement in post-test score in 4 of 6 items tested. The remaining 2 items which measured the perceived severity and cues to action had poorer post-test score. The preliminary results from this pilot study suggest contextualised content material embedded within MY DEMO maybe suitable for integration with the existing diabetes education programmes. This was the first known validated diabetes education programme available in the Malay language.

  6. Development of Image Segmentation Methods for Intracranial Aneurysms

    PubMed Central

    Qian, Yi; Morgan, Michael

    2013-01-01

    Though providing vital means for the visualization, diagnosis, and quantification of decision-making processes for the treatment of vascular pathologies, vascular segmentation remains a process that continues to be marred by numerous challenges. In this study, we validate eight aneurysms via the use of two existing segmentation methods; the Region Growing Threshold and Chan-Vese model. These methods were evaluated by comparison of the results obtained with a manual segmentation performed. Based upon this validation study, we propose a new Threshold-Based Level Set (TLS) method in order to overcome the existing problems. With divergent methods of segmentation, we discovered that the volumes of the aneurysm models reached a maximum difference of 24%. The local artery anatomical shapes of the aneurysms were likewise found to significantly influence the results of these simulations. In contrast, however, the volume differences calculated via use of the TLS method remained at a relatively low figure, at only around 5%, thereby revealing the existence of inherent limitations in the application of cerebrovascular segmentation. The proposed TLS method holds the potential for utilisation in automatic aneurysm segmentation without the setting of a seed point or intensity threshold. This technique will further enable the segmentation of anatomically complex cerebrovascular shapes, thereby allowing for more accurate and efficient simulations of medical imagery. PMID:23606905

  7. Classification and regression tree (CART) model to predict pulmonary tuberculosis in hospitalized patients.

    PubMed

    Aguiar, Fabio S; Almeida, Luciana L; Ruffino-Netto, Antonio; Kritski, Afranio Lineu; Mello, Fernanda Cq; Werneck, Guilherme L

    2012-08-07

    Tuberculosis (TB) remains a public health issue worldwide. The lack of specific clinical symptoms to diagnose TB makes the correct decision to admit patients to respiratory isolation a difficult task for the clinician. Isolation of patients without the disease is common and increases health costs. Decision models for the diagnosis of TB in patients attending hospitals can increase the quality of care and decrease costs, without the risk of hospital transmission. We present a predictive model for predicting pulmonary TB in hospitalized patients in a high prevalence area in order to contribute to a more rational use of isolation rooms without increasing the risk of transmission. Cross sectional study of patients admitted to CFFH from March 2003 to December 2004. A classification and regression tree (CART) model was generated and validated. The area under the ROC curve (AUC), sensitivity, specificity, positive and negative predictive values were used to evaluate the performance of model. Validation of the model was performed with a different sample of patients admitted to the same hospital from January to December 2005. We studied 290 patients admitted with clinical suspicion of TB. Diagnosis was confirmed in 26.5% of them. Pulmonary TB was present in 83.7% of the patients with TB (62.3% with positive sputum smear) and HIV/AIDS was present in 56.9% of patients. The validated CART model showed sensitivity, specificity, positive predictive value and negative predictive value of 60.00%, 76.16%, 33.33%, and 90.55%, respectively. The AUC was 79.70%. The CART model developed for these hospitalized patients with clinical suspicion of TB had fair to good predictive performance for pulmonary TB. The most important variable for prediction of TB diagnosis was chest radiograph results. Prospective validation is still necessary, but our model offer an alternative for decision making in whether to isolate patients with clinical suspicion of TB in tertiary health facilities in countries with limited resources.

  8. Coupled Hydro-Mechanical Constitutive Model for Vegetated Soils: Validation and Applications

    NASA Astrophysics Data System (ADS)

    Switala, Barbara Maria; Veenhof, Rick; Wu, Wei; Askarinejad, Amin

    2016-04-01

    It is well known, that presence of vegetation influences stability of the slope. However, the quantitative assessment of this contribution remains challenging. It is essential to develop a numerical model, which combines mechanical root reinforcement and root water uptake, and allows modelling rainfall induced landslides of vegetated slopes. Therefore a novel constitutive formulation is proposed, which is based on the modified Cam-clay model for unsaturated soils. Mechanical root reinforcement is modelled introducing a new constitutive parameter, which governs the evolution of the Cam-clay failure surface with the degree of root reinforcement. Evapotranspiration is modelled in terms of the root water uptake, defined as a sink term in the water flow continuity equation. The original concept is extended for different shapes of the root architecture in three dimensions, and combined with the mechanical model. The model is implemented in the research finite element code Comes-Geo, and in the commercial software Abaqus. The formulation is tested, performing a series of numerical examples, which allow validation of the concept. The direct shear test and the triaxial test are modelled in order to test the performance of the mechanical part of the model. In order to validate the hydrological part of the constitutive formulation, evapotranspiration from the vegetated box is simulated and compared with the experimental results. Obtained numerical results exhibit a good agreement with the experimental data. The implemented model is capable of reproducing results of basic geotechnical laboratory tests. Moreover, the constitutive formulation can be used to model rainfall induced landslides of vegetated slopes, taking into account the most important factors influencing the slope stability (root reinforcement and evapotranspiration).

  9. Dynamic analysis of rotor flex-structure based on nonlinear anisotropic shell models

    NASA Astrophysics Data System (ADS)

    Bauchau, Olivier A.; Chiang, Wuying

    1991-05-01

    In this paper an anisotropic shallow shell model is developed that accommodates transverse shearing deformations and arbitrarily large displacements and rotations, but strains are assumed to remain small. Two kinematic models are developed, the first using two DOF to locate the direction of the normal to the shell's midplane, the second using three. The latter model allows for an automatic compatibility of the shell model with beam models. The shell model is validated by comparing its predictions with several benchmark problems. In actual helicopter rotor blade problems, the shell model of the flex structure is shown to give very different results shown compared to beam models. The lead-lag and torsion modes in particular are strongly affected, whereas flapping modes seem to be less affected.

  10. Assessment of published models and prognostic variables in epithelial ovarian cancer at Mayo Clinic

    PubMed Central

    Hendrickson, Andrea Wahner; Hawthorne, Kieran M.; Goode, Ellen L.; Kalli, Kimberly R.; Goergen, Krista M.; Bakkum-Gamez, Jamie N.; Cliby, William A.; Keeney, Gary L.; Visscher, Dan W.; Tarabishy, Yaman; Oberg, Ann L.; Hartmann, Lynn C.; Maurer, Matthew J.

    2015-01-01

    Objectives Epithelial ovarian cancer (EOC) is an aggressive disease in which first line therapy consists of a surgical staging/debulking procedure and platinum based chemotherapy. There is significant interest in clinically applicable, easy to use prognostic tools to estimate risk of recurrence and overall survival. In this study we used a large prospectively collected cohort of women with EOC to validate currently published models and assess prognostic variables. Methods Women with invasive ovarian, peritoneal, or fallopian tube cancer diagnosed between 2000-2011 and prospectively enrolled into the Mayo Clinic Ovarian Cancer registry were identified. Demographics and known prognostic markers as well as epidemiologic exposure variables were abstracted from the medical record and collected via questionnaire. Six previously published models of overall and recurrence-free survival were assessed for external validity. In addition, predictors of outcome were assessed in our dataset. Results Previously published models validated with a range of c-statistics (0.587-0.827), though application of models containing variables not part of routine practice were somewhat limited by missing data; utilization of all applicable models and comparison of results is suggested. Examination of prognostic variables identified only the presence of ascites and ASA score to be independent predictors of prognosis in our dataset, albeit with marginal gain in prognostic information, after accounting for stage and debulking. Conclusions Existing prognostic models for newly diagnosed EOC showed acceptable calibration in our cohort for clinical application. However, modeling of prospective variables in our dataset reiterates that stage and debulking remain the most important predictors of prognosis in this setting. PMID:25620544

  11. An Empirical Comparison of Different Models of Active Aging in Canada: The International Mobility in Aging Study

    PubMed Central

    Ahmed, Tamer; Filiatrault, Johanne; Yu, Hsiu-Ting; Zunzunegui, Maria Victoria

    2017-01-01

    Abstract Purpose: Active aging is a concept that lacks consensus. The WHO defines it as a holistic concept that encompasses the overall health, participation, and security of older adults. Fernández-Ballesteros and colleagues propose a similar concept but omit security and include mood and cognitive function. To date, researchers attempting to validate conceptual models of active aging have obtained mixed results. The goal of this study was to examine the validity of existing models of active aging with epidemiological data from Canada. Methods: The WHO model of active aging and the psychological model of active aging developed by Fernández-Ballesteros and colleagues were tested with confirmatory factor analysis. The data used included 799 community-dwelling older adults between 65 and 74 years old, recruited from the patient lists of family physicians in Saint-Hyacinthe, Quebec and Kingston, Ontario. Results: Neither model could be validated in the sample of Canadian older adults. Although a concept of healthy aging can be modeled adequately, social participation and security did not fit a latent factor model. A simple binary index indicated that 27% of older adults in the sample did not meet the active aging criteria proposed by the WHO. Implications: Our results suggest that active aging might represent a human rights policy orientation rather than an empirical measurement tool to guide research among older adult populations. Binary indexes of active aging may serve to highlight what remains to be improved about the health, participation, and security of growing populations of older adults. PMID:26350153

  12. The implementation and validation of improved land-surface hydrology in an atmospheric general circulation model

    NASA Technical Reports Server (NTRS)

    Johnson, Kevin D.; Entekhabi, Dara; Eagleson, Peter S.

    1993-01-01

    New land-surface hydrologic parameterizations are implemented into the NASA Goddard Institute for Space Studies (GISS) General Circulation Model (GCM). These parameterizations are: 1) runoff and evapotranspiration functions that include the effects of subgrid-scale spatial variability and use physically based equations of hydrologic flux at the soil surface and 2) a realistic soil moisture diffusion scheme for the movement of water and root sink in the soil column. A one-dimensional climate model with a complete hydrologic cycle is used to screen the basic sensitivities of the hydrological parameterizations before implementation into the full three-dimensional GCM. Results of the final simulation with the GISS GCM and the new land-surface hydrology indicate that the runoff rate, especially in the tropics, is significantly improved. As a result, the remaining components of the heat and moisture balance show similar improvements when compared to observations. The validation of model results is carried from the large global (ocean and land-surface) scale to the zonal, continental, and finally the regional river basin scales.

  13. Modeling Natural Anti-Inflammatory Compounds by Molecular Topology

    PubMed Central

    Galvez-Llompart, María; Zanni, Riccardo; García-Domenech, Ramón

    2011-01-01

    One of the main pharmacological problems today in the treatment of chronic inflammation diseases consists of the fact that anti-inflammatory drugs usually exhibit side effects. The natural products offer a great hope in the identification of bioactive lead compounds and their development into drugs for treating inflammatory diseases. Computer-aided drug design has proved to be a very useful tool for discovering new drugs and, specifically, Molecular Topology has become a good technique for such a goal. A topological-mathematical model, obtained by linear discriminant analysis, has been developed for the search of new anti-inflammatory natural compounds. An external validation obtained with the remaining compounds (those not used in building up the model), has been carried out. Finally, a virtual screening on natural products was performed and 74 compounds showed actual anti-inflammatory activity. From them, 54 had been previously described as anti-inflammatory in the literature. This can be seen as a plus in the model validation and as a reinforcement of the role of Molecular Topology as an efficient tool for the discovery of new anti-inflammatory natural compounds. PMID:22272145

  14. Cross-Validation of the Work Organization Assessment Questionnaire Across Genders: A Study in the Australian Health Care Sector.

    PubMed

    Karimi, Leila; Karanika-Murray, Maria; Meyer, Denny

    2016-03-01

    The main aim of this study was to examine the measurement invariance of the Work Organization Assessment Questionnaire (WOAQ) across genders in a group of health care employees, using bifactor modeling. There is a very limited research that uses invariance testing of bifactor models, despite their usefulness. Establishing validity of the WOAQ in this way is important for demonstrating its relevance for both men and women. A bifactor modeling procedure was used here to examine the validity of the WOAQ with a sample of 946 paramedics employed in a large Australian organization in the health care sector. The results of this study show that the WOAQ has good psychometric properties across genders in health care settings. In addition, there were significant mean differences between men and women in their perceptions of "quality of relationships with colleagues," and "reward and recognition." There were no differences between men and women in the remaining factors: "quality of relationships with the management," "quality of relationships with colleagues," and "quality of the physical environment." The use of bifactor modeling to establish the cross-validity of the WOAQ across male and female paramedics adds to evidence for the measure's good psychometric properties. The findings confirm those of previous research that has used higher order confirmatory factor analysis. Moreover, mean differences between men and women were found to be significant in two of the five WOAQ subscales. These findings have practical implications for health care organizations, in terms of assessing work characteristics and developing activities to support the health and well-being of their employees.

  15. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive ability. PMID:23755236

  16. Up-scaling of multi-variable flood loss models from objects to land use units at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai; Merz, Bruno

    2016-05-01

    Flood risk management increasingly relies on risk analyses, including loss modelling. Most of the flood loss models usually applied in standard practice have in common that complex damaging processes are described by simple approaches like stage-damage functions. Novel multi-variable models significantly improve loss estimation on the micro-scale and may also be advantageous for large-scale applications. However, more input parameters also reveal additional uncertainty, even more in upscaling procedures for meso-scale applications, where the parameters need to be estimated on a regional area-wide basis. To gain more knowledge about challenges associated with the up-scaling of multi-variable flood loss models the following approach is applied: Single- and multi-variable micro-scale flood loss models are up-scaled and applied on the meso-scale, namely on basis of ATKIS land-use units. Application and validation is undertaken in 19 municipalities, which were affected during the 2002 flood by the River Mulde in Saxony, Germany by comparison to official loss data provided by the Saxon Relief Bank (SAB).In the meso-scale case study based model validation, most multi-variable models show smaller errors than the uni-variable stage-damage functions. The results show the suitability of the up-scaling approach, and, in accordance with micro-scale validation studies, that multi-variable models are an improvement in flood loss modelling also on the meso-scale. However, uncertainties remain high, stressing the importance of uncertainty quantification. Thus, the development of probabilistic loss models, like BT-FLEMO used in this study, which inherently provide uncertainty information are the way forward.

  17. Validating the Use of Deep Learning Neural Networks for Correction of Large Hydrometric Datasets

    NASA Astrophysics Data System (ADS)

    Frazier, N.; Ogden, F. L.; Regina, J. A.; Cheng, Y.

    2017-12-01

    Collection and validation of Earth systems data can be time consuming and labor intensive. In particular, high resolution hydrometric data, including rainfall and streamflow measurements, are difficult to obtain due to a multitude of complicating factors. Measurement equipment is subject to clogs, environmental disturbances, and sensor drift. Manual intervention is typically required to identify, correct, and validate these data. Weirs can become clogged and the pressure transducer may float or drift over time. We typically employ a graphical tool called Time Series Editor to manually remove clogs and sensor drift from the data. However, this process is highly subjective and requires hydrological expertise. Two different people may produce two different data sets. To use this data for scientific discovery and model validation, a more consistent method is needed to processes this field data. Deep learning neural networks have proved to be excellent mechanisms for recognizing patterns in data. We explore the use of Recurrent Neural Networks (RNN) to capture the patterns in the data over time using various gating mechanisms (LSTM and GRU), network architectures, and hyper-parameters to build an automated data correction model. We also explore the required amount of manually corrected training data required to train the network for reasonable accuracy. The benefits of this approach are that the time to process a data set is significantly reduced, and the results are 100% reproducible after training is complete. Additionally, we train the RNN and calibrate a physically-based hydrological model against the same portion of data. Both the RNN and the model are applied to the remaining data using a split-sample methodology. Performance of the machine learning is evaluated for plausibility by comparing with the output of the hydrological model, and this analysis identifies potential periods where additional investigation is warranted.

  18. Integrated Technology Rotor Methodology Assessment Workshop

    NASA Technical Reports Server (NTRS)

    Mcnulty, Michael J. (Editor); Bousman, William G. (Editor)

    1988-01-01

    The conference proceedings contains 14 formal papers and the results of two panel discussions. In addition, a transcript of discussion that followed the paper presentations and panels is included. The papers are of two kinds. The first seven papers were directed specifically to the correlation of industry and government mathematical models with data for rotorcraft stability from six experiments. The remaining 7 papers dealt with related topics in the prediction of rotor aeroelastic or aeromechanical stability. The first of the panels provided an evaluation of the correlation that was shown between the mathematical models and the experimental data. The second panel addressed the general problems of the validation of mathematical models.

  19. Fault detection and diagnosis in an industrial fed-batch cell culture process.

    PubMed

    Gunther, Jon C; Conner, Jeremy S; Seborg, Dale E

    2007-01-01

    A flexible process monitoring method was applied to industrial pilot plant cell culture data for the purpose of fault detection and diagnosis. Data from 23 batches, 20 normal operating conditions (NOC) and three abnormal, were available. A principal component analysis (PCA) model was constructed from 19 NOC batches, and the remaining NOC batch was used for model validation. Subsequently, the model was used to successfully detect (both offline and online) abnormal process conditions and to diagnose the root causes. This research demonstrates that data from a relatively small number of batches (approximately 20) can still be used to monitor for a wide range of process faults.

  20. Modeling nanomaterial environmental fate in aquatic systems.

    PubMed

    Dale, Amy L; Casman, Elizabeth A; Lowry, Gregory V; Lead, Jamie R; Viparelli, Enrica; Baalousha, Mohammed

    2015-03-03

    Mathematical models improve our fundamental understanding of the environmental behavior, fate, and transport of engineered nanomaterials (NMs, chemical substances or materials roughly 1-100 nm in size) and facilitate risk assessment and management activities. Although today's large-scale environmental fate models for NMs are a considerable improvement over early efforts, a gap still remains between the experimental research performed to date on the environmental fate of NMs and its incorporation into models. This article provides an introduction to the current state of the science in modeling the fate and behavior of NMs in aquatic environments. We address the strengths and weaknesses of existing fate models, identify the challenges facing researchers in developing and validating these models, and offer a perspective on how these challenges can be addressed through the combined efforts of modelers and experimentalists.

  1. External Validity of the New York University Caregiver Intervention: Key Caregiver Outcomes Across Multiple Demonstration Projects.

    PubMed

    Fauth, Elizabeth B; Jackson, Mark A; Walberg, Donna K; Lee, Nancy E; Easom, Leisa R; Alston, Gayle; Ramos, Angel; Felten, Kristen; LaRue, Asenath; Mittelman, Mary

    2017-06-01

    The Administration on Aging funded six New York University Caregiver Intervention (NYUCI) demonstration projects, a counseling/support intervention targeting dementia caregivers and families. Three sites (Georgia, Utah, Wisconsin) pooled data to inform external validity in nonresearch settings. This study (a) assesses collective changes over time, and (b) compares outcomes across sites on caregiver burden, depressive symptoms, satisfaction with social support, family conflict, and quality of life. Data included baseline/preintervention ( N = 294) and follow-up visits (approximately 4, 8, 12 months). Linear mixed models showed that social support satisfaction increased ( p < .05) and family conflict decreased ( p < .05; Cohen's d = 0.49 and 0.35, respectively). Marginally significant findings emerged for quality of life increases ( p = .05) and burden decreases ( p < .10). Depressive symptoms remained stable. Slopes did not differ much by site. NYUCI demonstrated external validity in nonresearch settings across diverse caregiver samples.

  2. A satellite relative motion model including J_2 and J_3 via Vinti's intermediary

    NASA Astrophysics Data System (ADS)

    Biria, Ashley D.; Russell, Ryan P.

    2018-03-01

    Vinti's potential is revisited for analytical propagation of the main satellite problem, this time in the context of relative motion. A particular version of Vinti's spheroidal method is chosen that is valid for arbitrary elliptical orbits, encapsulating J_2, J_3, and generally a partial J_4 in an orbit propagation theory without recourse to perturbation methods. As a child of Vinti's solution, the proposed relative motion model inherits these properties. Furthermore, the problem is solved in oblate spheroidal elements, leading to large regions of validity for the linearization approximation. After offering several enhancements to Vinti's solution, including boosts in accuracy and removal of some singularities, the proposed model is derived and subsequently reformulated so that Vinti's solution is piecewise differentiable. While the model is valid for the critical inclination and nonsingular in the element space, singularities remain in the linear transformation from Earth-centered inertial coordinates to spheroidal elements when the eccentricity is zero or for nearly equatorial orbits. The new state transition matrix is evaluated against numerical solutions including the J_2 through J_5 terms for a wide range of chief orbits and separation distances. The solution is also compared with side-by-side simulations of the original Gim-Alfriend state transition matrix, which considers the J_2 perturbation. Code for computing the resulting state transition matrix and associated reference frame and coordinate transformations is provided online as supplementary material.

  3. Differentiation and identification of grape-associated black aspergilli using Fourier transform infrared (FT-IR) spectroscopic analysis of mycelia.

    PubMed

    Kogkaki, Efstathia A; Sofoulis, Manos; Natskoulis, Pantelis; Tarantilis, Petros A; Pappas, Christos S; Panagou, Efstathios Z

    2017-10-16

    The purpose of this study was to evaluate the potential of FT-IR spectroscopy as a high-throughput method for rapid differentiation among the ochratoxigenic species of Aspergillus carbonarius and the non-ochratoxigenic or low toxigenic species of Aspergillus niger aggregate, namely A. tubingensis and A. niger isolated previously from grapes of Greek vineyards. A total of 182 isolates of A. carbonarius, A. tubingensis, and A. niger were analyzed using FT-IR spectroscopy. The first derivative of specific spectral regions (3002-2801cm -1 , 1773-1550cm -1 , and 1286-952cm -1 ) were chosen and evaluated with respect to absorbance values. The average spectra of 130 fungal isolates were used for model calibration based on Discriminant analysis and the remaining 52 spectra were used for external model validation. This methodology was able to differentiate correctly 98.8% in total accuracy in both model calibration and validation. The per class accuracy for A. carbonarius was 95.3% and 100% for model calibration and validation, respectively, whereas for A. niger aggregate the per class accuracy amounted to 100% in both cases. The obtained results indicated that FT-IR could become a promising, fast, reliable and low-cost tool for the discrimination and differentiation of closely related fungal species. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Nonlinear-drifted Brownian motion with multiple hidden states for remaining useful life prediction of rechargeable batteries

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Zhao, Yang; Yang, Fangfang; Tsui, Kwok-Leung

    2017-09-01

    Brownian motion with adaptive drift has attracted much attention in prognostics because its first hitting time is highly relevant to remaining useful life prediction and it follows the inverse Gaussian distribution. Besides linear degradation modeling, nonlinear-drifted Brownian motion has been developed to model nonlinear degradation. Moreover, the first hitting time distribution of the nonlinear-drifted Brownian motion has been approximated by time-space transformation. In the previous studies, the drift coefficient is the only hidden state used in state space modeling of the nonlinear-drifted Brownian motion. Besides the drift coefficient, parameters of a nonlinear function used in the nonlinear-drifted Brownian motion should be treated as additional hidden states of state space modeling to make the nonlinear-drifted Brownian motion more flexible. In this paper, a prognostic method based on nonlinear-drifted Brownian motion with multiple hidden states is proposed and then it is applied to predict remaining useful life of rechargeable batteries. 26 sets of rechargeable battery degradation samples are analyzed to validate the effectiveness of the proposed prognostic method. Moreover, some comparisons with a standard particle filter based prognostic method, a spherical cubature particle filter based prognostic method and two classic Bayesian prognostic methods are conducted to highlight the superiority of the proposed prognostic method. Results show that the proposed prognostic method has lower average prediction errors than the particle filter based prognostic methods and the classic Bayesian prognostic methods for battery remaining useful life prediction.

  5. Human genetics as a model for target validation: finding new therapies for diabetes.

    PubMed

    Thomsen, Soren K; Gloyn, Anna L

    2017-06-01

    Type 2 diabetes is a global epidemic with major effects on healthcare expenditure and quality of life. Currently available treatments are inadequate for the prevention of comorbidities, yet progress towards new therapies remains slow. A major barrier is the insufficiency of traditional preclinical models for predicting drug efficacy and safety. Human genetics offers a complementary model to assess causal mechanisms for target validation. Genetic perturbations are 'experiments of nature' that provide a uniquely relevant window into the long-term effects of modulating specific targets. Here, we show that genetic discoveries over the past decades have accurately predicted (now known) therapeutic mechanisms for type 2 diabetes. These findings highlight the potential for use of human genetic variation for prospective target validation, and establish a framework for future applications. Studies into rare, monogenic forms of diabetes have also provided proof-of-principle for precision medicine, and the applicability of this paradigm to complex disease is discussed. Finally, we highlight some of the limitations that are relevant to the use of genome-wide association studies (GWAS) in the search for new therapies for diabetes. A key outstanding challenge is the translation of GWAS signals into disease biology and we outline possible solutions for tackling this experimental bottleneck.

  6. Characterization of shrubland ecosystem components as continuous fields in the northwest United States

    USGS Publications Warehouse

    Xian, George Z.; Homer, Collin G.; Rigge, Matthew B.; Shi, Hua; Meyer, Debbie

    2015-01-01

    Accurate and consistent estimates of shrubland ecosystem components are crucial to a better understanding of ecosystem conditions in arid and semiarid lands. An innovative approach was developed by integrating multiple sources of information to quantify shrubland components as continuous field products within the National Land Cover Database (NLCD). The approach consists of several procedures including field sample collections, high-resolution mapping of shrubland components using WorldView-2 imagery and regression tree models, Landsat 8 radiometric balancing and phenological mosaicking, medium resolution estimates of shrubland components following different climate zones using Landsat 8 phenological mosaics and regression tree models, and product validation. Fractional covers of nine shrubland components were estimated: annual herbaceous, bare ground, big sagebrush, herbaceous, litter, sagebrush, shrub, sagebrush height, and shrub height. Our study area included the footprint of six Landsat 8 scenes in the northwestern United States. Results show that most components have relatively significant correlations with validation data, have small normalized root mean square errors, and correspond well with expected ecological gradients. While some uncertainties remain with height estimates, the model formulated in this study provides a cross-validated, unbiased, and cost effective approach to quantify shrubland components at a regional scale and advances knowledge of horizontal and vertical variability of these components.

  7. Predicting Retention Times of Naturally Occurring Phenolic Compounds in Reversed-Phase Liquid Chromatography: A Quantitative Structure-Retention Relationship (QSRR) Approach

    PubMed Central

    Akbar, Jamshed; Iqbal, Shahid; Batool, Fozia; Karim, Abdul; Chan, Kim Wei

    2012-01-01

    Quantitative structure-retention relationships (QSRRs) have successfully been developed for naturally occurring phenolic compounds in a reversed-phase liquid chromatographic (RPLC) system. A total of 1519 descriptors were calculated from the optimized structures of the molecules using MOPAC2009 and DRAGON softwares. The data set of 39 molecules was divided into training and external validation sets. For feature selection and mapping we used step-wise multiple linear regression (SMLR), unsupervised forward selection followed by step-wise multiple linear regression (UFS-SMLR) and artificial neural networks (ANN). Stable and robust models with significant predictive abilities in terms of validation statistics were obtained with negation of any chance correlation. ANN models were found better than remaining two approaches. HNar, IDM, Mp, GATS2v, DISP and 3D-MoRSE (signals 22, 28 and 32) descriptors based on van der Waals volume, electronegativity, mass and polarizability, at atomic level, were found to have significant effects on the retention times. The possible implications of these descriptors in RPLC have been discussed. All the models are proven to be quite able to predict the retention times of phenolic compounds and have shown remarkable validation, robustness, stability and predictive performance. PMID:23203132

  8. Comparing flood loss models of different complexity

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno

    2013-04-01

    Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.

  9. Fate of ethanol during cooking of liquid foods prepared with alcoholic beverages: Theory and experimental studies.

    PubMed

    Snitkjær, Pia; Ryapushkina, Julia; Skovenborg, Erik; Astrup, Arne; Bech, Lene Mølskov; Jensen, Morten Georg; Risbo, Jens

    2017-09-01

    To obtain an understanding of the ethanol loss during cooking of liquid foods containing alcoholic beverages, ethanol concentration was measured as a function of time and remaining volume in meat stocks prepared with wine and beer. A mathematical model describing the decline in volatile compounds during heating of simple liquid foods was derived. The experimental results and the model show that concentration of ethanol at any given time is determined by the initial concentration and a power law function of the remaining volume fraction. The power law function is found to be independent of factors like pot dimensions and temperature. When using a lid to cover the pot during cooking, the model was still valid but the ethanol concentrations decreased more steeply, corresponding to a higher exponent. The results provide a theoretical and empirical guideline for predicting the ethanol concentration in cooked liquid foods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Prognostics of slurry pumps based on a moving-average wear degradation index and a general sequential Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Tse, Peter W.

    2015-05-01

    Slurry pumps are commonly used in oil-sand mining for pumping mixtures of abrasive liquids and solids. These operations cause constant wear of slurry pump impellers, which results in the breakdown of the slurry pumps. This paper develops a prognostic method for estimating remaining useful life of slurry pump impellers. First, a moving-average wear degradation index is proposed to assess the performance degradation of the slurry pump impeller. Secondly, the state space model of the proposed health index is constructed. A general sequential Monte Carlo method is employed to derive the parameters of the state space model. The remaining useful life of the slurry pump impeller is estimated by extrapolating the established state space model to a specified alert threshold. Data collected from an industrial oil sand pump were used to validate the developed method. The results show that the accuracy of the developed method improves as more data become available.

  11. Real-Time Prognostics of a Rotary Valve Actuator

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew

    2015-01-01

    Valves are used in many domains and often have system-critical functions. As such, it is important to monitor the health of valves and their actuators and predict remaining useful life. In this work, we develop a model-based prognostics approach for a rotary valve actuator. Due to limited observability of the component with multiple failure modes, a lumped damage approach is proposed for estimation and prediction of damage progression. In order to support the goal of real-time prognostics, an approach to prediction is developed that does not require online simulation to compute remaining life, rather, a function mapping the damage state to remaining useful life is found offline so that predictions can be made quickly online with a single function evaluation. Simulation results demonstrate the overall methodology, validating the lumped damage approach and demonstrating real-time prognostics.

  12. Toward On-line Parameter Estimation of Concentric Tube Robots Using a Mechanics-based Kinematic Model

    PubMed Central

    Jang, Cheongjae; Ha, Junhyoung; Dupont, Pierre E.; Park, Frank Chongwoo

    2017-01-01

    Although existing mechanics-based models of concentric tube robots have been experimentally demonstrated to approximate the actual kinematics, determining accurate estimates of model parameters remains difficult due to the complex relationship between the parameters and available measurements. Further, because the mechanics-based models neglect some phenomena like friction, nonlinear elasticity, and cross section deformation, it is also not clear if model error is due to model simplification or to parameter estimation errors. The parameters of the superelastic materials used in these robots can be slowly time-varying, necessitating periodic re-estimation. This paper proposes a method for estimating the mechanics-based model parameters using an extended Kalman filter as a step toward on-line parameter estimation. Our methodology is validated through both simulation and experiments. PMID:28717554

  13. Validation of the Asthma Illness Representation Scale-Spanish (AIRS-S).

    PubMed

    Sidora-Arcoleo, Kimberly Joan; Feldman, Jonathan; Serebrisky, Denise; Spray, Amanda

    2010-05-01

    To expand knowledge surrounding parental illness representations (IRs) of their children's asthma, it is imperative that culturally appropriate survey instruments are developed and validated for use in clinical and research settings. The Asthma Illness Representation Scale (AIRS) provides a structured assessment of the key components of asthma IRs, allowing the health care provider (HCP) to quickly identify areas of discordance with the professional model of asthma management. The English AIRS was developed and validated among a geographically and ethnically diverse sample. The authors present the validation results of the AIRS-S (Spanish) from a sample of Mexican and Puerto Rican parents. The AIRS was translated and back translated per approved methodologies. Factor analysis, internal reliability, external validity, and 2-week test-retest reliability (on a subsample) were carried out and results compared with the validated English version. Data were obtained from 80 Spanish-speaking Mexican and Puerto Rican parents of children with asthma. The sample was recruited from two school-based health centers and a free medical clinic in Phoenix, Arizona, and a hospital-based asthma clinic in Bronx, New York. The original Nature of Asthma Symptoms, Facts About Asthma, and Attitudes Towards Medication Use subscales emerged. Remaining factors were a mixture of items with no coherent or theoretical distinction between them. Interpretation of results is limited due to not meeting the minimum requirement of 5 observations/item. Cronbach's alpha coefficients for the total score (alpha = .77) and majority of subscales (alpha range = .53-.77) were acceptable and consistent with the English version. Parental reports of a positive relationship with the HCP significantly predicted AIRS scores congruent with the professional model; longer asthma duration was associated with beliefs aligned with the lay model; and AIRS scores congruent with the professional model were related to lower asthma severity. Stability in AIRS-S scores over 2 weeks was demonstrated. The AIRS-S is a culturally appropriate instrument that can be used by HCPs to ascertain Spanish-speaking parents' asthma illness beliefs and assess discordance with the professional model of asthma management. This information can be used by the HCP when discussing parent's asthma management strategies for their children during clinical encounters.

  14. A Review of Hemolysis Prediction Models for Computational Fluid Dynamics.

    PubMed

    Yu, Hai; Engel, Sebastian; Janiga, Gábor; Thévenin, Dominique

    2017-07-01

    Flow-induced hemolysis is a crucial issue for many biomedical applications; in particular, it is an essential issue for the development of blood-transporting devices such as left ventricular assist devices, and other types of blood pumps. In order to estimate red blood cell (RBC) damage in blood flows, many models have been proposed in the past. Most models have been validated by their respective authors. However, the accuracy and the validity range of these models remains unclear. In this work, the most established hemolysis models compatible with computational fluid dynamics of full-scale devices are described and assessed by comparing two selected reference experiments: a simple rheometric flow and a more complex hemodialytic flow through a needle. The quantitative comparisons show very large deviations concerning hemolysis predictions, depending on the model and model parameter. In light of the current results, two simple power-law models deliver the best compromise between computational efficiency and obtained accuracy. Finally, hemolysis has been computed in an axial blood pump. The reconstructed geometry of a HeartMate II shows that hemolysis occurs mainly at the tip and leading edge of the rotor blades, as well as at the leading edge of the diffusor vanes. © 2017 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  15. Landslide susceptibility modeling in a landslide prone area in Mazandarn Province, north of Iran: a comparison between GLM, GAM, MARS, and M-AHP methods

    NASA Astrophysics Data System (ADS)

    Pourghasemi, Hamid Reza; Rossi, Mauro

    2017-10-01

    Landslides are identified as one of the most important natural hazards in many areas throughout the world. The essential purpose of this study is to compare general linear model (GLM), general additive model (GAM), multivariate adaptive regression spline (MARS), and modified analytical hierarchy process (M-AHP) models and assessment of their performances for landslide susceptibility modeling in the west of Mazandaran Province, Iran. First, landslides were identified by interpreting aerial photographs, and extensive field works. In total, 153 landslides were identified in the study area. Among these, 105 landslides were randomly selected as training data (i.e. used in the models training) and the remaining 48 (30 %) cases were used for the validation (i.e. used in the models validation). Afterward, based on a deep literature review on 220 scientific papers (period between 2005 and 2012), eleven conditioning factors including lithology, land use, distance from rivers, distance from roads, distance from faults, slope angle, slope aspect, altitude, topographic wetness index (TWI), plan curvature, and profile curvature were selected. The Certainty Factor (CF) model was used for managing uncertainty in rule-based systems and evaluation of the correlation between the dependent (landslides) and independent variables. Finally, the landslide susceptibility zonation was produced using GLM, GAM, MARS, and M-AHP models. For evaluation of the models, the area under the curve (AUC) method was used and both success and prediction rate curves were calculated. The evaluation of models for GLM, GAM, and MARS showed 90.50, 88.90, and 82.10 % for training data and 77.52, 70.49, and 78.17 % for validation data, respectively. Furthermore, The AUC value of the produced landslide susceptibility map using M-AHP showed a training value of 77.82 % and validation value of 82.77 % accuracy. Based on the overall assessments, the proposed approaches showed reasonable results for landslide susceptibility mapping in the study area. Moreover, results obtained showed that the M-AHP model performed slightly better than the MARS, GLM, and GAM models in prediction. These algorithms can be very useful for landslide susceptibility and hazard mapping and land use planning in regional scale.

  16. Prognostic models for predicting posttraumatic seizures during acute hospitalization, and at 1 and 2 years following traumatic brain injury.

    PubMed

    Ritter, Anne C; Wagner, Amy K; Szaflarski, Jerzy P; Brooks, Maria M; Zafonte, Ross D; Pugh, Mary Jo V; Fabio, Anthony; Hammond, Flora M; Dreer, Laura E; Bushnik, Tamara; Walker, William C; Brown, Allen W; Johnson-Greene, Doug; Shea, Timothy; Krellman, Jason W; Rosenthal, Joseph A

    2016-09-01

    Posttraumatic seizures (PTS) are well-recognized acute and chronic complications of traumatic brain injury (TBI). Risk factors have been identified, but considerable variability in who develops PTS remains. Existing PTS prognostic models are not widely adopted for clinical use and do not reflect current trends in injury, diagnosis, or care. We aimed to develop and internally validate preliminary prognostic regression models to predict PTS during acute care hospitalization, and at year 1 and year 2 postinjury. Prognostic models predicting PTS during acute care hospitalization and year 1 and year 2 post-injury were developed using a recent (2011-2014) cohort from the TBI Model Systems National Database. Potential PTS predictors were selected based on previous literature and biologic plausibility. Bivariable logistic regression identified variables with a p-value < 0.20 that were used to fit initial prognostic models. Multivariable logistic regression modeling with backward-stepwise elimination was used to determine reduced prognostic models and to internally validate using 1,000 bootstrap samples. Fit statistics were calculated, correcting for overfitting (optimism). The prognostic models identified sex, craniotomy, contusion load, and pre-injury limitation in learning/remembering/concentrating as significant PTS predictors during acute hospitalization. Significant predictors of PTS at year 1 were subdural hematoma (SDH), contusion load, craniotomy, craniectomy, seizure during acute hospitalization, duration of posttraumatic amnesia, preinjury mental health treatment/psychiatric hospitalization, and preinjury incarceration. Year 2 significant predictors were similar to those of year 1: SDH, intraparenchymal fragment, craniotomy, craniectomy, seizure during acute hospitalization, and preinjury incarceration. Corrected concordance (C) statistics were 0.599, 0.747, and 0.716 for acute hospitalization, year 1, and year 2 models, respectively. The prognostic model for PTS during acute hospitalization did not discriminate well. Year 1 and year 2 models showed fair to good predictive validity for PTS. Cranial surgery, although medically necessary, requires ongoing research regarding potential benefits of increased monitoring for signs of epileptogenesis, PTS prophylaxis, and/or rehabilitation/social support. Future studies should externally validate models and determine clinical utility. Wiley Periodicals, Inc. © 2016 International League Against Epilepsy.

  17. RWEN: Response-Weighted Elastic Net For Prediction of Chemosensitivity of Cancer Cell Lines. | Office of Cancer Genomics

    Cancer.gov

    Motivation: In recent years there have been several efforts to generate sensitivity profiles of collections of genomically characterized cell lines to panels of candidate therapeutic compounds. These data provide the basis for the development of in silico models of sensitivity based on cellular, genetic, or expression biomarkers of cancer cells. However, a remaining challenge is an efficient way to identify accurate sets of biomarkers to validate.

  18. Quantifying Hydroperiod, Fire and Nutrient Effects on the Composition of Plant Communities in Marl Prairie of the Everglades: a Joint Probability Method Based Model

    NASA Astrophysics Data System (ADS)

    Zhai, L.

    2017-12-01

    Plant community can be simultaneously affected by human activities and climate changes, and quantifying and predicting this combined effect on plant community by appropriate model framework which is validated by field data is complex, but very useful to conservation management. Plant communities in the Everglades provide an unique set of conditions to develop and validate this model framework, because they are both experiencing intensive effects of human activities (such as changing hydroperiod by drainage and restoration projects, nutrients from upstream agriculture, prescribed fire, etc.) and climate changes (such as warming, changing precipitation patter, sea level rise, etc.). More importantly, previous research attention focuses on plant communities in slough ecosystem (including ridge, slough and their tree islands), very few studies consider the marl prairie ecosystem. Comparing with slough ecosystem featured by remaining consistently flooded almost year-round, marl prairie has relatively shorter hydroperiod (just in wet-season of one year). Therefore, plant communities of marl prairie may receive more impacts from hydroperiod change. In addition to hydroperiod, fire and nutrients also affect the plant communities in the marl prairie. Therefore, to quantify the combined effects of water level, fire, and nutrients on the composition of the plant communities, we are developing a joint probability method based vegetation dynamic model. Further, the model is being validated by field data about changes of vegetation assemblage along environmental gradients in the marl prairie. Our poster showed preliminary data from our current project.

  19. Spatial Interpretation of Tower, Chamber and Modelled Terrestrial Fluxes in a Tropical Forest Plantation

    NASA Astrophysics Data System (ADS)

    Whidden, E.; Roulet, N.

    2003-04-01

    Interpretation of a site average terrestrial flux may be complicated in the presence of inhomogeneities. Inhomogeneity may invalidate the basic assumptions of aerodynamic flux measurement. Chamber measurement may miss or misinterpret important temporal or spatial anomalies. Models may smooth over important nonlinearities depending on the scale of application. Although inhomogeneity is usually seen as a design problem, many sites have spatial variance that may have a large impact on net flux, and in many cases a large homogeneous surface is unrealistic. The sensitivity and validity of a site average flux are investigated in the presence of an inhomogeneous site. Directional differences are used to evaluate the validity of aerodynamic methods and the computation of a site average tower flux. Empirical and modelling methods are used to interpret the spatial controls on flux. An ecosystem model, Ecosys, is used to assess spatial length scales appropriate to the ecophysiologic controls. A diffusion model is used to compare tower, chamber, and model data, by spatially weighting contributions within the tower footprint. Diffusion model weighting is also used to improve tower flux estimates by producing footprint averaged ecological parameters (soil moisture, soil temperature, etc.). Although uncertainty remains in the validity of measurement methods and the accuracy of diffusion models, a detailed spatial interpretation is required at an inhomogeneous site. Flux estimation between methods improves with spatial interpretation, showing the importance to an estimation of a site average flux. Small-scale temporal and spatial anomalies may be relatively unimportant to overall flux, but accounting for medium-scale differences in ecophysiological controls is necessary. A combination of measurements and modelling can be used to define the appropriate time and length scales of significant non-linearity due to inhomogeneity.

  20. Gathering Validity Evidence for Surgical Simulation: A Systematic Review.

    PubMed

    Borgersen, Nanna Jo; Naur, Therese M H; Sørensen, Stine M D; Bjerrum, Flemming; Konge, Lars; Subhi, Yousif; Thomsen, Ann Sofia S

    2018-06-01

    To identify current trends in the use of validity frameworks in surgical simulation, to provide an overview of the evidence behind the assessment of technical skills in all surgical specialties, and to present recommendations and guidelines for future validity studies. Validity evidence for assessment tools used in the evaluation of surgical performance is of paramount importance to ensure valid and reliable assessment of skills. We systematically reviewed the literature by searching 5 databases (PubMed, EMBASE, Web of Science, PsycINFO, and the Cochrane Library) for studies published from January 1, 2008, to July 10, 2017. We included original studies evaluating simulation-based assessments of health professionals in surgical specialties and extracted data on surgical specialty, simulator modality, participant characteristics, and the validity framework used. Data were synthesized qualitatively. We identified 498 studies with a total of 18,312 participants. Publications involving validity assessments in surgical simulation more than doubled from 2008 to 2010 (∼30 studies/year) to 2014 to 2016 (∼70 to 90 studies/year). Only 6.6% of the studies used the recommended contemporary validity framework (Messick). The majority of studies used outdated frameworks such as face validity. Significant differences were identified across surgical specialties. The evaluated assessment tools were mostly inanimate or virtual reality simulation models. An increasing number of studies have gathered validity evidence for simulation-based assessments in surgical specialties, but the use of outdated frameworks remains common. To address the current practice, this paper presents guidelines on how to use the contemporary validity framework when designing validity studies.

  1. Recent advances in modeling and simulation of the exposure and response of tungsten to fusion energy conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marian, Jaime; Becquart, Charlotte S.; Domain, Christophe

    2017-06-09

    Under the anticipated operating conditions for demonstration magnetic fusion reactors beyond ITER, structural materials will be exposed to unprecedented conditions of irradiation, heat flux, and temperature. While such extreme environments remain inaccessible experimentally, computational modeling and simulation can provide qualitative and quantitative insights into materials response and complement the available experimental measurements with carefully validated predictions. For plasma facing components such as the first wall and the divertor, tungsten (W) has been selected as the best candidate material due to its superior high-temperature and irradiation properties. In this paper we provide a review of recent efforts in computational modeling ofmore » W both as a plasma-facing material exposed to He deposition as well as a bulk structural material subjected to fast neutron irradiation. We use a multiscale modeling approach –commonly used as the materials modeling paradigm– to define the outline of the paper and highlight recent advances using several classes of techniques and their interconnection. We highlight several of the most salient findings obtained via computational modeling and point out a number of remaining challenges and future research directions« less

  2. Validating a Predictive Model of Acute Advanced Imaging Biomarkers in Ischemic Stroke.

    PubMed

    Bivard, Andrew; Levi, Christopher; Lin, Longting; Cheng, Xin; Aviv, Richard; Spratt, Neil J; Lou, Min; Kleinig, Tim; O'Brien, Billy; Butcher, Kenneth; Zhang, Jingfen; Jannes, Jim; Dong, Qiang; Parsons, Mark

    2017-03-01

    Advanced imaging to identify tissue pathophysiology may provide more accurate prognostication than the clinical measures used currently in stroke. This study aimed to derive and validate a predictive model for functional outcome based on acute clinical and advanced imaging measures. A database of prospectively collected sub-4.5 hour patients with ischemic stroke being assessed for thrombolysis from 5 centers who had computed tomographic perfusion and computed tomographic angiography before a treatment decision was assessed. Individual variable cut points were derived from a classification and regression tree analysis. The optimal cut points for each assessment variable were then used in a backward logic regression to predict modified Rankin scale (mRS) score of 0 to 1 and 5 to 6. The variables remaining in the models were then assessed using a receiver operating characteristic curve analysis. Overall, 1519 patients were included in the study, 635 in the derivation cohort and 884 in the validation cohort. The model was highly accurate at predicting mRS score of 0 to 1 in all patients considered for thrombolysis therapy (area under the curve [AUC] 0.91), those who were treated (AUC 0.88) and those with recanalization (AUC 0.89). Next, the model was highly accurate at predicting mRS score of 5 to 6 in all patients considered for thrombolysis therapy (AUC 0.91), those who were treated (0.89) and those with recanalization (AUC 0.91). The odds ratio of thrombolysed patients who met the model criteria achieving mRS score of 0 to 1 was 17.89 (4.59-36.35, P <0.001) and for mRS score of 5 to 6 was 8.23 (2.57-26.97, P <0.001). This study has derived and validated a highly accurate model at predicting patient outcome after ischemic stroke. © 2017 American Heart Association, Inc.

  3. Diagnostic Value of Combining Tumor and Inflammatory Markers in Lung Cancer

    PubMed Central

    Yoon, Ho Il; Kwon, Oh-Ran; Kang, Kyung Nam; Shin, Yong Sung; Shin, Ho Sang; Yeon, Eun Hee; Kwon, Keon Young; Hwang, Ilseon; Jeon, Yoon Kyung; Kim, Yongdai; Kim, Chul Woo

    2016-01-01

    Background Despite major advances in lung cancer treatment, early detection remains the most promising way of improving outcomes. To detect lung cancer in earlier stages, many serum biomarkers have been tested. Unfortunately, no single biomarker can reliably detect lung cancer. We combined a set of 2 tumor markers and 4 inflammatory or metabolic markers and tried to validate the diagnostic performance in lung cancer. Methods We collected serum samples from 355 lung cancer patients and 590 control subjects and divided them into training and validation datasets. After measuring serum levels of 6 biomarkers (human epididymis secretory protein 4 [HE4], carcinoembryonic antigen [CEA], regulated on activation, normal T cell expressed and secreted [RANTES], apolipoprotein A2 [ApoA2], transthyretin [TTR], and secretory vascular cell adhesion molecule-1 [sVCAM-1]), we tested various sets of biomarkers for their diagnostic performance in lung cancer. Results In a training dataset, the area under the curve (AUC) values were 0.821 for HE4, 0.753 for CEA, 0.858 for RANTES, 0.867 for ApoA2, 0.830 for TTR, and 0.552 for sVCAM-1. A model using all 6 biomarkers and age yielded an AUC value of 0.986 and sensitivity of 93.2% (cutoff at specificity 94%). Applying this model to the validation dataset showed similar results. The AUC value of the model was 0.988, with sensitivity of 93.33% and specificity of 92.00% at the same cutoff point used in the validation dataset. Analyses by stages and histologic subtypes all yielded similar results. Conclusions Combining multiple tumor and systemic inflammatory markers proved to be a valid strategy in the diagnosis of lung cancer. PMID:27722145

  4. Diagnostic Value of Combining Tumor and Inflammatory Markers in Lung Cancer.

    PubMed

    Yoon, Ho Il; Kwon, Oh-Ran; Kang, Kyung Nam; Shin, Yong Sung; Shin, Ho Sang; Yeon, Eun Hee; Kwon, Keon Young; Hwang, Ilseon; Jeon, Yoon Kyung; Kim, Yongdai; Kim, Chul Woo

    2016-09-01

    Despite major advances in lung cancer treatment, early detection remains the most promising way of improving outcomes. To detect lung cancer in earlier stages, many serum biomarkers have been tested. Unfortunately, no single biomarker can reliably detect lung cancer. We combined a set of 2 tumor markers and 4 inflammatory or metabolic markers and tried to validate the diagnostic performance in lung cancer. We collected serum samples from 355 lung cancer patients and 590 control subjects and divided them into training and validation datasets. After measuring serum levels of 6 biomarkers (human epididymis secretory protein 4 [HE4], carcinoembryonic antigen [CEA], regulated on activation, normal T cell expressed and secreted [RANTES], apolipoprotein A2 [ApoA2], transthyretin [TTR], and secretory vascular cell adhesion molecule-1 [sVCAM-1]), we tested various sets of biomarkers for their diagnostic performance in lung cancer. In a training dataset, the area under the curve (AUC) values were 0.821 for HE4, 0.753 for CEA, 0.858 for RANTES, 0.867 for ApoA2, 0.830 for TTR, and 0.552 for sVCAM-1. A model using all 6 biomarkers and age yielded an AUC value of 0.986 and sensitivity of 93.2% (cutoff at specificity 94%). Applying this model to the validation dataset showed similar results. The AUC value of the model was 0.988, with sensitivity of 93.33% and specificity of 92.00% at the same cutoff point used in the validation dataset. Analyses by stages and histologic subtypes all yielded similar results. Combining multiple tumor and systemic inflammatory markers proved to be a valid strategy in the diagnosis of lung cancer.

  5. Theory-Based Parameterization of Semiotics for Measuring Pre-literacy Development

    NASA Astrophysics Data System (ADS)

    Bezruczko, N.

    2013-09-01

    A probabilistic model was applied to problem of measuring pre-literacy in young children. First, semiotic philosophy and contemporary cognition research were conceptually integrated to establish theoretical foundations for rating 14 characteristics of children's drawings and narratives (N = 120). Then ratings were transformed with a Rasch model, which estimated linear item parameter values that accounted for 79 percent of rater variance. Principle Components Analysis of item residual matrix confirmed variance remaining after item calibration was largely unsystematic. Validation analyses found positive correlations between semiotic measures and preschool literacy outcomes. Practical implications of a semiotics dimension for preschool practice were discussed.

  6. Multi-Scale Human Respiratory System Simulations to Study Health Effects of Aging, Disease, and Inhaled Substances

    NASA Astrophysics Data System (ADS)

    Kunz, Robert; Haworth, Daniel; Dogan, Gulkiz; Kriete, Andres

    2006-11-01

    Three-dimensional, unsteady simulations of multiphase flow, gas exchange, and particle/aerosol deposition in the human lung are reported. Surface data for human tracheo-bronchial trees are derived from CT scans, and are used to generate three- dimensional CFD meshes for the first several generations of branching. One-dimensional meshes for the remaining generations down to the respiratory units are generated using branching algorithms based on those that have been proposed in the literature, and a zero-dimensional respiratory unit (pulmonary acinus) model is attached at the end of each terminal bronchiole. The process is automated to facilitate rapid model generation. The model is exercised through multiple breathing cycles to compute the spatial and temporal variations in flow, gas exchange, and particle/aerosol deposition. The depth of the 3D/1D transition (at branching generation n) is a key parameter, and can be varied. High-fidelity models (large n) are run on massively parallel distributed-memory clusters, and are used to generate physical insight and to calibrate/validate the 1D and 0D models. Suitably validated lower-order models (small n) can be run on single-processor PC’s with run times that allow model-based clinical intervention for individual patients.

  7. Landslide susceptibility mapping using GIS-based statistical models and Remote sensing data in tropical environment

    PubMed Central

    Hashim, Mazlan

    2015-01-01

    This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning. PMID:25898919

  8. Landslide susceptibility mapping using GIS-based statistical models and Remote sensing data in tropical environment.

    PubMed

    Shahabi, Himan; Hashim, Mazlan

    2015-04-22

    This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning.

  9. Experimental verification of a thermal equivalent circuit dynamic model on an extended range electric vehicle battery pack

    NASA Astrophysics Data System (ADS)

    Ramotar, Lokendra; Rohrauer, Greg L.; Filion, Ryan; MacDonald, Kathryn

    2017-03-01

    The development of a dynamic thermal battery model for hybrid and electric vehicles is realized. A thermal equivalent circuit model is created which aims to capture and understand the heat propagation from the cells through the entire pack and to the environment using a production vehicle battery pack for model validation. The inclusion of production hardware and the liquid battery thermal management system components into the model considers physical and geometric properties to calculate thermal resistances of components (conduction, convection and radiation) along with their associated heat capacity. Various heat sources/sinks comprise the remaining model elements. Analog equivalent circuit simulations using PSpice are compared to experimental results to validate internal temperature nodes and heat rates measured through various elements, which are then employed to refine the model further. Agreement with experimental results indicates the proposed method allows for a comprehensive real-time battery pack analysis at little computational expense when compared to other types of computer based simulations. Elevated road and ambient conditions in Mesa, Arizona are simulated on a parked vehicle with varying quiescent cooling rates to examine the effect on the diurnal battery temperature for longer term static exposure. A typical daily driving schedule is also simulated and examined.

  10. Behavioural change models for infectious disease transmission: a systematic review (2010–2015)

    PubMed Central

    2016-01-01

    We review behavioural change models (BCMs) for infectious disease transmission in humans. Following the Cochrane collaboration guidelines and the PRISMA statement, our systematic search and selection yielded 178 papers covering the period 2010–2015. We observe an increasing trend in published BCMs, frequently coupled to (re)emergence events, and propose a categorization by distinguishing how information translates into preventive actions. Behaviour is usually captured by introducing information as a dynamic parameter (76/178) or by introducing an economic objective function, either with (26/178) or without (37/178) imitation. Approaches using information thresholds (29/178) and exogenous behaviour formation (16/178) are also popular. We further classify according to disease, prevention measure, transmission model (with 81/178 population, 6/178 metapopulation and 91/178 individual-level models) and the way prevention impacts transmission. We highlight the minority (15%) of studies that use any real-life data for parametrization or validation and note that BCMs increasingly use social media data and generally incorporate multiple sources of information (16/178), multiple types of information (17/178) or both (9/178). We conclude that individual-level models are increasingly used and useful to model behaviour changes. Despite recent advancements, we remain concerned that most models are purely theoretical and lack representative data and a validation process. PMID:28003528

  11. Estimating the clinical and economic benefit associated with incremental improvements in sustained virologic response in chronic hepatitis C.

    PubMed

    McEwan, Phil; Ward, Thomas; Bennett, Hayley; Kalsekar, Anupama; Webster, Samantha; Brenner, Michael; Yuan, Yong

    2015-01-01

    Hepatitis C virus (HCV) infection is one of the principle causes of chronic liver disease. Successful treatment significantly decreases the risk of hepatic morbidity and mortality. Current standard of care achieves sustained virologic response (SVR) rates of 40-80%; however, the HCV therapy landscape is rapidly evolving. The objective of this study was to quantify the clinical and economic benefit associated with increasing levels of SVR. A published Markov model (MONARCH) that simulates the natural history of hepatitis C over a lifetime horizon was used. Discounted and non-discounted life-years (LYs), quality-adjusted life-years (QALYs) and cost of complication management were estimated for various plausible SVR rates. To demonstrate the robustness of projections obtained, the model was validated to ten UK-specific HCV studies. QALY estimates ranged from 18.0 years for those treated successfully in fibrosis stage F0 to 7.5 years (discounted) for patients in fibrosis stage F4 who remain untreated. Predicted QALY gains per 10% improvement in SVR ranged from 0.23 (F0) to 0.64 (F4) and 0.58 (F0) to 1.35 (F4) in 40 year old patients (discounted and non-discounted results respectively). In those aged 40, projected discounted HCV-related costs are minimised with successful treatment in F0/F1 (at approximately £ 300), increasing to £ 49,300 in F4 patients who remain untreated. Validation of the model to published UK cost-effectiveness studies produce R2 goodness of fit statistics of 0.988, 0.978 and of 0.973 for total costs, QALYs and incremental cost effectiveness ratios, respectively. Projecting the long-term clinical and economic consequences associated with chronic hepatitis C is a necessary requirement for the evaluation of new treatments. The principle analysis demonstrates the significant impact on expected costs, LYs and QALYs associated with increasing SVR. A validation analysis demonstrated the robustness of the results reported.

  12. A pilot study: the development of a culturally tailored Malaysian Diabetes Education Module (MY-DEMO) based on the Health Belief Model

    PubMed Central

    2014-01-01

    Background Diabetes education and self-care remains the cornerstone of diabetes management. There are many structured diabetes modules available in the United Kingdom, Europe and United States of America. Contrastingly, few structured and validated diabetes modules are available in Malaysia. This pilot study aims to develop and validate diabetes education material suitable and tailored for a multicultural society like Malaysia. Methods The theoretical framework of this module was founded from the Health Belief Model (HBM). The participants were assessed using 6-item pre- and post-test questionnaires that measured some of the known HBM constructs namely cues to action, perceived severity and perceived benefit. Data was analysed using PASW Statistics 18.0. Results The pre- and post-test questionnaires were administered to 88 participants (31 males). In general, there was a significant increase in the total score in post-test (97.34 ± 6.13%) compared to pre-test (92.80 ± 12.83%) (p < 0.05) and a significant increase in excellent score (>85%) at post-test (84.1%) compared to pre-test (70.5%) (p < 0.05). There was an improvement in post-test score in 4 of 6 items tested. The remaining 2 items which measured the perceived severity and cues to action had poorer post-test score. Conclusions The preliminary results from this pilot study suggest contextualised content material embedded within MY DEMO maybe suitable for integration with the existing diabetes education programmes. This was the first known validated diabetes education programme available in the Malay language. PMID:24708715

  13. Validation of the AMC-71 Mobility Model. Appendix A: Vehicle Data. Appendix B: Location and Description of Test Sites. Appendix C: Definitions of Terrain Terms and Procedures Used to Collect Terrain Data for Validation Tests. Appendix D: Basic Terrain Data

    DTIC Science & Technology

    1976-03-01

    vegetation of the area, including maples, poplars, and pines dith scattered low shrubs and blueberry patches growing in loamy sand (SP-SM) with some...mostly lichens, grasses, and blueberry bushes along with scattered currant bushes as vegetable cover. Also, of the remaining 2204 ft, 568 ft was open with...NUHI1ER 4-:. TERRAIN UNIT S CONTINUES -40- -•r’ W -0 _..• ,- . STAT IO (NS q4 4 S C 1 4C I.- /=-too 7 ’o S- - S 4,- -. - 7s 4 Profile- -k-, jp" Continued

  14. Alzheimer's Disease Diagnosis in Individual Subjects using Structural MR Images: Validation Studies

    PubMed Central

    Vemuri, Prashanthi; Gunter, Jeffrey L.; Senjem, Matthew L.; Whitwell, Jennifer L.; Kantarci, Kejal; Knopman, David S.; Boeve, Bradley F.; Petersen, Ronald C.; Jack, Clifford R.

    2008-01-01

    OBJECTIVE To develop and validate a tool for Alzheimer's disease (AD) diagnosis in individual subjects using support vector machine (SVM) based classification of structural MR (sMR) images. BACKGROUND Libraries of sMR scans of clinically well characterized subjects can be harnessed for the purpose of diagnosing new incoming subjects. METHODS 190 patients with probable AD were age- and gender-matched with 190 cognitively normal (CN) subjects. Three different classification models were implemented: Model I uses tissue densities obtained from sMR scans to give STructural Abnormality iNDex (STAND)-score; and Models II and III use tissue densities as well as covariates (demographics and Apolipoprotein E genotype) to give adjusted-STAND (aSTAND)-score. Data from 140 AD and 140 CN were used for training. The SVM parameter optimization and training was done by four-fold cross validation. The remaining independent sample of 50 AD and 50 CN were used to obtain a minimally biased estimate of the generalization error of the algorithm. RESULTS The CV accuracy of Model II and Model III aSTAND-scores was 88.5% and 89.3% respectively and the developed models generalized well on the independent test datasets. Anatomic patterns best differentiating the groups were consistent with the known distribution of neurofibrillary AD pathology. CONCLUSIONS This paper presents preliminary evidence that application of SVM-based classification of an individual sMR scan relative to a library of scans can provide useful information in individual subjects for diagnosis of AD. Including demographic and genetic information in the classification algorithm slightly improves diagnostic accuracy. PMID:18054253

  15. Application of Koopmans' theorem for density functional theory to full valence-band photoemission spectroscopy modeling.

    PubMed

    Li, Tsung-Lung; Lu, Wen-Cai

    2015-10-05

    In this work, Koopmans' theorem for Kohn-Sham density functional theory (KS-DFT) is applied to the photoemission spectra (PES) modeling over the entire valence-band. To examine the validity of this application, a PES modeling scheme is developed to facilitate a full valence-band comparison of theoretical PES spectra with experiments. The PES model incorporates the variations of electron ionization cross-sections over atomic orbitals and a linear dispersion of spectral broadening widths. KS-DFT simulations of pristine rubrene (5,6,11,12-tetraphenyltetracene) and potassium-rubrene complex are performed, and the simulation results are used as the input to the PES models. Two conclusions are reached. First, decompositions of the theoretical total spectra show that the dissociated electron of the potassium mainly remains on the backbone and has little effect on the electronic structures of phenyl side groups. This and other electronic-structure results deduced from the spectral decompositions have been qualitatively obtained with the anionic approximation to potassium-rubrene complexes. The qualitative validity of the anionic approximation is thus verified. Second, comparison of the theoretical PES with the experiments shows that the full-scale simulations combined with the PES modeling methods greatly enhance the agreement on spectral shapes over the anionic approximation. This agreement of the theoretical PES spectra with the experiments over the full valence-band can be regarded, to some extent, as a collective validation of the application of Koopmans' theorem for KS-DFT to valence-band PES, at least, for this hydrocarbon and its alkali-adsorbed complex. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Development and validation of a questionnaire to evaluate the factors influencing training transfer among nursing professionals.

    PubMed

    Bai, Yangjing; Li, Jiping; Bai, Yangjuan; Ma, Weiguang; Yang, Xiangyu; Ma, Fang

    2018-02-13

    Most organizations invest in people for training to improve human capital and maximize profitability. Yet it is reported in industry and nursing as well that training effectiveness is constrained because of inadequate transfer of training and the underlying reasons for the transfer problem remain unknown. And there is lack of tool to measure transfer problem. The purpose of this study was to develop and validate a questionnaire to evaluate the scores of factors influencing training transfer (FITT) among nursing professionals. The questionnaire was developed by item generation through interview with nurses and literature review. The FITT was validated in terms of content validity through expert reviews. Psychometric properties of the final instrument were assessed in a sample of 960 nurses with training experiences. The content validity of the instrument were as follows: the IR was 0.8095. 51 items on the 63-item scale had I-CVIs of 1.0 and the remaining 12 items had I-CVIs of 0.88. The S-CVI/UA was 0.976 and the S-CVI/Ave was 0.977. For the exploratory step, principal axis factoring (PAF) was selected for this study. Parallel analysis was used to decide the number of factors to extract and oblimin rotation method was used. Exploratory factor analysis identified a five-factor solution including 53 items, accounting for 68.23% of the total variance. The confirmatory factor analysis showed some support for this five-factor model. The findings demonstrate high internal consistency (Cronbach's alpha = .965). This study indicates that the FITT is a valid and reliable instrument for assessing the factors influencing training transfer among nursing professionals. The FITT can be used to assess individual perceptions of catalysts and barriers to the transfer of training among nursing professionals, which can help promote training transfer and training effectiveness in the workplace.

  17. An Approach for Validating Actinide and Fission Product Burnup Credit Criticality Safety Analyses: Criticality (k eff) Predictions

    DOE PAGES

    Scaglione, John M.; Mueller, Don E.; Wagner, John C.

    2014-12-01

    One of the most important remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation—in particular, the availability and use of applicable measured data to support validation, especially for fission products (FPs). Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. In this study, this paper describes a validation approach for commercial spent nuclear fuel (SNF) criticality safety (k eff) evaluations based on best-available data andmore » methods and applies the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The criticality validation approach utilizes not only available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion program to support validation of the principal actinides but also calculated sensitivities, nuclear data uncertainties, and limited available FP LCE data to predict and verify individual biases for relevant minor actinides and FPs. The results demonstrate that (a) sufficient critical experiment data exist to adequately validate k eff calculations via conventional validation approaches for the primary actinides, (b) sensitivity-based critical experiment selection is more appropriate for generating accurate application model bias and uncertainty, and (c) calculated sensitivities and nuclear data uncertainties can be used for generating conservative estimates of bias for minor actinides and FPs. Results based on the SCALE 6.1 and the ENDF/B-VII.0 cross-section libraries indicate that a conservative estimate of the bias for the minor actinides and FPs is 1.5% of their worth within the application model. Finally, this paper provides a detailed description of the approach and its technical bases, describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models, and provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data.« less

  18. A regression-based 3-D shoulder rhythm.

    PubMed

    Xu, Xu; Lin, Jia-hua; McGorry, Raymond W

    2014-03-21

    In biomechanical modeling of the shoulder, it is important to know the orientation of each bone in the shoulder girdle when estimating the loads on each musculoskeletal element. However, because of the soft tissue overlying the bones, it is difficult to accurately derive the orientation of the clavicle and scapula using surface markers during dynamic movement. The purpose of this study is to develop two regression models which predict the orientation of the clavicle and the scapula. The first regression model uses humerus orientation and individual factors such as age, gender, and anthropometry data as the predictors. The second regression model includes only the humerus orientation as the predictor. Thirty-eight participants performed 118 static postures covering the volume of the right hand reach. The orientation of the thorax, clavicle, scapula and humerus were measured with a motion tracking system. Regression analysis was performed on the Euler angles decomposed from the orientation of each bone from 26 randomly selected participants. The regression models were then validated with the remaining 12 participants. The results indicate that for the first model, the r(2) of the predicted orientation of the clavicle and the scapula ranged between 0.31 and 0.65, and the RMSE obtained from the validation dataset ranged from 6.92° to 10.39°. For the second model, the r(2) ranged between 0.19 and 0.57, and the RMSE obtained from the validation dataset ranged from 6.62° and 11.13°. The derived regression-based shoulder rhythm could be useful in future biomechanical modeling of the shoulder. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Gastro-esophageal reflux disease symptoms and demographic factors as a pre-screening tool for Barrett's esophagus.

    PubMed

    Liu, Xinxue; Wong, Angela; Kadri, Sudarshan R; Corovic, Andrej; O'Donovan, Maria; Lao-Sirieix, Pierre; Lovat, Laurence B; Burnham, Rodney W; Fitzgerald, Rebecca C

    2014-01-01

    Barrett's esophagus (BE) occurs as consequence of reflux and is a risk factor for esophageal adenocarcinoma. The current "gold-standard" for diagnosing BE is endoscopy which remains prohibitively expensive and impractical as a population screening tool. We aimed to develop a pre-screening tool to aid decision making for diagnostic referrals. A prospective (training) cohort of 1603 patients attending for endoscopy was used for identification of risk factors to develop a risk prediction model. Factors associated with BE in the univariate analysis were selected to develop prediction models that were validated in an independent, external cohort of 477 non-BE patients referred for endoscopy with symptoms of reflux or dyspepsia. Two prediction models were developed separately for columnar lined epithelium (CLE) of any length and using a stricter definition of intestinal metaplasia (IM) with segments ≥ 2 cm with areas under the ROC curves (AUC) of 0.72 (95%CI: 0.67-0.77) and 0.81 (95%CI: 0.76-0.86), respectively. The two prediction models included demographics (age, sex), symptoms (heartburn, acid reflux, chest pain, abdominal pain) and medication for "stomach" symptoms. These two models were validated in the independent cohort with AUCs of 0.61 (95%CI: 0.54-0.68) and 0.64 (95%CI: 0.52-0.77) for CLE and IM ≥ 2 cm, respectively. We have identified and validated two prediction models for CLE and IM ≥ 2 cm. Both models have fair prediction accuracies and can select out around 20% of individuals unlikely to benefit from investigation for Barrett's esophagus. Such prediction models have the potential to generate useful cost-savings for BE screening among the symptomatic population.

  20. An Empirical Comparison of Different Models of Active Aging in Canada: The International Mobility in Aging Study.

    PubMed

    Bélanger, Emmanuelle; Ahmed, Tamer; Filiatrault, Johanne; Yu, Hsiu-Ting; Zunzunegui, Maria Victoria

    2017-04-01

    Active aging is a concept that lacks consensus. The WHO defines it as a holistic concept that encompasses the overall health, participation, and security of older adults. Fernández-Ballesteros and colleagues propose a similar concept but omit security and include mood and cognitive function. To date, researchers attempting to validate conceptual models of active aging have obtained mixed results. The goal of this study was to examine the validity of existing models of active aging with epidemiological data from Canada. The WHO model of active aging and the psychological model of active aging developed by Fernández-Ballesteros and colleagues were tested with confirmatory factor analysis. The data used included 799 community-dwelling older adults between 65 and 74 years old, recruited from the patient lists of family physicians in Saint-Hyacinthe, Quebec and Kingston, Ontario. Neither model could be validated in the sample of Canadian older adults. Although a concept of healthy aging can be modeled adequately, social participation and security did not fit a latent factor model. A simple binary index indicated that 27% of older adults in the sample did not meet the active aging criteria proposed by the WHO. Our results suggest that active aging might represent a human rights policy orientation rather than an empirical measurement tool to guide research among older adult populations. Binary indexes of active aging may serve to highlight what remains to be improved about the health, participation, and security of growing populations of older adults. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Dynamic vehicle-track interaction in switches and crossings and the influence of rail pad stiffness - field measurements and validation of a simulation model

    NASA Astrophysics Data System (ADS)

    Pålsson, Björn A.; Nielsen, Jens C. O.

    2015-06-01

    A model for simulation of dynamic interaction between a railway vehicle and a turnout (switch and crossing, S&C) is validated versus field measurements. In particular, the implementation and accuracy of viscously damped track models with different complexities are assessed. The validation data come from full-scale field measurements of dynamic track stiffness and wheel-rail contact forces in a demonstrator turnout that was installed as part of the INNOTRACK project with funding from the European Union Sixth Framework Programme. Vertical track stiffness at nominal wheel loads, in the frequency range up to 20 Hz, was measured using a rolling stiffness measurement vehicle (RSMV). Vertical and lateral wheel-rail contact forces were measured by an instrumented wheel set mounted in a freight car featuring Y25 bogies. The measurements were performed for traffic in both the through and diverging routes, and in the facing and trailing moves. The full set of test runs was repeated with different types of rail pad to investigate the influence of rail pad stiffness on track stiffness and contact forces. It is concluded that impact loads on the crossing can be reduced by using more resilient rail pads. To allow for vehicle dynamics simulations at low computational cost, the track models are discretised space-variant mass-spring-damper models that are moving with each wheel set of the vehicle model. Acceptable agreement between simulated and measured vertical contact forces at the crossing can be obtained when the standard GENSYS track model is extended with one ballast/subgrade mass under each rail. This model can be tuned to capture the large phase delay in dynamic track stiffness at low frequencies, as measured by the RSMV, while remaining sufficiently resilient at higher frequencies.

  2. A Confirmatory Factor Analysis of the Student Evidence-Based Practice Questionnaire (S-EBPQ) in an Australian sample.

    PubMed

    Beccaria, Lisa; Beccaria, Gavin; McCosker, Catherine

    2018-03-01

    It is crucial that nursing students develop skills and confidence in using Evidence-Based Practice principles early in their education. This should be assessed with valid tools however, to date, few measures have been developed and applied to the student population. To examine the structural validity of the Student Evidence-Based Practice Questionnaire (S-EBPQ), with an Australian online nursing student cohort. A cross-sectional study for constructing validity. Three hundred and forty-five undergraduate nursing students from an Australian regional university were recruited across two semesters. Confirmatory Factor Analysis was used to examine the structural validity. Confirmatory Factor Analysis was applied which resulted in a good fitting model, based on a revised 20-item tool. The S-EBPQ tool remains a psychometrically robust measure of evidence-based practice use, attitudes, and knowledge and skills and can be applied in an online Australian student context. The findings of this study provided further evidence of the reliability and four factor structure of the S-EBPQ. Opportunities for further refinement of the tool may result in improvements in structural validity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Developing a dengue forecast model using machine learning: A case study in China

    PubMed Central

    Zhang, Qin; Wang, Li; Xiao, Jianpeng; Zhang, Qingying; Luo, Ganfeng; Li, Zhihao; He, Jianfeng; Zhang, Yonghui; Ma, Wenjun

    2017-01-01

    Background In China, dengue remains an important public health issue with expanded areas and increased incidence recently. Accurate and timely forecasts of dengue incidence in China are still lacking. We aimed to use the state-of-the-art machine learning algorithms to develop an accurate predictive model of dengue. Methodology/Principal findings Weekly dengue cases, Baidu search queries and climate factors (mean temperature, relative humidity and rainfall) during 2011–2014 in Guangdong were gathered. A dengue search index was constructed for developing the predictive models in combination with climate factors. The observed year and week were also included in the models to control for the long-term trend and seasonality. Several machine learning algorithms, including the support vector regression (SVR) algorithm, step-down linear regression model, gradient boosted regression tree algorithm (GBM), negative binomial regression model (NBM), least absolute shrinkage and selection operator (LASSO) linear regression model and generalized additive model (GAM), were used as candidate models to predict dengue incidence. Performance and goodness of fit of the models were assessed using the root-mean-square error (RMSE) and R-squared measures. The residuals of the models were examined using the autocorrelation and partial autocorrelation function analyses to check the validity of the models. The models were further validated using dengue surveillance data from five other provinces. The epidemics during the last 12 weeks and the peak of the 2014 large outbreak were accurately forecasted by the SVR model selected by a cross-validation technique. Moreover, the SVR model had the consistently smallest prediction error rates for tracking the dynamics of dengue and forecasting the outbreaks in other areas in China. Conclusion and significance The proposed SVR model achieved a superior performance in comparison with other forecasting techniques assessed in this study. The findings can help the government and community respond early to dengue epidemics. PMID:29036169

  4. Confirmatory factor analysis of the female sexual function index.

    PubMed

    Opperman, Emily A; Benson, Lindsay E; Milhausen, Robin R

    2013-01-01

    The Female Sexual Functioning Index (Rosen et al., 2000 ) was designed to assess the key dimensions of female sexual functioning using six domains: desire, arousal, lubrication, orgasm, satisfaction, and pain. A full-scale score was proposed to represent women's overall sexual function. The fifth revision to the Diagnostic and Statistical Manual (DSM) is currently underway and includes a proposal to combine desire and arousal problems. The objective of this article was to evaluate and compare four models of the Female Sexual Functioning Index: (a) single-factor model, (b) six-factor model, (c) second-order factor model, and (4) five-factor model combining the desire and arousal subscales. Cross-sectional and observational data from 85 women were used to conduct a confirmatory factor analysis on the Female Sexual Functioning Index. Local and global goodness-of-fit measures, the chi-square test of differences, squared multiple correlations, and regression weights were used. The single-factor model fit was not acceptable. The original six-factor model was confirmed, and good model fit was found for the second-order and five-factor models. Delta chi-square tests of differences supported best fit for the six-factor model validating usage of the six domains. However, when revisions are made to the DSM-5, the Female Sexual Functioning Index can adapt to reflect these changes and remain a valid assessment tool for women's sexual functioning, as the five-factor structure was also supported.

  5. Towards Personalized Cardiology: Multi-Scale Modeling of the Failing Heart

    PubMed Central

    Amr, Ali; Neumann, Dominik; Georgescu, Bogdan; Seegerer, Philipp; Kamen, Ali; Haas, Jan; Frese, Karen S.; Irawati, Maria; Wirsz, Emil; King, Vanessa; Buss, Sebastian; Mereles, Derliz; Zitron, Edgar; Keller, Andreas; Katus, Hugo A.; Comaniciu, Dorin; Meder, Benjamin

    2015-01-01

    Background Despite modern pharmacotherapy and advanced implantable cardiac devices, overall prognosis and quality of life of HF patients remain poor. This is in part due to insufficient patient stratification and lack of individualized therapy planning, resulting in less effective treatments and a significant number of non-responders. Methods and Results State-of-the-art clinical phenotyping was acquired, including magnetic resonance imaging (MRI) and biomarker assessment. An individualized, multi-scale model of heart function covering cardiac anatomy, electrophysiology, biomechanics and hemodynamics was estimated using a robust framework. The model was computed on n=46 HF patients, showing for the first time that advanced multi-scale models can be fitted consistently on large cohorts. Novel multi-scale parameters derived from the model of all cases were analyzed and compared against clinical parameters, cardiac imaging, lab tests and survival scores to evaluate the explicative power of the model and its potential for better patient stratification. Model validation was pursued by comparing clinical parameters that were not used in the fitting process against model parameters. Conclusion This paper illustrates how advanced multi-scale models can complement cardiovascular imaging and how they could be applied in patient care. Based on obtained results, it becomes conceivable that, after thorough validation, such heart failure models could be applied for patient management and therapy planning in the future, as we illustrate in one patient of our cohort who received CRT-D implantation. PMID:26230546

  6. Development and validation of response markers to predict survival and pleurodesis success in patients with malignant pleural effusion (PROMISE): a multicohort analysis.

    PubMed

    Psallidas, Ioannis; Kanellakis, Nikolaos I; Gerry, Stephen; Thézénas, Marie Laëtitia; Charles, Philip D; Samsonova, Anastasia; Schiller, Herbert B; Fischer, Roman; Asciak, Rachelle; Hallifax, Robert J; Mercer, Rachel; Dobson, Melissa; Dong, Tao; Pavord, Ian D; Collins, Gary S; Kessler, Benedikt M; Pass, Harvey I; Maskell, Nick; Stathopoulos, Georgios T; Rahman, Najib M

    2018-06-13

    The prevalence of malignant pleural effusion is increasing worldwide, but prognostic biomarkers to plan treatment and to understand the underlying mechanisms of disease progression remain unidentified. The PROMISE study was designed with the objectives to discover, validate, and prospectively assess biomarkers of survival and pleurodesis response in malignant pleural effusion and build a score that predicts survival. In this multicohort study, we used five separate and independent datasets from randomised controlled trials to investigate potential biomarkers of survival and pleurodesis. Mass spectrometry-based discovery was used to investigate pleural fluid samples for differential protein expression in patients from the discovery group with different survival and pleurodesis outcomes. Clinical, radiological, and biological variables were entered into least absolute shrinkage and selection operator regression to build a model that predicts 3-month mortality. We evaluated the model using internal and external validation. 17 biomarker candidates of survival and seven of pleurodesis were identified in the discovery dataset. Three independent datasets (n=502) were used for biomarker validation. All pleurodesis biomarkers failed, and gelsolin, macrophage migration inhibitory factor, versican, and tissue inhibitor of metalloproteinases 1 (TIMP1) emerged as accurate predictors of survival. Eight variables (haemoglobin, C-reactive protein, white blood cell count, Eastern Cooperative Oncology Group performance status, cancer type, pleural fluid TIMP1 concentrations, and previous chemotherapy or radiotherapy) were validated and used to develop a survival score. Internal validation with bootstrap resampling and external validation with 162 patients from two independent datasets showed good discrimination (C statistic values of 0·78 [95% CI 0·72-0·83] for internal validation and 0·89 [0·84-0·93] for external validation of the clinical PROMISE score). To our knowledge, the PROMISE score is the first prospectively validated prognostic model for malignant pleural effusion that combines biological and clinical parameters to accurately estimate 3-month mortality. It is a robust, clinically relevant prognostic score that can be applied immediately, provide important information on patient prognosis, and guide the selection of appropriate management strategies. European Respiratory Society, Medical Research Funding-University of Oxford, Slater & Gordon Research Fund, and Oxfordshire Health Services Research Committee Research Grants. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Integrated Approach to Inform the New York City Water Supply System Coupling SAR Remote Sensing Observations and the SWAT Watershed Model

    NASA Astrophysics Data System (ADS)

    Tesser, D.; Hoang, L.; McDonald, K. C.

    2017-12-01

    Efforts to improve municipal water supply systems increasingly rely on an ability to elucidate variables that drive hydrologic dynamics within large watersheds. However, fundamental model variables such as precipitation, soil moisture, evapotranspiration, and soil freeze/thaw state remain difficult to measure empirically across large, heterogeneous watersheds. Satellite remote sensing presents a method to validate these spatially and temporally dynamic variables as well as better inform the watershed models that monitor the water supply for many of the planet's most populous urban centers. PALSAR 2 L-band, Sentinel 1 C-band, and SMAP L-band scenes covering the Cannonsville branch of the New York City (NYC) water supply watershed were obtained for the period of March 2015 - October 2017. The SAR data provides information on soil moisture, free/thaw state, seasonal surface inundation, and variable source areas within the study site. Integrating the remote sensing products with watershed model outputs and ground survey data improves the representation of related processes in the Soil and Water Assessment Tool (SWAT) utilized to monitor the NYC water supply. PALSAR 2 supports accurate mapping of the extent of variable source areas while Sentinel 1 presents a method to model the timing and magnitude of snowmelt runoff events. SMAP Active Radar soil moisture product directly validates SWAT outputs at the subbasin level. This blended approach verifies the distribution of soil wetness classes within the watershed that delineate Hydrologic Response Units (HRUs) in the modified SWAT-Hillslope. The research expands the ability to model the NYC water supply source beyond a subset of the watershed while also providing high resolution information across a larger spatial scale. The global availability of these remote sensing products provides a method to capture fundamental hydrology variables in regions where current modeling efforts and in situ data remain limited.

  8. Using Resin-Based 3D Printing to Build Geometrically Accurate Proxies of Porous Sedimentary Rocks.

    PubMed

    Ishutov, Sergey; Hasiuk, Franciszek J; Jobe, Dawn; Agar, Susan

    2018-05-01

    Three-dimensional (3D) printing is capable of transforming intricate digital models into tangible objects, allowing geoscientists to replicate the geometry of 3D pore networks of sedimentary rocks. We provide a refined method for building scalable pore-network models ("proxies") using stereolithography 3D printing that can be used in repeated flow experiments (e.g., core flooding, permeametry, porosimetry). Typically, this workflow involves two steps, model design and 3D printing. In this study, we explore how the addition of post-processing and validation can reduce uncertainty in the 3D-printed proxy accuracy (difference of proxy geometry from the digital model). Post-processing is a multi-step cleaning of porous proxies involving pressurized ethanol flushing and oven drying. Proxies are validated by: (1) helium porosimetry and (2) digital measurements of porosity from thin-section images of 3D-printed proxies. 3D printer resolution was determined by measuring the smallest open channel in 3D-printed "gap test" wafers. This resolution (400 µm) was insufficient to build porosity of Fontainebleau sandstone (∼13%) from computed tomography data at the sample's natural scale, so proxies were printed at 15-, 23-, and 30-fold magnifications to validate the workflow. Helium porosities of the 3D-printed proxies differed from digital calculations by up to 7% points. Results improved after pressurized flushing with ethanol (e.g., porosity difference reduced to ∼1% point), though uncertainties remain regarding the nature of sub-micron "artifact" pores imparted by the 3D printing process. This study shows the benefits of including post-processing and validation in any workflow to produce porous rock proxies. © 2017, National Ground Water Association.

  9. Validation of landsurface processes in the AMIP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, T J

    The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less

  10. Landslide susceptibility mapping at Hoa Binh province (Vietnam) using an adaptive neuro-fuzzy inference system and GIS

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Pradhan, Biswajeet; Lofman, Owe; Revhaug, Inge; Dick, Oystein B.

    2012-08-01

    The objective of this study is to investigate a potential application of the Adaptive Neuro-Fuzzy Inference System (ANFIS) and the Geographic Information System (GIS) as a relatively new approach for landslide susceptibility mapping in the Hoa Binh province of Vietnam. Firstly, a landslide inventory map with a total of 118 landslide locations was constructed from various sources. Then the landslide inventory was randomly split into a testing dataset 70% (82 landslide locations) for training the models and the remaining 30% (36 landslides locations) was used for validation purpose. Ten landslide conditioning factors such as slope, aspect, curvature, lithology, land use, soil type, rainfall, distance to roads, distance to rivers, and distance to faults were considered in the analysis. The hybrid learning algorithm and six different membership functions (Gaussmf, Gauss2mf, Gbellmf, Sigmf, Dsigmf, Psigmf) were applied to generate the landslide susceptibility maps. The validation dataset, which was not considered in the ANFIS modeling process, was used to validate the landslide susceptibility maps using the prediction rate method. The validation results showed that the area under the curve (AUC) for six ANFIS models vary from 0.739 to 0.848. It indicates that the prediction capability depends on the membership functions used in the ANFIS. The models with Sigmf (0.848) and Gaussmf (0.825) have shown the highest prediction capability. The results of this study show that landslide susceptibility mapping in the Hoa Binh province of Vietnam using the ANFIS approach is viable. As far as the performance of the ANFIS approach is concerned, the results appeared to be quite satisfactory, the zones determined on the map being zones of relative susceptibility.

  11. Surgical simulation: a urological perspective.

    PubMed

    Wignall, Geoffrey R; Denstedt, John D; Preminger, Glenn M; Cadeddu, Jeffrey A; Pearle, Margaret S; Sweet, Robert M; McDougall, Elspeth M

    2008-05-01

    Surgical education is changing rapidly as several factors including budget constraints and medicolegal concerns limit opportunities for urological trainees. New methods of skills training such as low fidelity bench trainers and virtual reality simulators offer new avenues for surgical education. In addition, surgical simulation has the potential to allow practicing surgeons to develop new skills and maintain those they already possess. We provide a review of the background, current status and future directions of surgical simulators as they pertain to urology. We performed a literature review and an overview of surgical simulation in urology. Surgical simulators are in various stages of development and validation. Several simulators have undergone extensive validation studies and are in use in surgical curricula. While virtual reality simulators offer the potential to more closely mimic reality and present entire operations, low fidelity simulators remain useful in skills training, particularly for novices and junior trainees. Surgical simulation remains in its infancy. However, the potential to shorten learning curves for difficult techniques and practice surgery without risk to patients continues to drive the development of increasingly more advanced and realistic models. Surgical simulation is an exciting area of surgical education. The future is bright as advancements in computing and graphical capabilities offer new innovations in simulator technology. Simulators must continue to undergo rigorous validation studies to ensure that time spent by trainees on bench trainers and virtual reality simulators will translate into improved surgical skills in the operating room.

  12. 28 CFR 25.5 - Validation and data integrity of records in the system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Validation and data integrity of records... verify that the information provided to the NICS Index remains valid and correct. (b) Each data source...

  13. Global parameterization and validation of a two-leaf light use efficiency model for predicting gross primary production across FLUXNET sites: TL-LUE Parameterization and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Yanlian; Wu, Xiaocui; Ju, Weimin

    2016-04-01

    We present the first extended validation of satellitemicrowave (MW) liquidwater path (LWP) for low nonprecipitating clouds, from four operational sensors, against ship-borne observations from a three-channel MW radiometer collected along ship transects over the northeast Pacific during May–August 2013. Satellite MW retrievals have an overall correlation of 0.84 with ship observations and a bias of 9.3 g/m2. The bias for broken cloud scenes increases linearly with water vapor path and remains below 17.7 g/m2. In contrast, satelliteMWLWP is unbiased in overcast scenes with correlations up to 0.91, demonstrating that the retrievals are accurate and reliable under these conditions. Satellite MWmore » retrievals produce a diurnal cycle amplitude consistent with ship-based observations (33 g/m2). Observations taken aboard extended ship cruises to evaluate not only satellite MW LWP but also LWP derived from visible/infrared sensors offer a new way to validate this important property over vast oceanic regions.« less

  14. Artificial neural networks predict the incidence of portosplenomesenteric venous thrombosis in patients with acute pancreatitis.

    PubMed

    Fei, Y; Hu, J; Li, W-Q; Wang, W; Zong, G-Q

    2017-03-01

    Essentials Predicting the occurrence of portosplenomesenteric vein thrombosis (PSMVT) is difficult. We studied 72 patients with acute pancreatitis. Artificial neural networks modeling was more accurate than logistic regression in predicting PSMVT. Additional predictive factors may be incorporated into artificial neural networks. Objective To construct and validate artificial neural networks (ANNs) for predicting the occurrence of portosplenomesenteric venous thrombosis (PSMVT) and compare the predictive ability of the ANNs with that of logistic regression. Methods The ANNs and logistic regression modeling were constructed using simple clinical and laboratory data of 72 acute pancreatitis (AP) patients. The ANNs and logistic modeling were first trained on 48 randomly chosen patients and validated on the remaining 24 patients. The accuracy and the performance characteristics were compared between these two approaches by SPSS17.0 software. Results The training set and validation set did not differ on any of the 11 variables. After training, the back propagation network training error converged to 1 × 10 -20 , and it retained excellent pattern recognition ability. When the ANNs model was applied to the validation set, it revealed a sensitivity of 80%, specificity of 85.7%, a positive predictive value of 77.6% and negative predictive value of 90.7%. The accuracy was 83.3%. Differences could be found between ANNs modeling and logistic regression modeling in these parameters (10.0% [95% CI, -14.3 to 34.3%], 14.3% [95% CI, -8.6 to 37.2%], 15.7% [95% CI, -9.9 to 41.3%], 11.8% [95% CI, -8.2 to 31.8%], 22.6% [95% CI, -1.9 to 47.1%], respectively). When ANNs modeling was used to identify PSMVT, the area under receiver operating characteristic curve was 0.849 (95% CI, 0.807-0.901), which demonstrated better overall properties than logistic regression modeling (AUC = 0.716) (95% CI, 0.679-0.761). Conclusions ANNs modeling was a more accurate tool than logistic regression in predicting the occurrence of PSMVT following AP. More clinical factors or biomarkers may be incorporated into ANNs modeling to improve its predictive ability. © 2016 International Society on Thrombosis and Haemostasis.

  15. Creation of a novel simulator for minimally invasive neurosurgery: fusion of 3D printing and special effects.

    PubMed

    Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R

    2017-07-01

    OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A combination of 3D printing technology and casting processes led to the creation of realistic surgical models that include high-fidelity reproductions of the anatomical features of hydrocephalus and allow for the performance of ETV for training purposes. The models reproduced the pulsations of the basilar artery, ventricles, and cerebrospinal fluid (CSF), thus simulating the experience of performing ETV on an actual patient. The results of the 14-item questionnaire showed limited variability among participants' scores, and the neurosurgery fellows and residents gave the models consistently high ratings for face and content validity. The mean score for the content validity questions (4.88) was higher than the mean score for face validity (4.69) (p = 0.03). On construct validity scores, the blinded observers rated performance of fellows significantly higher than that of residents, indicating that the model provided a means to distinguish between novice and expert surgical skills. CONCLUSIONS A plug-and-play lifelike ETV training model was developed through a combination of 3D printing and special effects techniques, providing both anatomical and haptic accuracy. Such simulators offer opportunities to accelerate the development of expertise with respect to new and novel procedures as well as iterate new surgical approaches and innovations, thus allowing novice neurosurgeons to gain valuable experience in surgical techniques without exposing patients to risk of harm.

  16. Biomechanical implications of lumbar spinal ligament transection.

    PubMed

    Von Forell, Gregory A; Bowden, Anton E

    2014-11-01

    Many lumbar spine surgeries either intentionally or inadvertently damage or transect spinal ligaments. The purpose of this work was to quantify the previously unknown biomechanical consequences of isolated spinal ligament transection on the remaining spinal ligaments (stress transfer), vertebrae (bone remodelling stimulus) and intervertebral discs (disc pressure) of the lumbar spine. A finite element model of the full lumbar spine was developed and validated against experimental data and tested in the primary modes of spinal motion in the intact condition. Once a ligament was removed, stress increased in the remaining spinal ligaments and changes occurred in vertebral strain energy, but disc pressure remained similar. All major biomechanical changes occurred at the same spinal level as the transected ligament, with minor changes at adjacent levels. This work demonstrates that iatrogenic damage to spinal ligaments disturbs the load sharing within the spinal ligament network and may induce significant clinically relevant changes in the spinal motion segment.

  17. Is questionnaire-based sitting time inaccurate and can it be improved? A cross-sectional investigation using accelerometer-based sitting time.

    PubMed

    Gupta, Nidhi; Christiansen, Caroline Stordal; Hanisch, Christiana; Bay, Hans; Burr, Hermann; Holtermann, Andreas

    2017-01-16

    To investigate the differences between a questionnaire-based and accelerometer-based sitting time, and develop a model for improving the accuracy of questionnaire-based sitting time for predicting accelerometer-based sitting time. 183 workers in a cross-sectional study reported sitting time per day using a single question during the measurement period, and wore 2 Actigraph GT3X+ accelerometers on the thigh and trunk for 1-4 working days to determine their actual sitting time per day using the validated Acti4 software. Least squares regression models were fitted with questionnaire-based siting time and other self-reported predictors to predict accelerometer-based sitting time. Questionnaire-based and accelerometer-based average sitting times were ≈272 and ≈476 min/day, respectively. A low Pearson correlation (r=0.32), high mean bias (204.1 min) and wide limits of agreement (549.8 to -139.7 min) between questionnaire-based and accelerometer-based sitting time were found. The prediction model based on questionnaire-based sitting explained 10% of the variance in accelerometer-based sitting time. Inclusion of 9 self-reported predictors in the model increased the explained variance to 41%, with 10% optimism using a resampling bootstrap validation. Based on a split validation analysis, the developed prediction model on ≈75% of the workers (n=132) reduced the mean and the SD of the difference between questionnaire-based and accelerometer-based sitting time by 64% and 42%, respectively, in the remaining 25% of the workers. This study indicates that questionnaire-based sitting time has low validity and that a prediction model can be one solution to materially improve the precision of questionnaire-based sitting time. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. Rat Genome and Model Resources.

    PubMed

    Shimoyama, Mary; Smith, Jennifer R; Bryda, Elizabeth; Kuramoto, Takashi; Saba, Laura; Dwinell, Melinda

    2017-07-01

    Rats remain a major model for studying disease mechanisms and discovery, validation, and testing of new compounds to improve human health. The rat's value continues to grow as indicated by the more than 1.4 million publications (second to human) at PubMed documenting important discoveries using this model. Advanced sequencing technologies, genome modification techniques, and the development of embryonic stem cell protocols ensure the rat remains an important mammalian model for disease studies. The 2004 release of the reference genome has been followed by the production of complete genomes for more than two dozen individual strains utilizing NextGen sequencing technologies; their analyses have identified over 80 million variants. This explosion in genomic data has been accompanied by the ability to selectively edit the rat genome, leading to hundreds of new strains through multiple technologies. A number of resources have been developed to provide investigators with access to precision rat models, comprehensive datasets, and sophisticated software tools necessary for their research. Those profiled here include the Rat Genome Database, PhenoGen, Gene Editing Rat Resource Center, Rat Resource and Research Center, and the National BioResource Project for the Rat in Japan. © The Author 2017. Published by Oxford University Press.

  19. Investigating the underlying mechanisms of aberrant behaviors in bipolar disorder from patients to models

    PubMed Central

    van Enkhuizen, Jordy; Geyer, Mark A.; Minassian, Arpi; Perry, William; Henry, Brook L.; Young, Jared W.

    2015-01-01

    Psychiatric patients with bipolar disorder suffer from states of depression and mania, during which a variety of symptoms are present. Current treatments are limited and neurocognitive deficits in particular often remain untreated. Targeted therapies based on the biological mechanisms of bipolar disorder could fill this gap and benefit patients and their families. Developing targeted therapies would benefit from appropriate animal models which are challenging to establish, but remain a vital tool. In this review, we summarize approaches to create a valid model relevant to bipolar disorder. We focus on studies that use translational tests of multivariate exploratory behavior, sensorimotor gating, decision-making under risk, and attentional functioning to discover profiles that are consistent between patients and rodent models. Using this battery of translational tests, similar behavior profiles in bipolar mania patients and mice with reduced dopamine transporter activity have been identified. Future investigations should combine other animal models that are biologically relevant to the neuropsychiatric disorder with translational behavioral assessment as outlined here. This methodology can be utilized to develop novel targeted therapies that relieve symptoms for more patients without common side effects caused by current treatments. PMID:26297513

  20. Zebrafish xenograft models of cancer and metastasis for drug discovery.

    PubMed

    Brown, Hannah K; Schiavone, Kristina; Tazzyman, Simon; Heymann, Dominique; Chico, Timothy Ja

    2017-04-01

    Patients with metastatic cancer suffer the highest rate of cancer-related death, but existing animal models of metastasis have disadvantages that limit our ability to understand this process. The zebrafish is increasingly used for cancer modelling, particularly xenografting of human cancer cell lines, and drug discovery, and may provide novel scientific and therapeutic insights. However, this model system remains underexploited. Areas covered: The authors discuss the advantages and disadvantages of the zebrafish xenograft model for the study of cancer, metastasis and drug discovery. They summarise previous work investigating the metastatic cascade, such as tumour-induced angiogenesis, intravasation, extravasation, dissemination and homing, invasion at secondary sites, assessing metastatic potential and evaluation of cancer stem cells in zebrafish. Expert opinion: The practical advantages of zebrafish for basic biological study and drug discovery are indisputable. However, their ability to sufficiently reproduce and predict the behaviour of human cancer and metastasis remains unproven. For this to be resolved, novel mechanisms must to be discovered in zebrafish that are subsequently validated in humans, and for therapeutic interventions that modulate cancer favourably in zebrafish to successfully translate to human clinical studies. In the meantime, more work is required to establish the most informative methods in zebrafish.

  1. Raman spectroscopy-based screening of IgM positive and negative sera for dengue virus infection

    NASA Astrophysics Data System (ADS)

    Bilal, M.; Saleem, M.; Bilal, Maria; Ijaz, T.; Khan, Saranjam; Ullah, Rahat; Raza, A.; Khurram, M.; Akram, W.; Ahmed, M.

    2016-11-01

    A statistical method based on Raman spectroscopy for the screening of immunoglobulin M (IgM) in dengue virus (DENV) infected human sera is presented. In total, 108 sera samples were collected and their antibody indexes (AI) for IgM were determined through enzyme-linked immunosorbent assay (ELISA). Raman spectra of these samples were acquired using a 785 nm wavelength excitation laser. Seventy-eight Raman spectra were selected randomly and unbiasedly for the development of a statistical model using partial least square (PLS) regression, while the remaining 30 were used for testing the developed model. An R-square (r 2) value of 0.929 was determined using the leave-one-sample-out (LOO) cross validation method, showing the validity of this model. It considers all molecular changes related to IgM concentration, and describes their role in infection. A graphical user interface (GUI) platform has been developed to run a developed multivariate model for the prediction of AI of IgM for blindly tested samples, and an excellent agreement has been found between model predicted and clinically determined values. Parameters like sensitivity, specificity, accuracy, and area under receiver operator characteristic (ROC) curve for these tested samples are also reported to visualize model performance.

  2. Finite element strategies to satisfy clinical and engineering requirements in the field of percutaneous valves.

    PubMed

    Capelli, Claudio; Biglino, Giovanni; Petrini, Lorenza; Migliavacca, Francesco; Cosentino, Daria; Bonhoeffer, Philipp; Taylor, Andrew M; Schievano, Silvia

    2012-12-01

    Finite element (FE) modelling can be a very resourceful tool in the field of cardiovascular devices. To ensure result reliability, FE models must be validated experimentally against physical data. Their clinical application (e.g., patients' suitability, morphological evaluation) also requires fast simulation process and access to results, while engineering applications need highly accurate results. This study shows how FE models with different mesh discretisations can suit clinical and engineering requirements for studying a novel device designed for percutaneous valve implantation. Following sensitivity analysis and experimental characterisation of the materials, the stent-graft was first studied in a simplified geometry (i.e., compliant cylinder) and validated against in vitro data, and then in a patient-specific implantation site (i.e., distensible right ventricular outflow tract). Different meshing strategies using solid, beam and shell elements were tested. Results showed excellent agreement between computational and experimental data in the simplified implantation site. Beam elements were found to be convenient for clinical applications, providing reliable results in less than one hour in a patient-specific anatomical model. Solid elements remain the FE choice for engineering applications, albeit more computationally expensive (>100 times). This work also showed how information on device mechanical behaviour differs when acquired in a simplified model as opposed to a patient-specific model.

  3. Model-based prognostics for batteries which estimates useful life and uses a probability density function

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor)

    2012-01-01

    This invention develops a mathematical model to describe battery behavior during individual discharge cycles as well as over its cycle life. The basis for the form of the model has been linked to the internal processes of the battery and validated using experimental data. Effects of temperature and load current have also been incorporated into the model. Subsequently, the model has been used in a Particle Filtering framework to make predictions of remaining useful life for individual discharge cycles as well as for cycle life. The prediction performance was found to be satisfactory as measured by performance metrics customized for prognostics for a sample case. The work presented here provides initial steps towards a comprehensive health management solution for energy storage devices.

  4. Diagnosing the impact of alternative calibration strategies on coupled hydrologic models

    NASA Astrophysics Data System (ADS)

    Smith, T. J.; Perera, C.; Corrigan, C.

    2017-12-01

    Hydrologic models represent a significant tool for understanding, predicting, and responding to the impacts of water on society and society on water resources and, as such, are used extensively in water resources planning and management. Given this important role, the validity and fidelity of hydrologic models is imperative. While extensive focus has been paid to improving hydrologic models through better process representation, better parameter estimation, and better uncertainty quantification, significant challenges remain. In this study, we explore a number of competing model calibration scenarios for simple, coupled snowmelt-runoff models to better understand the sensitivity / variability of parameterizations and its impact on model performance, robustness, fidelity, and transferability. Our analysis highlights the sensitivity of coupled snowmelt-runoff model parameterizations to alterations in calibration approach, underscores the concept of information content in hydrologic modeling, and provides insight into potential strategies for improving model robustness / fidelity.

  5. A network model of genomic hormone interactions underlying dementia and its translational validation through serendipitous off-target effect

    PubMed Central

    2013-01-01

    Background While the majority of studies have focused on the association between sex hormones and dementia, emerging evidence supports the role of other hormone signals in increasing dementia risk. However, due to the lack of an integrated view on mechanistic interactions of hormone signaling pathways associated with dementia, molecular mechanisms through which hormones contribute to the increased risk of dementia has remained unclear and capacity of translating hormone signals to potential therapeutic and diagnostic applications in relation to dementia has been undervalued. Methods Using an integrative knowledge- and data-driven approach, a global hormone interaction network in the context of dementia was constructed, which was further filtered down to a model of convergent hormone signaling pathways. This model was evaluated for its biological and clinical relevance through pathway recovery test, evidence-based analysis, and biomarker-guided analysis. Translational validation of the model was performed using the proposed novel mechanism discovery approach based on ‘serendipitous off-target effects’. Results Our results reveal the existence of a well-connected hormone interaction network underlying dementia. Seven hormone signaling pathways converge at the core of the hormone interaction network, which are shown to be mechanistically linked to the risk of dementia. Amongst these pathways, estrogen signaling pathway takes the major part in the model and insulin signaling pathway is analyzed for its association to learning and memory functions. Validation of the model through serendipitous off-target effects suggests that hormone signaling pathways substantially contribute to the pathogenesis of dementia. Conclusions The integrated network model of hormone interactions underlying dementia may serve as an initial translational platform for identifying potential therapeutic targets and candidate biomarkers for dementia-spectrum disorders such as Alzheimer’s disease. PMID:23885764

  6. Derivation and validation of a discharge disposition predicting model after acute stroke.

    PubMed

    Tseng, Hung-Pin; Lin, Feng-Jenq; Chen, Pi-Tzu; Mou, Chih-Hsin; Lee, Siu-Pak; Chang, Chun-Yuan; Chen, An-Chih; Liu, Chung-Hsiang; Yeh, Chung-Hsin; Tsai, Song-Yen; Hsiao, Yu-Jen; Lin, Ching-Huang; Hsu, Shih-Pin; Yu, Shih-Chieh; Hsu, Chung-Y; Sung, Fung-Chang

    2015-06-01

    Discharge disposition planning is vital for poststroke patients. We investigated clinical factors associated with discharging patients to nursing homes, using the Taiwan Stroke Registry data collected from 39 major hospitals. We randomly assigned 21,575 stroke inpatients registered from 2006 to 2008 into derivation and validation groups at a 3-to-1 ratio. We used the derivation group to develop a prediction model by measuring cumulative risk scores associated with potential predictors: age, sex, hypertension, diabetes mellitus, heart diseases, stroke history, snoring, main caregivers, stroke types, and National Institutes of Health Stroke Scale (NIHSS). Probability of nursing home care and odds ratio (OR) of nursing home care relative to home care by cumulative risk scores were measured for the prediction. The area under the receiver operating characteristic curve (AUROC) was used to assess the model discrimination against the validation group. Except for hypertension, all remaining potential predictors were significant independent predictors associated with stroke patient disposition to nursing home care after discharge from hospitals. The risk sharply increased with age and NIHSS. Patients with a cumulative risk score of 15 or more had an OR of 86.4 for the nursing home disposition. The AUROC plots showed similar areas under curves for the derivation group (.86, 95% confidence interval [CI], .85-.87) and for the validation group (.84, 95% CI, .83-.86). The cumulative risk score is an easy-to-estimate tool for preparing stroke patients and their family for disposition on discharge. Copyright © 2015 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  7. PASTIS: Bayesian extrasolar planet validation - I. General framework, models, and performance

    NASA Astrophysics Data System (ADS)

    Díaz, R. F.; Almenara, J. M.; Santerne, A.; Moutou, C.; Lethuillier, A.; Deleuil, M.

    2014-06-01

    A large fraction of the smallest transiting planet candidates discovered by the Kepler and CoRoT space missions cannot be confirmed by a dynamical measurement of the mass using currently available observing facilities. To establish their planetary nature, the concept of planet validation has been advanced. This technique compares the probability of the planetary hypothesis against that of all reasonably conceivable alternative false positive (FP) hypotheses. The candidate is considered as validated if the posterior probability of the planetary hypothesis is sufficiently larger than the sum of the probabilities of all FP scenarios. In this paper, we present PASTIS, the Planet Analysis and Small Transit Investigation Software, a tool designed to perform a rigorous model comparison of the hypotheses involved in the problem of planet validation, and to fully exploit the information available in the candidate light curves. PASTIS self-consistently models the transit light curves and follow-up observations. Its object-oriented structure offers a large flexibility for defining the scenarios to be compared. The performance is explored using artificial transit light curves of planets and FPs with a realistic error distribution obtained from a Kepler light curve. We find that data support the correct hypothesis strongly only when the signal is high enough (transit signal-to-noise ratio above 50 for the planet case) and remain inconclusive otherwise. PLAnetary Transits and Oscillations of stars (PLATO) shall provide transits with high enough signal-to-noise ratio, but to establish the true nature of the vast majority of Kepler and CoRoT transit candidates additional data or strong reliance on hypotheses priors is needed.

  8. Remaining lifetime modeling using State-of-Health estimation

    NASA Astrophysics Data System (ADS)

    Beganovic, Nejra; Söffker, Dirk

    2017-08-01

    Technical systems and system's components undergo gradual degradation over time. Continuous degradation occurred in system is reflected in decreased system's reliability and unavoidably lead to a system failure. Therefore, continuous evaluation of State-of-Health (SoH) is inevitable to provide at least predefined lifetime of the system defined by manufacturer, or even better, to extend the lifetime given by manufacturer. However, precondition for lifetime extension is accurate estimation of SoH as well as the estimation and prediction of Remaining Useful Lifetime (RUL). For this purpose, lifetime models describing the relation between system/component degradation and consumed lifetime have to be established. In this contribution modeling and selection of suitable lifetime models from database based on current SoH conditions are discussed. Main contribution of this paper is the development of new modeling strategies capable to describe complex relations between measurable system variables, related system degradation, and RUL. Two approaches with accompanying advantages and disadvantages are introduced and compared. Both approaches are capable to model stochastic aging processes of a system by simultaneous adaption of RUL models to current SoH. The first approach requires a priori knowledge about aging processes in the system and accurate estimation of SoH. An estimation of SoH here is conditioned by tracking actual accumulated damage into the system, so that particular model parameters are defined according to a priori known assumptions about system's aging. Prediction accuracy in this case is highly dependent on accurate estimation of SoH but includes high number of degrees of freedom. The second approach in this contribution does not require a priori knowledge about system's aging as particular model parameters are defined in accordance to multi-objective optimization procedure. Prediction accuracy of this model does not highly depend on estimated SoH. This model has lower degrees of freedom. Both approaches rely on previously developed lifetime models each of them corresponding to predefined SoH. Concerning first approach, model selection is aided by state-machine-based algorithm. In the second approach, model selection conditioned by tracking an exceedance of predefined thresholds is concerned. The approach is applied to data generated from tribological systems. By calculating Root Squared Error (RSE), Mean Squared Error (MSE), and Absolute Error (ABE) the accuracy of proposed models/approaches is discussed along with related advantages and disadvantages. Verification of the approach is done using cross-fold validation, exchanging training and test data. It can be stated that the newly introduced approach based on data (denoted as data-based or data-driven) parametric models can be easily established providing detailed information about remaining useful/consumed lifetime valid for systems with constant load but stochastically occurred damage.

  9. A new extranodal scoring system based on the prognostically relevant extranodal sites in diffuse large B-cell lymphoma, not otherwise specified treated with chemoimmunotherapy.

    PubMed

    Hwang, Hee Sang; Yoon, Dok Hyun; Suh, Cheolwon; Huh, Jooryung

    2016-08-01

    Extranodal involvement is a well-known prognostic factor in patients with diffuse large B-cell lymphomas (DLBCL). Nevertheless, the prognostic impact of the extranodal scoring system included in the conventional international prognostic index (IPI) has been questioned in an era where rituximab treatment has become widespread. We investigated the prognostic impacts of individual sites of extranodal involvement in 761 patients with DLBCL who received rituximab-based chemoimmunotherapy. Subsequently, we established a new extranodal scoring system based on extranodal sites, showing significant prognostic correlation, and compared this system with conventional scoring systems, such as the IPI and the National Comprehensive Cancer Network-IPI (NCCN-IPI). An internal validation procedure, using bootstrapped samples, was also performed for both univariate and multivariate models. Using multivariate analysis with a backward variable selection, we found nine extranodal sites (the liver, lung, spleen, central nervous system, bone marrow, kidney, skin, adrenal glands, and peritoneum) that remained significant for use in the final model. Our newly established extranodal scoring system, based on these sites, was better correlated with patient survival than standard scoring systems, such as the IPI and the NCCN-IPI. Internal validation by bootstrapping demonstrated an improvement in model performance of our modified extranodal scoring system. Our new extranodal scoring system, based on the prognostically relevant sites, may improve the performance of conventional prognostic models of DLBCL in the rituximab era and warrants further external validation using large study populations.

  10. Ozone delignification of pine and eucalyptus kraft pulps. 1: Kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simoes, R.M.S.; Castro, J.A.A.M.

    1999-12-01

    The kinetics of ozone delignification of unbleached pine and eucalyptus kraft pulps is studied at ultralow consistency in a stirred reactor. Ozone consumption was monitored with sensors located in both the liquid and gas phases of the reacting medium, and the results confirm the expectations, i.e., the very high oxidation rates. The experiments were carried out following two different approaches that give rise to very different ozone concentration profiles in the pulp suspension and to significant improvements in the statistical contents of the experimental data. In the development of the delignification model, special attention was paid to its validation andmore » thus different sets of data for training and validation were used, leading to high levels of confidence in the model. As far as the delignification is concerned, its rate can be described for both pulps by a pseudohomogeneous model with partial orders of 1 and 2 for ozone and lignin contents, respectively. However, the remaining parameters of the kinetic model are markedly different for the two pulps. The effect of temperature on the delignification rate is small and can be characterized by an activation energy close to 20 kJ/mol for both pulps.« less

  11. An externally validated model for predicting long-term survival after exercise treadmill testing in patients with suspected coronary artery disease and a normal electrocardiogram.

    PubMed

    Lauer, Michael S; Pothier, Claire E; Magid, David J; Smith, S Scott; Kattan, Michael W

    2007-12-18

    The exercise treadmill test is recommended for risk stratification among patients with intermediate to high pretest probability of coronary artery disease. Posttest risk stratification is based on the Duke treadmill score, which includes only functional capacity and measures of ischemia. To develop and externally validate a post-treadmill test, multivariable mortality prediction rule for adults with suspected coronary artery disease and normal electrocardiograms. Prospective cohort study conducted from September 1990 to May 2004. Exercise treadmill laboratories in a major medical center (derivation set) and a separate HMO (validation set). 33,268 patients in the derivation set and 5821 in the validation set. All patients had normal electrocardiograms and were referred for evaluation of suspected coronary artery disease. The derivation set patients were followed for a median of 6.2 years. A nomogram-illustrated model was derived on the basis of variables easily obtained in the stress laboratory, including age; sex; history of smoking, hypertension, diabetes, or typical angina; and exercise findings of functional capacity, ST-segment changes, symptoms, heart rate recovery, and frequent ventricular ectopy in recovery. The derivation data set included 1619 deaths. Although both the Duke treadmill score and our nomogram-illustrated model were significantly associated with death (P < 0.001), the nomogram was better at discrimination (concordance index for right-censored data, 0.83 vs. 0.73) and calibration. We reclassified many patients with intermediate- to high-risk Duke treadmill scores as low risk on the basis of the nomogram. The model also predicted 3-year mortality rates well in the validation set: Based on an optimal cut-point for a negative predictive value of 0.97, derivation and validation rates were, respectively, 1.7% and 2.5% below the cut-point and 25% and 29% above the cut-point. Blood test-based measures or left ventricular ejection fraction were not included. The nomogram can be applied only to patients with a normal electrocardiogram. Clinical utility remains to be tested. A simple nomogram based on easily obtained pretest and exercise test variables predicted all-cause mortality in adults with suspected coronary artery disease and normal electrocardiograms.

  12. Animal models of listeriosis: a comparative review of the current state of the art and lessons learned

    PubMed Central

    2012-01-01

    Listeriosis is a leading cause of hospitalization and death due to foodborne illness in the industrialized world. Animal models have played fundamental roles in elucidating the pathophysiology and immunology of listeriosis, and will almost certainly continue to be integral components of the research on listeriosis. Data derived from animal studies helped for example characterize the importance of cell-mediated immunity in controlling infection, allowed evaluation of chemotherapeutic treatments for listeriosis, and contributed to quantitative assessments of the public health risk associated with L. monocytogenes contaminated food commodities. Nonetheless, a number of pivotal questions remain unresolved, including dose-response relationships, which represent essential components of risk assessments. Newly emerging data about species-specific differences have recently raised concern about the validity of most traditional animal models of listeriosis. However, considerable uncertainty about the best choice of animal model remains. Here we review the available data on traditional and potential new animal models to summarize currently recognized strengths and limitations of each model. This knowledge is instrumental for devising future studies and for interpreting current data. We deliberately chose a historical, comparative and cross-disciplinary approach, striving to reveal clues that may help predict the ultimate value of each animal model in spite of incomplete data. PMID:22417207

  13. Animal models of listeriosis: a comparative review of the current state of the art and lessons learned.

    PubMed

    Hoelzer, Karin; Pouillot, Régis; Dennis, Sherri

    2012-03-14

    Listeriosis is a leading cause of hospitalization and death due to foodborne illness in the industrialized world. Animal models have played fundamental roles in elucidating the pathophysiology and immunology of listeriosis, and will almost certainly continue to be integral components of the research on listeriosis. Data derived from animal studies helped for example characterize the importance of cell-mediated immunity in controlling infection, allowed evaluation of chemotherapeutic treatments for listeriosis, and contributed to quantitative assessments of the public health risk associated with L. monocytogenes contaminated food commodities. Nonetheless, a number of pivotal questions remain unresolved, including dose-response relationships, which represent essential components of risk assessments. Newly emerging data about species-specific differences have recently raised concern about the validity of most traditional animal models of listeriosis. However, considerable uncertainty about the best choice of animal model remains. Here we review the available data on traditional and potential new animal models to summarize currently recognized strengths and limitations of each model. This knowledge is instrumental for devising future studies and for interpreting current data. We deliberately chose a historical, comparative and cross-disciplinary approach, striving to reveal clues that may help predict the ultimate value of each animal model in spite of incomplete data.

  14. Predicting protein-binding regions in RNA using nucleotide profiles and compositions.

    PubMed

    Choi, Daesik; Park, Byungkyu; Chae, Hanju; Lee, Wook; Han, Kyungsook

    2017-03-14

    Motivated by the increased amount of data on protein-RNA interactions and the availability of complete genome sequences of several organisms, many computational methods have been proposed to predict binding sites in protein-RNA interactions. However, most computational methods are limited to finding RNA-binding sites in proteins instead of protein-binding sites in RNAs. Predicting protein-binding sites in RNA is more challenging than predicting RNA-binding sites in proteins. Recent computational methods for finding protein-binding sites in RNAs have several drawbacks for practical use. We developed a new support vector machine (SVM) model for predicting protein-binding regions in mRNA sequences. The model uses sequence profiles constructed from log-odds scores of mono- and di-nucleotides and nucleotide compositions. The model was evaluated by standard 10-fold cross validation, leave-one-protein-out (LOPO) cross validation and independent testing. Since actual mRNA sequences have more non-binding regions than protein-binding regions, we tested the model on several datasets with different ratios of protein-binding regions to non-binding regions. The best performance of the model was obtained in a balanced dataset of positive and negative instances. 10-fold cross validation with a balanced dataset achieved a sensitivity of 91.6%, a specificity of 92.4%, an accuracy of 92.0%, a positive predictive value (PPV) of 91.7%, a negative predictive value (NPV) of 92.3% and a Matthews correlation coefficient (MCC) of 0.840. LOPO cross validation showed a lower performance than the 10-fold cross validation, but the performance remains high (87.6% accuracy and 0.752 MCC). In testing the model on independent datasets, it achieved an accuracy of 82.2% and an MCC of 0.656. Testing of our model and other state-of-the-art methods on a same dataset showed that our model is better than the others. Sequence profiles of log-odds scores of mono- and di-nucleotides were much more powerful features than nucleotide compositions in finding protein-binding regions in RNA sequences. But, a slight performance gain was obtained when using the sequence profiles along with nucleotide compositions. These are preliminary results of ongoing research, but demonstrate the potential of our approach as a powerful predictor of protein-binding regions in RNA. The program and supporting data are available at http://bclab.inha.ac.kr/RBPbinding .

  15. The predictive validity of three versions of the MCAT in relation to performance in medical school, residency, and licensing examinations: a longitudinal study of 36 classes of Jefferson Medical College.

    PubMed

    Callahan, Clara A; Hojat, Mohammadreza; Veloski, Jon; Erdmann, James B; Gonnella, Joseph S

    2010-06-01

    The Medical College Admission Test (MCAT) has undergone several revisions for content and validity since its inception. With another comprehensive review pending, this study examines changes in the predictive validity of the MCAT's three recent versions. Study participants were 7,859 matriculants in 36 classes entering Jefferson Medical College between 1970 and 2005; 1,728 took the pre-1978 version of the MCAT; 3,032 took the 1978-1991 version, and 3,099 took the post-1991 version. MCAT subtest scores were the predictors, and performance in medical school, attrition, scores on the medical licensing examinations, and ratings of clinical competence in the first year of residency were the criterion measures. No significant improvement in validity coefficients was observed for performance in medical school or residency. Validity coefficients for all three versions of the MCAT in predicting Part I/Step 1 remained stable (in the mid-0.40s, P < .01). A systematic decline was observed in the validity coefficients of the MCAT versions in predicting Part II/Step 2. It started at 0.47 for the pre-1978 version, decreased to between 0.42 and 0.40 for the 1978-1991 versions, and to 0.37 for the post-1991 version. Validity coefficients for the MCAT versions in predicting Part III/Step 3 remained near 0.30. These were generally larger for women than men. Although the findings support the short- and long-term predictive validity of the MCAT, opportunities to strengthen it remain. Subsequent revisions should increase the test's ability to predict performance on United States Medical Licensing Examination Step 2 and must minimize the differential validity for gender.

  16. DEVELOPMENT AND VALIDATION OF 'SURE': A PATIENT REPORTED OUTCOME MEASURE (PROM) FOR RECOVERY FROM DRUG AND ALCOHOL DEPENDENCE.

    PubMed

    Neale, Joanne; Vitoratou, Silia; Finch, Emily; Lennon, Paul; Mitcheson, Luke; Panebianco, Daria; Rose, Diana; Strang, John; Wykes, Til; Marsden, John

    2016-08-01

    Patient Reported Outcome Measures (PROMs) assess health status and health-related quality of life from the patient/service user perspective. Our study aimed to: i. develop a PROM for recovery from drug and alcohol dependence that has good face and content validity, acceptability and usability for people in recovery; ii. evaluate the psychometric properties and factorial structure of the new PROM ('SURE'). Item development included Delphi groups, focus groups, and service user feedback on draft versions of the new measure. A 30-item beta version was completed by 575 service users (461 in person [IP] and 114 online [OL]). Analyses comprised rating scale evaluation, assessment of psychometric properties, factorial structure, and differential item functioning. The beta measure had good face and content validity. Nine items were removed due to low stability, low factor loading, low construct validity or high complexity. The remaining 21 items were re-scaled (Rasch model analyses). Exploratory and confirmatory factor analyses revealed 5 factors: substance use, material resources, outlook on life, self-care, and relationships. The MIMIC model indicated 95% metric invariance across the IP and OL samples, and 100% metric invariance for gender. Internal consistency and test-retest reliability were granted. The 5 factors correlated positively with the corresponding WHOQOL-BREF and ARC subscales and score differences between participant sub-groups confirmed discriminative validity. 'SURE' is a psychometrically valid, quick and easy-to-complete outcome measure, developed with unprecedented input from people in recovery. It can be used alongside, or instead of, existing outcome tools. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  17. Construct Validity of the Societal Outreach Scale (SOS).

    PubMed

    Fike, David S; Denton, Jason; Walk, Matt; Kish, Jennifer; Gorman, Ira

    2018-04-01

    The American Physical Therapy Association (APTA) has been working toward a vision of increasing professional focus on societal-level health. However, performance of social responsibility and related behaviors by physical therapists remain relatively poorly integrated into practice. Promoting a focus on societal outreach is necessary for all health care professionals to impact the health of their communities. The objective was to document the validity of the 14-item Societal Outreach Scale (SOS) for use with practicing physical therapists. This study used a cross-sectional survey. The SOS was transmitted via email to all therapists who were licensed and practicing in 10 states in the United States that were purposefully selected to assure a broad representation. A sample of 2612 usable responses was received. Factor analysis was applied to assess construct validity of the instrument. Of alternate models, a 3-factor model best demonstrated goodness of fit with the sample data according to conventional indices (standardized root mean squared residual = .03, comparative fit index .96, root mean square error of approximation = .06). The 3 factors measured by the SOS were labeled Societal-Level Health Advocacy, Community Engagement/Social Integration, and Political Engagement. Internal consistency reliability was 0.7 for all factors. The 3-factor SOS demonstrated acceptable validity and reliability. Though the sample included a broad representation of physical therapists, this was a single cross-sectional study. Additional confirmatory factor analysis, reliability testing, and word refinement of the tool are warranted. Given the construct validity and reliability of the 3-factor SOS, it is recommended for use as a validated instrument to measure physical therapists' performance of social responsibility and related behaviors.

  18. A review of simulation platforms in surgery of the temporal bone.

    PubMed

    Bhutta, M F

    2016-10-01

    Surgery of the temporal bone is a high-risk activity in an anatomically complex area. Simulation enables rehearsal of such surgery. The traditional simulation platform is the cadaveric temporal bone, but in recent years other simulation platforms have been created, including plastic and virtual reality platforms. To undertake a review of simulation platforms for temporal bone surgery, specifically assessing their educational value in terms of validity and in enabling transition to surgery. Systematic qualitative review. Search of the Pubmed, CINAHL, BEI and ERIC databases. Assessment of reported outcomes in terms of educational value. A total of 49 articles were included, covering cadaveric, animal, plastic and virtual simulation platforms. Cadaveric simulation is highly rated as an educational tool, but there may be a ceiling effect on educational outcomes after drilling 8-10 temporal bones. Animal models show significant anatomical variation from man. Plastic temporal bone models offer much potential, but at present lack sufficient anatomical or haptic validity. Similarly, virtual reality platforms lack sufficient anatomical or haptic validity, but with technological improvements they are advancing rapidly. At present, cadaveric simulation remains the best platform for training in temporal bone surgery. Technological advances enabling improved materials or modelling mean that in the future plastic or virtual platforms may become comparable to cadaveric platforms, and also offer additional functionality including patient-specific simulation from CT data. © 2015 John Wiley & Sons Ltd.

  19. Cholinergic stimulation enhances Bayesian belief updating in the deployment of spatial attention.

    PubMed

    Vossel, Simone; Bauer, Markus; Mathys, Christoph; Adams, Rick A; Dolan, Raymond J; Stephan, Klaas E; Friston, Karl J

    2014-11-19

    The exact mechanisms whereby the cholinergic neurotransmitter system contributes to attentional processing remain poorly understood. Here, we applied computational modeling to psychophysical data (obtained from a spatial attention task) under a psychopharmacological challenge with the cholinesterase inhibitor galantamine (Reminyl). This allowed us to characterize the cholinergic modulation of selective attention formally, in terms of hierarchical Bayesian inference. In a placebo-controlled, within-subject, crossover design, 16 healthy human subjects performed a modified version of Posner's location-cueing task in which the proportion of validly and invalidly cued targets (percentage of cue validity, % CV) changed over time. Saccadic response speeds were used to estimate the parameters of a hierarchical Bayesian model to test whether cholinergic stimulation affected the trial-wise updating of probabilistic beliefs that underlie the allocation of attention or whether galantamine changed the mapping from those beliefs to subsequent eye movements. Behaviorally, galantamine led to a greater influence of probabilistic context (% CV) on response speed than placebo. Crucially, computational modeling suggested this effect was due to an increase in the rate of belief updating about cue validity (as opposed to the increased sensitivity of behavioral responses to those beliefs). We discuss these findings with respect to cholinergic effects on hierarchical cortical processing and in relation to the encoding of expected uncertainty or precision. Copyright © 2014 the authors 0270-6474/14/3415735-08$15.00/0.

  20. Development of a Breast Cancer Awareness Scale for Thai Women: Moving towards a Validated Measure.

    PubMed

    Rakkapao, Nitchamon; Promthet, Supannee; Moore, Malcolm A; Hurst, Cameron P

    2016-01-01

    Breast cancer is a major health problem among women around the world. Recent developments in screening and treatment have greatly improved the prognosis of patients with breast cancer in developed countries. However, in developing countries breast cancer mortality remains high.Breast cancer awareness is a first and important step in reducing breast cancer mortality. The development of a validated instrument to measure breast cancer awareness is crucial for the understanding and implementation of suitable health education programs to facilitate early deletion and minimize mortality. The objective of this study was to develop an instrument for the assessment of breast cancer awareness in Thai women. This methodological study was conducted in two stages: (1) literature searches and semi-structured interviews were conducted to generate items of the breast cancer awareness scale (B-CAS) which were subsequently examined for content and face validity, and (2) an exploration of the factor structure of the resulting instrument and an examination of its reliability. Data were collected using a self-administered questionnaire in Thai women aged 20-64 in August, 2015. A total of 219 women (response rate 97.4 %) participated in this validation study. The B-CAS contains five domains with 53 items on breast cancer awareness: 1) knowledge of risk factors, 2) knowledge of signs and symptoms, 3) attitude to breast cancer prevention, 4) barriers of breast screening, and 5) health behavior related to breast cancer awareness. Items with a content validity index <0.80 were excluded, and factor structure for the remaining items reflected the hypothesized five factor model. The scales based on all retained items was shown to have strongly internal consistency reliability (Cronbach's α=0.86). The B-CAS provides good psychometric properties to assess breast cancer awareness in women. It can be used to examine breast cancer awareness in Thai women and it could lead to the development and evaluation of suitable educational interventions for raising breast cancer awareness. Future research should focus on further validating the B-CAS including an assessment of construct and criterion-based validity.

  1. The Earthquake Source Inversion Validation (SIV) - Project: Summary, Status, Outlook

    NASA Astrophysics Data System (ADS)

    Mai, P. M.

    2017-12-01

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, this kinematic source inversion is ill-posed and returns non-unique solutions, as seen for instance in multiple source models for the same earthquake, obtained by different research teams, that often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversions and to understand strengths and weaknesses of various methods, the Source Inversion Validation (SIV) project developed a set of forward-modeling exercises and inversion benchmarks. Several research teams then use these validation exercises to test their codes and methods, but also to develop and benchmark new approaches. In this presentation I will summarize the SIV strategy, the existing benchmark exercises and corresponding results. Using various waveform-misfit criteria and newly developed statistical comparison tools to quantify source-model (dis)similarities, the SIV platforms is able to rank solutions and identify particularly promising source inversion approaches. Existing SIV exercises (with related data and descriptions) and all computational tools remain available via the open online collaboration platform; additional exercises and benchmark tests will be uploaded once they are fully developed. I encourage source modelers to use the SIV benchmarks for developing and testing new methods. The SIV efforts have already led to several promising new techniques for tackling the earthquake-source imaging problem. I expect that future SIV benchmarks will provide further innovations and insights into earthquake source kinematics that will ultimately help to better understand the dynamics of the rupture process.

  2. Effect of a Diffusion Zone on Fatigue Crack Propagation in Layered FGMs

    NASA Astrophysics Data System (ADS)

    Hauber, Brett; Brockman, Robert; Paulino, Glaucio

    2008-02-01

    Research into functionally graded materials (FGMs) has led to advances in our ability to analyze cracks. However, two prominent aspects remain relatively unexplored: 1) development and validation of modeling methods for fatigue crack propagation in FGMs, and 2) experimental validation of stress intensity models in engineered materials such as two phase monolithic and graded materials. This work addresses some of these problems for a limited set of conditions, material systems (e.g., Ti/TiB), and material gradients. Numerical analyses are conducted for single edge notch bend (SENB) specimens. Stress intensity factors are computed using the specialized finite element code I-Franc (Illinois Fracture Analysis Code), which is tailored for both homogeneous and graded materials, as well as Franc2DL and ABAQUS. Crack extension is considered by means of specified crack increments, together with fatigue evaluations to predict crack propagation life. Results will be used to determine linear material gradient parameters that are significant for prediction of fatigue crack growth behavior.

  3. Development and initial validation of an Aviation Safety Climate Scale.

    PubMed

    Evans, Bronwyn; Glendon, A Ian; Creed, Peter A

    2007-01-01

    A need was identified for a consistent set of safety climate factors to provide a basis for aviation industry benchmarking. Six broad safety climate themes were identified from the literature and consultations with industry safety experts. Items representing each of the themes were prepared and administered to 940 Australian commercial pilots. Data from half of the sample (N=468) were used in an exploratory factor analysis that produced a 3-factor model of Management commitment and communication, Safety training and equipment, and Maintenance. A confirmatory factor analysis on the remaining half of the sample showed the 3-factor model to be an adequate fit to the data. The results of this study have produced a scale of safety climate for aviation that is both reliable and valid. This study developed a tool to assess the level of perceived safety climate, specifically of pilots, but may also, with minor modifications, be used to assess other groups' perceptions of safety climate.

  4. Lidar ceilometer observations and modeling of a fireworks plume in Vancouver, British Columbia

    NASA Astrophysics Data System (ADS)

    van der Kamp, Derek; McKendry, Ian; Wong, May; Stull, Roland

    Observations of a plume emanating from a 30-min duration pyrotechnic display with a lidar ceilometer are described for an urban setting in complex, coastal terrain. Advection of the plume across the ceilometer occurred at a mean height of 250 m AGL. The plume traveled downwind at ˜3 m s -1, and at a distance of 8 km downwind, was ˜100 m in vertical thickness with particulate matter (PM) concentrations of order 30-40 μg m -3. Surface PM observations from surrounding urban monitoring stations suggest that the plume was not mixed to ground over the urban area. Plume trajectories at ˜250 m simulated by three numerical models all traveled to the northeast of the ceilometer location. Horizontal plume dispersion estimates suggest that the model trajectories were too far north to accommodate the likely lateral plume spread necessary to explain the ceilometer observations. This poor agreement between near surface observations and model output is consistent with previous mesoscale model validations in this region of complex urbanized terrain, and suggests that despite improvements in mesoscale model resolution, there remains an urgent need to improve upstream initial conditions over the Pacific Ocean, data assimilation over complex terrain, the representation of urban areas in mesoscale models, and to further validate such models for nocturnal applications in complex settings.

  5. Genesis and Evolution of the Skyrme Model from 1954 TO the Present

    NASA Astrophysics Data System (ADS)

    Sanyuk, Valery I.

    Not widely known facts on the genesis of the Skyrme model are presented in a historical survey, based on Skyrme's earliest papers and on his own published remembrance. We consider the evolution of Skyrme's model description of nuclear matter from the "Mesonic Fluid" model up to its final version, known as the baryon model. We pay special tribute to some well-known ideas in contemporary particle physics which one can find in Skyrme's earlier papers, such as: Nuclear Democracy, the Solitonic Mechanism, the Nonlinear Realization of Chiral Symmetry, Topological Charges, Fermi-Bose Transmutations, etc. It is curious to note in the final version of the Skyrme model gleams of Kelvin's "Vortex Atoms" theory. In conclusion we make a brief analysis of the validity of Skyrme's conjectures in view of recent results and pinpoint some questions which still remain.

  6. Modeling Imperfect Generator Behavior in Power System Operation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim

    A key component in power system operations is the use of computer models to quickly study and analyze different operating conditions and futures in an efficient manner. The output of these models are sensitive to the data used in them as well as the assumptions made during their execution. One typical assumption is that generators and load assets perfectly follow operator control signals. While this is a valid simulation assumption, generators may not always accurately follow control signals. This imperfect response of generators could impact cost and reliability metrics. This paper proposes a generator model that capture this imperfect behaviormore » and examines its impact on production costs and reliability metrics using a steady-state power system operations model. Preliminary analysis shows that while costs remain relatively unchanged, there could be significant impacts on reliability metrics.« less

  7. Application and Validation of Remaining Service Interval Framework for Pavements

    DOT National Transportation Integrated Search

    2016-10-01

    The pavement remaining service interval (RSI) terminology was developed to remove confusion caused by the multitude of meanings assigned to the various forms of pavement remaining service life (RSL). The RSI concept considers the complete maintenance...

  8. Review and assessment of turbulence models for hypersonic flows

    NASA Astrophysics Data System (ADS)

    Roy, Christopher J.; Blottner, Frederick G.

    2006-10-01

    Accurate aerodynamic prediction is critical for the design and optimization of hypersonic vehicles. Turbulence modeling remains a major source of uncertainty in the computational prediction of aerodynamic forces and heating for these systems. The first goal of this article is to update the previous comprehensive review of hypersonic shock/turbulent boundary-layer interaction experiments published in 1991 by Settles and Dodson (Hypersonic shock/boundary-layer interaction database. NASA CR 177577, 1991). In their review, Settles and Dodson developed a methodology for assessing experiments appropriate for turbulence model validation and critically surveyed the existing hypersonic experiments. We limit the scope of our current effort by considering only two-dimensional (2D)/axisymmetric flows in the hypersonic flow regime where calorically perfect gas models are appropriate. We extend the prior database of recommended hypersonic experiments (on four 2D and two 3D shock-interaction geometries) by adding three new geometries. The first two geometries, the flat plate/cylinder and the sharp cone, are canonical, zero-pressure gradient flows which are amenable to theory-based correlations, and these correlations are discussed in detail. The third geometry added is the 2D shock impinging on a turbulent flat plate boundary layer. The current 2D hypersonic database for shock-interaction flows thus consists of nine experiments on five different geometries. The second goal of this study is to review and assess the validation usage of various turbulence models on the existing experimental database. Here we limit the scope to one- and two-equation turbulence models where integration to the wall is used (i.e., we omit studies involving wall functions). A methodology for validating turbulence models is given, followed by an extensive evaluation of the turbulence models on the current hypersonic experimental database. A total of 18 one- and two-equation turbulence models are reviewed, and results of turbulence model assessments for the six models that have been extensively applied to the hypersonic validation database are compiled and presented in graphical form. While some of the turbulence models do provide reasonable predictions for the surface pressure, the predictions for surface heat flux are generally poor, and often in error by a factor of four or more. In the vast majority of the turbulence model validation studies we review, the authors fail to adequately address the numerical accuracy of the simulations (i.e., discretization and iterative error) and the sensitivities of the model predictions to freestream turbulence quantities or near-wall y+ mesh spacing. We recommend new hypersonic experiments be conducted which (1) measure not only surface quantities but also mean and fluctuating quantities in the interaction region and (2) provide careful estimates of both random experimental uncertainties and correlated bias errors for the measured quantities and freestream conditions. For the turbulence models, we recommend that a wide-range of turbulence models (including newer models) be re-examined on the current hypersonic experimental database, including the more recent experiments. Any future turbulence model validation efforts should carefully assess the numerical accuracy and model sensitivities. In addition, model corrections (e.g., compressibility corrections) should be carefully examined for their effects on a standard, low-speed validation database. Finally, as new experiments or direct numerical simulation data become available with information on mean and fluctuating quantities, they should be used to improve the turbulence models and thus increase their predictive capability.

  9. Directional Ocean Wave Spectra

    DTIC Science & Technology

    1991-01-01

    between the wave height time series from the different LEWEX." Data Report Programa de Clima iarnimo. Madrid (l9L8i. 84 AIR AND SPACE MEASUREMENTS IN... inclusion of the nonlinear azimuthal Summation over the velocity-bunching index m for cutoff factor, remains a valid approximation for the en - fixed...buoy observations. ’Guillaurne, A., "VAG-Modele de PT[iSJon de rFEtif de [a Mer en F’au However, an analysis of the evolution of the direc- Proflonde

  10. Development and validation of a predictive score for perioperative transfusion in patients with hepatocellular carcinoma undergoing liver resection.

    PubMed

    Wang, Hai-Qing; Yang, Jian; Yang, Jia-Yin; Wang, Wen-Tao; Yan, Lu-Nan

    2015-08-01

    Liver resection is a major surgery requiring perioperative blood transfusion. Predicting the need for blood transfusion for patients undergoing liver resection is of great importance. The present study aimed to develop and validate a model for predicting transfusion requirement in HBV-related hepatocellular carcinoma patients undergoing liver resection. A total of 1543 consecutive liver resections were included in the study. Randomly selected sample set of 1080 cases (70% of the study cohort) were used to develop a predictive score for transfusion requirement and the remaining 30% (n=463) was used to validate the score. Based on the preoperative and predictable intraoperative parameters, logistic regression was used to identify risk factors and to create an integer score for the prediction of transfusion requirement. Extrahepatic procedure, major liver resection, hemoglobin level and platelets count were identified as independent predictors for transfusion requirement by logistic regression analysis. A score system integrating these 4 factors was stratified into three groups which could predict the risk of transfusion, with a rate of 11.4%, 24.7% and 57.4% for low, moderate and high risk, respectively. The prediction model appeared accurate with good discriminatory abilities, generating an area under the receiver operating characteristic curve of 0.736 in the development set and 0.709 in the validation set. We have developed and validated an integer-based risk score to predict perioperative transfusion for patients undergoing liver resection in a high-volume surgical center. This score allows identifying patients at a high risk and may alter transfusion practices.

  11. The Chinese version of hospital anxiety and depression scale: Psychometric properties in Chinese cancer patients and their family caregivers.

    PubMed

    Li, Qiuping; Lin, Yi; Hu, Caiping; Xu, Yinghua; Zhou, Huiya; Yang, Liping; Xu, Yongyong

    2016-12-01

    The Hospital Anxiety and Depression Scale (HADS) acts as one of the most frequently used self-reported measures in cancer practice. The evidence for construct validity of HADS, however, remains inconclusive. The objective of this study is to evaluate the psychometric properties of the Chinese version HADS (C-HADS) in terms of construct validity, internal consistency reliability, and concurrent validity in dyads of Chinese cancer patients and their family caregivers. This was a cross-sectional study, conducted in multiple centers: one hospital in each of the seven different administrative regions in China from October 2014 to May 2015. A total of 641 dyads, consisting of cancer patients and family caregivers, completed a survey assessing their demographic and background information, anxiety and depression using C-HADS, and quality of life (QOL) using Chinese version SF-12. Data analysis methods included descriptive statistics, confirmatory factor analysis (CFA), and Pearson correlations. Both the two-factor and one-factor models offered the best and adequate fit to the data in cancer patients and family caregivers respectively. The comparison of the two-factor and single-factor models supports the basic assumption of two-factor construct of C-HADS. The overall and two subscales of C-HADS in both cancer patients and family caregivers had good internal consistency and acceptable concurrent validity. The Chinese version of the HADS may be a reliable and valid screening tool, as indicated by its original two-factor structure. The finding supports the basic assumption of two-factor construct of HADS. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A numerical analysis of the aortic blood flow pattern during pulsed cardiopulmonary bypass.

    PubMed

    Gramigna, V; Caruso, M V; Rossi, M; Serraino, G F; Renzulli, A; Fragomeni, G

    2015-01-01

    In the modern era, stroke remains a main cause of morbidity after cardiac surgery despite continuing improvements in the cardiopulmonary bypass (CPB) techniques. The aim of the current work was to numerically investigate the blood flow in aorta and epiaortic vessels during standard and pulsed CPB, obtained with the intra-aortic balloon pump (IABP). A multi-scale model, realized coupling a 3D computational fluid dynamics study with a 0D model, was developed and validated with in vivo data. The presence of IABP improved the flow pattern directed towards the epiaortic vessels with a mean flow increase of 6.3% and reduced flow vorticity.

  13. Sexual Assertiveness Scale (SAS) for women: development and validation.

    PubMed

    Morokoff, P J; Quina, K; Harlow, L L; Whitmire, L; Grimley, D M; Gibson, P R; Burkholder, G J

    1997-10-01

    Four studies were conducted to develop and validate the Sexual Assertiveness Scale (SAS), a measure of sexual assertiveness in women that consists of factors measuring initiation, refusal, and pregnancy-sexually transmitted disease prevention assertiveness. A total of 1,613 women from both university and community populations were studied. Confirmatory factor analyses demonstrated that the 3 factors remained stable across samples of university and community women. A structural model was tested in 2 samples, indicating that sexual experience, anticipated negative partner response, and self-efficacy are consistent predictors of sexual assertiveness. Sexual assertiveness was found to be somewhat related to relationship satisfaction, power, and length. The community sample was retested after 6 months and 1 year to establish test-retest reliability. The SAS provides a reliable instrument for assessing and understanding women's sexual assertiveness.

  14. Integrated Primary Care Readiness and Behaviors Scale: Development and validation in behavioral health professionals.

    PubMed

    Blaney, Cerissa L; Redding, Colleen A; Paiva, Andrea L; Rossi, Joseph S; Prochaska, James O; Blissmer, Bryan; Burditt, Caitlin T; Nash, Justin M; Bayley, Keri Dotson

    2018-03-01

    Although integrated primary care (IPC) is growing, several barriers remain. Better understanding of behavioral health professionals' (BHPs') readiness for and engagement in IPC behaviors could improve IPC research and training. This study developed measures of IPC behaviors and stage of change. The sample included 319 licensed, practicing BHPs with a range of interests and experience with IPC. Sequential measurement development procedures, with split-half cross-validation were conducted. Exploratory principal components analyses (N = 152) and confirmatory factor analyses (N = 167) yielded a 12-item scale with 2 factors: consultation/practice management (CPM) and intervention/knowledge (IK). A higher-order Integrated Primary Care Behavior Scale (IPCBS) model showed good fit to the data, and excellent internal consistencies. The multivariate analysis of variance (MANOVA) on the IPCBS demonstrated significant large-sized differences across stage and behavior groups. The IPCBS demonstrated good psychometric properties and external validation, advancing research, education, and training for IPC practice. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Development and validation of the Medical Home Care Coordination Survey for assessing care coordination in the primary care setting from the patient and provider perspectives.

    PubMed

    Zlateva, Ianita; Anderson, Daren; Coman, Emil; Khatri, Khushbu; Tian, Terrence; Fifield, Judith

    2015-06-07

    Community health centers are increasingly embracing the Patient Centered Medical Home (PCMH) model to improve quality, access to care, and patient experience while reducing healthcare costs. Care coordination (CC) is an important element of the PCMH model, but implementation and measurability of CC remains a problem within the outpatient setting. Assessing CC is an integral component of quality monitoring in health care systems. This study developed and validated the Medical Home Care Coordination Survey (MHCCS), to fill the gap in assessing CC in primary care from the perspectives of patients and their primary healthcare teams. We conducted a review of relevant literature and existing care coordination instruments identified by bibliographic search and contact with experts. After identifying all care coordination domains that could be assessed by primary healthcare team members and patients, we developed a conceptual model. Potentially appropriate items from existing published CC measures, along with newly developed items, were matched to each domain for inclusion. A modified Delphi approach was used to establish content validity. Primary survey data was collected from 232 patients with care transition and/or complex chronic illness needs from the Community Health Center, Inc. and from 164 staff members from 12 community health centers across the country via mail, phone and online survey. The MHCCS was validated for internal consistency, reliability, discriminant and convergent validity. This study was conducted at the Community Health Center, Inc. from January 15, 2012 to July 15, 2014. The 13-item MHCCS - Patient and the 32-item MHCCS - Healthcare Team were developed and validated. Exploratory Structural Equation Modeling was used to test the hypothesized domain structure. Four CC domains were confirmed from the patient group and eight were confirmed from the primary healthcare team group. All domains had high reliability (Cronbach's α scores were above 0.8). Patients experience the ultimate output of care coordination services, but primary healthcare staff members are best primed to perceive many of the structural elements of care coordination. The proactive measurement and monitoring of the core domains from both perspectives provides a richer body of information for the continuous improvement of care coordination services. The MHCCS shows promise as a valid and reliable assessment of these CC efforts.

  16. General regression neural network model for behavior of Salmonella on chicken meat during cold storage.

    PubMed

    Oscar, Thomas P

    2014-05-01

    A study was undertaken to investigate and model behavior of Salmonella on chicken meat during cold storage at constant temperatures. Chicken meat (white, dark, or skin) portions (0.75 cm(3)) were inoculated with a single strain of Salmonella Typhimurium DT104 (2.8 log) followed by storage for 0 to 8 d at -8, 0, 8, 12, 14, or 16 °C for model development and at -4, 4, 10, or 14 °C for model validation. A general regression neural network model was developed with commercial software. Performance of the model was considered acceptable when the proportion of residuals (observed--predicted) in an acceptable prediction zone (pAPZ) from -1 log (fail-safe) to 0.5 logs (fail-dangerous) was ≥ 0.7. Growth of Salmonella Typhimurium DT104 on chicken meat was observed at 12, 14, and 16 °C and was highest on dark meat, intermediate on skin, and lowest on white meat. At lower temperatures (-8 to 10 °C) Salmonella Typhimurium DT104 remained at initial levels throughout 8 d of storage except at 4 °C where there was a small (0.4 log) but significant decline. The model had acceptable performance (pAPZ = 0.929) for dependent data (n = 482) and acceptable performance (pAPZ = 0.923) for independent data (n = 235). Results indicated that it is important to include type of meat as an independent variable in the model and that the model provided valid predictions of the behavior of Salmonella Typhimurium DT104 on chicken skin, white, and dark meat during storage for 0 to 8 d at constant temperatures from -8 to 16 °C. A model for predicting behavior of Salmonella on chicken meat during cold storage was developed and validated. The model will help the chicken industry to better predict and manage this risk to public health. Journal of Food Science © 2014 Institute of Food Technologists® No claim to original US government works.

  17. Spatial Modeling of Flood Duration in Amazonian Floodplains Through Radar Remote Sensing and Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    Ferreira-Ferreira, J.; Francisco, M. S.; Silva, T. S. F.

    2017-12-01

    Amazon floodplains play an important role in biodiversity maintenance and provide important ecosystem services. Flood duration is the prime factor modulating biogeochemical cycling in Amazonian floodplain systems, as well as influencing ecosystem structure and function. However, due to the absence of accurate terrain information, fine-scale hydrological modeling is still not possible for most of the Amazon floodplains, and little is known regarding the spatio-temporal behavior of flooding in these environments. Our study presents an new approach for spatial modeling of flood duration, using Synthetic Aperture Radar (SAR) and Generalized Linear Modeling. Our focal study site was Mamirauá Sustainable Development Reserve, in the Central Amazon. We acquired a series of L-band ALOS-1/PALSAR Fine-Beam mosaics, chosen to capture the widest possible range of river stage heights at regular intervals. We then mapped flooded area on each image, and used the resulting binary maps as the response variable (flooded/non-flooded) for multiple logistic regression. Explanatory variables were accumulated precipitation 15 days prior and the water stage height recorded in the Mamirauá lake gauging station observed for each image acquisition date, Euclidean distance from the nearest drainage, and slope, terrain curvature, profile curvature, planform curvature and Height Above the Nearest Drainage (HAND) derived from the 30-m SRTM DEM. Model results were validated with water levels recorded by ten pressure transducers installed within the floodplains, from 2014 to 2016. The most accurate model included water stage height and HAND as explanatory variables, yielding a RMSE of ±38.73 days of flooding per year when compared to the ground validation sites. The largest disagreements were 57 days and 83 days for two validation sites, while remaining locations achieved absolute errors lower than 38 days. In five out of nine validation sites, the model predicted flood durations with disagreements lower than 20 days. The method extends our current capability to answer relevant scientific questions regarding floodplain ecological structure and functioning, and allows forecasting of ecological and biogeochemical alterations under climate change scenarios, using readily available datasets.

  18. Fission Fragment Mass Distributions and Total Kinetic Energy Release of 235-Uranium and 238-Uranium in Neutron-Induced Fission at Intermediate and Fast Neutron Energies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, Dana Lynn

    2015-11-12

    This Ph.D. dissertation describes a measurement of the change in mass distributions and average total kinetic energy (TKE) release with increasing incident neutron energy for fission of 235U and 238U. Although fission was discovered over seventy-five years ago, open questions remain about the physics of the fission process. The energy of the incident neutron, En, changes the division of energy release in the resulting fission fragments, however, the details of energy partitioning remain ambiguous because the nucleus is a many-body quantum system. Creating a full theoretical model is difficult and experimental data to validate existing models are lacking. Additional fissionmore » measurements will lead to higher-quality models of the fission process, therefore improving applications such as the development of next-generation nuclear reactors and defense. This work also paves the way for precision experiments such as the Time Projection Chamber (TPC) for fission cross section measurements and the Spectrometer for Ion Determination in Fission (SPIDER) for precision mass yields.« less

  19. Performance comparison of LUR and OK in PM2.5 concentration mapping: a multidimensional perspective

    PubMed Central

    Zou, Bin; Luo, Yanqing; Wan, Neng; Zheng, Zhong; Sternberg, Troy; Liao, Yilan

    2015-01-01

    Methods of Land Use Regression (LUR) modeling and Ordinary Kriging (OK) interpolation have been widely used to offset the shortcomings of PM2.5 data observed at sparse monitoring sites. However, traditional point-based performance evaluation strategy for these methods remains stagnant, which could cause unreasonable mapping results. To address this challenge, this study employs ‘information entropy’, an area-based statistic, along with traditional point-based statistics (e.g. error rate, RMSE) to evaluate the performance of LUR model and OK interpolation in mapping PM2.5 concentrations in Houston from a multidimensional perspective. The point-based validation reveals significant differences between LUR and OK at different test sites despite the similar end-result accuracy (e.g. error rate 6.13% vs. 7.01%). Meanwhile, the area-based validation demonstrates that the PM2.5 concentrations simulated by the LUR model exhibits more detailed variations than those interpolated by the OK method (i.e. information entropy, 7.79 vs. 3.63). Results suggest that LUR modeling could better refine the spatial distribution scenario of PM2.5 concentrations compared to OK interpolation. The significance of this study primarily lies in promoting the integration of point- and area-based statistics for model performance evaluation in air pollution mapping. PMID:25731103

  20. Evolution of female multiple mating: A quantitative model of the “sexually selected sperm” hypothesis

    PubMed Central

    Bocedi, Greta; Reid, Jane M

    2015-01-01

    Explaining the evolution and maintenance of polyandry remains a key challenge in evolutionary ecology. One appealing explanation is the sexually selected sperm (SSS) hypothesis, which proposes that polyandry evolves due to indirect selection stemming from positive genetic covariance with male fertilization efficiency, and hence with a male's success in postcopulatory competition for paternity. However, the SSS hypothesis relies on verbal analogy with “sexy-son” models explaining coevolution of female preferences for male displays, and explicit models that validate the basic SSS principle are surprisingly lacking. We developed analogous genetically explicit individual-based models describing the SSS and “sexy-son” processes. We show that the analogy between the two is only partly valid, such that the genetic correlation arising between polyandry and fertilization efficiency is generally smaller than that arising between preference and display, resulting in less reliable coevolution. Importantly, indirect selection was too weak to cause polyandry to evolve in the presence of negative direct selection. Negatively biased mutations on fertilization efficiency did not generally rescue runaway evolution of polyandry unless realized fertilization was highly skewed toward a single male, and coevolution was even weaker given random mating order effects on fertilization. Our models suggest that the SSS process is, on its own, unlikely to generally explain the evolution of polyandry. PMID:25330405

  1. Automatic paper sliceform design from 3D solid models.

    PubMed

    Le-Nguyen, Tuong-Vu; Low, Kok-Lim; Ruiz, Conrado; Le, Sang N

    2013-11-01

    A paper sliceform or lattice-style pop-up is a form of papercraft that uses two sets of parallel paper patches slotted together to make a foldable structure. The structure can be folded flat, as well as fully opened (popped-up) to make the two sets of patches orthogonal to each other. Automatic design of paper sliceforms is still not supported by existing computational models and remains a challenge. We propose novel geometric formulations of valid paper sliceform designs that consider the stability, flat-foldability and physical realizability of the designs. Based on a set of sufficient construction conditions, we also present an automatic algorithm for generating valid sliceform designs that closely depict the given 3D solid models. By approximating the input models using a set of generalized cylinders, our method significantly reduces the search space for stable and flat-foldable sliceforms. To ensure the physical realizability of the designs, the algorithm automatically generates slots or slits on the patches such that no two cycles embedded in two different patches are interlocking each other. This guarantees local pairwise assembility between patches, which is empirically shown to lead to global assembility. Our method has been demonstrated on a number of example models, and the output designs have been successfully made into real paper sliceforms.

  2. Ecological validity of cost-effectiveness models of universal HPV vaccination: A systematic literature review.

    PubMed

    Favato, Giampiero; Easton, Tania; Vecchiato, Riccardo; Noikokyris, Emmanouil

    2017-05-09

    The protective (herd) effect of the selective vaccination of pubertal girls against human papillomavirus (HPV) implies a high probability that one of the two partners involved in intercourse is immunised, hence preventing the other from this sexually transmitted infection. The dynamic transmission models used to inform immunisation policy should include consideration of sexual behaviours and population mixing in order to demonstrate an ecological validity, whereby the scenarios modelled remain faithful to the real-life social and cultural context. The primary aim of this review is to test the ecological validity of the universal HPV vaccination cost-effectiveness modelling available in the published literature. The research protocol related to this systematic review has been registered in the International Prospective Register of Systematic Reviews (PROSPERO: CRD42016034145). Eight published economic evaluations were reviewed. None of the studies showed due consideration of the complexities of human sexual behaviour and the impact this may have on the transmission of HPV. Our findings indicate that all the included models might be affected by a different degree of ecological bias, which implies an inability to reflect the natural demographic and behavioural trends in their outcomes and, consequently, to accurately inform public healthcare policy. In particular, ecological bias have the effect to over-estimate the preference-based outcomes of selective immunisation. A relatively small (15-20%) over-estimation of quality-adjusted life years (QALYs) gained with selective immunisation programmes could induce a significant error in the estimate of cost-effectiveness of universal immunisation, by inflating its incremental cost effectiveness ratio (ICER) beyond the acceptability threshold. The results modelled here demonstrate the limitations of the cost-effectiveness studies for HPV vaccination, and highlight the concern that public healthcare policy might have been built upon incomplete studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Nonlinear finite element model updating for damage identification of civil structures using batch Bayesian estimation

    NASA Astrophysics Data System (ADS)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.; de Callafon, Raymond A.

    2017-02-01

    This paper presents a framework for structural health monitoring (SHM) and damage identification of civil structures. This framework integrates advanced mechanics-based nonlinear finite element (FE) modeling and analysis techniques with a batch Bayesian estimation approach to estimate time-invariant model parameters used in the FE model of the structure of interest. The framework uses input excitation and dynamic response of the structure and updates a nonlinear FE model of the structure to minimize the discrepancies between predicted and measured response time histories. The updated FE model can then be interrogated to detect, localize, classify, and quantify the state of damage and predict the remaining useful life of the structure. As opposed to recursive estimation methods, in the batch Bayesian estimation approach, the entire time history of the input excitation and output response of the structure are used as a batch of data to estimate the FE model parameters through a number of iterations. In the case of non-informative prior, the batch Bayesian method leads to an extended maximum likelihood (ML) estimation method to estimate jointly time-invariant model parameters and the measurement noise amplitude. The extended ML estimation problem is solved efficiently using a gradient-based interior-point optimization algorithm. Gradient-based optimization algorithms require the FE response sensitivities with respect to the model parameters to be identified. The FE response sensitivities are computed accurately and efficiently using the direct differentiation method (DDM). The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem by computing the exact Fisher Information matrix using the FE response sensitivities with respect to the model parameters. The accuracy of the proposed uncertainty quantification approach is verified using a sampling approach based on the unscented transformation. Two validation studies, based on realistic structural FE models of a bridge pier and a moment resisting steel frame, are performed to validate the performance and accuracy of the presented nonlinear FE model updating approach and demonstrate its application to SHM. These validation studies show the excellent performance of the proposed framework for SHM and damage identification even in the presence of high measurement noise and/or way-out initial estimates of the model parameters. Furthermore, the detrimental effects of the input measurement noise on the performance of the proposed framework are illustrated and quantified through one of the validation studies.

  4. Modelling the distributions and spatial coincidence of bluetongue vectors Culicoides imicola and the Culicoides obsoletus group throughout the Iberian peninsula.

    PubMed

    Calvete, C; Estrada, R; Miranda, M A; Borrás, D; Calvo, J H; Lucientes, J

    2008-06-01

    Data obtained by a Spanish national surveillance programme in 2005 were used to develop climatic models for predictions of the distribution of the bluetongue virus (BTV) vectors Culicoides imicola Kieffer (Diptera: Ceratopogonidae) and the Culicoides obsoletus group Meigen throughout the Iberian peninsula. Models were generated using logistic regression to predict the probability of species occurrence at an 8-km spatial resolution. Predictor variables included the annual mean values and seasonalities of a remotely sensed normalized difference vegetation index (NDVI), a sun index, interpolated precipitation and temperature. Using an information-theoretic paradigm based on Akaike's criterion, a set of best models accounting for 95% of model selection certainty were selected and used to generate an average predictive model for each vector. The predictive performances (i.e. the discrimination capacity and calibration) of the average models were evaluated by both internal and external validation. External validation was achieved by comparing average model predictions with surveillance programme data obtained in 2004 and 2006. The discriminatory capacity of both models was found to be reasonably high. The estimated areas under the receiver operating characteristic (ROC) curve (AUC) were 0.78 and 0.70 for the C. imicola and C. obsoletus group models, respectively, in external validation, and 0.81 and 0.75, respectively, in internal validation. The predictions of both models were in close agreement with the observed distribution patterns of both vectors. Both models, however, showed a systematic bias in their predicted probability of occurrence: observed occurrence was systematically overestimated for C. imicola and underestimated for the C. obsoletus group. Average models were used to determine the areas of spatial coincidence of the two vectors. Although their spatial distributions were highly complementary, areas of spatial coincidence were identified, mainly in Portugal and in the southwest of peninsular Spain. In a hypothetical scenario in which both Culicoides members had similar vectorial capacity for a BTV strain, these areas should be considered of special epidemiological concern because any epizootic event could be intensified by consecutive vector activity developed for both species during the year; consequently, the probability of BTV spreading to remaining areas occupied by both vectors might also be higher.

  5. The potential for machine learning algorithms to improve and reduce the cost of 3-dimensional printing for surgical planning.

    PubMed

    Huff, Trevor J; Ludwig, Parker E; Zuniga, Jorge M

    2018-05-01

    3D-printed anatomical models play an important role in medical and research settings. The recent successes of 3D anatomical models in healthcare have led many institutions to adopt the technology. However, there remain several issues that must be addressed before it can become more wide-spread. Of importance are the problems of cost and time of manufacturing. Machine learning (ML) could be utilized to solve these issues by streamlining the 3D modeling process through rapid medical image segmentation and improved patient selection and image acquisition. The current challenges, potential solutions, and future directions for ML and 3D anatomical modeling in healthcare are discussed. Areas covered: This review covers research articles in the field of machine learning as related to 3D anatomical modeling. Topics discussed include automated image segmentation, cost reduction, and related time constraints. Expert commentary: ML-based segmentation of medical images could potentially improve the process of 3D anatomical modeling. However, until more research is done to validate these technologies in clinical practice, their impact on patient outcomes will remain unknown. We have the necessary computational tools to tackle the problems discussed. The difficulty now lies in our ability to collect sufficient data.

  6. Personalized Estimate of Chemotherapy-Induced Nausea and Vomiting: Development and External Validation of a Nomogram in Cancer Patients Receiving Highly/Moderately Emetogenic Chemotherapy.

    PubMed

    Hu, Zhihuang; Liang, Wenhua; Yang, Yunpeng; Keefe, Dorothy; Ma, Yuxiang; Zhao, Yuanyuan; Xue, Cong; Huang, Yan; Zhao, Hongyun; Chen, Likun; Chan, Alexandre; Zhang, Li

    2016-01-01

    Chemotherapy-induced nausea and vomiting (CINV) is presented in over 30% of cancer patients receiving highly/moderately emetogenic chemotherapy (HEC/MEC). The currently recommended antiemetic therapy is merely based on the emetogenic level of chemotherapy, regardless of patient's individual risk factors. It is, therefore, critical to develop an approach for personalized management of CINV in the era of precision medicine.A number of variables were involved in the development of CINV. In the present study, we pooled the data from 2 multi-institutional investigations of CINV due to HEC/MEC treatment in Asian countries. Demographic and clinical variables of 881 patients were prospectively collected as defined previously, and 862 of them had full documentation of variables of interest. The data of 548 patients from Chinese institutions were used to identify variables associated with CINV using multivariate logistic regression model, and then construct a personalized prediction model of nomogram; while the remaining 314 patients out of China (Singapore, South Korea, and Taiwan) entered the external validation set. C-index was used to measure the discrimination ability of the model.The predictors in the final model included sex, age, alcohol consumption, history of vomiting pregnancy, history of motion sickness, body surface area, emetogenicity of chemotherapy, and antiemetic regimens. The C-index was 0.67 (95% CI, 0.62-0.72) for the training set and 0.65 (95% CI, 0.58-0.72) for the validation set. The C-index was higher than that of any single predictor, including the emetogenic level of chemotherapy according to current antiemetic guidelines. Calibration curves showed good agreement between prediction and actual occurrence of CINV.This easy-to-use prediction model was based on chemotherapeutic regimens as well as patient's individual risk factors. The prediction accuracy of CINV occurrence in this nomogram was well validated by an independent data set. It could facilitate the assessment of individual risk, and thus improve the personalized management of CINV.

  7. Empirically Derived Dehydration Scoring and Decision Tree Models for Children With Diarrhea: Assessment and Internal Validation in a Prospective Cohort Study in Dhaka, Bangladesh

    PubMed Central

    Glavis-Bloom, Justin; Modi, Payal; Nasrin, Sabiha; Rege, Soham; Chu, Chieh; Schmid, Christopher H; Alam, Nur H

    2015-01-01

    Introduction: Diarrhea remains one of the most common and most deadly conditions affecting children worldwide. Accurately assessing dehydration status is critical to determining treatment course, yet no clinical diagnostic models for dehydration have been empirically derived and validated for use in resource-limited settings. Methods: In the Dehydration: Assessing Kids Accurately (DHAKA) prospective cohort study, a random sample of children under 5 with acute diarrhea was enrolled between February and June 2014 in Bangladesh. Local nurses assessed children for clinical signs of dehydration on arrival, and then serial weights were obtained as subjects were rehydrated. For each child, the percent weight change with rehydration was used to classify subjects with severe dehydration (>9% weight change), some dehydration (3–9%), or no dehydration (<3%). Clinical variables were then entered into logistic regression and recursive partitioning models to develop the DHAKA Dehydration Score and DHAKA Dehydration Tree, respectively. Models were assessed for their accuracy using the area under their receiver operating characteristic curve (AUC) and for their reliability through repeat clinical exams. Bootstrapping was used to internally validate the models. Results: A total of 850 children were enrolled, with 771 included in the final analysis. Of the 771 children included in the analysis, 11% were classified with severe dehydration, 45% with some dehydration, and 44% with no dehydration. Both the DHAKA Dehydration Score and DHAKA Dehydration Tree had significant AUCs of 0.79 (95% CI = 0.74, 0.84) and 0.76 (95% CI = 0.71, 0.80), respectively, for the diagnosis of severe dehydration. Additionally, the DHAKA Dehydration Score and DHAKA Dehydration Tree had significant positive likelihood ratios of 2.0 (95% CI = 1.8, 2.3) and 2.5 (95% CI = 2.1, 2.8), respectively, and significant negative likelihood ratios of 0.23 (95% CI = 0.13, 0.40) and 0.28 (95% CI = 0.18, 0.44), respectively, for the diagnosis of severe dehydration. Both models demonstrated 90% agreement between independent raters and good reproducibility using bootstrapping. Conclusion: This study is the first to empirically derive and internally validate accurate and reliable clinical diagnostic models for dehydration in a resource-limited setting. After external validation, frontline providers may use these new tools to better manage acute diarrhea in children. PMID:26374802

  8. Empirically Derived Dehydration Scoring and Decision Tree Models for Children With Diarrhea: Assessment and Internal Validation in a Prospective Cohort Study in Dhaka, Bangladesh.

    PubMed

    Levine, Adam C; Glavis-Bloom, Justin; Modi, Payal; Nasrin, Sabiha; Rege, Soham; Chu, Chieh; Schmid, Christopher H; Alam, Nur H

    2015-08-18

    Diarrhea remains one of the most common and most deadly conditions affecting children worldwide. Accurately assessing dehydration status is critical to determining treatment course, yet no clinical diagnostic models for dehydration have been empirically derived and validated for use in resource-limited settings. In the Dehydration: Assessing Kids Accurately (DHAKA) prospective cohort study, a random sample of children under 5 with acute diarrhea was enrolled between February and June 2014 in Bangladesh. Local nurses assessed children for clinical signs of dehydration on arrival, and then serial weights were obtained as subjects were rehydrated. For each child, the percent weight change with rehydration was used to classify subjects with severe dehydration (>9% weight change), some dehydration (3-9%), or no dehydration (<3%). Clinical variables were then entered into logistic regression and recursive partitioning models to develop the DHAKA Dehydration Score and DHAKA Dehydration Tree, respectively. Models were assessed for their accuracy using the area under their receiver operating characteristic curve (AUC) and for their reliability through repeat clinical exams. Bootstrapping was used to internally validate the models. A total of 850 children were enrolled, with 771 included in the final analysis. Of the 771 children included in the analysis, 11% were classified with severe dehydration, 45% with some dehydration, and 44% with no dehydration. Both the DHAKA Dehydration Score and DHAKA Dehydration Tree had significant AUCs of 0.79 (95% CI = 0.74, 0.84) and 0.76 (95% CI = 0.71, 0.80), respectively, for the diagnosis of severe dehydration. Additionally, the DHAKA Dehydration Score and DHAKA Dehydration Tree had significant positive likelihood ratios of 2.0 (95% CI = 1.8, 2.3) and 2.5 (95% CI = 2.1, 2.8), respectively, and significant negative likelihood ratios of 0.23 (95% CI = 0.13, 0.40) and 0.28 (95% CI = 0.18, 0.44), respectively, for the diagnosis of severe dehydration. Both models demonstrated 90% agreement between independent raters and good reproducibility using bootstrapping. This study is the first to empirically derive and internally validate accurate and reliable clinical diagnostic models for dehydration in a resource-limited setting. After external validation, frontline providers may use these new tools to better manage acute diarrhea in children. © Levine et al.

  9. Non local-thermodynamical-equilibrium effects in the simulation of laser-produced plasmas

    NASA Astrophysics Data System (ADS)

    Klapisch, M.; Bar-Shalom, A.; Oreg, J.; Colombant, D.

    1998-05-01

    Local thermodynamic equilibrium (LTE) breaks down in directly or indirectly driven laser plasmas because of sharp gradients, energy deposition, etc. For modeling non-LTE effects in hydrodynamical simulations, Busquet's model [Phys. Fluids B 5, 4191 (1993)] is very convenient and efficient. It uses off-line generated LTE opacities and equation of states via an effective, radiation-dependent ionization temperature Tz. An overview of the model is given. The results are compared with an elaborate collisional radiative model based on superconfigurations. The agreements for average charge Z* and opacities are surprisingly good, even more so when the plasma is immersed in a radiation field. Some remaining discrepancy at low density is attributed to dielectronic recombination. Improvement appears possible, especially for emissivities, because the concept of ionization temperature seems to be validated.

  10. The BTBR mouse model of idiopathic autism – current view on mechanisms

    PubMed Central

    Meyza, K. Z.; Blanchard, D. C.

    2017-01-01

    Autism spectrum disorder (ASD) is the most commonly diagnosed neurodevelopmental disorder, with current estimates of more than 1% of affected children across nations. The patients form a highly heterogeneous group with only the behavioral phenotype in common. The genetic heterogeneity is reflected in a plethora of animal models representing multiple mutations found in families of affected children. Despite many years of scientific effort, for the majority of cases the genetic cause remains elusive. It is therefore crucial to include well-validated models of idiopathic autism in studies searching for potential therapeutic agents. One of these models is the BTBR T+Itpr3tf/J mouse. The current review summarizes data gathered in recent research on potential molecular mechanisms responsible for the autism-like behavioral phenotype of this strain. PMID:28167097

  11. Predicting the heat of vaporization of iron at high temperatures using time-resolved laser-induced incandescence and Bayesian model selection

    NASA Astrophysics Data System (ADS)

    Sipkens, Timothy A.; Hadwin, Paul J.; Grauer, Samuel J.; Daun, Kyle J.

    2018-03-01

    Competing theories have been proposed to account for how the latent heat of vaporization of liquid iron varies with temperature, but experimental confirmation remains elusive, particularly at high temperatures. We propose time-resolved laser-induced incandescence measurements on iron nanoparticles combined with Bayesian model plausibility, as a novel method for evaluating these relationships. Our approach scores the explanatory power of candidate models, accounting for parameter uncertainty, model complexity, measurement noise, and goodness-of-fit. The approach is first validated with simulated data and then applied to experimental data for iron nanoparticles in argon. Our results justify the use of Román's equation to account for the temperature dependence of the latent heat of vaporization of liquid iron.

  12. End-of-Discharge and End-of-Life Prediction in Lithium-Ion Batteries with Electrochemistry-Based Aging Models

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.

    2016-01-01

    As batteries become increasingly prevalent in complex systems such as aircraft and electric cars, monitoring and predicting battery state of charge and state of health becomes critical. In order to accurately predict the remaining battery power to support system operations for informed operational decision-making, age-dependent changes in dynamics must be accounted for. Using an electrochemistry-based model, we investigate how key parameters of the battery change as aging occurs, and develop models to describe aging through these key parameters. Using these models, we demonstrate how we can (i) accurately predict end-of-discharge for aged batteries, and (ii) predict the end-of-life of a battery as a function of anticipated usage. The approach is validated through an experimental set of randomized discharge profiles.

  13. Prediction of overall survival in stage II and III colon cancer beyond TNM system: a retrospective, pooled biomarker study.

    PubMed

    Dienstmann, R; Mason, M J; Sinicrope, F A; Phipps, A I; Tejpar, S; Nesbakken, A; Danielsen, S A; Sveen, A; Buchanan, D D; Clendenning, M; Rosty, C; Bot, B; Alberts, S R; Milburn Jessup, J; Lothe, R A; Delorenzi, M; Newcomb, P A; Sargent, D; Guinney, J

    2017-05-01

    TNM staging alone does not accurately predict outcome in colon cancer (CC) patients who may be eligible for adjuvant chemotherapy. It is unknown to what extent the molecular markers microsatellite instability (MSI) and mutations in BRAF or KRAS improve prognostic estimation in multivariable models that include detailed clinicopathological annotation. After imputation of missing at random data, a subset of patients accrued in phase 3 trials with adjuvant chemotherapy (n = 3016)-N0147 (NCT00079274) and PETACC3 (NCT00026273)-was aggregated to construct multivariable Cox models for 5-year overall survival that were subsequently validated internally in the remaining clinical trial samples (n = 1499), and also externally in different population cohorts of chemotherapy-treated (n = 949) or -untreated (n = 1080) CC patients, and an additional series without treatment annotation (n = 782). TNM staging, MSI and BRAFV600E mutation status remained independent prognostic factors in multivariable models across clinical trials cohorts and observational studies. Concordance indices increased from 0.61-0.68 in the TNM alone model to 0.63-0.71 in models with added molecular markers, 0.65-0.73 with clinicopathological features and 0.66-0.74 with all covariates. In validation cohorts with complete annotation, the integrated time-dependent AUC rose from 0.64 for the TNM alone model to 0.67 for models that included clinicopathological features, with or without molecular markers. In patient cohorts that received adjuvant chemotherapy, the relative proportion of variance explained (R2) by TNM, clinicopathological features and molecular markers was on an average 65%, 25% and 10%, respectively. Incorporation of MSI, BRAFV600E and KRAS mutation status to overall survival models with TNM staging improves the ability to precisely prognosticate in stage II and III CC patients, but only modestly increases prediction accuracy in multivariable models that include clinicopathological features, particularly in chemotherapy-treated patients. © The Author 2017. Published by Oxford University Press on behalf of the European Society for Medical Oncology.

  14. Benchmarking Model Variants in Development of a Hardware-in-the-Loop Simulation System

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia M.; Kratz, Jonathan L.; Culley, Dennis E.; Thomas, George L.

    2016-01-01

    Distributed engine control architecture presents a significant increase in complexity over traditional implementations when viewed from the perspective of system simulation and hardware design and test. Even if the overall function of the control scheme remains the same, the hardware implementation can have a significant effect on the overall system performance due to differences in the creation and flow of data between control elements. A Hardware-in-the-Loop (HIL) simulation system is under development at NASA Glenn Research Center that enables the exploration of these hardware dependent issues. The system is based on, but not limited to, the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k). This paper describes the step-by-step conversion from the self-contained baseline model to the hardware in the loop model, and the validation of each step. As the control model hardware fidelity was improved during HIL system development, benchmarking simulations were performed to verify that engine system performance characteristics remained the same. The results demonstrate the goal of the effort; the new HIL configurations have similar functionality and performance compared to the baseline C-MAPSS40k system.

  15. Neutral meson properties under an external magnetic field in nonlocal chiral quark models

    NASA Astrophysics Data System (ADS)

    Gómez Dumm, D.; Izzo Villafañe, M. F.; Scoccola, N. N.

    2018-02-01

    We study the behavior of neutral meson properties in the presence of a static uniform external magnetic field in the context of nonlocal chiral quark models. The formalism is worked out introducing Ritus transforms of Dirac fields, which allow to obtain closed analytical expressions for π0 and σ meson masses and for the π0 decay constant. Numerical results for these observables are quoted for various parametrizations. In particular, the behavior of the π0 meson mass with the magnetic field is found to be in good agreement with lattice QCD results. It is also seen that the Goldberger-Treiman and Gell-Mann-Oakes-Renner chiral relations remain valid within these models in the presence of the external magnetic field.

  16. Novel application and serial evaluation of tissue-engineered portal vein grafts in a murine model.

    PubMed

    Maxfield, Mark W; Stacy, Mitchel R; Kurobe, Hirotsugu; Tara, Shuhei; Yi, Tai; Cleary, Muriel A; Zhuang, Zhen W; Rodriguez-Davalos, Manuel I; Emre, Sukru H; Iwakiri, Yasuko; Shinoka, Toshiharu; Breuer, Christopher K

    2017-12-01

    Surgical management of pediatric extrahepatic portal vein obstruction requires meso-Rex bypass using autologous or synthetic grafts. Tissue-engineered vascular grafts (TEVGs) provide an alternative, but no validated animal models using portal TEVGs exist. Herein, we preclinically assess TEVGs as portal vein bypass grafts. TEVGs were implanted as portal vein interposition conduits in SCID-beige mice, monitored by ultrasound and micro-computed tomography, and histologically assessed postmortem at 12 months. TEVGs remained patent for 12 months. Histologic analysis demonstrated formation of neovessels that resembled native portal veins, with similar content of smooth muscle cells, collagen type III and elastin. TEVGs are feasible portal vein conduits in a murine model. Further preclinical evaluation of TEVGs may facilitate pediatric clinical translation.

  17. Finite-element analysis of NiTi wire deflection during orthodontic levelling treatment

    NASA Astrophysics Data System (ADS)

    Razali, M. F.; Mahmud, A. S.; Mokhtar, N.; Abdullah, J.

    2016-02-01

    Finite-element analysis is an important product development tool in medical devices industry for design and failure analysis of devices. This tool helps device designers to quickly explore various design options, optimizing specific designs and providing a deeper insight how a device is actually performing. In this study, three-dimensional finite-element models of superelastic nickel-titanium arch wire engaged in a three brackets system were developed. The aim was to measure the effect of binding friction developed on wire-bracket interaction towards the remaining recovery force available for tooth movement. Uniaxial and three brackets bending test were modelled and validated against experimental works. The prediction made by the three brackets bending models shows good agreement with the experimental results.

  18. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... validation. The RSPP must require the identification of verification and validation methods for the... to be used in the verification and validation process, consistent with appendix C to this part. The... information. (3) If no action is taken on the petition within 180 days, the petition remains pending for...

  19. What Counts as Validity Evidence? Examples and Prevalence in a Systematic Review of Simulation-Based Assessment

    ERIC Educational Resources Information Center

    Cook, David A.; Zendejas, Benjamin; Hamstra, Stanley J.; Hatala, Rose; Brydges, Ryan

    2014-01-01

    Ongoing transformations in health professions education underscore the need for valid and reliable assessment. The current standard for assessment validation requires evidence from five sources: content, response process, internal structure, relations with other variables, and consequences. However, researchers remain uncertain regarding the types…

  20. Fisher's geometrical model emerges as a property of complex integrated phenotypic networks.

    PubMed

    Martin, Guillaume

    2014-05-01

    Models relating phenotype space to fitness (phenotype-fitness landscapes) have seen important developments recently. They can roughly be divided into mechanistic models (e.g., metabolic networks) and more heuristic models like Fisher's geometrical model. Each has its own drawbacks, but both yield testable predictions on how the context (genomic background or environment) affects the distribution of mutation effects on fitness and thus adaptation. Both have received some empirical validation. This article aims at bridging the gap between these approaches. A derivation of the Fisher model "from first principles" is proposed, where the basic assumptions emerge from a more general model, inspired by mechanistic networks. I start from a general phenotypic network relating unspecified phenotypic traits and fitness. A limited set of qualitative assumptions is then imposed, mostly corresponding to known features of phenotypic networks: a large set of traits is pleiotropically affected by mutations and determines a much smaller set of traits under optimizing selection. Otherwise, the model remains fairly general regarding the phenotypic processes involved or the distribution of mutation effects affecting the network. A statistical treatment and a local approximation close to a fitness optimum yield a landscape that is effectively the isotropic Fisher model or its extension with a single dominant phenotypic direction. The fit of the resulting alternative distributions is illustrated in an empirical data set. These results bear implications on the validity of Fisher's model's assumptions and on which features of mutation fitness effects may vary (or not) across genomic or environmental contexts.

  1. Modeling Droplet Heat and Mass Transfer during Spray Bar Pressure Control of the Multipurpose Hydrogen Test Bed (MHTB) Tank in Normal Gravity

    NASA Technical Reports Server (NTRS)

    Kartuzova, O.; Kassemi, M.

    2016-01-01

    A CFD model for simulating pressure control in cryogenic storage tanks through the injection of a subcooled liquid into the ullage is presented and applied to the 1g MHTB spray bar cooling experiments. An Eulerian-Lagrangian approach is utilized to track the spray droplets and capture the interaction between the discrete droplets and the continuous ullage phase. The spray model is coupled with the VOF model by performing particle tracking in the ullage, removing particles from the ullage when they reach the interface, and then adding their contributions to the liquid. A new model for calculating the droplet-ullage heat and mass transfer is developed. In this model, a droplet is allowed to warm up to the saturation temperature corresponding to the ullage vapor pressure, after which it evaporates while remaining at the saturation temperature. The droplet model is validated against the results of the MHTB spray-bar cooling experiments with 50% and 90% tank fill ratios. The predictions of the present T-sat based model are compared with those of a previously developed kinetic-based droplet mass transfer model. The predictions of the two models regarding the evolving tank pressure and temperature distributions, as well as the droplets' trajectories and temperatures, are examined and compared in detail. Finally, the ullage pressure and local vapor and liquid temperature evolutions are validated against the corresponding data provided by the MHTB spray bar mixing experiment.

  2. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  3. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    San Yuk, V.I.

    Not widely known facts on the genesis of the Skyrme model are presented in a historical survey, based on Skyrme's earliest papers and on his own published remembrance. This paper considers the evolution of Skyrme's model description of nuclear matter from the Mesonic Fluid model up to its final version, known as the baryon model. We pay special tribute to some well-known ideas in contemporary particle physics which one can find in Skyrme's earlier papers, such as: Nuclear Democracy, the Solitonic Mechanism, the Nonlinear Realization of Chiral Symmetry, Topological Charges, Fermi-Bose Transmutations, etc. It is curious to note in themore » final version of the Skyrme model gleams of Kelvin's Vortex Atoms theory. In conclusion we make a brief analysis of the validity of Skyrme's conjectures in view of recent results and pinpoint some questions which still remain.« less

  5. Measuring meaning and peace with the FACIT-spiritual well-being scale: distinction without a difference?

    PubMed

    Peterman, Amy H; Reeve, Charlie L; Winford, Eboni C; Cotton, Sian; Salsman, John M; McQuellon, Richard; Tsevat, Joel; Campbell, Cassie

    2014-03-01

    The Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being Scale (FACIT-Sp; Peterman, Fitchett, Brady, Hernandez, & Cella, 2002) has become a widely used measure of spirituality; however, there remain questions about its specific factor structure and the validity of scores from its separate scales. Specifically, it remains unclear whether the Meaning and Peace scales denote distinct factors. The present study addresses previous limitations by examining the extent to which the Meaning and Peace scales relate differentially to a variety of physical and mental health variables across 4 sets of data from adults with a number of chronic health conditions. Although a model with separate but correlated factors fit the data better, discriminant validity analyses indicated limited differences in the pattern of associations each scale showed with a wide array of commonly used health and quality-of-life measures. In total, the results suggest that people may distinguish between the concepts of Meaning and Peace, but the observed relations with health outcomes are primarily due to variance shared between the 2 factors. Additional research is needed to better understand the separate and joint role of Meaning and Peace in the quality of life of people with chronic illness. 2014 APA

  6. Measuring Meaning and Peace With the FACIT–Spiritual Well-Being Scale: Distinction Without a Difference?

    PubMed Central

    Peterman, Amy H.; Reeve, Charlie L.; Winford, Eboni C.; Salsman, John M.; Tsevat, Joel; Cotton, Sian; McQuellon, Richard; Campbell, Cassie

    2014-01-01

    The Functional Assessment of Chronic Illness Therapy–Spiritual Well-Being Scale (FACIT–Sp; Peterman, Fitchett, Brady, Hernandez, & Cella, 2002) has become a widely used measure of spirituality; however, there remain questions about its specific factor structure and the validity of scores from its separate scales. Specifically, it remains unclear whether the Meaning and Peace scales denote distinct factors. The present study addresses previous limitations by examining the extent to which the Meaning and Peace scales relate differentially to a variety of physical and mental health variables across 4 sets of data from adults with a number of chronic health conditions. Although a model with separate but correlated factors fit the data better, discriminant validity analyses indicated limited differences in the pattern of associations each scale showed with a wide array of commonly used health and quality-of-life measures. In total, the results suggest that people may distinguish between the concepts of Meaning and Peace, but the observed relations with health outcomes are primarily due to variance shared between the 2 factors. Additional research is needed to better understand the separate and joint role of Meaning and Peace in the quality of life of people with chronic illness. PMID:24188147

  7. Experimental and statistical post-validation of positive example EST sequences carrying peroxisome targeting signals type 1 (PTS1)

    PubMed Central

    Lingner, Thomas; Kataya, Amr R. A.; Reumann, Sigrun

    2012-01-01

    We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences.1 As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity.” Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals. PMID:22415050

  8. Experimental and statistical post-validation of positive example EST sequences carrying peroxisome targeting signals type 1 (PTS1).

    PubMed

    Lingner, Thomas; Kataya, Amr R A; Reumann, Sigrun

    2012-02-01

    We recently developed the first algorithms specifically for plants to predict proteins carrying peroxisome targeting signals type 1 (PTS1) from genome sequences. As validated experimentally, the prediction methods are able to correctly predict unknown peroxisomal Arabidopsis proteins and to infer novel PTS1 tripeptides. The high prediction performance is primarily determined by the large number and sequence diversity of the underlying positive example sequences, which mainly derived from EST databases. However, a few constructs remained cytosolic in experimental validation studies, indicating sequencing errors in some ESTs. To identify erroneous sequences, we validated subcellular targeting of additional positive example sequences in the present study. Moreover, we analyzed the distribution of prediction scores separately for each orthologous group of PTS1 proteins, which generally resembled normal distributions with group-specific mean values. The cytosolic sequences commonly represented outliers of low prediction scores and were located at the very tail of a fitted normal distribution. Three statistical methods for identifying outliers were compared in terms of sensitivity and specificity." Their combined application allows elimination of erroneous ESTs from positive example data sets. This new post-validation method will further improve the prediction accuracy of both PTS1 and PTS2 protein prediction models for plants, fungi, and mammals.

  9. Measuring acuity of the approximate number system reliably and validly: the evaluation of an adaptive test procedure

    PubMed Central

    Lindskog, Marcus; Winman, Anders; Juslin, Peter; Poom, Leo

    2013-01-01

    Two studies investigated the reliability and predictive validity of commonly used measures and models of Approximate Number System acuity (ANS). Study 1 investigated reliability by both an empirical approach and a simulation of maximum obtainable reliability under ideal conditions. Results showed that common measures of the Weber fraction (w) are reliable only when using a substantial number of trials, even under ideal conditions. Study 2 compared different purported measures of ANS acuity as for convergent and predictive validity in a within-subjects design and evaluated an adaptive test using the ZEST algorithm. Results showed that the adaptive measure can reduce the number of trials needed to reach acceptable reliability. Only direct tests with non-symbolic numerosity discriminations of stimuli presented simultaneously were related to arithmetic fluency. This correlation remained when controlling for general cognitive ability and perceptual speed. Further, the purported indirect measure of ANS acuity in terms of the Numeric Distance Effect (NDE) was not reliable and showed no sign of predictive validity. The non-symbolic NDE for reaction time was significantly related to direct w estimates in a direction contrary to the expected. Easier stimuli were found to be more reliable, but only harder (7:8 ratio) stimuli contributed to predictive validity. PMID:23964256

  10. On the Connection between Kinetic Monte Carlo and the Burton-Cabrera-Frank Theory

    NASA Astrophysics Data System (ADS)

    Patrone, Paul; Margetis, Dionisios; Einstein, T. L.

    2013-03-01

    In the many years since it was first proposed, the Burton- Cabrera-Frank (BCF) model of step-flow has been experimentally established as one of the cornerstones of surface physics. However, many questions remain regarding the underlying physical processes and theoretical assumptions that give rise to the BCF theory. In this work, we formally derive the BCF theory from an atomistic, kinetic Monte Carlo model of the surface in 1 +1 dimensions with one step. Our analysis (i) shows how the BCF theory describes a surface with a low density of adsorbed atoms, and (ii) establishes a set of near-equilibrium conditions ensuring that the theory remains valid for all times. Support for PP was provided by the NIST-ARRA Fellowship Award No. 70NANB10H026 through UMD. Support for TLE and PP was also provided by the CMTC at UMD, with ancillary support from the UMD MRSEC. Support for DM was provided by NSF DMS0847587 at UMD.

  11. Multi-response calibration of a conceptual hydrological model in the semiarid catchment of Wadi al Arab, Jordan

    NASA Astrophysics Data System (ADS)

    Rödiger, T.; Geyer, S.; Mallast, U.; Merz, R.; Krause, P.; Fischer, C.; Siebert, C.

    2014-02-01

    A key factor for sustainable management of groundwater systems is the accurate estimation of groundwater recharge. Hydrological models are common tools for such estimations and widely used. As such models need to be calibrated against measured values, the absence of adequate data can be problematic. We present a nested multi-response calibration approach for a semi-distributed hydrological model in the semi-arid catchment of Wadi al Arab in Jordan, with sparsely available runoff data. The basic idea of the calibration approach is to use diverse observations in a nested strategy, in which sub-parts of the model are calibrated to various observation data types in a consecutive manner. First, the available different data sources have to be screened for information content of processes, e.g. if data sources contain information on mean values, spatial or temporal variability etc. for the entire catchment or only sub-catchments. In a second step, the information content has to be mapped to relevant model components, which represent these processes. Then the data source is used to calibrate the respective subset of model parameters, while the remaining model parameters remain unchanged. This mapping is repeated for other available data sources. In that study the gauged spring discharge (GSD) method, flash flood observations and data from the chloride mass balance (CMB) are used to derive plausible parameter ranges for the conceptual hydrological model J2000g. The water table fluctuation (WTF) method is used to validate the model. Results from modelling using a priori parameter values from literature as a benchmark are compared. The estimated recharge rates of the calibrated model deviate less than ±10% from the estimates derived from WTF method. Larger differences are visible in the years with high uncertainties in rainfall input data. The performance of the calibrated model during validation produces better results than applying the model with only a priori parameter values. The model with a priori parameter values from literature tends to overestimate recharge rates with up to 30%, particular in the wet winter of 1991/1992. An overestimation of groundwater recharge and hence available water resources clearly endangers reliable water resource managing in water scarce region. The proposed nested multi-response approach may help to better predict water resources despite data scarcity.

  12. Discovery and Validation of Novel Expression Signature for Postcystectomy Recurrence in High-Risk Bladder Cancer

    PubMed Central

    Lam, Lucia L.; Ghadessi, Mercedeh; Erho, Nicholas; Vergara, Ismael A.; Alshalalfa, Mohammed; Buerki, Christine; Haddad, Zaid; Sierocinski, Thomas; Triche, Timothy J.; Skinner, Eila C.; Davicioni, Elai; Daneshmand, Siamak; Black, Peter C.

    2014-01-01

    Background Nearly half of muscle-invasive bladder cancer patients succumb to their disease following cystectomy. Selecting candidates for adjuvant therapy is currently based on clinical parameters with limited predictive power. This study aimed to develop and validate genomic-based signatures that can better identify patients at risk for recurrence than clinical models alone. Methods Transcriptome-wide expression profiles were generated using 1.4 million feature-arrays on archival tumors from 225 patients who underwent radical cystectomy and had muscle-invasive and/or node-positive bladder cancer. Genomic (GC) and clinical (CC) classifiers for predicting recurrence were developed on a discovery set (n = 133). Performances of GC, CC, an independent clinical nomogram (IBCNC), and genomic-clinicopathologic classifiers (G-CC, G-IBCNC) were assessed in the discovery and independent validation (n = 66) sets. GC was further validated on four external datasets (n = 341). Discrimination and prognostic abilities of classifiers were compared using area under receiver-operating characteristic curves (AUCs). All statistical tests were two-sided. Results A 15-feature GC was developed on the discovery set with area under curve (AUC) of 0.77 in the validation set. This was higher than individual clinical variables, IBCNC (AUC = 0.73), and comparable to CC (AUC = 0.78). Performance was improved upon combining GC with clinical nomograms (G-IBCNC, AUC = 0.82; G-CC, AUC = 0.86). G-CC high-risk patients had elevated recurrence probabilities (P < .001), with GC being the best predictor by multivariable analysis (P = .005). Genomic-clinicopathologic classifiers outperformed clinical nomograms by decision curve and reclassification analyses. GC performed the best in validation compared with seven prior signatures. GC markers remained prognostic across four independent datasets. Conclusions The validated genomic-based classifiers outperform clinical models for predicting postcystectomy bladder cancer recurrence. This may be used to better identify patients who need more aggressive management. PMID:25344601

  13. Development and initial validation of a content taxonomy for patient records in general dentistry

    PubMed Central

    Acharya, Amit; Hernandez, Pedro; Thyvalikakath, Thankam; Ye, Harold; Song, Mei; Schleyer, Titus

    2013-01-01

    Objective Develop and validate an initial content taxonomy for patient records in general dentistry. Methods Phase 1–Obtain 95 de-identified patient records from 11 general dentists in the United States. Phase 2–Extract individual data fields (information items), both explicit (labeled) and implicit (unlabeled), from records, and organize into categories mirroring original field context. Phase 3–Refine raw list of information items by eliminating duplicates/redundancies and focusing on general dentistry. Phase 4–Validate all items regarding inclusion and importance using a two-round Delphi study with a panel of 22 general dentists active in clinical practice, education, and research. Results Analysis of 76 patient records from 9 dentists, combined with previous work, yielded a raw list of 1,509 information items. Refinement reduced this list to 1,107 items, subsequently rated by the Delphi panel. The final model contained 870 items, with 761 (88%) rated as mandatory. In Round 1, 95% (825) of the final items were accepted, in Round 2 the remaining 5% (45). Only 45 items on the initial list were rejected and 192 (or 17%) remained equivocal. Conclusion Grounded in the reality of clinical practice, our proposed content taxonomy represents a significant advance over existing guidelines and standards by providing a granular and comprehensive information representation for general dental patient records. It offers a significant foundational asset for implementing an interoperable health information technology infrastructure for general dentistry. PMID:23838618

  14. The temporal stability and predictive validity of pupils' causal attributions for difficult classroom behaviour.

    PubMed

    Lambert, Nathan; Miller, Andy

    2010-12-01

    Recent studies have investigated the causal attributions for difficult pupil behaviour made by teachers, pupils, and parents but none have investigated the temporal stability or predictive validity of these attributions. This study examines the causal attributions made for difficult classroom behaviour by students on two occasions 30 months apart. The longitudinal stability of these attributions is considered as is the predictive validity of the first set of attributions in relation to teachers' later judgments about individual students' behaviour. Two hundred and seventeen secondary school age pupils (114 males, 103 females) provided data on the two occasions. Teachers also rated each student's behaviour at the two times. A questionnaire listing 63 possible causes of classroom misbehaviour was delivered to pupils firstly when they were in Year 7 (aged 11-12) and then again, 30 months later. Responses were analysed through exploratory factor analysis (EFA). Additionally, teachers were asked to rate the standard of behaviour of each of the students on the two occasions. EFA of the Years 7 and 10 data indicated that pupils' attributions yielded broadly similar five-factor models with the perceived relative importance of these factors remaining the same. Analysis also revealed a predictive relationship between pupils' attributions regarding the factor named culture of misbehaviour in Year 7, and teachers' judgments of their standard of behaviour in Year 10. The present study suggests that young adolescents' causal attributions for difficult classroom behaviour remain stable over time and are predictive of teachers' later judgments about their behaviour.

  15. Making Progress and Gaining Momentum in Global 3Rs Efforts: How the European Pharmaceutical Industry Is Contributing

    PubMed Central

    Fleetwood, Gill; Chlebus, Magda; Coenen, Joachim; Dudoignon, Nicolas; Lecerf, Catherine; Maisonneuve, Catherine; Robinson, Sally

    2015-01-01

    Animal research together with other investigational methods (computer modeling, in vitro tests, etc) remains an indispensable part of the pharmaceutical research and development process. The European pharmaceutical industry recognizes the responsibilities inherent in animal research and is committed to applying and enhancing 3Rs principles. New nonsentient, ex vivo, and in vitro methods are developed every day and contribute to reducing and, in some instances, replacing in vivo studies. Their utility is however limited by the extent of our current knowledge and understanding of complex biological systems. Until validated alternative ways to model these complex interactions become available, animals remain indispensable in research and safety testing. In the interim, scientists continue to look for ways to reduce the number of animals needed to obtain valid results, refine experimental techniques to enhance animal welfare, and replace animals with other research methods whenever feasible. As research goals foster increasing cross-sector and international collaboration, momentum is growing to enhance and coordinate scientific innovation globally—beyond a single company, stakeholder group, sector, region, or country. The implementation of 3Rs strategies can be viewed as an integral part of this continuously evolving science, demonstrating the link between science and welfare, benefiting both the development of new medicines and animal welfare. This goal is one of the key objectives of the Research and Animal Welfare working group of the European Federation of Pharmaceutical Industries and Associations. PMID:25836966

  16. Making progress and gaining momentum in global 3Rs efforts: how the European pharmaceutical industry is contributing.

    PubMed

    Fleetwood, Gill; Chlebus, Magda; Coenen, Joachim; Dudoignon, Nicolas; Lecerf, Catherine; Maisonneuve, Catherine; Robinson, Sally

    2015-03-01

    Animal research together with other investigational methods (computer modeling, in vitro tests, etc) remains an indispensable part of the pharmaceutical research and development process. The European pharmaceutical industry recognizes the responsibilities inherent in animal research and is committed to applying and enhancing 3Rs principles. New nonsentient, ex vivo, and in vitro methods are developed every day and contribute to reducing and, in some instances, replacing in vivo studies. Their utility is however limited by the extent of our current knowledge and understanding of complex biological systems. Until validated alternative ways to model these complex interactions become available, animals remain indispensable in research and safety testing. In the interim, scientists continue to look for ways to reduce the number of animals needed to obtain valid results, refine experimental techniques to enhance animal welfare, and replace animals with other research methods whenever feasible. As research goals foster increasing cross-sector and international collaboration, momentum is growing to enhance and coordinate scientific innovation globally-beyond a single company, stakeholder group, sector, region, or country. The implementation of 3Rs strategies can be viewed as an integral part of this continuously evolving science, demonstrating the link between science and welfare, benefiting both the development of new medicines and animal welfare. This goal is one of the key objectives of the Research and Animal Welfare working group of the European Federation of Pharmaceutical Industries and Associations.

  17. The craving withdrawal model for alcoholism: towards the DSM-V. Improving the discriminant validity of alcohol use disorder diagnosis.

    PubMed

    de Bruijn, Carla; van den Brink, Wim; de Graaf, Ron; Vollebergh, Wilma A M

    2005-01-01

    To compare the discriminant validity of the DSM-IV and the ICD-10 classification of alcohol use disorders (AUD) with an alternative classification, the craving withdrawal model (CWM). CWM requires craving and withdrawal for the diagnosis of alcohol dependence and raises the alcohol abuse threshold to two DSM-IV AUD criteria. Data were derived from The Netherlands Mental Health Survey and Incidence Study, a large representative sample of the general Dutch population. In the present study, only non-abstinent subjects were included (n=6041). Three diagnostic systems (DSM-IV, ICD-10, and CWM) were compared using the following discriminant variables: alcohol intake, psychiatric comorbidity, functional status, familial alcohol problems, and treatment sought. The year prevalence of CWM alcohol dependence was lower than the prevalence of ICD-10 and DSM-IV dependence (0.3% vs 1.4% and 1.4%). The year prevalence of abuse was similar for CWM and DSM-IV (4.7 and 4.9%), but lower for ICD-10 harmful use (1.7%). DSM-IV resulted in a poor distinction between normality and abuse and ICD-10 resulted in a poor distinction between harmful use and dependence. In contrast, the CWM distinctions between normality and abuse, and between abuse, and dependence were significant for most of the discriminant variables. This study indicates that CWM improves the discriminant validity of AUD diagnoses. The predictive validity of the CWM for alcohol and other substance use disorders remain to be studied.

  18. Prognostic nomogram and score to predict overall survival in locally advanced untreated pancreatic cancer (PROLAP)

    PubMed Central

    Vernerey, Dewi; Huguet, Florence; Vienot, Angélique; Goldstein, David; Paget-Bailly, Sophie; Van Laethem, Jean-Luc; Glimelius, Bengt; Artru, Pascal; Moore, Malcolm J; André, Thierry; Mineur, Laurent; Chibaudel, Benoist; Benetkiewicz, Magdalena; Louvet, Christophe; Hammel, Pascal; Bonnetain, Franck

    2016-01-01

    Background: The management of locally advanced pancreatic cancer (LAPC) patients remains controversial. Better discrimination for overall survival (OS) at diagnosis is needed. We address this issue by developing and validating a prognostic nomogram and a score for OS in LAPC (PROLAP). Methods: Analyses were derived from 442 LAPC patients enrolled in the LAP07 trial. The prognostic ability of 30 baseline parameters was evaluated using univariate and multivariate Cox regression analyses. Performance assessment and internal validation of the final model were done with Harrell's C-index, calibration plot and bootstrap sample procedures. On the basis of the final model, a prognostic nomogram and a score were developed, and externally validated in 106 consecutive LAPC patients treated in Besançon Hospital, France. Results: Age, pain, tumour size, albumin and CA 19-9 were independent prognostic factors for OS. The final model had good calibration, acceptable discrimination (C-index=0.60) and robust internal validity. The PROLAP score has the potential to delineate three different prognosis groups with median OS of 15.4, 11.7 and 8.5 months (log-rank P<0.0001). The score ability to discriminate OS was externally confirmed in 63 (59%) patients with complete clinical data derived from a data set of 106 consecutive LAPC patients; median OS of 18.3, 14.1 and 7.6 months for the three groups (log-rank P<0.0001). Conclusions: The PROLAP nomogram and score can accurately predict OS before initiation of induction chemotherapy in LAPC-untreated patients. They may help to optimise clinical trials design and might offer the opportunity to define risk-adapted strategies for LAPC management in the future. PMID:27404456

  19. Assessing health-related quality of life in Japanese children with a chronic condition: validation of the DISABKIDS chronic generic module.

    PubMed

    Sasaki, Hatoko; Kakee, Naoko; Morisaki, Naho; Mori, Rintaro; Ravens-Sieberer, Ulrike; Bullinger, Monika

    2018-05-02

    This study examined the reliability and validity of the Japanese versions of the DISABKIDS-37 generic modules, a tool for assessing the health-related quality of life (HRQOL) of children and adolescents with a chronic condition. The study was conducted using a sample of 123 children/adolescents with a chronic medical condition, aged 8-18 years, and their parents. Focus interviews were performed to ensure content validity after translation. The classical psychometric tests were used to assess reliability and scale intercorrelations. The factor structure was examined with confirmatory factor analysis (CFA). Convergent validity was assessed by the correlation between the total score and the sub-scales of DISABKIDS-37 as well as the total score of KIDSCREEN-10. Both the children/adolescent and parent versions of the score showed good to high internal consistency, and the test-retest reliability correlations were r = 0.91 or above. The CFA revealed that the modified models for all domains were better fit than the original 37 item scale model for both self-report and proxy-report. Moderate to high positive correlations were found for the associations within DISABKIDS-37 sub-scales and between the subscales and total score, except for the treatment sub-scale, which correlated weakly with the remaining sub-scales. The total score of the child-reported version of KIDSCREEN-10 correlated significantly and positively with the total score and all the sub-scales of the child-reported version of DISABKIDS-37 except the Treatment sub-scale in adolescents. The modified models of Japanese version of DISABKIDS generic module were psychometrically robust enough to assess the HRQOL of children with a chronic condition.

  20. The New York Sepsis Severity Score: Development of a Risk-Adjusted Severity Model for Sepsis.

    PubMed

    Phillips, Gary S; Osborn, Tiffany M; Terry, Kathleen M; Gesten, Foster; Levy, Mitchell M; Lemeshow, Stanley

    2018-05-01

    In accordance with Rory's Regulations, hospitals across New York State developed and implemented protocols for sepsis recognition and treatment to reduce variations in evidence informed care and preventable mortality. The New York Department of Health sought to develop a risk assessment model for accurate and standardized hospital mortality comparisons of adult septic patients across institutions using case-mix adjustment. Retrospective evaluation of prospectively collected data. Data from 43,204 severe sepsis and septic shock patients from 179 hospitals across New York State were evaluated. Prospective data were submitted to a database from January 1, 2015, to December 31, 2015. None. Maximum likelihood logistic regression was used to estimate model coefficients used in the New York State risk model. The mortality probability was estimated using a logistic regression model. Variables to be included in the model were determined as part of the model-building process. Interactions between variables were included if they made clinical sense and if their p values were less than 0.05. Model development used a random sample of 90% of available patients and was validated using the remaining 10%. Hosmer-Lemeshow goodness of fit p values were considerably greater than 0.05, suggesting good calibration. Areas under the receiver operator curve in the developmental and validation subsets were 0.770 (95% CI, 0.765-0.775) and 0.773 (95% CI, 0.758-0.787), respectively, indicating good discrimination. Development and validation datasets had similar distributions of estimated mortality probabilities. Mortality increased with rising age, comorbidities, and lactate. The New York Sepsis Severity Score accurately estimated the probability of hospital mortality in severe sepsis and septic shock patients. It performed well with respect to calibration and discrimination. This sepsis-specific model provides an accurate, comprehensive method for standardized mortality comparison of adult patients with severe sepsis and septic shock.

  1. Using deep RNA sequencing for the structural annotation of the laccaria bicolor mycorrhizal transcriptome.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larsen, P. E.; Trivedi, G.; Sreedasyam, A.

    2010-07-06

    Accurate structural annotation is important for prediction of function and required for in vitro approaches to characterize or validate the gene expression products. Despite significant efforts in the field, determination of the gene structure from genomic data alone is a challenging and inaccurate process. The ease of acquisition of transcriptomic sequence provides a direct route to identify expressed sequences and determine the correct gene structure. We developed methods to utilize RNA-seq data to correct errors in the structural annotation and extend the boundaries of current gene models using assembly approaches. The methods were validated with a transcriptomic data set derivedmore » from the fungus Laccaria bicolor, which develops a mycorrhizal symbiotic association with the roots of many tree species. Our analysis focused on the subset of 1501 gene models that are differentially expressed in the free living vs. mycorrhizal transcriptome and are expected to be important elements related to carbon metabolism, membrane permeability and transport, and intracellular signaling. Of the set of 1501 gene models, 1439 (96%) successfully generated modified gene models in which all error flags were successfully resolved and the sequences aligned to the genomic sequence. The remaining 4% (62 gene models) either had deviations from transcriptomic data that could not be spanned or generated sequence that did not align to genomic sequence. The outcome of this process is a set of high confidence gene models that can be reliably used for experimental characterization of protein function. 69% of expressed mycorrhizal JGI 'best' gene models deviated from the transcript sequence derived by this method. The transcriptomic sequence enabled correction of a majority of the structural inconsistencies and resulted in a set of validated models for 96% of the mycorrhizal genes. The method described here can be applied to improve gene structural annotation in other species, provided that there is a sequenced genome and a set of gene models.« less

  2. The Use of a Mesoscale Climate Model to Validate the Nocturnal Carbon Flux over a Forested Site

    NASA Astrophysics Data System (ADS)

    Werth, D.; Parker, M.; Kurzeja, R.; Leclerc, M.; Watson, T.

    2007-12-01

    The Savannah River National Laboratory is initiating a comprehensive carbon dioxide monitoring and modeling program in collaboration with the University of Georgia and the Brookhaven National Laboratory. One of the primary goals is to study the dynamics of carbon dioxide in the stable nocturnal boundary layer (NBL) over a forested area of the Savannah River Site in southwest South Carolina. In the nocturnal boundary layer (NBL), eddy flux correlation is less effective in determining the release of CO2 due to respiration. Theoretically, however, the flux can be inferred by measuring the build up of CO2 in the stable layer throughout the night. This method of monitoring the flux will be validated and studied in more detail with both observations and the results of a high-resolution regional climate model. The experiment will involve two phases. First, an artificial tracer will be released into the forest boundary layer and observed through an array of sensors and at a flux tower. The event will be simulated with the RAMS climate model run at very high resolution. Ideally, the tracer will remain trapped within the stable layer and accumulate at rates which will allow us to infer the release rate, and this should compare well to the actual release rate. If an unknown mechanism allows the tracer to escape, the model simulation would be used to reveal it. In the second phase, carbon fluxes will be measured overnight through accumulation in the overlying layer. The RAMS model will be coupled with the SiB carbon model to simulate the nocturnal cycle of carbon dynamics, and this will be compared to the data collected during the night. As with the tracer study, the NBL method of flux measurement will be validated against the model. The RAMS-SiB coupled model has been run over the SRS at high-resolution to simulate the NBL, and results from simulations of both phases of the project will be presented.

  3. Modeling breath-enhanced jet nebulizers to estimate pulmonary drug deposition.

    PubMed

    Wee, Wallace B; Leung, Kitty; Coates, Allan L

    2013-12-01

    Predictable delivery of aerosol medication for a given patient and drug-device combination is crucial, both for therapeutic effect and to avoid toxicity. The gold standard for measuring pulmonary drug deposition (PDD) is gamma scintigraphy. However, these techniques expose patients to radiation, are complicated, and are relevant for only one patient and drug-device combination, making them less available. Alternatively, in vitro experiments have been used as a surrogate to estimate in vivo performance, but this is time-consuming and has few "in vitro to in vivo" correlations for therapeutics delivered by inhalation. An alternative method for determining inhaled mass and PDD is proposed by deriving and validating a mathematical model, for the individual breathing patterns of normal subjects and drug-device operating parameters. This model was evaluated for patients with cystic fibrosis (CF). This study is comprised of three stages: mathematical model derivation, in vitro testing, and in vivo validation. The model was derived from an idealized patient's respiration cycle and the steady-state operating characteristics of a drug-device combination. The model was tested under in vitro dynamic conditions that varied tidal volume, inspiration-to-expiration time, and breaths per minute. This approach was then extended to incorporate additional physiological parameters (dead space, aerodynamic particle size distribution) and validated against in vivo nuclear medicine data in predicting PDD in both normal subjects and those with CF. The model shows strong agreement with in vitro testing. In vivo testing with normal subjects yielded good agreement, but less agreement for patients with chronic obstructive lung disease and bronchiectasis from CF. The mathematical model was successful in accommodating a wide range of breathing patterns and drug-device combinations. Furthermore, the model has demonstrated its effectiveness in predicting the amount of aerosol delivered to "normal" subjects. However, challenges remain in predicting deposition in obstructive lung disease.

  4. The relevance of human stem cell-derived organoid models for epithelial translational medicine

    PubMed Central

    Hynds, Robert E.; Giangreco, Adam

    2014-01-01

    Epithelial organ remodeling is a major contributing factor to worldwide death and disease, costing healthcare systems billions of dollars every year. Despite this, most fundamental epithelial organ research fails to produce new therapies and mortality rates for epithelial organ diseases remain unacceptably high. In large part, this failure in translating basic epithelial research into clinical therapy is due to a lack of relevance in existing preclinical models. To correct this, new models are required that improve preclinical target identification, pharmacological lead validation, and compound optimization. In this review, we discuss the relevance of human stem cell-derived, three-dimensional organoid models for addressing each of these challenges. We highlight the advantages of stem cell-derived organoid models over existing culture systems, discuss recent advances in epithelial tissue-specific organoids, and present a paradigm for using organoid models in human translational medicine. PMID:23203919

  5. Development of a MELCOR Sodium Chemistry (NAC) Package - FY17 Progress.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.

    This report describes the status of the development of MELCOR Sodium Chemistry (NAC) package. This development is based on the CONTAIN-LMR sodium physics and chemistry models to be implemented in MELCOR. In the past three years, the sodium equation of state as a working fluid from the nuclear fusion safety research and from the SIMMER code has been implemented into MELCOR. The chemistry models from the CONTAIN-LMR code, such as the spray and pool fire mode ls, have also been implemented into MELCOR. This report describes the implemented models and the issues encountered. Model descriptions and input descriptions are provided.more » Development testing of the spray and pool fire models is described, including the code-to-code comparison with CONTAIN-LMR. The report ends with an expected timeline for the remaining models to be implemented, such as the atmosphere chemistry, sodium-concrete interactions, and experimental validation tests .« less

  6. Statistically Qualified Neuro-Analytic system and Method for Process Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    1998-11-04

    An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less

  7. Genetic and Environmental Influences of General Cognitive Ability: Is g a valid latent construct?

    PubMed Central

    Panizzon, Matthew S.; Vuoksimaa, Eero; Spoon, Kelly M.; Jacobson, Kristen C.; Lyons, Michael J.; Franz, Carol E.; Xian, Hong; Vasilopoulos, Terrie; Kremen, William S.

    2014-01-01

    Despite an extensive literature, the “g” construct remains a point of debate. Different models explaining the observed relationships among cognitive tests make distinct assumptions about the role of g in relation to those tests and specific cognitive domains. Surprisingly, these different models and their corresponding assumptions are rarely tested against one another. In addition to the comparison of distinct models, a multivariate application of the twin design offers a unique opportunity to test whether there is support for g as a latent construct with its own genetic and environmental influences, or whether the relationships among cognitive tests are instead driven by independent genetic and environmental factors. Here we tested multiple distinct models of the relationships among cognitive tests utilizing data from the Vietnam Era Twin Study of Aging (VETSA), a study of middle-aged male twins. Results indicated that a hierarchical (higher-order) model with a latent g phenotype, as well as specific cognitive domains, was best supported by the data. The latent g factor was highly heritable (86%), and accounted for most, but not all, of the genetic effects in specific cognitive domains and elementary cognitive tests. By directly testing multiple competing models of the relationships among cognitive tests in a genetically-informative design, we are able to provide stronger support than in prior studies for g being a valid latent construct. PMID:24791031

  8. Perceived experiences of atheist discrimination: Instrument development and evaluation.

    PubMed

    Brewster, Melanie E; Hammer, Joseph; Sawyer, Jacob S; Eklund, Austin; Palamar, Joseph

    2016-10-01

    The present 2 studies describe the development and initial psychometric evaluation of a new instrument, the Measure of Atheist Discrimination Experiences (MADE), which may be used to examine the minority stress experiences of atheist people. Items were created from prior literature, revised by a panel of expert researchers, and assessed psychometrically. In Study 1 (N = 1,341 atheist-identified people), an exploratory factor analysis with 665 participants suggested the presence of 5 related dimensions of perceived discrimination. However, bifactor modeling via confirmatory factor analysis and model-based reliability estimates with data from the remaining 676 participants affirmed the presence of a strong "general" factor of discrimination and mixed to poor support for substantive subdimensions. In Study 2 (N = 1,057 atheist-identified people), another confirmatory factor analysis and model-based reliability estimates strongly supported the bifactor model from Study 1 (i.e., 1 strong "general" discrimination factor) and poor support for subdimensions. Across both studies, the MADE general factor score demonstrated evidence of good reliability (i.e., Cronbach's alphas of .94 and .95; omega hierarchical coefficients of .90 and .92), convergent validity (i.e., with stigma consciousness, β = .56; with awareness of public devaluation, β = .37), and preliminary evidence for concurrent validity (i.e., with loneliness β = .18; with psychological distress β = .27). Reliability and validity evidence for the MADE subscale scores was not sufficient to warrant future use of the subscales. Limitations and implications for future research and clinical work with atheist individuals are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Rating long-term care facilities on pressure ulcer development: importance of case-mix adjustment.

    PubMed

    Berlowitz, D R; Ash, A S; Brandeis, G H; Brand, H K; Halpern, J L; Moskowitz, M A

    1996-03-15

    To determine the importance of case-mix adjustment in interpreting differences in rates of pressure ulcer development in Department of Veterans Affairs long- term care facilities. A sample assembled from the Patient Assessment File, a Veterans Affairs administrative database, was used to derive predictors of pressure ulcer development; the resulting model was validated in a separate sample. Facility-level rates of pressure ulcer development, both unadjusted and adjusted for case mix using the predictive model, were compared. Department of Veterans Affairs long-term care facilities. The derivation sample consisted of 31 150 intermediate medicine and nursing home residents who were initially free of pressure ulcers and were institutionalized between October 1991 and April 1993. The validation sample consisted of 17 946 residents institutionalized from April 1993 to October 1993. Development of a stage 2 or greater pressure ulcer. 11 factors predicted pressure ulcer development. Validated performance properties of the resulting model were good. Model-predicted rates of pressure ulcer development at individual long-term care facilities varied from 1.9% to 6.3%, and observed rates ranged from 0% to 10.9%. Case-mix-adjusted rates and ranks of facilities differed considerably from unadjusted ratings. For example, among five facilities that were identified as high outliers on the basis of unadjusted rates, two remained as outliers after adjustment for case mix. Long-term care facilities differ in case mix. Adjustments for case mix result in different judgments about facility performance and should be used when facility incidence rates are compared.

  10. Towards the development of improved tests for negative symptoms of schizophrenia in a validated animal model.

    PubMed

    Sahin, Ceren; Doostdar, Nazanin; Neill, Joanna C

    2016-10-01

    Negative symptoms in schizophrenia remain an unmet clinical need. There is no licensed treatment specifically for this debilitating aspect of the disorder and effect sizes of new therapies are too small to make an impact on quality of life and function. Negative symptoms are multifactorial but often considered in terms of two domains, expressive deficit incorporating blunted affect and poverty of speech and avolition incorporating asociality and lack of drive. There is a clear need for improved understanding of the neurobiology of negative symptoms which can be enabled through the use of carefully validated animal models. While there are several tests for assessing sociability in animals, tests for blunted affect in schizophrenia are currently lacking. Two paradigms have recently been developed for assessing negative affect of relevance to depression in rats. Here we assess their utility for studying negative symptoms in schizophrenia using our well validated model for schizophrenia of sub-chronic (sc) treatment with Phencyclidine (PCP) in adult female rats. Results demonstrate that sc PCP treatment produces a significant negative affect bias in response to a high value reward in the optimistic and affective bias tests. Our results are not easily explained by the known cognitive deficits induced by sc PCP and support the hypothesis of a negative affective bias in this model. We suggest that further refinement of these two tests will provide a means to investigate the neurobiological basis of negative affect in schizophrenia, thus supporting the assessment of efficacy of new targets for this currently untreated symptom domain. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Integral Full Core Multi-Physics PWR Benchmark with Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forget, Benoit; Smith, Kord; Kumar, Shikhar

    In recent years, the importance of modeling and simulation has been highlighted extensively in the DOE research portfolio with concrete examples in nuclear engineering with the CASL and NEAMS programs. These research efforts and similar efforts worldwide aim at the development of high-fidelity multi-physics analysis tools for the simulation of current and next-generation nuclear power reactors. Like all analysis tools, verification and validation is essential to guarantee proper functioning of the software and methods employed. The current approach relies mainly on the validation of single physic phenomena (e.g. critical experiment, flow loops, etc.) and there is a lack of relevantmore » multiphysics benchmark measurements that are necessary to validate high-fidelity methods being developed today. This work introduces a new multi-cycle full-core Pressurized Water Reactor (PWR) depletion benchmark based on two operational cycles of a commercial nuclear power plant that provides a detailed description of fuel assemblies, burnable absorbers, in-core fission detectors, core loading and re-loading patterns. This benchmark enables analysts to develop extremely detailed reactor core models that can be used for testing and validation of coupled neutron transport, thermal-hydraulics, and fuel isotopic depletion. The benchmark also provides measured reactor data for Hot Zero Power (HZP) physics tests, boron letdown curves, and three-dimensional in-core flux maps from 58 instrumented assemblies. The benchmark description is now available online and has been used by many groups. However, much work remains to be done on the quantification of uncertainties and modeling sensitivities. This work aims to address these deficiencies and make this benchmark a true non-proprietary international benchmark for the validation of high-fidelity tools. This report details the BEAVRS uncertainty quantification for the first two cycle of operations and serves as the final report of the project.« less

  12. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis.

    PubMed

    Till, Kevin; Jones, Ben L; Cobley, Stephen; Morley, David; O'Hara, John; Chapman, Chris; Cooke, Carlton; Beggs, Clive B

    2016-01-01

    Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional). Players were blindly and randomly divided into an exploratory (n = 165) and validation dataset (n = 92). The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD) with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001), although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003). Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification.

  13. Identifying Talent in Youth Sport: A Novel Methodology Using Higher-Dimensional Analysis

    PubMed Central

    Till, Kevin; Jones, Ben L.; Cobley, Stephen; Morley, David; O'Hara, John; Chapman, Chris; Cooke, Carlton; Beggs, Clive B.

    2016-01-01

    Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional). Players were blindly and randomly divided into an exploratory (n = 165) and validation dataset (n = 92). The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD) with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001), although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003). Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification. PMID:27224653

  14. Exploring predictive performance: A reanalysis of the geospace model transition challenge

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Anderson, B. J.; Crowley, G.; Pulkkinen, A. A.; Rastätter, L.

    2017-01-01

    The Pulkkinen et al. (2013) study evaluated the ability of five different geospace models to predict surface dB/dt as a function of upstream solar drivers. This was an important step in the assessment of research models for predicting and ultimately preventing the damaging effects of geomagnetically induced currents. Many questions remain concerning the capabilities of these models. This study presents a reanalysis of the Pulkkinen et al. (2013) results in an attempt to better understand the models' performance. The range of validity of the models is determined by examining the conditions corresponding to the empirical input data. It is found that the empirical conductance models on which global magnetohydrodynamic models rely are frequently used outside the limits of their input data. The prediction error for the models is sorted as a function of solar driving and geomagnetic activity. It is found that all models show a bias toward underprediction, especially during active times. These results have implications for future research aimed at improving operational forecast models.

  15. The German version of the Posttraumatic Stress Disorder Checklist for DSM-5 (PCL-5): psychometric properties and diagnostic utility.

    PubMed

    Krüger-Gottschalk, Antje; Knaevelsrud, Christine; Rau, Heinrich; Dyer, Anne; Schäfer, Ingo; Schellong, Julia; Ehring, Thomas

    2017-11-28

    The Posttraumatic Stress Disorder (PTSD) Checklist (PCL, now PCL-5) has recently been revised to reflect the new diagnostic criteria of the disorder. A clinical sample of trauma-exposed individuals (N = 352) was assessed with the Clinician Administered PTSD Scale for DSM-5 (CAPS-5) and the PCL-5. Internal consistencies and test-retest reliability were computed. To investigate diagnostic accuracy, we calculated receiver operating curves. Confirmatory factor analyses (CFA) were performed to analyze the structural validity. Results showed high internal consistency (α = .95), high test-retest reliability (r = .91) and a high correlation with the total severity score of the CAPS-5, r = .77. In addition, the recommended cutoff of 33 on the PCL-5 showed high diagnostic accuracy when compared to the diagnosis established by the CAPS-5. CFAs comparing the DSM-5 model with alternative models (the three-factor solution, the dysphoria, anhedonia, externalizing behavior and hybrid model) to account for the structural validity of the PCL-5 remained inconclusive. Overall, the findings show that the German PCL-5 is a reliable instrument with good diagnostic accuracy. However, more research evaluating the underlying factor structure is needed.

  16. Role of multiple cusps in tooth fracture.

    PubMed

    Barani, Amir; Bush, Mark B; Lawn, Brian R

    2014-07-01

    The role of multiple cusps in the biomechanics of human molar tooth fracture is analysed. A model with four cusps at the bite surface replaces the single dome structure used in previous simulations. Extended finite element modelling, with provision to embed longitudinal cracks into the enamel walls, enables full analysis of crack propagation from initial extension to final failure. The cracks propagate longitudinally around the enamel side walls from starter cracks placed either at the top surface (radial cracks) or from the tooth base (margin cracks). A feature of the crack evolution is its stability, meaning that extension occurs steadily with increasing applied force. Predictions from the model are validated by comparison with experimental data from earlier publications, in which crack development was followed in situ during occlusal loading of extracted human molars. The results show substantial increase in critical forces to produce longitudinal fractures with number of cuspal contacts, indicating a capacity for an individual tooth to spread the load during mastication. It is argued that explicit critical force equations derived in previous studies remain valid, at the least as a means for comparing the capacity for teeth of different dimensions to sustain high bite forces. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The application of molecular topology for ulcerative colitis drug discovery.

    PubMed

    Bellera, Carolina L; Di Ianni, Mauricio E; Talevi, Alan

    2018-01-01

    Although the therapeutic arsenal against ulcerative colitis has greatly expanded (including the revolutionary advent of biologics), there remain patients who are refractory to current medications while the safety of the available therapeutics could also be improved. Molecular topology provides a theoretic framework for the discovery of new therapeutic agents in a very efficient manner, and its applications in the field of ulcerative colitis have slowly begun to flourish. Areas covered: After discussing the basics of molecular topology, the authors review QSAR models focusing on validated targets for the treatment of ulcerative colitis, entirely or partially based on topological descriptors. Expert opinion: The application of molecular topology to ulcerative colitis drug discovery is still very limited, and many of the existing reports seem to be strictly theoretic, with no experimental validation or practical applications. Interestingly, mechanism-independent models based on phenotypic responses have recently been reported. Such models are in agreement with the recent interest raised by network pharmacology as a potential solution for complex disorders. These and other similar studies applying molecular topology suggest that some therapeutic categories may present a 'topological pattern' that goes beyond a specific mechanism of action.

  18. Effect of soccer shoe upper on ball behaviour in curve kicks

    PubMed Central

    Ishii, Hideyuki; Sakurai, Yoshihisa; Maruyama, Takeo

    2014-01-01

    New soccer shoes have been developed by considering various concepts related to kicking, such as curving a soccer ball. However, the effects of shoes on ball behaviour remain unclear. In this study, by using a finite element simulation, we investigated the factors that affect ball behaviour immediately after impact in a curve kick. Five experienced male university soccer players performed one curve kick. We developed a finite element model of the foot and ball and evaluated the validity of the model by comparing the finite element results for the ball behaviour immediately after impact with the experimental results. The launch angle, ball velocity, and ball rotation in the finite element analysis were all in general agreement with the experimental results. Using the validated finite element model, we simulated the ball behaviour. The simulation results indicated that the larger the foot velocity immediately before impact, the larger the ball velocity and ball rotation. Furthermore, the Young's modulus of the shoe upper and the coefficient of friction between the shoe upper and the ball had little effect on the launch angle, ball velocity, and ball rotation. The results of this study suggest that the shoe upper does not significantly influence ball behaviour. PMID:25266788

  19. Validation of the Schizotypal Personality Questionnaire-Brief Form in adolescents.

    PubMed

    Fonseca-Pedrero, Eduardo; Paíno-Piñeiro, Mercedes; Lemos-Giráldez, Serafín; Villazón-García, Ursula; Muñiz, José

    2009-06-01

    The main objective of the study was to validate the Schizotypal Personality Questionnaire-Brief (SPQ-B) in a sample of non-clinical adolescents. In addition, the schizotypal personality structure and differences in the dimensions of schizotypy according to gender and age are analyzed. The sample comprises 1683 students, 818 males (48.6%), with a mean age of 15.9 years (SD=1.2). The results showed that the SPQ-B had adequate psychometric properties. Internal consistency of the subscales and total score ranged from 0.61 to 0.81. Confirmatory factor analyses indicated that the three-factor model (positive, negative, and disorganized) and the four-factor model (positive, paranoid, negative, and disorganized) fit reasonably well in comparison to the remaining models. With regard to gender and age, statistically significant differences were found due to age but not to gender. In line with previous literature, the results confirmed the multi-factor structure of the schizotypal personality in non-clinical adolescent populations. Future studies could use the SPQ-B as a screening self-report of rapid and efficient application for the detection of adolescents vulnerable to the development of schizophrenia-spectrum disorders in the general population, in genetically high-risk samples and in clinical studies.

  20. Application of Machine-Learning Models to Predict Tacrolimus Stable Dose in Renal Transplant Recipients

    NASA Astrophysics Data System (ADS)

    Tang, Jie; Liu, Rong; Zhang, Yue-Li; Liu, Mou-Ze; Hu, Yong-Fang; Shao, Ming-Jie; Zhu, Li-Jun; Xin, Hua-Wen; Feng, Gui-Wen; Shang, Wen-Jun; Meng, Xiang-Guang; Zhang, Li-Rong; Ming, Ying-Zi; Zhang, Wei

    2017-02-01

    Tacrolimus has a narrow therapeutic window and considerable variability in clinical use. Our goal was to compare the performance of multiple linear regression (MLR) and eight machine learning techniques in pharmacogenetic algorithm-based prediction of tacrolimus stable dose (TSD) in a large Chinese cohort. A total of 1,045 renal transplant patients were recruited, 80% of which were randomly selected as the “derivation cohort” to develop dose-prediction algorithm, while the remaining 20% constituted the “validation cohort” to test the final selected algorithm. MLR, artificial neural network (ANN), regression tree (RT), multivariate adaptive regression splines (MARS), boosted regression tree (BRT), support vector regression (SVR), random forest regression (RFR), lasso regression (LAR) and Bayesian additive regression trees (BART) were applied and their performances were compared in this work. Among all the machine learning models, RT performed best in both derivation [0.71 (0.67-0.76)] and validation cohorts [0.73 (0.63-0.82)]. In addition, the ideal rate of RT was 4% higher than that of MLR. To our knowledge, this is the first study to use machine learning models to predict TSD, which will further facilitate personalized medicine in tacrolimus administration in the future.

  1. Effect of soccer shoe upper on ball behaviour in curve kicks

    NASA Astrophysics Data System (ADS)

    Ishii, Hideyuki; Sakurai, Yoshihisa; Maruyama, Takeo

    2014-08-01

    New soccer shoes have been developed by considering various concepts related to kicking, such as curving a soccer ball. However, the effects of shoes on ball behaviour remain unclear. In this study, by using a finite element simulation, we investigated the factors that affect ball behaviour immediately after impact in a curve kick. Five experienced male university soccer players performed one curve kick. We developed a finite element model of the foot and ball and evaluated the validity of the model by comparing the finite element results for the ball behaviour immediately after impact with the experimental results. The launch angle, ball velocity, and ball rotation in the finite element analysis were all in general agreement with the experimental results. Using the validated finite element model, we simulated the ball behaviour. The simulation results indicated that the larger the foot velocity immediately before impact, the larger the ball velocity and ball rotation. Furthermore, the Young's modulus of the shoe upper and the coefficient of friction between the shoe upper and the ball had little effect on the launch angle, ball velocity, and ball rotation. The results of this study suggest that the shoe upper does not significantly influence ball behaviour.

  2. Direct yaw moment control and power consumption of in-wheel motor vehicle in steady-state turning

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takao; Katsuyama, Etsuo; Sugiura, Hideki; Ono, Eiichi; Yamamoto, Masaki

    2017-01-01

    Driving force distribution control is one of the characteristic performance aspects of in-wheel motor vehicles and various methods have been developed to control direct yaw moment while turning. However, while these controls significantly enhance vehicle dynamic performance, the additional power required to control vehicle motion still remains to be clarified. This paper constructed new formulae of the mechanism by which direct yaw moment alters the cornering resistance and mechanical power of all wheels based on a simple bicycle model, including the electric loss of the motors and the inverters. These formulation results were validated by an actual test vehicle equipped with in-wheel motors in steady-state turning. The validated theory was also applied to a comparison of several different driving force distribution mechanisms from the standpoint of innate mechanical power.

  3. GIS-based groundwater potential mapping using boosted regression tree, classification and regression tree, and random forest machine learning models in Iran.

    PubMed

    Naghibi, Seyed Amir; Pourghasemi, Hamid Reza; Dixon, Barnali

    2016-01-01

    Groundwater is considered one of the most valuable fresh water resources. The main objective of this study was to produce groundwater spring potential maps in the Koohrang Watershed, Chaharmahal-e-Bakhtiari Province, Iran, using three machine learning models: boosted regression tree (BRT), classification and regression tree (CART), and random forest (RF). Thirteen hydrological-geological-physiographical (HGP) factors that influence locations of springs were considered in this research. These factors include slope degree, slope aspect, altitude, topographic wetness index (TWI), slope length (LS), plan curvature, profile curvature, distance to rivers, distance to faults, lithology, land use, drainage density, and fault density. Subsequently, groundwater spring potential was modeled and mapped using CART, RF, and BRT algorithms. The predicted results from the three models were validated using the receiver operating characteristics curve (ROC). From 864 springs identified, 605 (≈70 %) locations were used for the spring potential mapping, while the remaining 259 (≈30 %) springs were used for the model validation. The area under the curve (AUC) for the BRT model was calculated as 0.8103 and for CART and RF the AUC were 0.7870 and 0.7119, respectively. Therefore, it was concluded that the BRT model produced the best prediction results while predicting locations of springs followed by CART and RF models, respectively. Geospatially integrated BRT, CART, and RF methods proved to be useful in generating the spring potential map (SPM) with reasonable accuracy.

  4. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods.

    PubMed

    Rakotonarivo, O Sarobidy; Schaafsma, Marije; Hockley, Neal

    2016-12-01

    While discrete choice experiments (DCEs) are increasingly used in the field of environmental valuation, they remain controversial because of their hypothetical nature and the contested reliability and validity of their results. We systematically reviewed evidence on the validity and reliability of environmental DCEs from the past thirteen years (Jan 2003-February 2016). 107 articles met our inclusion criteria. These studies provide limited and mixed evidence of the reliability and validity of DCE. Valuation results were susceptible to small changes in survey design in 45% of outcomes reporting reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2-90% of respondents protested against a feature of the survey, and a considerable proportion found DCEs to be incomprehensible or inconsequential (17-40% and 10-62% respectively). DCE remains useful for non-market valuation, but its results should be used with caution. Given the sparse and inconclusive evidence base, we recommend that tests of reliability and validity are more routinely integrated into DCE studies and suggest how this might be achieved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Statistical Patterns of Ionospheric Convection Derived From Mid-Latitude, High-Latitude, and Polar SuperDARN HF Radar Observations

    NASA Astrophysics Data System (ADS)

    Thomas, E. G.; Shepherd, S. G.

    2017-12-01

    Global patterns of ionospheric convection have been widely studied in terms of the interplanetary magnetic field (IMF) magnitude and orientation in both the Northern and Southern Hemispheres using observations from the Super Dual Auroral Radar Network (SuperDARN). The dynamic range of driving conditions under which existing SuperDARN statistical models are valid is currently limited to periods when the high-latitude convection pattern remains above about 60° geomagnetic latitude. Cousins and Shepherd [2010] found this to correspond to intervals when the solar wind electric field Esw < 4.1 mV/m and IMF Bz is negative. Conversely, under northward IMF conditions (Bz > 0) the high-latitude radars often experience difficulties in measuring convection above about 85° geomagnetic latitude. In this presentation, we introduce a new statistical model of ionospheric convection which is valid for much more dominant IMF Bz conditions than was previously possible by including velocity measurements from the newly constructed tiers of radars in the Northern Hemisphere at midlatitudes and in the polar cap. This new model (TS17) is compared to previous statistical models derived from high-latitude SuperDARN observations (RG96, PSR10, CS10) and its impact on instantaneous Map Potential solutions is examined.

  6. Validation of systems biology derived molecular markers of renal donor organ status associated with long term allograft function.

    PubMed

    Perco, Paul; Heinzel, Andreas; Leierer, Johannes; Schneeberger, Stefan; Bösmüller, Claudia; Oberhuber, Rupert; Wagner, Silvia; Engler, Franziska; Mayer, Gert

    2018-05-03

    Donor organ quality affects long term outcome after renal transplantation. A variety of prognostic molecular markers is available, yet their validity often remains undetermined. A network-based molecular model reflecting donor kidney status based on transcriptomics data and molecular features reported in scientific literature to be associated with chronic allograft nephropathy was created. Significantly enriched biological processes were identified and representative markers were selected. An independent kidney pre-implantation transcriptomics dataset of 76 organs was used to predict estimated glomerular filtration rate (eGFR) values twelve months after transplantation using available clinical data and marker expression values. The best-performing regression model solely based on the clinical parameters donor age, donor gender, and recipient gender explained 17% of variance in post-transplant eGFR values. The five molecular markers EGF, CD2BP2, RALBP1, SF3B1, and DDX19B representing key molecular processes of the constructed renal donor organ status molecular model in addition to the clinical parameters significantly improved model performance (p-value = 0.0007) explaining around 33% of the variability of eGFR values twelve months after transplantation. Collectively, molecular markers reflecting donor organ status significantly add to prediction of post-transplant renal function when added to the clinical parameters donor age and gender.

  7. Probing the structure of Leishmania major DHFR TS and structure based virtual screening of peptide library for the identification of anti-leishmanial leads.

    PubMed

    Rajasekaran, Rajalakshmi; Chen, Yi-Ping Phoebe

    2012-09-01

    Leishmaniasis, a multi-faceted ethereal disease is considered to be one of the World's major communicable diseases that demands exhaustive research and control measures. The substantial data on these protozoan parasites has not been utilized completely to develop potential therapeutic strategies against Leishmaniasis. Dihydrofolate reductase thymidylate synthase (DHFR-TS) plays a major role in the infective state of the parasite and hence the DHFR-TS based drugs remains of much interest to researchers working on Leishmaniasis. Although, crystal structures of DHFR-TS from different species including Plasmodium falciparum and Trypanosoma cruzi are available, the experimentally determined structure of the Leishmania major DHFR-TS has not yet been reported in the Protein Data Bank. A high quality three dimensional structure of L.major DHFR-TS has been modeled through the homology modeling approach. Carefully refined and the energy minimized structure of the modeled protein was validated using a number of structure validation programs to confirm its structure quality. The modeled protein structure was used in the process of structure based virtual screening to figure out a potential lead structure against DHFR TS. The lead molecule identified has a binding affinity of 0.51 nM and clearly follows drug like properties.

  8. A method for the inline measurement of milk gel firmness using an optical sensor.

    PubMed

    Arango, O; Castillo, M

    2018-05-01

    At present, selection of cutting time during cheesemaking is made based on subjective methods, which has effects on product homogeneity and has prevented complete automation of cheesemaking. In this work, a new method for inline monitoring of curd firmness is presented. The method consisted of developing a model that correlates the backscatter ratio of near infrared light during milk coagulation with the rheological storage modulus. The model was developed through a factorial design with 2 factors: protein concentration (3.4 and 5.1%) and coagulation temperature (30 and 40°C). Each treatment was replicated 3 times; the model was calibrated with the first replicate and validated using the remaining 2 replicates. The coagulation process was simultaneously monitored using an optical sensor and small-amplitude oscillatory rheology. The model was calibrated and successfully validated at the different protein concentrations and coagulation temperatures studied, predicting the evolution of storage modulus during milk coagulation with coefficient of determination values >0.998 and standard error of prediction values <3.4 Pa. The results demonstrated that the proposed method allows inline monitoring of curd firming in cheesemaking and cutting the curd at a proper firmness to each type of cheese. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. An evidence-based framework for predicting the impact of differing autotroph-heterotroph thermal sensitivities on consumer–prey dynamics

    PubMed Central

    Yang, Zhou; Zhang, Lu; Zhu, Xuexia; Wang, Jun; Montagnes, David J S

    2016-01-01

    Increased temperature accelerates vital rates, influencing microbial population and wider ecosystem dynamics, for example, the predicted increases in cyanobacterial blooms associated with global warming. However, heterotrophic and mixotrophic protists, which are dominant grazers of microalgae, may be more thermally sensitive than autotrophs, and thus prey could be suppressed as temperature rises. Theoretical and meta-analyses have begun to address this issue, but an appropriate framework linking experimental data with theory is lacking. Using ecophysiological data to develop a novel model structure, we provide the first validation of this thermal sensitivity hypothesis: increased temperature improves the consumer's ability to control the autotrophic prey. Specifically, the model accounts for temperature effects on auto- and mixotrophs and ingestion, growth and mortality rates, using an ecologically and economically important system (cyanobacteria grazed by a mixotrophic flagellate). Once established, we show the model to be a good predictor of temperature impacts on consumer–prey dynamics by comparing simulations with microcosm observations. Then, through simulations, we indicate our conclusions remain valid, even with large changes in bottom-up factors (prey growth and carrying capacity). In conclusion, we show that rising temperature could, counterintuitively, reduce the propensity for microalgal blooms to occur and, critically, provide a novel model framework for needed, continued assessment. PMID:26684731

  10. External validation of a risk assessment model for venous thromboembolism in the hospitalised acutely-ill medical patient (VTE-VALOURR).

    PubMed

    Mahan, Charles E; Liu, Yang; Turpie, A Graham; Vu, Jennifer T; Heddle, Nancy; Cook, Richard J; Dairkee, Undaleeb; Spyropoulos, Alex C

    2014-10-01

    Venous thromboembolic (VTE) risk assessment remains an important issue in hospitalised, acutely-ill medical patients, and several VTE risk assessment models (RAM) have been proposed. The purpose of this large retrospective cohort study was to externally validate the IMPROVE RAM using a large database of three acute care hospitals. We studied 41,486 hospitalisations (28,744 unique patients) with 1,240 VTE hospitalisations (1,135 unique patients) in the VTE cohort and 40,246 VTE-free hospitalisations (27,609 unique patients) in the control cohort. After chart review, 139 unique VTE patients were identified and 278 randomly-selected matched patients in the control cohort. Seven independent VTE risk factors as part of the RAM in the derivation cohort were identified. In the validation cohort, the incidence of VTE was 0.20%; 95% confidence interval (CI) 0.18-0.22, 1.04%; 95%CI 0.88-1.25, and 4.15%; 95%CI 2.79-8.12 in the low, moderate, and high VTE risk groups, respectively, which compared to rates of 0.45%, 1.3%, and 4.74% in the three risk categories of the derivation cohort. For the derivation and validation cohorts, the total percentage of patients in low, moderate and high VTE risk occurred in 68.6% vs 63.3%, 24.8% vs 31.1%, and 6.5% vs 5.5%, respectively. Overall, the area under the receiver-operator characteristics curve for the validation cohort was 0.7731. In conclusion, the IMPROVE RAM can accurately identify medical patients at low, moderate, and high VTE risk. This will tailor future thromboprophylactic strategies in this population as well as identify particularly high VTE risk patients in whom multimodal or more intensive prophylaxis may be beneficial.

  11. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    PubMed

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  12. Sleuthing the Isolated Compact Stars

    NASA Astrophysics Data System (ADS)

    Drake, J. J.

    2004-08-01

    In the early 1990's, isolated thermally-emitting neutron stars accreting from the interstellar medium were predicted to show up in their thousands in the ROSAT soft X-ray all-sky survey. The glut of sources would provide unprecedented opportunities for probing the equation of state of ultra-dense matter. Only seven objects have been firmly identified to date. The reasons for this discrepency are discussed and recent high resolution X-ray spectroscopic observations of these objects are described. Spectra of the brightest of the isolated neutron star candidates, RX J1856.5-3754, continue to present interpretational difficulties for current neutron star model atmospheres and alternative models are briefly discussed. RX J1856.5-3754 remains a valid quark star candidate.

  13. The Interaction of Sexual Validation, Criminal Justice Involvement, and Sexually Transmitted Infection Risk Among Adolescent and Young Adult Males.

    PubMed

    Matson, Pamela A; Towe, Vivian; Ellen, Jonathan M; Chung, Shang-En; Sherman, Susan G

    2018-03-01

    Young men who have been involved with the criminal justice system are more likely to have concurrent sexual partners, a key driver of sexually transmitted infections. The value men place on having sexual relationships to validate themselves may play an important role in understanding this association. Data were from a household survey. Young men (N = 132), aged 16 to 24 years, self-reported whether they ever spent time in jail or juvenile detention and if they had sexual partnerships that overlapped in time. A novel scale, "Validation through Sex and Sexual Relationships" (VTSSR) assessed the importance young men place on sex and sexual relationships (α = 0.91). Weighted logistic regression accounted for the sampling design. The mean (SD) VTSSR score was 23.7 (8.8) with no differences by race. Both criminal justice involvement (CJI) (odds ratio [OR], 3.69; 95% confidence interval [CI], 1.12-12.1) and sexual validation (OR, 1.10; 95% CI, 1.04-1.16) were associated with an increased odds of concurrency; however, CJI did not remain associated with concurrency in the fully adjusted model. There was effect modification, CJI was associated with concurrency among those who scored high on sexual validation (OR, 9.18; 95% CI, 1.73-48.6]; however, there was no association among those who scored low on sexual validation. Racial differences were observed between CJI and concurrency, but not between sexual validation and concurrency. Sexual validation may be an important driver of concurrency for men who have been involved with the criminal justice system. Study findings have important implications on how sexual validation may explain racial differences in rates of concurrency.

  14. Accuracy and generalizability of using automated methods for identifying adverse events from electronic health record data: a validation study protocol.

    PubMed

    Rochefort, Christian M; Buckeridge, David L; Tanguay, Andréanne; Biron, Alain; D'Aragon, Frédérick; Wang, Shengrui; Gallix, Benoit; Valiquette, Louis; Audet, Li-Anne; Lee, Todd C; Jayaraman, Dev; Petrucci, Bruno; Lefebvre, Patricia

    2017-02-16

    Adverse events (AEs) in acute care hospitals are frequent and associated with significant morbidity, mortality, and costs. Measuring AEs is necessary for quality improvement and benchmarking purposes, but current detection methods lack in accuracy, efficiency, and generalizability. The growing availability of electronic health records (EHR) and the development of natural language processing techniques for encoding narrative data offer an opportunity to develop potentially better methods. The purpose of this study is to determine the accuracy and generalizability of using automated methods for detecting three high-incidence and high-impact AEs from EHR data: a) hospital-acquired pneumonia, b) ventilator-associated event and, c) central line-associated bloodstream infection. This validation study will be conducted among medical, surgical and ICU patients admitted between 2013 and 2016 to the Centre hospitalier universitaire de Sherbrooke (CHUS) and the McGill University Health Centre (MUHC), which has both French and English sites. A random 60% sample of CHUS patients will be used for model development purposes (cohort 1, development set). Using a random sample of these patients, a reference standard assessment of their medical chart will be performed. Multivariate logistic regression and the area under the curve (AUC) will be employed to iteratively develop and optimize three automated AE detection models (i.e., one per AE of interest) using EHR data from the CHUS. These models will then be validated on a random sample of the remaining 40% of CHUS patients (cohort 1, internal validation set) using chart review to assess accuracy. The most accurate models developed and validated at the CHUS will then be applied to EHR data from a random sample of patients admitted to the MUHC French site (cohort 2) and English site (cohort 3)-a critical requirement given the use of narrative data -, and accuracy will be assessed using chart review. Generalizability will be determined by comparing AUCs from cohorts 2 and 3 to those from cohort 1. This study will likely produce more accurate and efficient measures of AEs. These measures could be used to assess the incidence rates of AEs, evaluate the success of preventive interventions, or benchmark performance across hospitals.

  15. Computational Fluid Dynamics and Additive Manufacturing to Diagnose and Treat Cardiovascular Disease.

    PubMed

    Randles, Amanda; Frakes, David H; Leopold, Jane A

    2017-11-01

    Noninvasive engineering models are now being used for diagnosing and planning the treatment of cardiovascular disease. Techniques in computational modeling and additive manufacturing have matured concurrently, and results from simulations can inform and enable the design and optimization of therapeutic devices and treatment strategies. The emerging synergy between large-scale simulations and 3D printing is having a two-fold benefit: first, 3D printing can be used to validate the complex simulations, and second, the flow models can be used to improve treatment planning for cardiovascular disease. In this review, we summarize and discuss recent methods and findings for leveraging advances in both additive manufacturing and patient-specific computational modeling, with an emphasis on new directions in these fields and remaining open questions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. An animal model for tinnitus.

    PubMed

    Jastreboff, P J; Brennan, J F; Sasaki, C T

    1988-03-01

    Subjective tinnitus remains obscure, widespread, and without apparent cure. In the absence of a suitable animal model, past investigations took place in humans, resulting in studies that were understandably restricted by the nature of human investigation. Within this context, the development of a valid animal model would be considered a major breakthrough in this field of investigation. Our results showed changes in the spontaneous activity of single neurons in the inferior colliculus, consistent with abnormally increased neuronal activity within the auditory pathways after manipulations known to produce tinnitus in man. A procedure based on a Pavlovian conditioned suppression paradigm was recently developed that allows us to measure tinnitus behaviorally in conscious animals. Accordingly, an animal model of tinnitus is proposed that permits tests of hypotheses relating to tinnitus generation, allowing the accommodation of interventional strategies for the treatment of this widespread auditory disorder.

  17. Validation of the Dutch version of the Swallowing Quality-of-Life Questionnaire (DSWAL-QoL) and the adjusted DSWAL-QoL (aDSWAL-QoL) using item analysis with the Rasch model: a pilot study.

    PubMed

    Simpelaere, Ingeborg S; Van Nuffelen, Gwen; De Bodt, Marc; Vanderwegen, Jan; Hansen, Tina

    2017-04-07

    The Swallowing Quality-of-Life Questionnaire (SWAL-QoL) is considered the gold standard for assessing health-related QoL in oropharyngeal dysphagia. The Dutch translation (DSWAL-QoL) and its adjusted version (aDSWAL-QoL) have been validated using classical test theory (CTT). However, these scales have not been tested against the Rasch measurement model, which is required to establish the structural validity and objectivity of the total scale and subscale scores. Thus, the purpose of this study was to examine the psychometric properties of these scales using item analysis according to the Rasch model. Item analysis with the Rasch model was performed using RUMM2030 software with previously collected data from a validation study of 108 patients. The assessment included evaluations of overall model fit, reliability, unidimensionality, threshold ordering, individual item and person fits, differential item functioning (DIF), local item dependency (LID) and targeting. The analysis could not establish the psychometric properties of either of the scales or their subscales because they did not fit the Rasch model, and multidimensionality, disordered thresholds, DIF, and/or LID were found. The reliability and power of fit were high for the total scales (PSI = 0.93) but low for most of the subscales (PSI < 0.70). The targeting of persons and items was suboptimal. The main source of misfit was disordered thresholds for both the total scales and subscales. Based on the results of the analysis, adjustments to improve the scales were implemented as follows: disordered thresholds were rescaled, misfit items were removed and items were split for DIF. However, the multidimensionality and LID could not be resolved. The reliability and power of fit remained low for most of the subscales. This study represents the first analyses of the DSWAL-QoL and aDSWAL-QoL with the Rasch model. Relying on the DSWAL-QoL and aDSWAL-QoL total and subscale scores to make conclusions regarding dysphagia-related HRQoL should be treated with caution before the structural validity and objectivity of both scales have been established. A larger and well-targeted sample is recommended to derive definitive conclusions about the items and scales. Solutions for the psychometric weaknesses suggested by the model and practical implications are discussed.

  18. The Psychometric Properties of the Center for Epidemiologic Studies Depression Scale in Chinese Primary Care Patients: Factor Structure, Construct Validity, Reliability, Sensitivity and Responsiveness.

    PubMed

    Chin, Weng Yee; Choi, Edmond P H; Chan, Kit T Y; Wong, Carlos K H

    2015-01-01

    The Center for Epidemiologic Studies Depression Scale (CES-D) is a commonly used instrument to measure depressive symptomatology. Despite this, the evidence for its psychometric properties remains poorly established in Chinese populations. The aim of this study was to validate the use of the CES-D in Chinese primary care patients by examining factor structure, construct validity, reliability, sensitivity and responsiveness. The psychometric properties were assessed amongst a sample of 3686 Chinese adult primary care patients in Hong Kong. Three competing factor structure models were examined using confirmatory factor analysis. The original CES-D four-structure model had adequate fit, however the data was better fit into a bi-factor model. For the internal construct validity, corrected item-total correlations were 0.4 for most items. The convergent validity was assessed by examining the correlations between the CES-D, the Patient Health Questionnaire 9 (PHQ-9) and the Short Form-12 Health Survey (version 2) Mental Component Summary (SF-12 v2 MCS). The CES-D had a strong correlation with the PHQ-9 (coefficient: 0.78) and SF-12 v2 MCS (coefficient: -0.75). Internal consistency was assessed by McDonald's omega hierarchical (ωH). The ωH value for the general depression factor was 0.855. The ωH values for "somatic", "depressed affect", "positive affect" and "interpersonal problems" were 0.434, 0.038, 0.738 and 0.730, respectively. For the two-week test-retest reliability, the intraclass correlation coefficient was 0.91. The CES-D was sensitive in detecting differences between known groups, with the AUC >0.7. Internal responsiveness of the CES-D to detect positive and negative changes was satisfactory (with p value <0.01 and all effect size statistics >0.2). The CES-D was externally responsive, with the AUC>0.7. The CES-D appears to be a valid, reliable, sensitive and responsive instrument for screening and monitoring depressive symptoms in adult Chinese primary care patients. In its original four-factor and bi-factor structure, the CES-D is supported for cross-cultural comparisons of depression in multi-center studies.

  19. Flight Validation of the Thermal Propellant Gauging Method used at EADS Astrium

    NASA Astrophysics Data System (ADS)

    Dandaleix, L.; Ounougha, L.; Jallade, S.

    2004-10-01

    EADS Astrium recently met a major milestone in the field of propellant gauging with the first reorbitation of an Eurostar tanks equipped satellite. It proved successful determining the remaining available propellant mass for spacecraft displacement beyond the customer specified graveyard orbit; thus demonstrating its expertness in Propellant Gauging in correlation with tank residual mass minimization. A critical parameter in satellite operational planning is indeed the accurate knowledge of the on-board remaining propellant mass; basically for the commercial telecommunication missions, where it is the major criterion for lifetime maximization. To provide an accurate and reliable process for measurement of this propellant mass throughout lifetime, EADS Astrium uses a Combination of two independent techniques: The Dead Reckoning Method (maximum accuracy at BOL), based on thrusters flow rate prediction &the Thermal Propellant Gauging Technique, deriving the propellant mass from the tank thermal capacity (Absolute gauging method, with increasing accuracy along lifetime). Then, the present article shows the recent flight validation of the Gauging method obtained for Eurostar E2000 propellant tanks including the validation of the different thermodynamic models. ABBREVIATIONS &ACRONYMS BOL, MOL, EOL: Beginning, Middle &End of Life Cempty: Empty tank thermal inertia [J/K] Chelium: Helium thermal inertia [J/K] Cpropellant: Propellant thermal inertia [J/K] Ct = C1+C2: Total tank thermal inertia (Subscript for upper node and for lower node) [J/K] CPS: Combined Propulsion System DR: Dead Reckoning FM: Flight Model LAE: Liquid Apogee Engine lsb: Least significant byte M0: TPGS Uncertainty component linked to Cempty mox, mfuel: Propellant mass of oxidiser &fuel [kg] Pox, Pfuel: Pressure of oxidiser &fuel [bar] PTA: Propellant Tank Assembly Q: Heater power [W] Qox, Qfuel: Mass flow rate of oxidiser &fuel [kg/s] RCT: Reaction Control Thrusters T0: Spacecraft platform equilibrium temperature TPGS: Thermal Propellant Gauging Software TPGT: Thermal Propellant Gauging Technique T1i: Internal thermal gradients [K] T2i: External thermal gradients [K] Ï 1: Internal thermal characteristic time [s] 2: External thermal characteristic time [s

  20. Application of a transonic similarity rule to correct the effects of sidewall boundary layers in two-dimensional transonic wind tunnels. M.S. Thesis - George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Sewall, W. G.

    1982-01-01

    A transonic similarity rule which accounts for the effects of attached sidewall boundary layers is presented and evaluated by comparison with the characteristics of airfoils tested in a two dimensional transonic tunnel with different sidewall boundary layer thicknesses. The rule appears valid provided the sidewall boundary layer both remains attached in the vicinity of the model and occupies a small enough fraction of the tunnel width to preserve sufficient two dimensionality in the tunnel.

  1. An Investigation of a Hybrid Mixing Model for PDF Simulations of Turbulent Premixed Flames

    NASA Astrophysics Data System (ADS)

    Zhou, Hua; Li, Shan; Wang, Hu; Ren, Zhuyin

    2015-11-01

    Predictive simulations of turbulent premixed flames over a wide range of Damköhler numbers in the framework of Probability Density Function (PDF) method still remain challenging due to the deficiency in current micro-mixing models. In this work, a hybrid micro-mixing model, valid in both the flamelet regime and broken reaction zone regime, is proposed. A priori testing of this model is first performed by examining the conditional scalar dissipation rate and conditional scalar diffusion in a 3-D direct numerical simulation dataset of a temporally evolving turbulent slot jet flame of lean premixed H2-air in the thin reaction zone regime. Then, this new model is applied to PDF simulations of the Piloted Premixed Jet Burner (PPJB) flames, which are a set of highly shear turbulent premixed flames and feature strong turbulence-chemistry interaction at high Reynolds and Karlovitz numbers. Supported by NSFC 51476087 and NSFC 91441202.

  2. Quantitative structure-property relationship (QSPR) modeling of drug-loaded polymeric micelles via genetic function approximation.

    PubMed

    Wu, Wensheng; Zhang, Canyang; Lin, Wenjing; Chen, Quan; Guo, Xindong; Qian, Yu; Zhang, Lijuan

    2015-01-01

    Self-assembled nano-micelles of amphiphilic polymers represent a novel anticancer drug delivery system. However, their full clinical utilization remains challenging because the quantitative structure-property relationship (QSPR) between the polymer structure and the efficacy of micelles as a drug carrier is poorly understood. Here, we developed a series of QSPR models to account for the drug loading capacity of polymeric micelles using the genetic function approximation (GFA) algorithm. These models were further evaluated by internal and external validation and a Y-randomization test in terms of stability and generalization, yielding an optimization model that is applicable to an expanded materials regime. As confirmed by experimental data, the relationship between microstructure and drug loading capacity can be well-simulated, suggesting that our models are readily applicable to the quantitative evaluation of the drug-loading capacity of polymeric micelles. Our work may offer a pathway to the design of formulation experiments.

  3. Quantitative Structure-Property Relationship (QSPR) Modeling of Drug-Loaded Polymeric Micelles via Genetic Function Approximation

    PubMed Central

    Lin, Wenjing; Chen, Quan; Guo, Xindong; Qian, Yu; Zhang, Lijuan

    2015-01-01

    Self-assembled nano-micelles of amphiphilic polymers represent a novel anticancer drug delivery system. However, their full clinical utilization remains challenging because the quantitative structure-property relationship (QSPR) between the polymer structure and the efficacy of micelles as a drug carrier is poorly understood. Here, we developed a series of QSPR models to account for the drug loading capacity of polymeric micelles using the genetic function approximation (GFA) algorithm. These models were further evaluated by internal and external validation and a Y-randomization test in terms of stability and generalization, yielding an optimization model that is applicable to an expanded materials regime. As confirmed by experimental data, the relationship between microstructure and drug loading capacity can be well-simulated, suggesting that our models are readily applicable to the quantitative evaluation of the drug-loading capacity of polymeric micelles. Our work may offer a pathway to the design of formulation experiments. PMID:25780923

  4. Performance analysis of the lineal model for estimating the maximum power of a HCPV module in different climate conditions

    NASA Astrophysics Data System (ADS)

    Fernández, Eduardo F.; Almonacid, Florencia; Sarmah, Nabin; Mallick, Tapas; Sanchez, Iñigo; Cuadra, Juan M.; Soria-Moya, Alberto; Pérez-Higueras, Pedro

    2014-09-01

    A model based on easily obtained atmospheric parameters and on a simple lineal mathematical expression has been developed at the Centre of Advanced Studies in Energy and Environment in southern Spain. The model predicts the maximum power of a HCPV module as a function of direct normal irradiance, air temperature and air mass. Presently, the proposed model has only been validated in southern Spain and its performance in locations with different atmospheric conditions still remains unknown. In order to address this issue, several HCPV modules have been measured in two different locations with different climate conditions than the south of Spain: the Environment and Sustainability Institute in southern UK and the National Renewable Energy Center in northern Spain. Results show that the model has an adequate match between actual and estimated data with a RMSE lower than 3.9% at locations with different climate conditions.

  5. Development and validation of a new knowledge, attitude, belief and practice questionnaire on leptospirosis in Malaysia.

    PubMed

    Zahiruddin, Wan Mohd; Arifin, Wan Nor; Mohd-Nazri, Shafei; Sukeri, Surianti; Zawaha, Idris; Bakar, Rahman Abu; Hamat, Rukman Awang; Malina, Osman; Jamaludin, Tengku Zetty Maztura Tengku; Pathman, Arumugam; Mas-Harithulfadhli-Agus, Ab Rahman; Norazlin, Idris; Suhailah, Binti Samsudin; Saudi, Siti Nor Sakinah; Abdullah, Nurul Munirah; Nozmi, Noramira; Zainuddin, Abdul Wahab; Aziah, Daud

    2018-03-07

    In Malaysia, leptospirosis is considered an endemic disease, with sporadic outbreaks following rainy or flood seasons. The objective of this study was to develop and validate a new knowledge, attitude, belief and practice (KABP) questionnaire on leptospirosis for use in urban and rural populations in Malaysia. The questionnaire comprised development and validation stages. The development phase encompassed a literature review, expert panel review, focus-group testing, and evaluation. The validation phase consisted of exploratory and confirmatory parts to verify the psychometric properties of the questionnaire. A total of 214 and 759 participants were recruited from two Malaysian states, Kelantan and Selangor respectively, for the validation phase. The participants comprised urban and rural communities with a high reported incidence of leptospirosis. The knowledge section of the validation phase utilized item response theory (IRT) analysis. The attitude and belief sections utilized exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). The development phase resulted in a questionnaire that included four main sections: knowledge, attitude, belief, and practice. In the exploratory phase, as shown by the IRT analysis of knowledge about leptospirosis, the difficulty and discrimination values of the items were acceptable, with the exception of two items. Based on the EFA, the psychometric properties of the attitude, belief, and practice sections were poor. Thus, these sections were revised, and no further factor analysis of the practice section was conducted. In the confirmatory stage, the difficulty and discrimination values of the items in the knowledge section remained within the acceptable range. The CFA of the attitude section resulted in a good-fitting two-factor model. The CFA of the belief section retained low number of items, although the analysis resulted in a good fit in the final three-factor model. Based on the IRT analysis and factor analytic evidence, the knowledge and attitude sections of the KABP questionnaire on leptospirosis were psychometrically valid. However, the psychometric properties of the belief section were unsatisfactory, despite being revised after the initial validation study. Further development of this section is warranted in future studies.

  6. A model to evaluate quality and effectiveness of disease management.

    PubMed

    Lemmens, K M M; Nieboer, A P; van Schayck, C P; Asin, J D; Huijsman, R

    2008-12-01

    Disease management has emerged as a new strategy to enhance quality of care for patients suffering from chronic conditions, and to control healthcare costs. So far, however, the effects of this strategy remain unclear. Although current models define the concept of disease management, they do not provide a systematic development or an explanatory theory of how disease management affects the outcomes of care. The objective of this paper is to present a framework for valid evaluation of disease-management initiatives. The evaluation model is built on two pillars of disease management: patient-related and professional-directed interventions. The effectiveness of these interventions is thought to be affected by the organisational design of the healthcare system. Disease management requires a multifaceted approach; hence disease-management programme evaluations should focus on the effects of multiple interventions, namely patient-related, professional-directed and organisational interventions. The framework has been built upon the conceptualisation of these disease-management interventions. Analysis of the underlying mechanisms of these interventions revealed that learning and behavioural theories support the core assumptions of disease management. The evaluation model can be used to identify the components of disease-management programmes and the mechanisms behind them, making valid comparison feasible. In addition, this model links the programme interventions to indicators that can be used to evaluate the disease-management programme. Consistent use of this framework will enable comparisons among disease-management programmes and outcomes in evaluation research.

  7. Validation of the Glaucoma Filtration Surgical Mouse Model for Antifibrotic Drug Evaluation

    PubMed Central

    Seet, Li-Fong; Lee, Wing Sum; Su, Roseline; Finger, Sharon N; Crowston, Jonathan G; Wong, Tina T

    2011-01-01

    Glaucoma is a progressive optic neuropathy, which, if left untreated, leads to blindness. The most common and most modifiable risk factor in glaucoma is elevated intraocular pressure (IOP), which can be managed surgically by filtration surgery. The postoperative subconjunctival scarring response, however, remains the major obstacle to achieving long-term surgical success. Antiproliferatives such as mitomycin C are commonly used to prevent postoperative scarring. Efficacy of these agents has been tested extensively on monkey and rabbit models of glaucoma filtration surgery. As these models have inherent limitations, we have developed a model of glaucoma filtration surgery in the mouse. We show, for the first time, that the mouse model typically scarred within 14 d, but when augmented with mitomycin C, more animals maintained lower intraocular pressures for a longer period of time concomitant with prolonged bleb survival to beyond 28 d. The morphology of the blebs following mitomycin C treatment also resembled well-documented clinical observations, thus confirming the validity and clinical relevance of this model. We demonstrate that the antiscarring response to mitomycin C is likely to be due to its effects on conjunctival fibroblast proliferation, apoptosis and collagen deposition and the suppression of inflammation. Indeed, we verified some of these properties on mouse conjunctival fibroblasts cultured in vitro. These data support the suitability of this mouse model for studying the wound healing response in glaucoma filtration surgery, and as a potentially useful tool for the in vivo evaluation of antifibrotic therapeutics in the eye. PMID:21229189

  8. Validity and usefulness of the Line Drill test for adolescent basketball players: a Bayesian multilevel analysis.

    PubMed

    Carvalho, Humberto M; Gonçalves, Carlos E; Grosgeorge, Bernard; Paes, Roberto R

    2017-01-01

    The study examined the validity of the Line Drill test (LD) in male adolescent basketball players (10-15 years). Sensitiveness of the LD to changes in performance across a training and competition season (4 months) was also considered. Age, maturation, body size and LD were measured (n = 57). Sensitiveness of the LD was examined pre- and post-competitive season in a sub-sample (n = 44). The time at each of the four shuttle sprints of the LD (i.e. four stages) was modelled with Bayesian multilevel models. We observed very large correlation of performance at stage 4 (full LD protocol) with stage 3, but lower correlations with the early LD stages. Players' performance by somatic maturity differed substantially only when considering full LD protocol performance. Substantial improvements in all stages of the protocol were observed across the 4-month competitive season. The LD protocol should be shortened by the last full court shuttle sprint, remaining sensitive to training exposure, and independent of maturity status and body size.

  9. 3D in vitro modeling of the central nervous system

    PubMed Central

    Hopkins, Amy M.; DeSimone, Elise; Chwalek, Karolina; Kaplan, David L.

    2015-01-01

    There are currently more than 600 diseases characterized as affecting the central nervous system (CNS) which inflict neural damage. Unfortunately, few of these conditions have effective treatments available. Although significant efforts have been put into developing new therapeutics, drugs which were promising in the developmental phase have high attrition rates in late stage clinical trials. These failures could be circumvented if current 2D in vitro and in vivo models were improved. 3D, tissue-engineered in vitro systems can address this need and enhance clinical translation through two approaches: (1) bottom-up, and (2) top-down (developmental/regenerative) strategies to reproduce the structure and function of human tissues. Critical challenges remain including biomaterials capable of matching the mechanical properties and extracellular matrix (ECM) composition of neural tissues, compartmentalized scaffolds that support heterogeneous tissue architectures reflective of brain organization and structure, and robust functional assays for in vitro tissue validation. The unique design parameters defined by the complex physiology of the CNS for construction and validation of 3D in vitro neural systems are reviewed here. PMID:25461688

  10. Bioprosthetic heart valve heterograft biomaterials: structure, mechanical behavior and computational simulation.

    PubMed

    Sacks, Michael S; Mirnajafi, Ali; Sun, Wei; Schmidt, Paul

    2006-11-01

    The present review surveys significant developments in the biomechanical characterization and computational simulation of biologically derived chemically cross-linked soft tissues, or 'heterograft' biomaterials, used in replacement bioprosthetic heart valve (BHV). A survey of mechanical characterization techniques, relevant mechanical properties and computational simulation approaches is presented for both the source tissues and cross-linked biomaterials. Since durability remains the critical problem with current bioprostheses, changes with the mechanical behavior with fatigue are also presented. Moreover, given the complex nature of the mechanical properties of heterograft biomaterials it is not surprising that most constitutive (stress-strain) models, historically used to characterize their behavior, were oversimplified. Simulations of BHV function utilizing these models have inevitably been inaccurate. Thus, more recent finite element simulations utilizing nonlinear constitutive models, which achieve greater model fidelity, are reviewed. An important conclusion of this review is the need for accurate constitutive models, rigorously validated with appropriate experimental data, in order that the design benefits of computational models can be realized. Finally, for at least the coming 20 years, BHVs fabricated from heterograft biomaterials will continue to be extensively used, and will probably remain as the dominant valve design. We should thus recognize that rational, scientifically based approaches to BHV biomaterial development and design can lead to significantly improved BHV, over the coming decades, which can potentially impact millions of patients worldwide with heart valve disease.

  11. Accuracy of dengue clinical diagnosis with and without NS1 antigen rapid test: Comparison between human and Bayesian network model decision.

    PubMed

    Sa-Ngamuang, Chaitawat; Haddawy, Peter; Luvira, Viravarn; Piyaphanee, Watcharapong; Iamsirithaworn, Sopon; Lawpoolsri, Saranath

    2018-06-18

    Differentiating dengue patients from other acute febrile illness patients is a great challenge among physicians. Several dengue diagnosis methods are recommended by WHO. The application of specific laboratory tests is still limited due to high cost, lack of equipment, and uncertain validity. Therefore, clinical diagnosis remains a common practice especially in resource limited settings. Bayesian networks have been shown to be a useful tool for diagnostic decision support. This study aimed to construct Bayesian network models using basic demographic, clinical, and laboratory profiles of acute febrile illness patients to diagnose dengue. Data of 397 acute undifferentiated febrile illness patients who visited the fever clinic of the Bangkok Hospital for Tropical Diseases, Thailand, were used for model construction and validation. The two best final models were selected: one with and one without NS1 rapid test result. The diagnostic accuracy of the models was compared with that of physicians on the same set of patients. The Bayesian network models provided good diagnostic accuracy of dengue infection, with ROC AUC of 0.80 and 0.75 for models with and without NS1 rapid test result, respectively. The models had approximately 80% specificity and 70% sensitivity, similar to the diagnostic accuracy of the hospital's fellows in infectious disease. Including information on NS1 rapid test improved the specificity, but reduced the sensitivity, both in model and physician diagnoses. The Bayesian network model developed in this study could be useful to assist physicians in diagnosing dengue, particularly in regions where experienced physicians and laboratory confirmation tests are limited.

  12. Development of Risk Score for Predicting 3-Year Incidence of Type 2 Diabetes: Japan Epidemiology Collaboration on Occupational Health Study

    PubMed Central

    Nanri, Akiko; Nakagawa, Tohru; Kuwahara, Keisuke; Yamamoto, Shuichiro; Honda, Toru; Okazaki, Hiroko; Uehara, Akihiko; Yamamoto, Makoto; Miyamoto, Toshiaki; Kochi, Takeshi; Eguchi, Masafumi; Murakami, Taizo; Shimizu, Chii; Shimizu, Makiko; Tomita, Kentaro; Nagahama, Satsue; Imai, Teppei; Nishihara, Akiko; Sasaki, Naoko; Hori, Ai; Sakamoto, Nobuaki; Nishiura, Chihiro; Totsuzaki, Takafumi; Kato, Noritada; Fukasawa, Kenji; Huanhuan, Hu; Akter, Shamima; Kurotani, Kayo; Kabe, Isamu; Mizoue, Tetsuya; Sone, Tomofumi; Dohi, Seitaro

    2015-01-01

    Objective Risk models and scores have been developed to predict incidence of type 2 diabetes in Western populations, but their performance may differ when applied to non-Western populations. We developed and validated a risk score for predicting 3-year incidence of type 2 diabetes in a Japanese population. Methods Participants were 37,416 men and women, aged 30 or older, who received periodic health checkup in 2008–2009 in eight companies. Diabetes was defined as fasting plasma glucose (FPG) ≥126 mg/dl, random plasma glucose ≥200 mg/dl, glycated hemoglobin (HbA1c) ≥6.5%, or receiving medical treatment for diabetes. Risk scores on non-invasive and invasive models including FPG and HbA1c were developed using logistic regression in a derivation cohort and validated in the remaining cohort. Results The area under the curve (AUC) for the non-invasive model including age, sex, body mass index, waist circumference, hypertension, and smoking status was 0.717 (95% CI, 0.703–0.731). In the invasive model in which both FPG and HbA1c were added to the non-invasive model, AUC was increased to 0.893 (95% CI, 0.883–0.902). When the risk scores were applied to the validation cohort, AUCs (95% CI) for the non-invasive and invasive model were 0.734 (0.715–0.753) and 0.882 (0.868–0.895), respectively. Participants with a non-invasive score of ≥15 and invasive score of ≥19 were projected to have >20% and >50% risk, respectively, of developing type 2 diabetes within 3 years. Conclusions The simple risk score of the non-invasive model might be useful for predicting incident type 2 diabetes, and its predictive performance may be markedly improved by incorporating FPG and HbA1c. PMID:26558900

  13. Estimating body fat in NCAA Division I female athletes: a five-compartment model validation of laboratory methods.

    PubMed

    Moon, Jordan R; Eckerson, Joan M; Tobkin, Sarah E; Smith, Abbie E; Lockwood, Christopher M; Walter, Ashley A; Cramer, Joel T; Beck, Travis W; Stout, Jeffrey R

    2009-01-01

    The purpose of the present study was to determine the validity of various laboratory methods for estimating percent body fat (%fat) in NCAA Division I college female athletes (n = 29; 20 +/- 1 year). Body composition was assessed via hydrostatic weighing (HW), air displacement plethysmography (ADP), and dual-energy X-ray absorptiometry (DXA), and estimates of %fat derived using 4-compartment (C), 3C, and 2C models were compared to a criterion 5C model that included bone mineral content, body volume (BV), total body water, and soft tissue mineral. The Wang-4C and the Siri-3C models produced nearly identical values compared to the 5C model (r > 0.99, total error (TE) < 0.40%fat). For the remaining laboratory methods, constant error values (CE) ranged from -0.04%fat (HW-Siri) to -3.71%fat (DXA); r values ranged from 0.89 (ADP-Siri, ADP-Brozek) to 0.93 (DXA); standard error of estimate values ranged from 1.78%fat (DXA) to 2.19%fat (ADP-Siri, ADP-Brozek); and TE values ranged from 2.22%fat (HW-Brozek) to 4.90%fat (DXA). The limits of agreement for DXA (-10.10 to 2.68%fat) were the largest with a significant trend of -0.43 (P < 0.05). With the exception of DXA, all of the equations resulted in acceptable TE values (<3.08%fat). However, the results for individual estimates of %fat using the Brozek equation indicated that the 2C models that derived BV from ADP and HW overestimated (5.38, 3.65%) and underestimated (5.19, 4.88%) %fat, respectively. The acceptable TE values for both HW and ADP suggest that these methods are valid for estimating %fat in college female athletes; however, the Wang-4C and Siri-3C models should be used to identify individual estimates of %fat in this population.

  14. Promoting motivation through mode of instruction: The relationship between use of affective teaching techniques and motivation to learn science

    NASA Astrophysics Data System (ADS)

    Sanchez Rivera, Yamil

    The purpose of this study is to add to what we know about the affective domain and to create a valid instrument for future studies. The Motivation to Learn Science (MLS) Inventory is based on Krathwohl's Taxonomy of Affective Behaviors (Krathwohl et al., 1964). The results of the Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) demonstrated that the MLS Inventory is a valid and reliable instrument. Therefore, the MLS Inventory is a uni-dimensional instrument composed of 9 items with convergent validity (no divergence). The instrument had a high Chronbach Alpha value of .898 during the EFA analysis and .919 with the CFA analysis. Factor loadings on the 9 items ranged from .617 to .800. Standardized regression weights ranged from .639 to .835 in the CFA analysis. Various indices (RMSEA = .033; NFI = .987; GFI = .985; CFI = 1.000) demonstrated a good fitness of the proposed model. Hierarchical linear modeling was used to statistical analyze data where students' motivation to learn science scores (level-1) were nested within teachers (level-2). The analysis was geared toward identifying if teachers' use of affective behavior (a level-2 classroom variable) was significantly related with students' MLS scores (level-1 criterion variable). Model testing proceeded in three phases: intercept-only model, means-as-outcome model, and a random-regression coefficient model. The intercept-only model revealed an intra-class correlation coefficient of .224 with an estimated reliability of .726. Therefore, data suggested that only 22.4% of the variance in MLS scores is between-classes and the remaining 77.6% is at the student-level. Due to the significant variance in MLS scores, X2(62.756, p<.0001), teachers' TAB scores were added as a level-2 predictor. The regression coefficient was non-significant (p>.05). Therefore, the teachers' self-reported use of affective behaviors was not a significant predictor of students' motivation to learn science.

  15. Development of Risk Score for Predicting 3-Year Incidence of Type 2 Diabetes: Japan Epidemiology Collaboration on Occupational Health Study.

    PubMed

    Nanri, Akiko; Nakagawa, Tohru; Kuwahara, Keisuke; Yamamoto, Shuichiro; Honda, Toru; Okazaki, Hiroko; Uehara, Akihiko; Yamamoto, Makoto; Miyamoto, Toshiaki; Kochi, Takeshi; Eguchi, Masafumi; Murakami, Taizo; Shimizu, Chii; Shimizu, Makiko; Tomita, Kentaro; Nagahama, Satsue; Imai, Teppei; Nishihara, Akiko; Sasaki, Naoko; Hori, Ai; Sakamoto, Nobuaki; Nishiura, Chihiro; Totsuzaki, Takafumi; Kato, Noritada; Fukasawa, Kenji; Huanhuan, Hu; Akter, Shamima; Kurotani, Kayo; Kabe, Isamu; Mizoue, Tetsuya; Sone, Tomofumi; Dohi, Seitaro

    2015-01-01

    Risk models and scores have been developed to predict incidence of type 2 diabetes in Western populations, but their performance may differ when applied to non-Western populations. We developed and validated a risk score for predicting 3-year incidence of type 2 diabetes in a Japanese population. Participants were 37,416 men and women, aged 30 or older, who received periodic health checkup in 2008-2009 in eight companies. Diabetes was defined as fasting plasma glucose (FPG) ≥ 126 mg/dl, random plasma glucose ≥ 200 mg/dl, glycated hemoglobin (HbA1c) ≥ 6.5%, or receiving medical treatment for diabetes. Risk scores on non-invasive and invasive models including FPG and HbA1c were developed using logistic regression in a derivation cohort and validated in the remaining cohort. The area under the curve (AUC) for the non-invasive model including age, sex, body mass index, waist circumference, hypertension, and smoking status was 0.717 (95% CI, 0.703-0.731). In the invasive model in which both FPG and HbA1c were added to the non-invasive model, AUC was increased to 0.893 (95% CI, 0.883-0.902). When the risk scores were applied to the validation cohort, AUCs (95% CI) for the non-invasive and invasive model were 0.734 (0.715-0.753) and 0.882 (0.868-0.895), respectively. Participants with a non-invasive score of ≥ 15 and invasive score of ≥ 19 were projected to have >20% and >50% risk, respectively, of developing type 2 diabetes within 3 years. The simple risk score of the non-invasive model might be useful for predicting incident type 2 diabetes, and its predictive performance may be markedly improved by incorporating FPG and HbA1c.

  16. Implementation of remaining service interval concept.

    DOT National Transportation Integrated Search

    2016-10-01

    This document is a technical summary of the Federal Highway Administration (FHWA) report, "Application and Validation of RSI Framework to Pavements" (FHWA-HRT-16-053). The goal of this project was to demonstrate and to validate the application of the...

  17. Animal models for posttraumatic stress disorder: An overview of what is used in research

    PubMed Central

    Borghans, Bart; Homberg, Judith R

    2015-01-01

    Posttraumatic stress disorder (PTSD) is a common anxiety disorder characterised by its persistence of symptoms after a traumatic experience. Although some patients can be cured, many do not benefit enough from the psychological therapies or medication strategies used. Many researchers use animal models to learn more about the disorder and several models are available. The most-used physical stressor models are single-prolonged stress, restraint stress, foot shock, stress-enhanced fear learning, and underwater trauma. Common social stressors are housing instability, social instability, early-life stress, and social defeat. Psychological models are not as diverse and rely on controlled exposure to the test animal’s natural predator. While validation of these models has been resolved with replicated symptoms using analogous stressors, translating new findings to human patients remains essential for their impact on the field. Choosing a model to experiment with can be challenging; this overview of what is possible with individual models may aid in making a decision. PMID:26740930

  18. Unconventional Liquid Flow in Low-Permeability Media: Theory and Revisiting Darcy's Law

    NASA Astrophysics Data System (ADS)

    Liu, H. H.; Chen, J.

    2017-12-01

    About 80% of fracturing fluid remains in shale formations after hydraulic fracturing and the flow back process. It is critical to understand and accurately model the flow process of fracturing fluids in a shale formation, because the flow has many practical applications for shale gas recovery. Owing to the strong solid-liquid interaction in low-permeability media, Darcy's law is not always adequate for describing liquid flow process in a shale formation. This non-Darcy flow behavior (characterized by nonlinearity of the relationship between liquid flux and hydraulic gradient), however, has not been given enough attention in the shale gas community. The current study develops a systematic methodology to address this important issue. We developed a phenomenological model for liquid flow in shale (in which liquid flux is a power function of pressure gradient), an extension of the conventional Darcy's law, and also a methodology to estimate parameters for the phenomenological model from spontaneous imbibition tests. The validity of our new developments is verified by satisfactory comparisons of theoretical results and observations from our and other research groups. The relative importance of this non-Darcy liquid flow for hydrocarbon production in unconventional reservoirs remains an issue that needs to be further investigated.

  19. Temperament and problem solving in a population of adolescent guide dogs.

    PubMed

    Bray, Emily E; Sammel, Mary D; Seyfarth, Robert M; Serpell, James A; Cheney, Dorothy L

    2017-09-01

    It is often assumed that measures of temperament within individuals are more correlated to one another than to measures of problem solving. However, the exact relationship between temperament and problem-solving tasks remains unclear because large-scale studies have typically focused on each independently. To explore this relationship, we tested 119 prospective adolescent guide dogs on a battery of 11 temperament and problem-solving tasks. We then summarized the data using both confirmatory factor analysis and exploratory principal components analysis. Results of confirmatory analysis revealed that a priori separation of tests as measuring either temperament or problem solving led to weak results, poor model fit, some construct validity, and no predictive validity. In contrast, results of exploratory analysis were best summarized by principal components that mixed temperament and problem-solving traits. These components had both construct and predictive validity (i.e., association with success in the guide dog training program). We conclude that there is complex interplay between tasks of "temperament" and "problem solving" and that the study of both together will be more informative than approaches that consider either in isolation.

  20. Enhanced migration of polychlorodibenzo-p-dioxins and furans in the presence of pentachlorophenol-treated oil in soil around utility poles: screening model validation.

    PubMed

    Bulle, Cécile; Samson, Réjean; Deschênes, Louise

    2010-03-01

    Field samples were collected around six pentachlorophenol (PCP)-treated wooden poles (in clay, organic soil, and sand) to evaluate the vertical migration of polychlorodibenzo-p-dioxins and furans (PCDD/Fs). Soils were characterized, PCDD/Fs, C(10)-C(50), and PCP were analyzed for seven composite samples located at a depth from 0 to 100 cm and at a distance from 0 to 50 cm from each pole. Concentrations of PCDD/Fs measured in organic soils were the highest (maximum 1.2E + 05 pg toxic equivalent TEQ/g soil), followed by clay (maximum 3.8E + 04 pg TEQ/g soil) and sand (maximum 1.8E + 04 pg TEQ/g soil). Model predictions, including the influence of wood treatment oil, were validated using measured concentration values in soils around poles. The model predicts a migration of PCDD/Fs due to the migration of oil, which differs depending on the type of soil: in clay, 90% of PCDD/Fs are predicted to remain in the first 29 cm, whereas in sand, 80 to 90% of the emitted PCDD/Fs are predicted to migrate deeper than 185 cm. For the organic soil, the predicted migration depth varies from 90 to 155 cm. This screening model allows evaluating the danger of microcontaminated sites around PCP-treated wooden poles: from a risk assessment perspective, in the case of organic soil and clay, no PCDD/F contamination is to be expected below the pole, but high levels of PCDD/Fs can be found in the first 2 m below the surface. For sand, however, significantly lower levels of PCDD/Fs were predicted in the surface soil, while the migration depth remains elevated, posing an inherent danger of aquifer contamination under the pole.

  1. Last stand of single small field inflation

    NASA Astrophysics Data System (ADS)

    Bramante, Joseph; Lehman, Landon; Martin, Adam; Downes, Sean

    2014-07-01

    By incorporating both the tensor-to-scalar ratio and the measured value of the spectral index, we set a bound on solo small field inflation of Δϕ/mPl≥1.00√r/0.1 . Unlike previous bounds which require monotonic ɛV, |ηV|<1, and 60 e-folds of inflation, the bound remains valid for nonmonotonic ɛV, |ηV|≳1, and for inflation which occurs only over the eight e-folds which have been observed on the cosmic microwave background. The negative value of the spectral index over the observed eight e-folds is what makes the bound strong; we illustrate this by surveying single field models and finding that for r ≳0.1 and eight e-folds of inflation, there is no simple potential which reproduces observed cosmic microwave background perturbations and remains sub-Planckian. Models that are sub-Planckian after eight e-folds must be patched together with a second epoch of inflation that fills out the remaining ˜50 e-folds. This second, post-cosmic microwave background epoch is characterized by extremely small ɛV and therefore an increasing scalar power spectrum. Using the fact that large power can overabundantly produce primordial black holes, we bound the maximum energy level of the second phase of inflation.

  2. Development of a scale to measure patients' trust in health insurers.

    PubMed

    Zheng, Beiyao; Hall, Mark A; Dugan, Elizabeth; Kidd, Kristin E; Levine, Douglas

    2002-02-01

    To develop a scale to measure patients' trust in health insurers, including public and private insurers and both indemnity and managed care. A scale was developed based on our conceptual model of insurer trust. The scale was analyzed for its factor structure, internal consistency, construct validity, and other psychometric properties. The scale was developed and validated on a random national sample (n = 410) of subjects with any type of insurance and further validated and used in a regional random sample of members of an HMO in North Carolina (n = 1152). Factor analysis was used to uncover the underlying dimensions of the scale. Internal consistency was assessed by Cronbach's alpha. Construct validity was established by Pearson or Spearman correlations and t tests. Data were collected via telephone interviews. The 11-item scale has good internal consistency (alpha = 0.92/ 0.89) and response variability (range = 11-55, M = 36.5/37.0, SD = 7.8/7.0). Insurer trust is a unidimensional construct and is related to trust in physicians, satisfaction with care and with insurer, having enough choice in selecting health insurer, no prior disputes with health insurer, type of insurer, and desire to remain with insurer. Trust in health insurers can be validly and reliably measured. Additional studies are required to learn more about what factors affect insurer trust and whether differences and changes in insurer trust affect actual behaviors and other outcomes of interest.

  3. Value-based decision making via sequential sampling with hierarchical competition and attentional modulation

    PubMed Central

    2017-01-01

    In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes “winner-take-all” processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans’ value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light. PMID:29077746

  4. Value-based decision making via sequential sampling with hierarchical competition and attentional modulation.

    PubMed

    Colas, Jaron T

    2017-01-01

    In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes "winner-take-all" processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans' value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light.

  5. A Critical Evaluation of the Validity and the Reliability of Global Competency Constructs for Supervisor Assessment of Junior Medical Trainees

    ERIC Educational Resources Information Center

    McGill, D. A.; van der Vleuten, C. P. M.; Clarke, M. J.

    2013-01-01

    Supervisor assessments are critical for both formative and summative assessment in the workplace. Supervisor ratings remain an important source of such assessment in many educational jurisdictions even though there is ambiguity about their validity and reliability. The aims of this evaluation is to explore the: (1) construct validity of ward-based…

  6. Variability and validity of intimate partner violence reporting by couples in Tanzania.

    PubMed

    Halim, Nafisa; Steven, Ester; Reich, Naomi; Badi, Lilian; Messersmith, Lisa

    2018-01-01

    In recent years, major global institutions have amplified their efforts to address intimate partner violence (IPV) against women-a global health and human rights violation affecting 15-71% of reproductive aged women over their lifetimes. Still, some scholars remain concerned about the validity of instruments used for IPV assessment in population-based studies. In this paper, we conducted two validation analyses using novel data from 450 women-men dyads across nine villages in Northern Tanzania. First, we examined the level of inter-partner agreement in reporting of men's physical, sexual, emotional and economic IPV against women in the last three and twelve months prior to the survey, ever in the relationship, and during pregnancy. Second, we conducted a convergent validity analysis to compare the relative efficacy of men's self-reports of perpetration and women's of victimization as a valid indicator of IPV against Tanzanian women using logistic regression models with village-level clustered errors. We found that, for every violence type across the recall periods of the last three months, the last twelve months and ever in the relationship, at least one in three couples disagreed about IPV occurrences in the relationship. Couples' agreement about physical, sexual and economic IPV during pregnancy was high with 86-93% of couples reporting concordantly. Also, men's self-reported perpetration had statistically significant associations with at least as many validated risk factors as had women's self-reported victimization. This finding suggests that men's self-reports are at least as valid as women's as an indicator of IPV against women in Northern Tanzania. We recommend more validation studies are conducted in low-income countries, and that data on relationship factors affecting IPV reports and reporting are made available along with data on IPV occurrences.

  7. Technical skills assessment toolbox: a review using the unitary framework of validity.

    PubMed

    Ghaderi, Iman; Manji, Farouq; Park, Yoon Soo; Juul, Dorthea; Ott, Michael; Harris, Ilene; Farrell, Timothy M

    2015-02-01

    The purpose of this study was to create a technical skills assessment toolbox for 35 basic and advanced skills/procedures that comprise the American College of Surgeons (ACS)/Association of Program Directors in Surgery (APDS) surgical skills curriculum and to provide a critical appraisal of the included tools, using contemporary framework of validity. Competency-based training has become the predominant model in surgical education and assessment of performance is an essential component. Assessment methods must produce valid results to accurately determine the level of competency. A search was performed, using PubMed and Google Scholar, to identify tools that have been developed for assessment of the targeted technical skills. A total of 23 assessment tools for the 35 ACS/APDS skills modules were identified. Some tools, such as Operative Performance Rating System (OSATS) and Objective Structured Assessment of Technical Skill (OPRS), have been tested for more than 1 procedure. Therefore, 30 modules had at least 1 assessment tool, with some common surgical procedures being addressed by several tools. Five modules had none. Only 3 studies used Messick's framework to design their validity studies. The remaining studies used an outdated framework on the basis of "types of validity." When analyzed using the contemporary framework, few of these studies demonstrated validity for content, internal structure, and relationship to other variables. This study provides an assessment toolbox for common surgical skills/procedures. Our review shows that few authors have used the contemporary unitary concept of validity for development of their assessment tools. As we progress toward competency-based training, future studies should provide evidence for various sources of validity using the contemporary framework.

  8. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    PubMed

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  9. Modeling Storm-Induced Inundation on the Yukon-Kuskokwim Delta for Present and Future Climates

    NASA Astrophysics Data System (ADS)

    Ravens, T. M.; Allen, J.

    2012-12-01

    The Yukon-Kuskokwim (YK) Delta is a large delta on the west coast of Alaska and one of the few remaining deltas that is largely free of anthropogenic impacts. The delta hosts a wide-range of nesting birds including the endangered Spectacled Eider. The delta plain, with an elevation of about 2 m (m.s.l.) - and an average tidal range of 2.7 m - is subject to frequent inundation by storm surges originating from the adjacent Bering Sea. Here, we report on our efforts to validate a storm-surge modeling system consisting of a course-grid ADCIRC model covering the Bering and Chukchi Seas and a Delft3D fine-grid model of the southern YK Delta. The storm surge models are validated based on measured water levels from 2007-2010 and using satellite observations of inundation due to large storms in 2005 and 2006. About 10 storms over the past 30 years are modeled. Based on model output, we computed a spatially distributed inundation index which is a time-integral of water level throughout the fine-grid model domain from individual storms and from the 30 year period. In order to examine the change in inundation in future climates, the models of the 30 year period were re-run assuming a 1 and 2 meter sea level rise. The impact of climate change on inundation frequency and intensity - using the inundation index - is reported. Future work will relate the present and projected inundation index to ecological parameters such as bird-nest concentration and vegetation type.

  10. Standard International prognostic index remains a valid predictor of outcome for patients with aggressive CD20+ B-cell lymphoma in the rituximab era.

    PubMed

    Ziepert, Marita; Hasenclever, Dirk; Kuhnt, Evelyn; Glass, Bertram; Schmitz, Norbert; Pfreundschuh, Michael; Loeffler, Markus

    2010-05-10

    The International Prognostic Index (IPI) is widely used for risk stratification of patients with aggressive B-cell lymphoma. The introduction of rituximab has markedly improved outcome, and R-CHOP (rituximab + cyclophosphamide, doxorubicin, vincristine, prednisone) has become the standard treatment for CD20(+) diffuse large B-cell lymphoma. To investigate whether the IPI has maintained its power for risk stratification when rituximab is combined with CHOP, we analyzed the prognostic relevance of IPI in three prospective clinical trials. In total, 1,062 patients treated with rituximab were included (MabThera International Trial [MInT], 380 patients; dose-escalated regimen of cyclophosphamide, doxorubicin, vincristine, etoposide, and prednisone (MegaCHOEP) trial, 72 patients; CHOP + rituximab for patients older than age 60 years [RICOVER-60] trial, 610 patients). A multivariate proportional hazards modeling was performed for single IPI factors under rituximab on event-free, progression-free, and overall survival. IPI score was significant for all three end points. Rituximab significantly improved treatment outcome within each IPI group resulting in a quenching of the Kaplan-Meier estimators. However, IPI was a significant prognostic factor in all three end points and the ordering of the IPI groups remained valid. The relative risk estimates of single IPI factors and their order in patients treated with R-CHOP were similar to those found with CHOP. The effects of rituximab were superimposed on the effects of CHOP with no interactions between chemotherapy and antibody therapy. These results demonstrate that the IPI is still valid in the R-CHOP era.

  11. Estimating energy expenditure from heart rate in older adults: a case for calibration.

    PubMed

    Schrack, Jennifer A; Zipunnikov, Vadim; Goldsmith, Jeff; Bandeen-Roche, Karen; Crainiceanu, Ciprian M; Ferrucci, Luigi

    2014-01-01

    Accurate measurement of free-living energy expenditure is vital to understanding changes in energy metabolism with aging. The efficacy of heart rate as a surrogate for energy expenditure is rooted in the assumption of a linear function between heart rate and energy expenditure, but its validity and reliability in older adults remains unclear. To assess the validity and reliability of the linear function between heart rate and energy expenditure in older adults using different levels of calibration. Heart rate and energy expenditure were assessed across five levels of exertion in 290 adults participating in the Baltimore Longitudinal Study of Aging. Correlation and random effects regression analyses assessed the linearity of the relationship between heart rate and energy expenditure and cross-validation models assessed predictive performance. Heart rate and energy expenditure were highly correlated (r=0.98) and linear regardless of age or sex. Intra-person variability was low but inter-person variability was high, with substantial heterogeneity of the random intercept (s.d. =0.372) despite similar slopes. Cross-validation models indicated individual calibration data substantially improves accuracy predictions of energy expenditure from heart rate, reducing the potential for considerable measurement bias. Although using five calibration measures provided the greatest reduction in the standard deviation of prediction errors (1.08 kcals/min), substantial improvement was also noted with two (0.75 kcals/min). These findings indicate standard regression equations may be used to make population-level inferences when estimating energy expenditure from heart rate in older adults but caution should be exercised when making inferences at the individual level without proper calibration.

  12. Incorporating Water Boiling in the Numerical Modelling of Thermal Remediation by Electrical Resistance Heating

    NASA Astrophysics Data System (ADS)

    Molnar, I. L.; Krol, M.; Mumford, K. G.

    2017-12-01

    Developing numerical models for subsurface thermal remediation techniques - such as Electrical Resistive Heating (ERH) - that include multiphase processes such as in-situ water boiling, gas production and recovery has remained a significant challenge. These subsurface gas generation and recovery processes are driven by physical phenomena such as discrete and unstable gas (bubble) flow as well as water-gas phase mass transfer rates during bubble flow. Traditional approaches to multiphase flow modeling soil remain unable to accurately describe these phenomena. However, it has been demonstrated that Macroscopic Invasion Percolation (MIP) can successfully simulate discrete and unstable gas transport1. This has lead to the development of a coupled Electro Thermal-MIP Model2 (ET-MIP) capable of simulating multiple key processes in the thermal remediation and gas recovery process including: electrical heating of soil and groundwater, water flow, geological heterogeneity, heating-induced buoyant flow, water boiling, gas bubble generation and mobilization, contaminant mass transport and removal, and additional mechanisms such as bubble collapse in cooler regions. This study presents the first rigorous validation of a coupled ET-MIP model against two-dimensional water boiling and water/NAPL co-boiling experiments3. Once validated, the model was used to explore the impact of water and co-boiling events and subsequent gas generation and mobilization on ERH's ability to 1) generate, expand and mobilize gas at boiling and NAPL co-boiling temperatures, 2) efficiently strip contaminants from soil during both boiling and co-boiling. In addition, a quantification of the energy losses arising from steam generation during subsurface water boiling was examined with respect to its impact on the efficacy of thermal remediation. While this study specifically targets ERH, the study's focus on examining the fundamental mechanisms driving thermal remediation (e.g., water boiling) renders these results applicable to a wide range of thermal and gas-based remediation techniques. 1. Mumford, K. G., et al. (2010), Adv. Water Resour. 2010, 33 (4), 504-513. 2. Krol, M. M., et al. (2011), Adv. Water Resour. 2011, 34 (4), 537-549. 3. Hegele, P. R. and Mumford, K. G. Journal of Contaminant Hydrology 2014, 165, 24-36.

  13. Validating modelled variable surface saturation in the riparian zone with thermal infrared images

    NASA Astrophysics Data System (ADS)

    Glaser, Barbara; Klaus, Julian; Frei, Sven; Frentress, Jay; Pfister, Laurent; Hopp, Luisa

    2015-04-01

    Variable contributing areas and hydrological connectivity have become prominent new concepts for hydrologic process understanding in recent years. The dynamic connectivity within the hillslope-riparian-stream (HRS) system is known to have a first order control on discharge generation and especially the riparian zone functions as runoff buffering or producing zone. However, despite their importance, the highly dynamic processes of contraction and extension of saturation within the riparian zone and its impact on runoff generation still remain not fully understood. In this study, we analysed the potential of a distributed, fully coupled and physically based model (HydroGeoSphere) to represent the spatial and temporal water flux dynamics of a forested headwater HRS system (6 ha) in western Luxembourg. The model was set up and parameterised under consideration of experimentally-derived knowledge of catchment structure and was run for a period of four years (October 2010 to August 2014). For model evaluation, we especially focused on the temporally varying spatial patterns of surface saturation. We used ground-based thermal infrared (TIR) imagery to map surface saturation with a high spatial and temporal resolution and collected 20 panoramic snapshots of the riparian zone (ca. 10 by 20 m) under different hydrologic conditions. These TIR panoramas were used in addition to several classical discharge and soil moisture time series for a spatially-distributed model validation. In a manual calibration process we optimised model parameters (e.g. porosity, saturated hydraulic conductivity, evaporation depth) to achieve a better agreement between observed and modelled discharges and soil moistures. The subsequent validation of surface saturation patterns by a visual comparison of processed TIR panoramas and corresponding model output panoramas revealed an overall good accordance for all but one region that was always too dry in the model. However, quantitative comparisons of modelled and observed saturated pixel percentages and of their modelled and measured relationships to concurrent discharges revealed remarkable similarities. During the calibration process we observed that surface saturation patterns were mostly affected by changing the soil properties of the topsoil in the riparian zone, but that the discharge behaviour did not change substantially at the same time. This effect of various spatial patterns occurring concomitant to a nearly unchanged integrated response demonstrates the importance of spatially distributed validation data. Our study clearly benefited from using different kinds of data - spatially integrated and distributed, temporally continuous and discrete - for the model evaluation procedure.

  14. 24 CFR 597.402 - Validation of designation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... URBAN DEVELOPMENT COMMUNITY FACILITIES URBAN EMPOWERMENT ZONES AND ENTERPRISE COMMUNITIES: ROUND ONE... eligibility for and the validity of the designation of any Empowerment Zone or Enterprise Community. Determinations of whether any designated Empowerment Zone or Enterprise Community remains in good standing shall...

  15. Asymptotically optimum multialternative sequential procedures for discernment of processes minimizing average length of observations

    NASA Astrophysics Data System (ADS)

    Fishman, M. M.

    1985-01-01

    The problem of multialternative sequential discernment of processes is formulated in terms of conditionally optimum procedures minimizing the average length of observations, without any probabilistic assumptions about any one occurring process, rather than in terms of Bayes procedures minimizing the average risk. The problem is to find the procedure that will transform inequalities into equalities. The problem is formulated for various models of signal observation and data processing: (1) discernment of signals from background interference by a multichannel system; (2) discernment of pulse sequences with unknown time delay; (3) discernment of harmonic signals with unknown frequency. An asymptotically optimum sequential procedure is constructed which compares the statistics of the likelihood ratio with the mean-weighted likelihood ratio and estimates the upper bound for conditional average lengths of observations. This procedure is shown to remain valid as the upper bound for the probability of erroneous partial solutions decreases approaching zero and the number of hypotheses increases approaching infinity. It also remains valid under certain special constraints on the probability such as a threshold. A comparison with a fixed-length procedure reveals that this sequential procedure decreases the length of observations to one quarter, on the average, when the probability of erroneous partial solutions is low.

  16. Unremarked or Unperformed? Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer.

    PubMed

    de Boer, Pieter T; Frederix, Geert W J; Feenstra, Talitha L; Vemer, Pepijn

    2016-09-01

    Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of model validation efforts in the overall health economic literature. A literature search was performed in Pubmed and Embase to retrieve health economic modelling studies published between 2008 and 2014. Reporting on model validation was evaluated by checking for the word validation, and by using AdViSHE (Assessment of the Validation Status of Health Economic decision models), a tool containing a structured list of relevant items for validation. Additionally, we contacted corresponding authors to ask whether more validation efforts were performed other than those reported in the manuscripts. A total of 53 studies on seasonal influenza and 41 studies on early breast cancer were included in our review. The word validation was used in 16 studies (30 %) on seasonal influenza and 23 studies (56 %) on early breast cancer; however, in a minority of studies, this referred to a model validation technique. Fifty-seven percent of seasonal influenza studies and 71 % of early breast cancer studies reported one or more validation techniques. Cross-validation of study outcomes was found most often. A limited number of studies reported on model validation efforts, although good examples were identified. Author comments indicated that more validation techniques were performed than those reported in the manuscripts. Although validation is deemed important by many researchers, this is not reflected in the reporting habits of health economic modelling studies. Systematic reporting of validation efforts would be desirable to further enhance decision makers' confidence in health economic models and their outcomes.

  17. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling

    PubMed Central

    Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W. F.; Jeelani, Owase; Dunaway, David J.; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face. PMID:29742139

  18. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling.

    PubMed

    Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.

  19. Developing and validating a measure of community capacity: Why volunteers make the best neighbours.

    PubMed

    Lovell, Sarah A; Gray, Andrew R; Boucher, Sara E

    2015-05-01

    Social support and community connectedness are key determinants of both mental and physical wellbeing. While social capital has been used to indicate the instrumental value of these social relationships, its broad and often competing definitions have hindered practical applications of the concept. Within the health promotion field, the related concept of community capacity, the ability of a group to identify and act on problems, has gained prominence (Labonte and Laverack, 2001). The goal of this study was to develop and validate a scale measuring community capacity including exploring its associations with socio-demographic and civic behaviour variables among the residents of four small (populations 1500-2000) high-deprivation towns in southern New Zealand. The full (41-item) scale was found to have strong internal consistency (Cronbach's alpha = 0.89) but a process of reducing the scale resulted in a shorter 26-item instrument with similar internal consistency (alpha 0.88). Subscales of the reduced instrument displayed at least marginally acceptable levels of internal consistency (0.62-0.77). Using linear regression models, differences in community capacity scores were found for selected criterion, namely time spent living in the location, local voting, and volunteering behaviour, although the first of these was no longer statistically significant in an adjusted model with potential confounders including age, sex, ethnicity, education, marital status, employment, household income, and religious beliefs. This provides support for the scale's concurrent validity. Differences were present between the four towns in unadjusted models and remained statistically significant in adjusted models (including variables mentioned above) suggesting, crucially, that even when such factors are accounted for, perceptions of one's community may still depend on place. Copyright © 2014. Published by Elsevier Ltd.

  20. [Maslach Burnout Inventory - Student Survey: Portugal-Brazil cross-cultural adaptation].

    PubMed

    Campos, Juliana Alvares Duarte Bonini; Maroco, João

    2012-10-01

    To perform a cross-cultural adaptation of the Portuguese version of the Maslach Burnout Inventory for students (MBI-SS), and investigate its reliability, validity and cross-cultural invariance. The face validity involved the participation of a multidisciplinary team. Content validity was performed. The Portuguese version was completed in 2009, on the internet, by 958 Brazilian and 556 Portuguese university students from the urban area. Confirmatory factor analysis was carried out using as fit indices: the χ²/df, the Comparative Fit Index (CFI), the Goodness of Fit Index (GFI) and the Root Mean Square Error of Approximation (RMSEA). To verify the stability of the factor solution according to the original English version, cross-validation was performed in 2/3 of the total sample and replicated in the remaining 1/3. Convergent validity was estimated by the average variance extracted and composite reliability. The discriminant validity was assessed, and the internal consistency was estimated by the Cronbach's alpha coefficient. Concurrent validity was estimated by the correlational analysis of the mean scores of the Portuguese version and the Copenhagen Burnout Inventory, and the divergent validity was compared to the Beck Depression Inventory. The invariance of the model between the Brazilian and the Portuguese samples was assessed. The three-factor model of Exhaustion, Disengagement and Efficacy showed good fit (c 2/df = 8.498, CFI = 0.916, GFI = 0.902, RMSEA = 0.086). The factor structure was stable (λ:χ²dif = 11.383, p = 0.50; Cov: χ²dif = 6.479, p = 0.372; Residues: χ²dif = 21.514, p = 0.121). Adequate convergent validity (VEM = 0.45;0.64, CC = 0.82;0.88), discriminant (ρ² = 0.06;0.33) and internal consistency (α = 0.83;0.88) were observed. The concurrent validity of the Portuguese version with the Copenhagen Inventory was adequate (r = 0.21, 0.74). The assessment of the divergent validity was impaired by the approach of the theoretical concept of the dimensions Exhaustion and Disengagement of the Portuguese version with the Beck Depression Inventory. Invariance of the instrument between the Brazilian and Portuguese samples was not observed (λ:χ²dif = 84.768, p<0.001; Cov: χ²dif = 129.206, p < 0.001; Residues: χ²dif = 518.760, p < 0.001). The Portuguese version of the Maslach Burnout Inventory for students showed adequate reliability and validity, but its factor structure was not invariant between the countries, indicating the absence of cross-cultural stability.

  1. A multilevel analysis of gatekeeper characteristics and consistent condom use among establishment-based female sex workers in Guangxi, China.

    PubMed

    Li, Qing; Li, Xiaoming; Stanton, Bonita; Fang, Xiaoyi; Zhao, Ran

    2010-11-01

    Multilevel analytical techniques are being applied in condom use research to ensure the validity of investigation on environmental/structural influences and clustered data from venue-based sampling. The literature contains reports of consistent associations between perceived gatekeeper support and condom use among entertainment establishment-based female sex workers (FSWs) in Guangxi, China. However, the clustering inherent in the data (FSWs being clustered within establishment) has not been accounted in most of the analyses. We used multilevel analyses to examine perceived features of gatekeepers and individual correlates of consistent condom use among FSWs and to validate the findings in the existing literature. We analyzed cross-sectional data from 318 FSWs from 29 entertainment establishments in Guangxi, China in 2004, with a minimum of 5 FSWs per establishment. The Hierarchical Linear Models program with Laplace estimation was used to estimate the parameters in models containing random effects and binary outcomes. About 11.6% of women reported consistent condom use with clients. The intraclass correlation coefficient indicated 18.5% of the variance in condom use could be attributed to their similarity between FSWs within the same establishments. Women's perceived gatekeeper support and education remained positively associated with condom use (P < 0.05), after controlling for other individual characteristics and clustering. After adjusting for data clustering, perceived gatekeeper support remains associated with consistent condom use with clients among FSWs in China. The results imply that combined interventions to intervene both gatekeepers and individual FSW may effectively promote consistent condom use.

  2. A Multilevel Analysis of Gatekeeper Characteristics and Consistent Condom Use Among Establishment-Based Female Sex Workers in Guangxi, China

    PubMed Central

    Li, Qing; Li, Xiaoming; Stanton, Bonita; Fang, Xiaoyi; Zhao, Ran

    2010-01-01

    Background Multilevel analytical techniques are being applied in condom use research to ensure the validity of investigation on environmental/structural influences and clustered data from venue-based sampling. The literature contains reports of consistent associations between perceived gatekeeper support and condom use among entertainments establishment-based female sex workers (FSWs) in Guangxi, China. However, the clustering inherent in the data (FSWs being clustered within establishment) has not been accounted in most of the analyses. We used multilevel analyses to examine perceived features of gatekeepers and individual correlates of consistent condom use among FSWs and to validate the findings in the existing literature. Methods We analyzed cross-sectional data from 318 FSWs from 29 entertainment establishments in Guangxi, China in 2004, with a minimum of 5 FSWs per establishment. The Hierarchical Linear Models program with Laplace estimation was used to estimate the parameters in models containing random effects and binary outcomes. Results About 11.6% of women reported consistent condom use with clients. The intraclass correlation coefficient indicated 18.5% of the variance in condom use could be attributed to their similarity between FSWs within the same establishments. Women’s perceived gatekeeper support and education remained positively associated with condom use (P < 0.05), after controlling for other individual characteristics and clustering. Conclusions After adjusting for data clustering, perceived gatekeeper support remains associated with consistent condom use with clients among FSWs in China. The results imply that combined interventions to intervene both gatekeepers and individual FSW may effectively promote consistent condom use. PMID:20539262

  3. Hormone replacement therapy is associated with gastro-oesophageal reflux disease: a retrospective cohort study.

    PubMed

    Close, Helen; Mason, James M; Wilson, Douglas; Hungin, A Pali S

    2012-05-29

    Oestrogen and progestogen have the potential to influence gastro-intestinal motility; both are key components of hormone replacement therapy (HRT). Results of observational studies in women taking HRT rely on self-reporting of gastro-oesophageal symptoms and the aetiology of gastro-oesophageal reflux disease (GORD) remains unclear. This study investigated the association between HRT and GORD in menopausal women using validated general practice records. 51,182 menopausal women were identified using the UK General Practice Research Database between 1995-2004. Of these, 8,831 were matched with and without hormone use. Odds ratios (ORs) were calculated for GORD and proton-pump inhibitor (PPI) use in hormone and non-hormone users, adjusting for age, co-morbidities, and co-pharmacy. In unadjusted analysis, all forms of hormone use (oestrogen-only, tibolone, combined HRT and progestogen) were statistically significantly associated with GORD. In adjusted models, this association remained statistically significant for oestrogen-only treatment (OR 1.49; 1.18-1.89). Unadjusted analysis showed a statistically significant association between PPI use and oestrogen-only and combined HRT treatment. When adjusted for covariates, oestrogen-only treatment was significant (OR 1.34; 95% CI 1.03-1.74). Findings from the adjusted model demonstrated the greater use of PPI by progestogen users (OR 1.50; 1.01-2.22). This first large cohort study of the association between GORD and HRT found a statistically significant association between oestrogen-only hormone and GORD and PPI use. This should be further investigated using prospective follow-up to validate the strength of association and describe its clinical significance.

  4. POVERTY, INFANT MORTALITY, AND HOMICIDE RATES IN CROSS-NATIONAL PERPSECTIVE: ASSESSMENTS OF CRITERION AND CONSTRUCT VALIDITY*

    PubMed Central

    Messner, Steven F.; Raffalovich, Lawrence E.; Sutton, Gretchen M.

    2011-01-01

    This paper assesses the extent to which the infant mortality rate might be treated as a “proxy” for poverty in research on cross-national variation in homicide rates. We have assembled a pooled, cross-sectional time-series dataset for 16 advanced nations over the 1993–2000 period that includes standard measures of infant mortality and homicide and also contains information on two commonly used “income-based” poverty measures: a measure intended to reflect “absolute” deprivation and a measure intended to reflect “relative” deprivation. With these data, we are able to assess the criterion validity of the infant mortality rate with reference to the two income-based poverty measures. We are also able to estimate the effects of the various indicators of disadvantage on homicide rates in regression models, thereby assessing construct validity. The results reveal that the infant mortality rate is more strongly correlated with “relative poverty” than with “absolute poverty,” although much unexplained variance remains. In the regression models, the measure of infant mortality and the relative poverty measure yield significant positive effects on homicide rates, while the absolute poverty measure does not exhibit any significant effects. Our analyses suggest that it would be premature to dismiss relative deprivation in cross-national research on homicide, and that disadvantage is best conceptualized and measured as a multidimensional construct. PMID:21643432

  5. Validation of an integrative mathematical model of dehydration and rehydration in virtual humans.

    PubMed

    Pruett, W Andrew; Clemmer, John S; Hester, Robert L

    2016-11-01

    Water homeostasis is one of the body's most critical tasks. Physical challenges to the body, including exercise and surgery, almost always coordinate with some change in water handling reflecting the changing needs of the body. Vasopressin is the most important hormone that contributes to short-term water homeostasis. By manipulating vascular tone and regulating water reabsorption in the collecting duct of the kidneys, vasopressin can mediate the retention or loss of fluids quickly. In this study, we validated HumMod, an integrative mathematical model of human physiology, against six different challenges to water homeostasis with special attention to the secretion of vasopressin and maintenance of electrolyte balance. The studies chosen were performed in normal men and women, and represent a broad spectrum of perturbations. HumMod successfully replicated the experimental results, remaining within 1 standard deviation of the experimental means in 138 of 161 measurements. Only three measurements lay outside of the second standard deviation. Observations were made on serum osmolarity, serum vasopressin concentration, serum sodium concentration, urine osmolarity, serum protein concentration, hematocrit, and cumulative water intake following dehydration. This validation suggests that HumMod can be used to understand water homeostasis under a variety of conditions. © 2016 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adamek, Julian; Daverio, David; Durrer, Ruth

    We present a new N-body code, gevolution , for the evolution of large scale structure in the Universe. Our code is based on a weak field expansion of General Relativity and calculates all six metric degrees of freedom in Poisson gauge. N-body particles are evolved by solving the geodesic equation which we write in terms of a canonical momentum such that it remains valid also for relativistic particles. We validate the code by considering the Schwarzschild solution and, in the Newtonian limit, by comparing with the Newtonian N-body codes Gadget-2 and RAMSES . We then proceed with a simulation ofmore » large scale structure in a Universe with massive neutrinos where we study the gravitational slip induced by the neutrino shear stress. The code can be extended to include different kinds of dark energy or modified gravity models and going beyond the usually adopted quasi-static approximation. Our code is publicly available.« less

  7. The Development and Validation of the Social Networking Experiences Questionnaire: A Measure of Adolescent Cyberbullying and Its Impact.

    PubMed

    Dredge, Rebecca; Gleeson, John; Garcia, Xochitl de la Piedad

    2015-01-01

    The measurement of cyberbullying has been marked by several inconsistencies that lead to difficulties in cross-study comparisons of the frequency of occurrence and the impact of cyberbullying. Consequently, the first aim of this study was to develop a measure of experience with and impact of cyberbullying victimization in social networking sites in adolescents. The second aim was to investigate the psychometric properties of a purpose-built measure (Social Networking Experiences Questionnaire [SNEQ]). Exploratory factor analysis on 253 adolescent social networking sites users produced a six-factor model of impact. However, one factor was removed because of low internal consistency. Cronbach's alpha was higher than .76 for the victimization and remaining five impact subscales. Furthermore, correlation coefficients for the Victimization scale and related dimensions showed good construct validity. The utility of the SNEQ for victim support personnel, research, and cyberbullying education/prevention programs is discussed.

  8. CXCL4 Contributes to the Pathogenesis of Chronic Liver Allograft Dysfunction

    PubMed Central

    Li, Jing; Shi, Yuan; Xie, Ke-Liang; Yin, Hai-Fang; Yan, Lu-nan; Lau, Wan-yee; Wang, Guo-Lin

    2016-01-01

    Chronic liver allograft dysfunction (CLAD) remains the most common cause of patient morbidity and allograft loss in liver transplant patients. However, the pathogenesis of CLAD has not been completely elucidated. By establishing rat CLAD models, in this study, we identified the informative CLAD-associated genes using isobaric tags for relative and absolute quantification (iTRAQ) proteomics analysis and validated these results in recipient rat liver allografts. CXCL4, CXCR3, EGFR, JAK2, STAT3, and Collagen IV were associated with CLAD pathogenesis. We validated that CXCL4 is upstream of these informative genes in the isolated hepatic stellate cells (HSC). Blocking CXCL4 protects against CLAD by reducing liver fibrosis. Therefore, our results indicated that therapeutic approaches that neutralize CXCL4, a newly identified target of fibrosis, may represent a novel strategy for preventing and treating CLAD after liver transplantation. PMID:28053995

  9. CXCL4 Contributes to the Pathogenesis of Chronic Liver Allograft Dysfunction.

    PubMed

    Li, Jing; Liu, Bin; Shi, Yuan; Xie, Ke-Liang; Yin, Hai-Fang; Yan, Lu-Nan; Lau, Wan-Yee; Wang, Guo-Lin

    2016-01-01

    Chronic liver allograft dysfunction (CLAD) remains the most common cause of patient morbidity and allograft loss in liver transplant patients. However, the pathogenesis of CLAD has not been completely elucidated. By establishing rat CLAD models, in this study, we identified the informative CLAD-associated genes using isobaric tags for relative and absolute quantification (iTRAQ) proteomics analysis and validated these results in recipient rat liver allografts. CXCL4, CXCR3, EGFR, JAK2, STAT3, and Collagen IV were associated with CLAD pathogenesis. We validated that CXCL4 is upstream of these informative genes in the isolated hepatic stellate cells (HSC). Blocking CXCL4 protects against CLAD by reducing liver fibrosis. Therefore, our results indicated that therapeutic approaches that neutralize CXCL4, a newly identified target of fibrosis, may represent a novel strategy for preventing and treating CLAD after liver transplantation.

  10. Stable Isotope Ratios as Biomarkers of Diet for Health Research

    PubMed Central

    O’Brien, Diane M.

    2016-01-01

    Diet is a leading modifiable risk factor for chronic disease, but it remains difficult to measure accurately due to the error and bias inherent in self-reported methods of diet assessment. Consequently there is a pressing need for more objective biomarkers of diet for use in health research. The stable isotope ratios of light elements are a promising set of candidate biomarkers because they vary naturally and reproducibly among foods, and those variations are captured in molecules and tissues with high fidelity. Recent studies have identified valid isotopic measures of short and long-term sugar intake, meat intake, and fish intake in specific populations. These studies provide a strong foundation for validating stable isotopic biomarkers in the general United States population. Approaches to improve specificity for specific foods are needed, for example, by modeling intake using multiple stable isotope ratios, or by isolating and measuring specific molecules linked to foods of interest. PMID:26048703

  11. Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.

    PubMed

    Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel

    2006-02-01

    Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.

  12. The bottom-up approach to integrative validity: a new perspective for program evaluation.

    PubMed

    Chen, Huey T

    2010-08-01

    The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  13. Landscape Analysis of Adult Florida Panther Habitat.

    PubMed

    Frakes, Robert A; Belden, Robert C; Wood, Barry E; James, Frederick E

    2015-01-01

    Historically occurring throughout the southeastern United States, the Florida panther is now restricted to less than 5% of its historic range in one breeding population located in southern Florida. Using radio-telemetry data from 87 prime-aged (≥3 years old) adult panthers (35 males and 52 females) during the period 2004 through 2013 (28,720 radio-locations), we analyzed the characteristics of the occupied area and used those attributes in a random forest model to develop a predictive distribution map for resident breeding panthers in southern Florida. Using 10-fold cross validation, the model was 87.5 % accurate in predicting presence or absence of panthers in the 16,678 km2 study area. Analysis of variable importance indicated that the amount of forests and forest edge, hydrology, and human population density were the most important factors determining presence or absence of panthers. Sensitivity analysis showed that the presence of human populations, roads, and agriculture (other than pasture) had strong negative effects on the probability of panther presence. Forest cover and forest edge had strong positive effects. The median model-predicted probability of presence for panther home ranges was 0.81 (0.82 for females and 0.74 for males). The model identified 5579 km2 of suitable breeding habitat remaining in southern Florida; 1399 km2 (25%) of this habitat is in non-protected private ownership. Because there is less panther habitat remaining than previously thought, we recommend that all remaining breeding habitat in south Florida should be maintained, and the current panther range should be expanded into south-central Florida. This model should be useful for evaluating the impacts of future development projects, in prioritizing areas for panther conservation, and in evaluating the potential impacts of sea-level rise and changes in hydrology.

  14. Content Validity and Psychometric Characteristics of the "Knowledge about Older Patients Quiz" for Nurses Using Item Response Theory.

    PubMed

    Dikken, Jeroen; Hoogerduijn, Jita G; Kruitwagen, Cas; Schuurmans, Marieke J

    2016-11-01

    To assess the content validity and psychometric characteristics of the Knowledge about Older Patients Quiz (KOP-Q), which measures nurses' knowledge regarding older hospitalized adults and their certainty regarding this knowledge. Cross-sectional. Content validity: general hospitals. Psychometric characteristics: nursing school and general hospitals in the Netherlands. Content validity: 12 nurse specialists in geriatrics. Psychometric characteristics: 107 first-year and 78 final-year bachelor of nursing students, 148 registered nurses, and 20 nurse specialists in geriatrics. Content validity: The nurse specialists rated each item of the initial KOP-Q (52 items) on relevance. Ratings were used to calculate Item-Content Validity Index and average Scale-Content Validity Index (S-CVI/ave) scores. Items with insufficient content validity were removed. Psychometric characteristics: Ratings of students, nurses, and nurse specialists were used to test for different item functioning (DIF) and unidimensionality before item characteristics (discrimination and difficulty) were examined using Item Response Theory. Finally, norm references were calculated and nomological validity was assessed. Content validity: Forty-three items remained after assessing content validity (S-CVI/ave = 0.90). Psychometric characteristics: Of the 43 items, two demonstrating ceiling effects and 11 distorting ability estimates (DIF) were subsequently excluded. Item characteristics were assessed for the remaining 30 items, all of which demonstrated good discrimination and difficulty parameters. Knowledge was positively correlated with certainty about this knowledge. The final 30-item KOP-Q is a valid, psychometrically sound, comprehensive instrument that can be used to assess the knowledge of nursing students, hospital nurses, and nurse specialists in geriatrics regarding older hospitalized adults. It can identify knowledge and certainty deficits for research purposes or serve as a tool in educational or quality improvement programs. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.

  15. Beware the black box: investigating the sensitivity of FEA simulations to modelling factors in comparative biomechanics.

    PubMed

    Walmsley, Christopher W; McCurry, Matthew R; Clausen, Phillip D; McHenry, Colin R

    2013-01-01

    Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be 'reasonable' are often assumed to have little influence on the results and their interpretation. HERE WE REPORT AN EXTENSIVE SENSITIVITY ANALYSIS WHERE HIGH RESOLUTION FINITE ELEMENT (FE) MODELS OF MANDIBLES FROM SEVEN SPECIES OF CROCODILE WERE ANALYSED UNDER LOADS TYPICAL FOR COMPARATIVE ANALYSIS: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results.

  16. Personalized Estimate of Chemotherapy-Induced Nausea and Vomiting: Development and External Validation of a Nomogram in Cancer Patients Receiving Highly/Moderately Emetogenic Chemotherapy

    PubMed Central

    Hu, Zhihuang; Liang, Wenhua; Yang, Yunpeng; Keefe, Dorothy; Ma, Yuxiang; Zhao, Yuanyuan; Xue, Cong; Huang, Yan; Zhao, Hongyun; Chen, Likun; Chan, Alexandre; Zhang, Li

    2016-01-01

    Abstract Chemotherapy-induced nausea and vomiting (CINV) is presented in over 30% of cancer patients receiving highly/moderately emetogenic chemotherapy (HEC/MEC). The currently recommended antiemetic therapy is merely based on the emetogenic level of chemotherapy, regardless of patient's individual risk factors. It is, therefore, critical to develop an approach for personalized management of CINV in the era of precision medicine. A number of variables were involved in the development of CINV. In the present study, we pooled the data from 2 multi-institutional investigations of CINV due to HEC/MEC treatment in Asian countries. Demographic and clinical variables of 881 patients were prospectively collected as defined previously, and 862 of them had full documentation of variables of interest. The data of 548 patients from Chinese institutions were used to identify variables associated with CINV using multivariate logistic regression model, and then construct a personalized prediction model of nomogram; while the remaining 314 patients out of China (Singapore, South Korea, and Taiwan) entered the external validation set. C-index was used to measure the discrimination ability of the model. The predictors in the final model included sex, age, alcohol consumption, history of vomiting pregnancy, history of motion sickness, body surface area, emetogenicity of chemotherapy, and antiemetic regimens. The C-index was 0.67 (95% CI, 0.62–0.72) for the training set and 0.65 (95% CI, 0.58–0.72) for the validation set. The C-index was higher than that of any single predictor, including the emetogenic level of chemotherapy according to current antiemetic guidelines. Calibration curves showed good agreement between prediction and actual occurrence of CINV. This easy-to-use prediction model was based on chemotherapeutic regimens as well as patient's individual risk factors. The prediction accuracy of CINV occurrence in this nomogram was well validated by an independent data set. It could facilitate the assessment of individual risk, and thus improve the personalized management of CINV. PMID:26765450

  17. Beware the black box: investigating the sensitivity of FEA simulations to modelling factors in comparative biomechanics

    PubMed Central

    McCurry, Matthew R.; Clausen, Phillip D.; McHenry, Colin R.

    2013-01-01

    Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be ‘reasonable’ are often assumed to have little influence on the results and their interpretation. Here we report an extensive sensitivity analysis where high resolution finite element (FE) models of mandibles from seven species of crocodile were analysed under loads typical for comparative analysis: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results. PMID:24255817

  18. Realistic molecular model of kerogen's nanostructure

    NASA Astrophysics Data System (ADS)

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E.; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J.-M.; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp2/sp3 hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  19. Generalized Galileons: instabilities of bouncing and Genesis cosmologies and modified Genesis

    NASA Astrophysics Data System (ADS)

    Libanov, M.; Mironov, S.; Rubakov, V.

    2016-08-01

    We study spatially flat bouncing cosmologies and models with the early-time Genesis epoch in a popular class of generalized Galileon theories. We ask whether there exist solutions of these types which are free of gradient and ghost instabilities. We find that irrespectively of the forms of the Lagrangian functions, the bouncing models either are plagued with these instabilities or have singularities. The same result holds for the original Genesis model and its variants in which the scale factor tends to a constant as t → -∞. The result remains valid in theories with additional matter that obeys the Null Energy Condition and interacts with the Galileon only gravitationally. We propose a modified Genesis model which evades our no-go argument and give an explicit example of healthy cosmology that connects the modified Genesis epoch with kination (the epoch still driven by the Galileon field, which is a conventional massless scalar field at that stage).

  20. Realistic molecular model of kerogen's nanostructure.

    PubMed

    Bousige, Colin; Ghimbeu, Camélia Matei; Vix-Guterl, Cathie; Pomerantz, Andrew E; Suleimenova, Assiya; Vaughan, Gavin; Garbarino, Gaston; Feygenson, Mikhail; Wildgruber, Christoph; Ulm, Franz-Josef; Pellenq, Roland J-M; Coasne, Benoit

    2016-05-01

    Despite kerogen's importance as the organic backbone for hydrocarbon production from source rocks such as gas shale, the interplay between kerogen's chemistry, morphology and mechanics remains unexplored. As the environmental impact of shale gas rises, identifying functional relations between its geochemical, transport, elastic and fracture properties from realistic molecular models of kerogens becomes all the more important. Here, by using a hybrid experimental-simulation method, we propose a panel of realistic molecular models of mature and immature kerogens that provide a detailed picture of kerogen's nanostructure without considering the presence of clays and other minerals in shales. We probe the models' strengths and limitations, and show that they predict essential features amenable to experimental validation, including pore distribution, vibrational density of states and stiffness. We also show that kerogen's maturation, which manifests itself as an increase in the sp(2)/sp(3) hybridization ratio, entails a crossover from plastic-to-brittle rupture mechanisms.

  1. Stuy on Fatigue Life of Aluminum Alloy Considering Fretting

    NASA Astrophysics Data System (ADS)

    Yang, Maosheng; Zhao, Hongqiang; Wang, Yunxiang; Chen, Xiaofei; Fan, Jiali

    2018-01-01

    To study the influence of fretting on Aluminum Alloy, a global finite element model considering fretting was performed using the commercial code ABAQUS. With which a new model for predicting fretting fatigue life has been presented based on friction work. The rationality and effectiveness of the model were validated according to the contrast of experiment life and predicting life. At last influence factor on fretting fatigue life of aerial aluminum alloy was investigated with the model. The results revealed that fretting fatigue life decreased monotonously with the increasing of normal load and then became constant at higher pressures. At low normal load, fretting fatigue life was found to increase with increase in the pad radius. At high normal load, however, the fretting fatigue life remained almost unchanged with changes in the fretting pad radius. The bulk stress amplitude had the dominant effect on fretting fatigue life. The fretting fatigue life diminished as the bulk stress amplitude increased.

  2. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rueegsegger, Michael B.; Bach Cuadra, Meritxell; Pica, Alessia

    Purpose: Ocular anatomy and radiation-associated toxicities provide unique challenges for external beam radiation therapy. For treatment planning, precise modeling of organs at risk and tumor volume are crucial. Development of a precise eye model and automatic adaptation of this model to patients' anatomy remain problematic because of organ shape variability. This work introduces the application of a 3-dimensional (3D) statistical shape model as a novel method for precise eye modeling for external beam radiation therapy of intraocular tumors. Methods and Materials: Manual and automatic segmentations were compared for 17 patients, based on head computed tomography (CT) volume scans. A 3Dmore » statistical shape model of the cornea, lens, and sclera as well as of the optic disc position was developed. Furthermore, an active shape model was built to enable automatic fitting of the eye model to CT slice stacks. Cross-validation was performed based on leave-one-out tests for all training shapes by measuring dice coefficients and mean segmentation errors between automatic segmentation and manual segmentation by an expert. Results: Cross-validation revealed a dice similarity of 95% {+-} 2% for the sclera and cornea and 91% {+-} 2% for the lens. Overall, mean segmentation error was found to be 0.3 {+-} 0.1 mm. Average segmentation time was 14 {+-} 2 s on a standard personal computer. Conclusions: Our results show that the solution presented outperforms state-of-the-art methods in terms of accuracy, reliability, and robustness. Moreover, the eye model shape as well as its variability is learned from a training set rather than by making shape assumptions (eg, as with the spherical or elliptical model). Therefore, the model appears to be capable of modeling nonspherically and nonelliptically shaped eyes.« less

  4. Carbon fluxes in tropical forest ecosystems: the value of Eddy-covariance data for individual-based dynamic forest gap models

    NASA Astrophysics Data System (ADS)

    Roedig, Edna; Cuntz, Matthias; Huth, Andreas

    2015-04-01

    The effects of climatic inter-annual fluctuations and human activities on the global carbon cycle are uncertain and currently a major issue in global vegetation models. Individual-based forest gap models, on the other hand, model vegetation structure and dynamics on a small spatial (<100 ha) and large temporal scale (>1000 years). They are well-established tools to reproduce successions of highly-diverse forest ecosystems and investigate disturbances as logging or fire events. However, the parameterizations of the relationships between short-term climate variability and forest model processes are often uncertain in these models (e.g. daily variable temperature and gross primary production (GPP)) and cannot be constrained from forest inventories. We addressed this uncertainty and linked high-resolution Eddy-covariance (EC) data with an individual-based forest gap model. The forest model FORMIND was applied to three diverse tropical forest sites in the Amazonian rainforest. Species diversity was categorized into three plant functional types. The parametrizations for the steady-state of biomass and forest structure were calibrated and validated with different forest inventories. The parameterizations of relationships between short-term climate variability and forest model processes were evaluated with EC-data on a daily time step. The validations of the steady-state showed that the forest model could reproduce biomass and forest structures from forest inventories. The daily estimations of carbon fluxes showed that the forest model reproduces GPP as observed by the EC-method. Daily fluctuations of GPP were clearly reflected as a response to daily climate variability. Ecosystem respiration remains a challenge on a daily time step due to a simplified soil respiration approach. In the long-term, however, the dynamic forest model is expected to estimate carbon budgets for highly-diverse tropical forests where EC-measurements are rare.

  5. Ranking and validation of the spallation models for description of intermediate mass fragment emission from p + Ag collisions at 480 MeV incident proton beam energy

    NASA Astrophysics Data System (ADS)

    Sharma, Sushil K.; Kamys, Bogusław; Goldenbaum, Frank; Filges, Detlef

    2016-06-01

    Double-differential cross-sections d2σ/dΩ dE for isotopically identified intermediate mass fragments ( 6Li up to 27Mg from nuclear reactions induced by 480 MeV protons impinging on a silver target were analyzed in the frame of a two-step model. The first step of the reaction was described by the intranuclear cascade model INCL4.6 and the second one by four different models (ABLA07,GEM2, GEMINI++, and SMM). The experimental spectra reveal the presence of low-energy, isotropic as well as high-energy, forward-peaked contributions. The INCL4.6 model offers a possibility to describe the latter contribution for light intermediate mass fragments by coalescence of the emitted nucleons. The qualitative agreement of the model predictions with the data was observed but the high-energy tails of the spectra were significantly overestimated. The shape of the isotropic part of the spectra was reproduced by all four models. The GEM2 model strongly underestimated the value of the cross-sections for heavier IMF whereas the SMM and ABLA07 models generally overestimated the data. The best quantitative description of the data was offered by GEMINI++, however, a discrepancy between the data and the model cross-sections still remained for almost all reaction products, especially at forward angles. It indicates that non-equilibrium processes are present which cannot be reproduced by the applied models. The goodness of the data description was judged quantitatively using two statistical deviation factors, the H-factor and the M-factor, as a tool for ranking and validation of the theoretical models.

  6. The Stigma Resistance Scale: A multi-sample validation of a new instrument to assess mental illness stigma resistance.

    PubMed

    Firmin, Ruth L; Lysaker, Paul H; McGrew, John H; Minor, Kyle S; Luther, Lauren; Salyers, Michelle P

    2017-12-01

    Although associated with key recovery outcomes, stigma resistance remains under-studied largely due to limitations of existing measures. This study developed and validated a new measure of stigma resistance. Preliminary items, derived from qualitative interviews of people with lived experience, were pilot tested online with people self-reporting a mental illness diagnosis (n = 489). Best performing items were selected, and the refined measure was administered to an independent sample of people with mental illness at two state mental health consumer recovery conferences (n = 202). Confirmatory factor analyses (CFA) guided by theory were used to test item fit, correlations between the refined stigma resistance measure and theoretically relevant measures were examined for validity, and test-retest correlations of a subsample were examined for stability. CFA demonstrated strong fit for a 5-factor model. The final 20-item measure demonstrated good internal consistency for each of the 5 subscales, adequate test-retest reliability at 3 weeks, and strong construct validity (i.e., positive associations with quality of life, recovery, and self-efficacy, and negative associations with overall symptoms, defeatist beliefs, and self-stigma). The new measure offers a more reliable and nuanced assessment of stigma resistance. It may afford greater personalization of interventions targeting stigma resistance. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Global estimates of evapotranspiration and gross primary production based on MODIS and global meteorology data

    USGS Publications Warehouse

    Yuan, W.; Liu, S.; Yu, G.; Bonnefond, J.-M.; Chen, J.; Davis, K.; Desai, A.R.; Goldstein, Allen H.; Gianelle, D.; Rossi, F.; Suyker, A.E.; Verma, S.B.

    2010-01-01

    The simulation of gross primary production (GPP) at various spatial and temporal scales remains a major challenge for quantifying the global carbon cycle. We developed a light use efficiency model, called EC-LUE, driven by only four variables: normalized difference vegetation index (NDVI), photosynthetically active radiation (PAR), air temperature, and the Bowen ratio of sensible to latent heat flux. The EC-LUE model may have the most potential to adequately address the spatial and temporal dynamics of GPP because its parameters (i.e., the potential light use efficiency and optimal plant growth temperature) are invariant across the various land cover types. However, the application of the previous EC-LUE model was hampered by poor prediction of Bowen ratio at the large spatial scale. In this study, we substituted the Bowen ratio with the ratio of evapotranspiration (ET) to net radiation, and revised the RS-PM (Remote Sensing-Penman Monteith) model for quantifying ET. Fifty-four eddy covariance towers, including various ecosystem types, were selected to calibrate and validate the revised RS-PM and EC-LUE models. The revised RS-PM model explained 82% and 68% of the observed variations of ET for all the calibration and validation sites, respectively. Using estimated ET as input, the EC-LUE model performed well in calibration and validation sites, explaining 75% and 61% of the observed GPP variation for calibration and validation sites respectively.Global patterns of ET and GPP at a spatial resolution of 0.5° latitude by 0.6° longitude during the years 2000–2003 were determined using the global MERRA dataset (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate Resolution Imaging Spectroradiometer). The global estimates of ET and GPP agreed well with the other global models from the literature, with the highest ET and GPP over tropical forests and the lowest values in dry and high latitude areas. However, comparisons with observed GPP at eddy flux towers showed significant underestimation of ET and GPP due to lower net radiation of MERRA dataset. Applying a procedure to correct the systematic errors of global meteorological data would improve global estimates of GPP and ET. The revised RS-PM and EC-LUE models will provide the alternative approaches making it possible to map ET and GPP over large areas because (1) the model parameters are invariant across various land cover types and (2) all driving forces of the models may be derived from remote sensing data or existing climate observation networks.

  8. Prolonged striatal disinhibition as a chronic animal model of tic disorders.

    PubMed

    Vinner, Esther; Israelashvili, Michal; Bar-Gad, Izhar

    2017-12-01

    Experimental findings and theoretical models have associated Tourette syndrome with abnormal striatal inhibition. The expression of tics, the hallmark symptom of this disorder, has been transiently induced in non-human primates and rodents by the injection of GABA A antagonists into the striatum, leading to temporary disinhibition. The novel chronic model of tic expression utilizes mini-osmotic pumps implanted subcutaneously in the rat's back for prolonged infusion of bicuculline into the dorsolateral striatum. Tics were expressed on the contralateral side to the infusion over a period of multiple days. Tic expression was stable, and maintained similar properties throughout the infusion period. Electrophysiological recordings revealed the existence of tic-related local field potential spikes and individual neuron activity changes that remained stable throughout the infusion period. The striatal disinhibition model provides a unique combination of face validity (tic expression) and construct validity (abnormal striatal inhibition) but is limited to sub-hour periods. The new chronic model extends the period of tic expression to multiple days and thus enables the study of tic dynamics and the effects of behavior and pharmacological agents on tic expression. The chronic model provides similar behavioral and neuronal correlates of tics as the acute striatal disinhibition model but over prolonged periods of time, thus providing a unique, basal ganglia initiated model of tic expression. Chronic expression of symptoms is the key to studying the time varying properties of Tourette syndrome and the effects of multiple internal and external factors on this disorder. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Finite element modelling of the foot for clinical application: A systematic review.

    PubMed

    Behforootan, Sara; Chatzistergos, Panagiotis; Naemi, Roozbeh; Chockalingam, Nachiappan

    2017-01-01

    Over the last two decades finite element modelling has been widely used to give new insight on foot and footwear biomechanics. However its actual contribution for the improvement of the therapeutic outcome of different pathological conditions of the foot, such as the diabetic foot, remains relatively limited. This is mainly because finite element modelling has only been used within the research domain. Clinically applicable finite element modelling can open the way for novel diagnostic techniques and novel methods for treatment planning/optimisation which would significantly enhance clinical practice. In this context this review aims to provide an overview of modelling techniques in the field of foot and footwear biomechanics and to investigate their applicability in a clinical setting. Even though no integrated modelling system exists that could be directly used in the clinic and considerable progress is still required, current literature includes a comprehensive toolbox for future work towards clinically applicable finite element modelling. The key challenges include collecting the information that is needed for geometry design, the assignment of material properties and loading on a patient-specific basis and in a cost-effective and non-invasive way. The ultimate challenge for the implementation of any computational system into clinical practice is to ensure that it can produce reliable results for any person that belongs in the population for which it was developed. Consequently this highlights the need for thorough and extensive validation of each individual step of the modelling process as well as for the overall validation of the final integrated system. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  10. Modeling the role of environmental variables on the population dynamics of the malaria vector Anopheles gambiae sensu stricto

    PubMed Central

    2012-01-01

    Background The impact of weather and climate on malaria transmission has attracted considerable attention in recent years, yet uncertainties around future disease trends under climate change remain. Mathematical models provide powerful tools for addressing such questions and understanding the implications for interventions and eradication strategies, but these require realistic modeling of the vector population dynamics and its response to environmental variables. Methods Published and unpublished field and experimental data are used to develop new formulations for modeling the relationships between key aspects of vector ecology and environmental variables. These relationships are integrated within a validated deterministic model of Anopheles gambiae s.s. population dynamics to provide a valuable tool for understanding vector response to biotic and abiotic variables. Results A novel, parsimonious framework for assessing the effects of rainfall, cloudiness, wind speed, desiccation, temperature, relative humidity and density-dependence on vector abundance is developed, allowing ease of construction, analysis, and integration into malaria transmission models. Model validation shows good agreement with longitudinal vector abundance data from Tanzania, suggesting that recent malaria reductions in certain areas of Africa could be due to changing environmental conditions affecting vector populations. Conclusions Mathematical models provide a powerful, explanatory means of understanding the role of environmental variables on mosquito populations and hence for predicting future malaria transmission under global change. The framework developed provides a valuable advance in this respect, but also highlights key research gaps that need to be resolved if we are to better understand future malaria risk in vulnerable communities. PMID:22877154

  11. Markov modeling and discrete event simulation in health care: a systematic comparison.

    PubMed

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  12. Recurrent urinary tract infections in healthy and nonpregnant women✩

    PubMed Central

    Glover, Matthew; Moreira, Cristiano G.; Sperandio, Vanessa; Zimmern, Philippe

    2016-01-01

    Recurrent urinary tract infections (RUTI) are prevalent and pose significant clinical challenges. Although the term RUTI has long been vaguely defined, a consensus definition has emerged in recent years. The exact etiology behind RUTI remains under debate, with valid arguments for both ascending reinfections as well as persistent infection inside the bladder. These persistent infections exist in the form of quiescent intracellular reservoirs in the mouse model and may represent a novel concept to explain UTI recurrence in humans. Manageable risk factors such as behavioral patterns alongside nonmanageable risk factors including genetic susceptibility are growing fields of investigation. Acute UTI have been studied through two model bacterial strains: Escherichia coli UTI89 and CFT073. However, the clinical relevance to RUTI of these two strains has not been firmly established. Current treatment strategies for RUTI are limited and remain dominated by antibiotic usage despite variable efficacy. The majority of studies in humans have focused on younger groups of women with little information available about the postmenopausal population despite a heightened risk of RUTI in this age group. PMID:27499825

  13. Mechanism of West Nile Virus Neuroinvasion: A Critical Appraisal

    PubMed Central

    Suen, Willy W.; Prow, Natalie A.; Hall, Roy A.; Bielefeldt-Ohmann, Helle

    2014-01-01

    West Nile virus (WNV) is an important emerging neurotropic virus, responsible for increasingly severe encephalitis outbreaks in humans and horses worldwide. However, the mechanism by which the virus gains entry to the brain (neuroinvasion) remains poorly understood. Hypotheses of hematogenous and transneural entry have been proposed for WNV neuroinvasion, which revolve mainly around the concepts of blood-brain barrier (BBB) disruption and retrograde axonal transport, respectively. However, an over‑representation of in vitro studies without adequate in vivo validation continues to obscure our understanding of the mechanism(s). Furthermore, WNV infection in the current rodent models does not generate a similar viremia and character of CNS infection, as seen in the common target hosts, humans and horses. These differences ultimately question the applicability of rodent models for pathogenesis investigations. Finally, the role of several barriers against CNS insults, such as the blood-cerebrospinal fluid (CSF), the CSF-brain and the blood-spinal cord barriers, remain largely unexplored, highlighting the infancy of this field. In this review, a systematic and critical appraisal of the current evidence relevant to the possible mechanism(s) of WNV neuroinvasion is conducted. PMID:25046180

  14. Validation of High Frequency (HF) Propagation Prediction Models in the Arctic region

    NASA Astrophysics Data System (ADS)

    Athieno, R.; Jayachandran, P. T.

    2014-12-01

    Despite the emergence of modern techniques for long distance communication, Ionospheric communication in the high frequency (HF) band (3-30 MHz) remains significant to both civilian and military users. However, the efficient use of the ever-varying ionosphere as a propagation medium is dependent on the reliability of ionospheric and HF propagation prediction models. Most available models are empirical implying that data collection has to be sufficiently large to provide good intended results. The models we present were developed with little data from the high latitudes which necessitates their validation. This paper presents the validation of three long term High Frequency (HF) propagation prediction models over a path within the Arctic region. Measurements of the Maximum Usable Frequency for a 3000 km range (MUF (3000) F2) for Resolute, Canada (74.75° N, 265.00° E), are obtained from hand-scaled ionograms generated by the Canadian Advanced Digital Ionosonde (CADI). The observations have been compared with predictions obtained from the Ionospheric Communication Enhanced Profile Analysis Program (ICEPAC), Voice of America Coverage Analysis Program (VOACAP) and International Telecommunication Union Recommendation 533 (ITU-REC533) for 2009, 2011, 2012 and 2013. A statistical analysis shows that the monthly predictions seem to reproduce the general features of the observations throughout the year though it is more evident in the winter and equinox months. Both predictions and observations show a diurnal and seasonal variation. The analysed models did not show large differences in their performances. However, there are noticeable differences across seasons for the entire period analysed: REC533 gives a better performance in winter months while VOACAP has a better performance for both equinox and summer months. VOACAP gives a better performance in the daily predictions compared to ICEPAC though, in general, the monthly predictions seem to agree more with the observations compared to the daily predictions.

  15. Functional genomics unique to week 20 post wounding in the deep cone/fat dome of the Duroc/Yorkshire porcine model of fibroproliferative scarring.

    PubMed

    Engrav, Loren H; Tuggle, Christopher K; Kerr, Kathleen F; Zhu, Kathy Q; Numhom, Surawej; Couture, Oliver P; Beyer, Richard P; Hocking, Anne M; Carrougher, Gretchen J; Ramos, Maria Luiza C; Klein, Matthew B; Gibran, Nicole S

    2011-04-20

    Hypertrophic scar was first described over 100 years ago; PubMed has more than 1,000 references on the topic. Nevertheless prevention and treatment remains poor, because 1) there has been no validated animal model; 2) human scar tissue, which is impossible to obtain in a controlled manner, has been the only source for study; 3) tissues typically have been homogenized, mixing cell populations; and 4) gene-by-gene studies are incomplete. We have assembled a system that overcomes these barriers and permits the study of genome-wide gene expression in microanatomical locations, in shallow and deep partial-thickness wounds, and pigmented and non-pigmented skin, using the Duroc(pigmented fibroproliferative)/Yorkshire(non-pigmented non-fibroproliferative) porcine model. We used this system to obtain the differential transcriptome at 1, 2, 3, 12 and 20 weeks post wounding. It is not clear when fibroproliferation begins, but it is fully developed in humans and the Duroc breed at 20 weeks. Therefore we obtained the derivative functional genomics unique to 20 weeks post wounding. We also obtained long-term, forty-six week follow-up with the model. 1) The scars are still thick at forty-six weeks post wounding further validating the model. 2) The differential transcriptome provides new insights into the fibroproliferative process as several genes thought fundamental to fibroproliferation are absent and others differentially expressed are newly implicated. 3) The findings in the derivative functional genomics support old concepts, which further validates the model, and suggests new avenues for reductionist exploration. In the future, these findings will be searched for directed networks likely involved in cutaneous fibroproliferation. These clues may lead to a better understanding of the systems biology of cutaneous fibroproliferation, and ultimately prevention and treatment of hypertrophic scarring.

  16. OCT-based full crystalline lens shape change during accommodation in vivo.

    PubMed

    Martinez-Enriquez, Eduardo; Pérez-Merino, Pablo; Velasco-Ocana, Miriam; Marcos, Susana

    2017-02-01

    The full shape of the accommodating crystalline lens was estimated using custom three-dimensional (3-D) spectral OCT and image processing algorithms. Automatic segmentation and distortion correction were used to construct 3-D models of the lens region visible through the pupil. The lens peripheral region was estimated with a trained and validated parametric model. Nineteen young eyes were measured at 0-6 D accommodative demands in 1.5 D steps. Lens volume, surface area, diameter, and equatorial plane position were automatically quantified. Lens diameter & surface area correlated negatively and equatorial plane position positively with accommodation response. Lens volume remained constant and surface area decreased with accommodation, indicating that the lens material is incompressible and the capsular bag elastic.

  17. Minimal Left-Right Symmetric Dark Matter.

    PubMed

    Heeck, Julian; Patra, Sudhanwa

    2015-09-18

    We show that left-right symmetric models can easily accommodate stable TeV-scale dark matter particles without the need for an ad hoc stabilizing symmetry. The stability of a newly introduced multiplet either arises accidentally as in the minimal dark matter framework or comes courtesy of the remaining unbroken Z_{2} subgroup of B-L. Only one new parameter is introduced: the mass of the new multiplet. As minimal examples, we study left-right fermion triplets and quintuplets and show that they can form viable two-component dark matter. This approach is, in particular, valid for SU(2)×SU(2)×U(1) models that explain the recent diboson excess at ATLAS in terms of a new charged gauge boson of mass 2 TeV.

  18. A Gaussian framework for modeling effects of frequency-dependent attenuation, frequency-dependent scattering, and gating.

    PubMed

    Wear, Keith A

    2002-11-01

    For a wide range of applications in medical ultrasound, power spectra of received signals are approximately Gaussian. It has been established previously that an ultrasound beam with a Gaussian spectrum propagating through a medium with linear attenuation remains Gaussian. In this paper, Gaussian transformations are derived to model the effects of scattering (according to a power law, as is commonly applicable in soft tissues, especially over limited frequency ranges) and gating (with a Hamming window, a commonly used gate function). These approximations are shown to be quite accurate even for relatively broad band systems with fractional bandwidths approaching 100%. The theory is validated by experiments in phantoms consisting of glass particles suspended in agar.

  19. OCT-based full crystalline lens shape change during accommodation in vivo

    PubMed Central

    Martinez-Enriquez, Eduardo; Pérez-Merino, Pablo; Velasco-Ocana, Miriam; Marcos, Susana

    2017-01-01

    The full shape of the accommodating crystalline lens was estimated using custom three-dimensional (3-D) spectral OCT and image processing algorithms. Automatic segmentation and distortion correction were used to construct 3-D models of the lens region visible through the pupil. The lens peripheral region was estimated with a trained and validated parametric model. Nineteen young eyes were measured at 0-6 D accommodative demands in 1.5 D steps. Lens volume, surface area, diameter, and equatorial plane position were automatically quantified. Lens diameter & surface area correlated negatively and equatorial plane position positively with accommodation response. Lens volume remained constant and surface area decreased with accommodation, indicating that the lens material is incompressible and the capsular bag elastic. PMID:28270993

  20. Prediction models of donor arrest and graft utilization in liver transplantation from maastricht-3 donors after circulatory death.

    PubMed

    Davila, D; Ciria, R; Jassem, W; Briceño, J; Littlejohn, W; Vilca-Meléndez, H; Srinivasan, P; Prachalias, A; O'Grady, J; Rela, M; Heaton, N

    2012-12-01

    Shortage of organs for transplantation has led to the renewed interest in donation after circulatory-determination of death (DCDD). We conducted a retrospective analysis (2001-2009) and a subsequent prospective validation (2010) of liver Maastricht-Category-3-DCDD and donation-after-brain-death (DBD) offers to our program. Accepted and declined offers were compared. Accepted DCDD offers were divided into donors who went on to cardiac arrest and those who did not. Donors who arrested were divided into those producing grafts that were transplanted or remained unused. Descriptive comparisons and regression analyses were performed to assess predictor models of donor cardiac arrest and graft utilization. Variables from the multivariate analysis were prospectively validated. Of 1579 DCDD offers, 621 were accepted, and of these, 400 experienced cardiac arrest after withdrawal of support. Of these, 173 livers were transplanted. In the DCDD group, donor age < 40 years, use of inotropes and absence of gag/cough reflexes were predictors of cardiac arrest. Donor age >50 years, BMI >30, warm ischemia time >25 minutes, ITU stay >7 days and ALT ≥ 4× normal rates were risk factors for not using the graft. These variables had excellent sensitivity and specificity for the prediction of cardiac arrest (AUROC = 0.835) and graft use (AUROC = 0.748) in the 2010 prospective validation. These models can feasibly predict cardiac arrest in potential DCDDs and graft usability, helping to avoid unnecessary recoveries and healthcare expenditure. © Copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.

  1. The effect of inhibition of PP1 and TNFα signaling on pathogenesis of SARS coronavirus.

    PubMed

    McDermott, Jason E; Mitchell, Hugh D; Gralinski, Lisa E; Eisfeld, Amie J; Josset, Laurence; Bankhead, Armand; Neumann, Gabriele; Tilton, Susan C; Schäfer, Alexandra; Li, Chengjun; Fan, Shufang; McWeeney, Shannon; Baric, Ralph S; Katze, Michael G; Waters, Katrina M

    2016-09-23

    The complex interplay between viral replication and host immune response during infection remains poorly understood. While many viruses are known to employ anti-immune strategies to facilitate their replication, highly pathogenic virus infections can also cause an excessive immune response that exacerbates, rather than reduces pathogenicity. To investigate this dichotomy in severe acute respiratory syndrome coronavirus (SARS-CoV), we developed a transcriptional network model of SARS-CoV infection in mice and used the model to prioritize candidate regulatory targets for further investigation. We validated our predictions in 18 different knockout (KO) mouse strains, showing that network topology provides significant predictive power to identify genes that are important for viral infection. We identified a novel player in the immune response to virus infection, Kepi, an inhibitory subunit of the protein phosphatase 1 (PP1) complex, which protects against SARS-CoV pathogenesis. We also found that receptors for the proinflammatory cytokine tumor necrosis factor alpha (TNFα) promote pathogenesis, presumably through excessive inflammation. The current study provides validation of network modeling approaches for identifying important players in virus infection pathogenesis, and a step forward in understanding the host response to an important infectious disease. The results presented here suggest the role of Kepi in the host response to SARS-CoV, as well as inflammatory activity driving pathogenesis through TNFα signaling in SARS-CoV infections. Though we have reported the utility of this approach in bacterial and cell culture studies previously, this is the first comprehensive study to confirm that network topology can be used to predict phenotypes in mice with experimental validation.

  2. Evaluation of force-velocity and power-velocity relationship of arm muscles.

    PubMed

    Sreckovic, Sreten; Cuk, Ivan; Djuric, Sasa; Nedeljkovic, Aleksandar; Mirkov, Dragan; Jaric, Slobodan

    2015-08-01

    A number of recent studies have revealed an approximately linear force-velocity (F-V) and, consequently, a parabolic power-velocity (P-V) relationship of multi-joint tasks. However, the measurement characteristics of their parameters have been neglected, particularly those regarding arm muscles, which could be a problem for using the linear F-V model in both research and routine testing. Therefore, the aims of the present study were to evaluate the strength, shape, reliability, and concurrent validity of the F-V relationship of arm muscles. Twelve healthy participants performed maximum bench press throws against loads ranging from 20 to 70 % of their maximum strength, and linear regression model was applied on the obtained range of F and V data. One-repetition maximum bench press and medicine ball throw tests were also conducted. The observed individual F-V relationships were exceptionally strong (r = 0.96-0.99; all P < 0.05) and fairly linear, although it remains unresolved whether a polynomial fit could provide even stronger relationships. The reliability of parameters obtained from the linear F-V regressions proved to be mainly high (ICC > 0.80), while their concurrent validity regarding directly measured F, P, and V ranged from high (for maximum F) to medium-to-low (for maximum P and V). The findings add to the evidence that the linear F-V and, consequently, parabolic P-V models could be used to study the mechanical properties of muscular systems, as well as to design a relatively simple, reliable, and ecologically valid routine test of the muscle ability of force, power, and velocity production.

  3. Moderator's view: Predictive models: a prelude to precision nephrology.

    PubMed

    Zoccali, Carmine

    2017-05-01

    Appropriate diagnosis is fundamental in medicine because it sets the basis for the prediction of disease outcome at the single patient level (prognosis) and decisions regarding the most appropriate therapy. However, given the large series of social, clinical and biological factors that determine the likelihood of an individual's future outcome, prognosis only partly depends on diagnosis and aetiology and treatment is not decided solely on the basis of the underlying diagnosis. This issue is crucial in multifactorial diseases like atherosclerosis, where the use of statins has now shifted from 'treating hypercholesterolaemia' to 'treating the risk of adverse cardiovascular events'. Approaches that take due account of prognosis limit the lingering risk of over-diagnosis and maximize the value of prognostic information in the clinical decision process. In the nephrology realm, the application of a well-validated risk equation for kidney failure in Canada led to a 35% reduction in new referrals. Prognostic models based on simple clinical data extractable from clinical files have recently been developed to predict all-cause and cardiovascular mortality in end-stage kidney disease patients. However, research on predictive models in renal diseases remains suboptimal and non-accounting for competing events and measurement errors, and a lack of calibration analyses and external validation are common fallacies in currently available studies. More focus on this blossoming research area is desirable. The nephrology community may now start to apply the best validated risk scores and further test their potential usefulness in chronic kidney disease patients in diverse clinical situations and geographical areas. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  4. Accounting for the influence of vegetation and landscape improves model transferability in a tropical savannah region

    NASA Astrophysics Data System (ADS)

    Gao, Hongkai; Hrachowitz, Markus; Sriwongsitanon, Nutchanart; Fenicia, Fabrizio; Gharari, Shervan; Savenije, Hubert H. G.

    2016-10-01

    Understanding which catchment characteristics dominate hydrologic response and how to take them into account remains a challenge in hydrological modeling, particularly in ungauged basins. This is even more so in nontemperate and nonhumid catchments, where—due to the combination of seasonality and the occurrence of dry spells—threshold processes are more prominent in rainfall runoff behavior. An example is the tropical savannah, the second largest climatic zone, characterized by pronounced dry and wet seasons and high evaporative demand. In this study, we investigated the importance of landscape variability on the spatial variability of stream flow in tropical savannah basins. We applied a stepwise modeling approach to 23 subcatchments of the Upper Ping River in Thailand, where gradually more information on landscape was incorporated. The benchmark is represented by a classical lumped model (FLEXL), which does not account for spatial variability. We then tested the effect of accounting for vegetation information within the lumped model (FLEXLM), and subsequently two semidistributed models: one accounting for the spatial variability of topography-based landscape features alone (FLEXT), and another accounting for both topographic features and vegetation (FLEXTM). In cross validation, each model was calibrated on one catchment, and then transferred with its fitted parameters to the remaining catchments. We found that when transferring model parameters in space, the semidistributed models accounting for vegetation and topographic heterogeneity clearly outperformed the lumped model. This suggests that landscape controls a considerable part of the hydrological function and explicit consideration of its heterogeneity can be highly beneficial for prediction in ungauged basins in tropical savannah.

  5. Validity of Factors of the Psychopathy Checklist-Revised in Female Prisoners: Discriminant Relations with Antisocial Behavior, Substance Abuse, and Personality

    ERIC Educational Resources Information Center

    Kennealy, Patrick J.; Hicks, Brian M.; Patrick, Christopher J.

    2007-01-01

    The validity of the Psychopathy Checklist-Revised (PCL-R) has been examined extensively in men, but its validity for women remains understudied. Specifically, the correlates of the general construct of psychopathy and its components as assessed by PCL-R total, factor, and facet scores have yet to be examined in depth. Based on previous research…

  6. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Shelf-life prediction models for ready-to-eat fresh cut salads: Testing in real cold chain.

    PubMed

    Tsironi, Theofania; Dermesonlouoglou, Efimia; Giannoglou, Marianna; Gogou, Eleni; Katsaros, George; Taoukis, Petros

    2017-01-02

    The aim of the study was to develop and test the applicability of predictive models for shelf-life estimation of ready-to-eat (RTE) fresh cut salads in realistic distribution temperature conditions in the food supply chain. A systematic kinetic study of quality loss of RTE mixed salad (lollo rosso lettuce-40%, lollo verde lettuce-45%, rocket-15%) packed under modified atmospheres (3% O 2 , 10% CO 2 , 87% N 2 ) was conducted. Microbial population (total viable count, Pseudomonas spp., lactic acid bacteria), vitamin C, colour and texture were the measured quality parameters. Kinetic models for these indices were developed to determine the quality loss and calculate product remaining shelf-life (SL R ). Storage experiments were conducted at isothermal (2.5-15°C) and non-isothermal temperature conditions (T eff =7.8°C defined as the constant temperature that results in the same quality value as the variable temperature distribution) for validation purposes. Pseudomonas dominated spoilage, followed by browning and chemical changes. The end of shelf-life correlated with a Pseudomonas spp. level of 8 log(cfu/g), and 20% loss of the initial vitamin C content. The effect of temperature on these quality parameters was expressed by the Arrhenius equation; activation energy (E a ) value was 69.1 and 122.6kJ/mol for Pseudomonas spp. growth and vitamin C loss rates, respectively. Shelf-life prediction models were also validated in real cold chain conditions (including the stages of transport to and storage at retail distribution center, transport to and display at 7 retail stores, transport to and storage in domestic refrigerators). The quality level and SL R estimated after 2-3days of domestic storage (time of consumption) ranged between 1 and 8days at 4°C and was predicted within satisfactory statistical error by the kinetic models. T eff in the cold chain ranged between 3.7 and 8.3°C. Using the validated models, SL R of RTE fresh cut salad can be estimated at any point of the cold chain if the temperature history is known. Shelf-life models of validated applicability can serve as an effective tool for shelf-life assessment and the development of new products in the fresh produce food sector. Copyright © 2016. Published by Elsevier B.V.

  8. Estimating the fates of organic contaminants in an aquifer using QSAR.

    PubMed

    Lim, Seung Joo; Fox, Peter

    2013-01-01

    The quantitative structure activity relationship (QSAR) model, BIOWIN, was modified to more accurately estimate the fates of organic contaminants in an aquifer. The predictions from BIOWIN were modified to include oxidation and sorption effects. The predictive model therefore included the effects of sorption, biodegradation, and oxidation. A total of 35 organic compounds were used to validate the predictive model. The majority of the ratios of predicted half-life to measured half-life were within a factor of 2 and no ratio values were greater than a factor of 5. In addition, the accuracy of estimating the persistence of organic compounds in the sub-surface was superior when modified by the relative fraction adsorbed to the solid phase, 1/Rf, to that when modified by the remaining fraction of a given compound adsorbed to a solid, 1 - fs.

  9. Axial geometrical aberration correction up to 5th order with N-SYLC.

    PubMed

    Hoque, Shahedul; Ito, Hiroyuki; Takaoka, Akio; Nishi, Ryuji

    2017-11-01

    We present N-SYLC (N-fold symmetric line currents) models to correct 5th order axial geometrical aberrations in electron microscopes. In our previous paper, we showed that 3rd order spherical aberration can be corrected by 3-SYLC doublet. After that, mainly the 5th order aberrations remain to limit the resolution. In this paper, we extend the doublet to quadruplet models also including octupole and dodecapole fields for correcting these higher order aberrations, without introducing any new unwanted ones. We prove the validity of our models by analytical calculations. Also by computer simulations, we show that for beam energy of 5keV and initial angle 10mrad at the corrector object plane, beam size of less than 0.5nm is achieved at the corrector image plane. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Combinatorial Histone Acetylation Patterns Are Generated by Motif-Specific Reactions.

    PubMed

    Blasi, Thomas; Feller, Christian; Feigelman, Justin; Hasenauer, Jan; Imhof, Axel; Theis, Fabian J; Becker, Peter B; Marr, Carsten

    2016-01-27

    Post-translational modifications (PTMs) are pivotal to cellular information processing, but how combinatorial PTM patterns ("motifs") are set remains elusive. We develop a computational framework, which we provide as open source code, to investigate the design principles generating the combinatorial acetylation patterns on histone H4 in Drosophila melanogaster. We find that models assuming purely unspecific or lysine site-specific acetylation rates were insufficient to explain the experimentally determined motif abundances. Rather, these abundances were best described by an ensemble of models with acetylation rates that were specific to motifs. The model ensemble converged upon four acetylation pathways; we validated three of these using independent data from a systematic enzyme depletion study. Our findings suggest that histone acetylation patterns originate through specific pathways involving motif-specific acetylation activity. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Solidification Sequence of Spray-Formed Steels

    NASA Astrophysics Data System (ADS)

    Zepon, Guilherme; Ellendt, Nils; Uhlenwinkel, Volker; Bolfarini, Claudemiro

    2016-02-01

    Solidification in spray-forming is still an open discussion in the atomization and deposition area. This paper proposes a solidification model based on the equilibrium solidification path of alloys. The main assumptions of the model are that the deposition zone temperature must be above the alloy's solidus temperature and that the equilibrium liquid fraction at this temperature is reached, which involves partial remelting and/or redissolution of completely solidified droplets. When the deposition zone is cooled, solidification of the remaining liquid takes place under near equilibrium conditions. Scanning electron microscopy (SEM) and optical microscopy (OM) were used to analyze the microstructures of two different spray-formed steel grades: (1) boron modified supermartensitic stainless steel (SMSS) and (2) D2 tool steel. The microstructures were analyzed to determine the sequence of phase formation during solidification. In both cases, the solidification model proposed was validated.

  12. PREDICT: A next generation platform for near real-time prediction of cholera

    NASA Astrophysics Data System (ADS)

    Jutla, A.; Aziz, S.; Akanda, A. S.; Alam, M.; Ahsan, G. U.; Huq, A.; Colwell, R. R.

    2017-12-01

    Data on disease prevalence and infectious pathogens is sparingly collected/available in region(s) where climatic variability and extreme natural events intersect with population vulnerability (such as lack of access to water and sanitation infrastructure). Therefore, traditional time series modeling approach of calibration and validation of a model is inadequate. Hence, prediction of diarrheal infections (such as cholera, Shigella etc) remain a challenge even though disease causing pathogens are strongly associated with modalities of regional climate and weather system. Here we present an algorithm that integrates satellite derived data on several hydroclimatic and ecological processes into a framework that can determine high resolution cholera risk on global scales. Cholera outbreaks can be classified in three forms- epidemic (sudden or seasonal outbreaks), endemic (recurrence and persistence of the disease for several consecutive years) and mixed-mode endemic (combination of certain epidemic and endemic conditions) with significant spatial and temporal heterogeneity. Using data from multiple satellites (AVHRR, TRMM, GPM, MODIS, VIIRS, GRACE), we will show examples from Haiti, Yemen, Nepal and several other regions where our algorithm has been successful in capturing risk of outbreak of infection in human population. A spatial model validation algorithm will also be presented that has capabilities to self-calibrate as new hydroclimatic and disease data become available.

  13. Individual and culture-level components of survey response styles: A multi-level analysis using cultural models of selfhood.

    PubMed

    Smith, Peter B; Vignoles, Vivian L; Becker, Maja; Owe, Ellinor; Easterbrook, Matthew J; Brown, Rupert; Bourguignon, David; Garðarsdóttir, Ragna B; Kreuzbauer, Robert; Cendales Ayala, Boris; Yuki, Masaki; Zhang, Jianxin; Lv, Shaobo; Chobthamkit, Phatthanakit; Jaafar, Jas Laile; Fischer, Ronald; Milfont, Taciano L; Gavreliuc, Alin; Baguma, Peter; Bond, Michael Harris; Martin, Mariana; Gausel, Nicolay; Schwartz, Seth J; Des Rosiers, Sabrina E; Tatarko, Alexander; González, Roberto; Didier, Nicolas; Carrasco, Diego; Lay, Siugmin; Nizharadze, George; Torres, Ana; Camino, Leoncio; Abuhamdeh, Sami; Macapagal, Ma Elizabeth J; Koller, Silvia H; Herman, Ginette; Courtois, Marie; Fritsche, Immo; Espinosa, Agustín; Villamar, Juan A; Regalia, Camillo; Manzi, Claudia; Brambilla, Maria; Zinkeng, Martina; Jalal, Baland; Kusdil, Ersin; Amponsah, Benjamin; Çağlar, Selinay; Mekonnen, Kassahun Habtamu; Möller, Bettina; Zhang, Xiao; Schweiger Gallo, Inge; Prieto Gil, Paula; Lorente Clemares, Raquel; Campara, Gabriella; Aldhafri, Said; Fülöp, Márta; Pyszczynski, Tom; Kesebir, Pelin; Harb, Charles

    2016-12-01

    Variations in acquiescence and extremity pose substantial threats to the validity of cross-cultural research that relies on survey methods. Individual and cultural correlates of response styles when using 2 contrasting types of response mode were investigated, drawing on data from 55 cultural groups across 33 nations. Using 7 dimensions of self-other relatedness that have often been confounded within the broader distinction between independence and interdependence, our analysis yields more specific understandings of both individual- and culture-level variations in response style. When using a Likert-scale response format, acquiescence is strongest among individuals seeing themselves as similar to others, and where cultural models of selfhood favour harmony, similarity with others and receptiveness to influence. However, when using Schwartz's (2007) portrait-comparison response procedure, acquiescence is strongest among individuals seeing themselves as self-reliant but also connected to others, and where cultural models of selfhood favour self-reliance and self-consistency. Extreme responding varies less between the two types of response modes, and is most prevalent among individuals seeing themselves as self-reliant, and in cultures favouring self-reliance. As both types of response mode elicit distinctive styles of response, it remains important to estimate and control for style effects to ensure valid comparisons. © 2016 International Union of Psychological Science.

  14. Uncertainty propagation for statistical impact prediction of space debris

    NASA Astrophysics Data System (ADS)

    Hoogendoorn, R.; Mooij, E.; Geul, J.

    2018-01-01

    Predictions of the impact time and location of space debris in a decaying trajectory are highly influenced by uncertainties. The traditional Monte Carlo (MC) method can be used to perform accurate statistical impact predictions, but requires a large computational effort. A method is investigated that directly propagates a Probability Density Function (PDF) in time, which has the potential to obtain more accurate results with less computational effort. The decaying trajectory of Delta-K rocket stages was used to test the methods using a six degrees-of-freedom state model. The PDF of the state of the body was propagated in time to obtain impact-time distributions. This Direct PDF Propagation (DPP) method results in a multi-dimensional scattered dataset of the PDF of the state, which is highly challenging to process. No accurate results could be obtained, because of the structure of the DPP data and the high dimensionality. Therefore, the DPP method is less suitable for practical uncontrolled entry problems and the traditional MC method remains superior. Additionally, the MC method was used with two improved uncertainty models to obtain impact-time distributions, which were validated using observations of true impacts. For one of the two uncertainty models, statistically more valid impact-time distributions were obtained than in previous research.

  15. Mesostructural investigation of micron-sized glass particles during shear deformation - An experimental approach vs. DEM simulation

    NASA Astrophysics Data System (ADS)

    Torbahn, Lutz; Weuster, Alexander; Handl, Lisa; Schmidt, Volker; Kwade, Arno; Wolf, Dietrich E.

    2017-06-01

    The interdependency of structure and mechanical features of a cohesive powder packing is on current scientific focus and far from being well understood. Although the Discrete Element Method provides a well applicable and widely used tool to model powder behavior, non-trivial contact mechanics of micron-sized particles demand a sophisticated contact model. Here, a direct comparison between experiment and simulation on a particle level offers a proper approach for model validation. However, the simulation of a full scale shear-tester experiment with micron-sized particles, and hence, validating this simulation remains a challenge. We address this task by down scaling the experimental setup: A fully functional micro shear-tester was developed and implemented into an X-ray tomography device in order to visualize the sample on a bulk and particle level within small bulk volumes of the order of a few micro liter under well-defined consolidation. Using spherical micron-sized particles (30 μm), shear tests with a particle number accessible for simulations can be performed. Moreover, particle level analysis allows for a direct comparison of experimental and numerical results, e.g., regarding structural evolution. In this talk, we focus on density inhomogeneity and shear induced heterogeneity during compaction and shear deformation.

  16. Study of the antimicrobial activity of cyclic cation-based ionic liquids via experimental and group contribution QSAR model.

    PubMed

    Ghanem, Ouahid Ben; Shah, Syed Nasir; Lévêque, Jean-Marc; Mutalib, M I Abdul; El-Harbawi, Mohanad; Khan, Amir Sada; Alnarabiji, Mohamad Sahban; Al-Absi, Hamada R H; Ullah, Zahoor

    2018-03-01

    Over the past decades, Ionic liquids (ILs) have gained considerable attention from the scientific community in reason of their versatility and performance in many fields. However, they nowadays remain mainly for laboratory scale use. The main barrier hampering their use in a larger scale is their questionable ecological toxicity. This study investigated the effect of hydrophobic and hydrophilic cyclic cation-based ILs against four pathogenic bacteria that infect humans. For that, cations, either of aromatic character (imidazolium or pyridinium) or of non-aromatic nature, (pyrrolidinium or piperidinium), were selected with different alkyl chain lengths and combined with both hydrophilic and hydrophobic anionic moieties. The results clearly demonstrated that introducing of hydrophobic anion namely bis((trifluoromethyl)sulfonyl)amide, [NTF 2 ] and the elongation of the cations substitutions dramatically affect ILs toxicity behaviour. The established toxicity data [50% effective concentration (EC 50 )] along with similar endpoint collected from previous work against Aeromonas hydrophila were combined to developed quantitative structure-activity relationship (QSAR) model for toxicity prediction. The model was developed and validated in the light of Organization for Economic Co-operation and Development (OECD) guidelines strategy, producing good correlation coefficient R 2 of 0.904 and small mean square error (MSE) of 0.095. The reliability of the QSAR model was further determined using k-fold cross validation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. The WRAIR projectile concussive impact model of mild traumatic brain injury: re-design, testing and preclinical validation.

    PubMed

    Leung, Lai Yee; Larimore, Zachary; Holmes, Larry; Cartagena, Casandra; Mountney, Andrea; Deng-Bryant, Ying; Schmid, Kara; Shear, Deborah; Tortella, Frank

    2014-08-01

    The WRAIR projectile concussive impact (PCI) model was developed for preclinical study of concussion. It represents a truly non-invasive closed-head injury caused by a blunt impact. The original design, however, has several drawbacks that limit the manipulation of injury parameters. The present study describes engineering advancements made to the PCI injury model including helmet material testing, projectile impact energy/head kinematics and impact location. Material testing indicated that among the tested materials, 'fiber-glass/carbon' had the lowest elastic modulus and yield stress for providing an relative high percentage of load transfer from the projectile impact, resulting in significant hippocampal astrocyte activation. Impact energy testing of small projectiles, ranging in shape and size, showed the steel sphere produced the highest impact energy and the most consistent impact characteristics. Additional tests confirmed the steel sphere produced linear and rotational motions on the rat's head while remaining within a range that meets the criteria for mTBI. Finally, impact location testing results showed that PCI targeted at the temporoparietal surface of the rat head produced the most prominent gait abnormalities. Using the parameters defined above, pilot studies were conducted to provide initial validation of the PCI model demonstrating quantifiable and significant increases in righting reflex recovery time, axonal damage and astrocyte activation following single and multiple concussions.

  18. Stochastic Time Models of Syllable Structure

    PubMed Central

    Shaw, Jason A.; Gafos, Adamantios I.

    2015-01-01

    Drawing on phonology research within the generative linguistics tradition, stochastic methods, and notions from complex systems, we develop a modelling paradigm linking phonological structure, expressed in terms of syllables, to speech movement data acquired with 3D electromagnetic articulography and X-ray microbeam methods. The essential variable in the models is syllable structure. When mapped to discrete coordination topologies, syllabic organization imposes systematic patterns of variability on the temporal dynamics of speech articulation. We simulated these dynamics under different syllabic parses and evaluated simulations against experimental data from Arabic and English, two languages claimed to parse similar strings of segments into different syllabic structures. Model simulations replicated several key experimental results, including the fallibility of past phonetic heuristics for syllable structure, and exposed the range of conditions under which such heuristics remain valid. More importantly, the modelling approach consistently diagnosed syllable structure proving resilient to multiple sources of variability in experimental data including measurement variability, speaker variability, and contextual variability. Prospects for extensions of our modelling paradigm to acoustic data are also discussed. PMID:25996153

  19. Automated verbal credibility assessment of intentions: The model statement technique and predictive modeling

    PubMed Central

    van der Toolen, Yaloe; Vrij, Aldert; Arntz, Arnoud; Verschuere, Bruno

    2018-01-01

    Summary Recently, verbal credibility assessment has been extended to the detection of deceptive intentions, the use of a model statement, and predictive modeling. The current investigation combines these 3 elements to detect deceptive intentions on a large scale. Participants read a model statement and wrote a truthful or deceptive statement about their planned weekend activities (Experiment 1). With the use of linguistic features for machine learning, more than 80% of the participants were classified correctly. Exploratory analyses suggested that liars included more person and location references than truth‐tellers. Experiment 2 examined whether these findings replicated on independent‐sample data. The classification accuracies remained well above chance level but dropped to 63%. Experiment 2 corroborated the finding that liars' statements are richer in location and person references than truth‐tellers' statements. Together, these findings suggest that liars may over‐prepare their statements. Predictive modeling shows promise as an automated veracity assessment approach but needs validation on independent data. PMID:29861544

  20. Unmanned Aircraft Systems Minimum Operations Performance Standards End-to-End Verification and Validation (E2-V2) Simulation

    NASA Technical Reports Server (NTRS)

    Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.; hide

    2017-01-01

    As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot model similar to one from the Massachusetts Institute of Technology's Lincoln Laboratory (MIT/LL). The resulting simulation provides the following key parameters, among others, to evaluate the effectiveness of the MOPS DAA system: severity of loss of well clear (SLoWC), alert scoring, and number of increasing alerts (alert jitter). The technique, results, and lessons learned from a detailed examination of DAA system performance over specific test vectors and encounter cases during the simulation experiment will be presented in this paper.

  1. Modelisation 0D/1D des emissions de particules de suie dans les turbines a gaz aeronautiques

    NASA Astrophysics Data System (ADS)

    Bisson, Jeremie

    Because of more stringent regulations of aircraft particle emissions as well as strong uncertainties about their formation and their effects on the atmosphere, a better understanding of particle microphysical mechanisms and their interactions with the engine components is required. This thesis focuses on the development of a 0D/1D combustion model with soot production in an aeronautical gas turbine. A major objective of this study is to assess the quality of soot particle emission predictions for different flight configurations. The model should eventually allow performing parametric studies on current or future engines with a minimal computation time. The model represents the combustor as well as turbines and nozzle with a chemical reactor network (CRN) that is coupled with a detailed combustion chemistry for kerosene (Jet A-1) and a soot particle dynamics model using the method of moments. The CRN was applied to the CFM56-2C1 engine during flight configurations of the LTO cycle (Landing-Take-Off) as in the APEX-1 study on aircraft particle emissions. The model was mainly validated on gas turbine thermodynamic data and pollutant concentrations (H2O, COX, NOx, SOX) which were measured in the same study. Once the first validation completed, the model was subsequently used for the computation of mass and number-based emissions indices of the soot particulate population and average diameter. Overall, the model is representative of the thermodynamic conditions and succeeds in predicting the emissions of major pollutants, particularly at high power. Concerning soot particulate emissions, the model's ability to predict simultaneously the emission indices as well as mean diameter has been partially validated. Indeed, the mass emission indices have remained higher than experimental results particularly at high power. These differences on particulate emission index may be the result of uncertainties on thermodynamic parameters of the CRN and mass air flow distribution in the combustion chamber. The analysis of the number-based emission index profile along the CRN also highlights the need to review the nucleation model that has been used and to consider in the future the implementation of a particle aggregation mechanism.

  2. Revealing the distribution of transmembrane currents along the dendritic tree of a neuron from extracellular recordings

    PubMed Central

    Cserpán, Dorottya; Meszéna, Domokos; Wittner, Lucia; Tóth, Kinga; Ulbert, István; Somogyvári, Zoltán

    2017-01-01

    Revealing the current source distribution along the neuronal membrane is a key step on the way to understanding neural computations; however, the experimental and theoretical tools to achieve sufficient spatiotemporal resolution for the estimation remain to be established. Here, we address this problem using extracellularly recorded potentials with arbitrarily distributed electrodes for a neuron of known morphology. We use simulations of models with varying complexity to validate the proposed method and to give recommendations for experimental applications. The method is applied to in vitro data from rat hippocampus. PMID:29148974

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Amy B.; Boukhalfa, Hakim; Caporuscio, Florie Andre

    To gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that parameters and processes are correctly simulated. The laboratory investigations presented herein aim to address knowledge gaps for heat-generating nuclear waste (HGNW) disposal in bedded salt that remain after examination of prior field and laboratory test data. Primarily, we are interested in better constraining the thermal, hydrological, and physicochemical behavior of brine, water vapor, and salt when moist salt is heated. The target of this work is to use run-of-mine (RoM) salt; however during FY2015 progress was made using high-purity, granular sodium chloride.

  4. Nutrient chemotaxis suppression of a diffusive instability in bacterial colony dynamics

    NASA Astrophysics Data System (ADS)

    Arouh, Scott; Levine, Herbert

    2000-07-01

    Bacteria grown on a semisolid agar surface have been observed to form branching patterns as the colony envelope propagates outward. The fundamental cause of this instability relates to the need for limited nutrient to diffuse towards the colony. Here, we investigate the effect on this instability of allowing the bacteria to move chemotactically in response to the nutrient gradient. Our results show that this additional effect has a tendency to suppress the instability. Our calculations are done within the context of a simple ``cutoff'' model of colony dynamics, but presumably remain valid for more complex and hence more realistic approaches.

  5. A combined experimental and finite element analysis method for the estimation of eddy-current loss in NdFeB magnets.

    PubMed

    Fratila, Radu; Benabou, Abdelkader; Tounzi, Abdelmounaïm; Mipo, Jean-Claude

    2014-05-14

    NdFeB permanent magnets (PMs) are widely used in high performance electrical machines, but their relatively high conductivity subjects them to eddy current losses that can lead to magnetization loss. The Finite Element (FE) method is generally used to quantify the eddy current loss of PMs, but it remains quite difficult to validate the accuracy of the results with complex devices. In this paper, an experimental test device is used in order to extract the eddy current losses that are then compared with those of a 3D FE model.

  6. KARHUNEN-LOÈVE Basis Functions of Kolmogorov Turbulence in the Sphere

    NASA Astrophysics Data System (ADS)

    Mathar, Richard J.

    In support of modeling atmospheric turbulence, the statistically independent Karhunen-Loève modes of refractive indices with isotropic Kolmogorov spectrum of the covariance are calculated inside a sphere of fixed radius, rendered as series of 3D Zernike functions. Many of the symmetry arguments of the well-known associated 2D problem for the circular input pupil remain valid. The technique of efficient diagonalization of the eigenvalue problem in wavenumber space is founded on the Fourier representation of the 3D Zernike basis, and extensible to the von-Kármán power spectrum.

  7. Modelling obesity trends in Australia: unravelling the past and predicting the future.

    PubMed

    Hayes, A J; Lung, T W C; Bauman, A; Howard, K

    2017-01-01

    Modelling is increasingly being used to predict the epidemiology of obesity progression and its consequences. The aims of this study were: (a) to present and validate a model for prediction of obesity among Australian adults and (b) to use the model to project the prevalence of obesity and severe obesity by 2025. Individual level simulation combined with survey estimation techniques to model changing population body mass index (BMI) distribution over time. The model input population was derived from a nationally representative survey in 1995, representing over 12 million adults. Simulations were run for 30 years. The model was validated retrospectively and then used to predict obesity and severe obesity by 2025 among different aged cohorts and at a whole population level. The changing BMI distribution over time was well predicted by the model and projected prevalence of weight status groups agreed with population level data in 2008, 2012 and 2014.The model predicts more growth in obesity among younger than older adult cohorts. Projections at a whole population level, were that healthy weight will decline, overweight will remain steady, but obesity and severe obesity prevalence will continue to increase beyond 2016. Adult obesity prevalence was projected to increase from 19% in 1995 to 35% by 2025. Severe obesity (BMI>35), which was only around 5% in 1995, was projected to be 13% by 2025, two to three times the 1995 levels. The projected rise in obesity severe obesity will have more substantial cost and healthcare system implications than in previous decades. Having a robust epidemiological model is key to predicting these long-term costs and health outcomes into the future.

  8. A model for phosphorus transformation and runoff loss for surface-applied manures.

    PubMed

    Vadas, P A; Gburek, W J; Sharpley, A N; Kleinman, P J A; Moore, P A; Cabrera, M L; Harmel, R D

    2007-01-01

    Agricultural P transport in runoff is an environmental concern. An important source of P runoff is surface-applied, unincorporated manures, but computer models used to assess P transport do not adequately simulate P release and transport from surface manures. We developed a model to address this limitation. The model operates on a daily basis and simulates manure application to the soil surface, letting 60% of manure P infiltrate into soil if manure slurry with less than 15% solids is applied. The model divides manure P into four pools, water-extractable inorganic and organic P, and stable inorganic and organic P. The model simulates manure dry matter decomposition, and manure stable P transformation to water-extractable P. Manure dry matter and P are assimilated into soil to simulate bioturbation. Water-extractable P is leached from manure when it rains, and a portion of leached P can be transferred to surface runoff. Eighty percent of manure P leached into soil by rain remains in the top 2 cm, while 20% leaches deeper. This 2-cm soil layer contributes P to runoff via desorption. We used data from field studies in Texas, Pennsylvania, Georgia, and Arkansas to build and validate the model. Validation results show the model accurately predicted cumulative P loads in runoff, reflecting successful simulation of the dynamics of manure dry matter, manure and soil P pools, and storm-event runoff P concentrations. Predicted runoff P concentrations were significantly related to (r2=0.57) but slightly less than measured concentrations. Our model thus represents an important modification for field or watershed scale models that assess P loss from manured soils.

  9. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    PubMed

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. Localization of (photo)respiration and CO2 re-assimilation in tomato leaves investigated with a reaction-diffusion model

    PubMed Central

    Berghuijs, Herman N. C.; Yin, Xinyou; Ho, Q. Tri; Verboven, Pieter; Nicolaï, Bart M.

    2017-01-01

    The rate of photosynthesis depends on the CO2 partial pressure near Rubisco, Cc, which is commonly calculated by models using the overall mesophyll resistance. Such models do not explain the difference between the CO2 level in the intercellular air space and Cc mechanistically. This problem can be overcome by reaction-diffusion models for CO2 transport, production and fixation in leaves. However, most reaction-diffusion models are complex and unattractive for procedures that require a large number of runs, like parameter optimisation. This study provides a simpler reaction-diffusion model. It is parameterized by both leaf physiological and leaf anatomical data. The anatomical data consisted of the thickness of the cell wall, cytosol and stroma, and the area ratios of mesophyll exposed to the intercellular air space to leaf surfaces and exposed chloroplast to exposed mesophyll surfaces. The model was used directly to estimate photosynthetic parameters from a subset of the measured light and CO2 response curves; the remaining data were used for validation. The model predicted light and CO2 response curves reasonably well for 15 days old tomato (cv. Admiro) leaves, if (photo)respiratory CO2 release was assumed to take place in the inner cytosol or in the gaps between the chloroplasts. The model was also used to calculate the fraction of CO2 produced by (photo)respiration that is re-assimilated in the stroma, and this fraction ranged from 56 to 76%. In future research, the model should be further validated to better understand how the re-assimilation of (photo)respired CO2 is affected by environmental conditions and physiological parameters. PMID:28880924

  11. Localization of (photo)respiration and CO2 re-assimilation in tomato leaves investigated with a reaction-diffusion model.

    PubMed

    Berghuijs, Herman N C; Yin, Xinyou; Ho, Q Tri; Retta, Moges A; Verboven, Pieter; Nicolaï, Bart M; Struik, Paul C

    2017-01-01

    The rate of photosynthesis depends on the CO2 partial pressure near Rubisco, Cc, which is commonly calculated by models using the overall mesophyll resistance. Such models do not explain the difference between the CO2 level in the intercellular air space and Cc mechanistically. This problem can be overcome by reaction-diffusion models for CO2 transport, production and fixation in leaves. However, most reaction-diffusion models are complex and unattractive for procedures that require a large number of runs, like parameter optimisation. This study provides a simpler reaction-diffusion model. It is parameterized by both leaf physiological and leaf anatomical data. The anatomical data consisted of the thickness of the cell wall, cytosol and stroma, and the area ratios of mesophyll exposed to the intercellular air space to leaf surfaces and exposed chloroplast to exposed mesophyll surfaces. The model was used directly to estimate photosynthetic parameters from a subset of the measured light and CO2 response curves; the remaining data were used for validation. The model predicted light and CO2 response curves reasonably well for 15 days old tomato (cv. Admiro) leaves, if (photo)respiratory CO2 release was assumed to take place in the inner cytosol or in the gaps between the chloroplasts. The model was also used to calculate the fraction of CO2 produced by (photo)respiration that is re-assimilated in the stroma, and this fraction ranged from 56 to 76%. In future research, the model should be further validated to better understand how the re-assimilation of (photo)respired CO2 is affected by environmental conditions and physiological parameters.

  12. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  13. The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance.

    PubMed

    Kepes, Sven; McDaniel, Michael A

    2015-01-01

    Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation.

  14. The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance

    PubMed Central

    2015-01-01

    Introduction Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. Methods To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Results Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. Conclusion The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation. PMID:26517553

  15. Towards a viscoelastic model for the unfused midpalatal suture: development and validation using the midsagittal suture in New Zealand white rabbits.

    PubMed

    Romanyk, D L; Liu, S S; Lipsett, M G; Toogood, R W; Lagravère, M O; Major, P W; Carey, J P

    2013-06-21

    Maxillary expansion treatment is a commonly used procedure by orthodontists to widen a patient's upper jaw. As this is typically performed in adolescent patients, the midpalatal suture, connective tissue adjoining the two maxilla halves, remains unfused. Studies that have investigated patient response to expansion treatment, generally through finite element analysis, have considered this suture to behave in a linear elastic manner or it was left vacant. The purpose of the study presented here was to develop a model that could represent the midpalatal suture's viscoelastic behavior. Quasilinear viscoelastic, modified superposition, Schapery's, and Burgers modeling approaches were all considered. Raw data from a previously published study using New Zealand White Rabbits was utilized for model parameter estimation and validation. In this study, Sentalloy(®) coil springs at load levels of 0.49N (50g), 0.98N (100g), and 1.96N (200g) were used to widen the midsagittal suture of live rabbits over a period of 6 weeks. Evaluation was based on a models ability to represent experimental data well over all three load sets. Ideally, a single set of model constants could be used to represent data over all loads tested. Upon completion of the analysis it was found that the modified superposition method was able to replicate experimental data within one standard deviation of the means using a single set of constants for all loads. Future work should focus on model improvement as well as prediction of treatment outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. An assessment of the trophic structure of the Bay of Biscay continental shelf food web: Comparing estimates derived from an ecosystem model and isotopic data

    NASA Astrophysics Data System (ADS)

    Lassalle, G.; Chouvelon, T.; Bustamante, P.; Niquil, N.

    2014-01-01

    Comparing outputs of ecosystem models with estimates derived from experimental and observational approaches is important in creating valuable feedback for model construction, analyses and validation. Stable isotopes and mass-balanced trophic models are well-known and widely used as approximations to describe the structure of food webs, but their consistency has not been properly established as attempts to compare these methods remain scarce. Model construction is a data-consuming step, meaning independent sets for validation are rare. Trophic linkages in the French continental shelf of the Bay of Biscay food webs were recently investigated using both methodologies. Trophic levels for mono-specific compartments representing small pelagic fish and marine mammals and multi-species functional groups corresponding to demersal fish and cephalopods, derived from modelling, were compared with trophic levels calculated from independent carbon and nitrogen isotope ratios. Estimates of the trophic niche width of those species, or groups of species, were compared between these two approaches as well. A significant and close-to-one positive (rSpearman2 = 0.72 , n = 16, p < 0.0001) correlation was found between trophic levels estimated by Ecopath modelling and those derived from isotopic signatures. Differences between estimates were particularly low for mono-specific compartments. No clear relationship existed between indices of trophic niche width derived from both methods. Given the wide recognition of trophic levels as a useful concept in ecosystem-based fisheries management, propositions were made to further combine these two approaches.

  17. A broad scope knowledge based model for optimization of VMAT in esophageal cancer: validation and assessment of plan quality among different treatment centers.

    PubMed

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca

    2015-10-31

    To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice.

  18. Using a web-based, iterative education model to enhance clinical clerkships.

    PubMed

    Alexander, Erik K; Bloom, Nurit; Falchuk, Kenneth H; Parker, Michael

    2006-10-01

    Although most clinical clerkship curricula are designed to provide all students consistent exposure to defined course objectives, it is clear that individual students are diverse in their backgrounds and baseline knowledge. Ideally, the learning process should be individualized towards the strengths and weakness of each student, but, until recently, this has proved prohibitively time-consuming. The authors describe a program to develop and evaluate an iterative, Web-based educational model assessing medical students' knowledge deficits and allowing targeted teaching shortly after their identification. Beginning in 2002, a new educational model was created, validated, and applied in a prospective fashion to medical students during an internal medicine clerkship at Harvard Medical School. Using a Web-based platform, five validated questions were delivered weekly and a specific knowledge deficiency identified. Teaching targeted to the deficiency was provided to an intervention cohort of five to seven students in each clerkship, though not to controls (the remaining 7-10 students). Effectiveness of this model was assessed by performance on the following week's posttest question. Specific deficiencies were readily identified weekly using this model. Throughout the year, however, deficiencies varied unpredictably. Teaching targeted to deficiencies resulted in significantly better performance on follow-up questioning compared to the performance of those who did not receive this intervention. This model was easily applied in an additive fashion to the current curriculum, and student acceptance was high. The authors conclude that a Web-based, iterative assessment model can effectively target specific curricular needs unique to each group; focus teaching in a rapid, formative, and highly efficient manner; and may improve the efficiency of traditional clerkship teaching.

  19. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  20. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  1. Atomistic kinetic Monte Carlo study of atomic layer deposition derived from density functional theory.

    PubMed

    Shirazi, Mahdi; Elliott, Simon D

    2014-01-30

    To describe the atomic layer deposition (ALD) reactions of HfO2 from Hf(N(CH3)2)4 and H2O, a three-dimensional on-lattice kinetic Monte-Carlo model is developed. In this model, all atomistic reaction pathways in density functional theory (DFT) are implemented as reaction events on the lattice. This contains all steps, from the early stage of adsorption of each ALD precursor, kinetics of the surface protons, interaction between the remaining precursors (steric effect), influence of remaining fragments on adsorption sites (blocking), densification of each ALD precursor, migration of each ALD precursors, and cooperation between the remaining precursors to adsorb H2O (cooperative effect). The essential chemistry of the ALD reactions depends on the local environment at the surface. The coordination number and a neighbor list are used to implement the dependencies. The validity and necessity of the proposed reaction pathways are statistically established at the mesoscale. The formation of one monolayer of precursor fragments is shown at the end of the metal pulse. Adsorption and dissociation of the H2O precursor onto that layer is described, leading to the delivery of oxygen and protons to the surface during the H2O pulse. Through these processes, the remaining precursor fragments desorb from the surface, leaving the surface with bulk-like and OH-terminated HfO2, ready for the next cycle. The migration of the low coordinated remaining precursor fragments is also proposed. This process introduces a slow reordering motion (crawling) at the mesoscale, leading to the smooth and conformal thin film that is characteristic of ALD. Copyright © 2013 Wiley Periodicals, Inc.

  2. Differential Decomposition Among Pig, Rabbit, and Human Remains.

    PubMed

    Dautartas, Angela; Kenyhercz, Michael W; Vidoli, Giovanna M; Meadows Jantz, Lee; Mundorff, Amy; Steadman, Dawnie Wolfe

    2018-03-30

    While nonhuman animal remains are often utilized in forensic research to develop methods to estimate the postmortem interval, systematic studies that directly validate animals as proxies for human decomposition are lacking. The current project compared decomposition rates among pigs, rabbits, and humans at the University of Tennessee's Anthropology Research Facility across three seasonal trials that spanned nearly 2 years. The Total Body Score (TBS) method was applied to quantify decomposition changes and calculate the postmortem interval (PMI) in accumulated degree days (ADD). Decomposition trajectories were analyzed by comparing the estimated and actual ADD for each seasonal trial and by fuzzy cluster analysis. The cluster analysis demonstrated that the rabbits formed one group while pigs and humans, although more similar to each other than either to rabbits, still showed important differences in decomposition patterns. The decomposition trends show that neither nonhuman model captured the pattern, rate, and variability of human decomposition. © 2018 American Academy of Forensic Sciences.

  3. Universal Non-Debye Scaling in the Density of States of Amorphous Solids.

    PubMed

    Charbonneau, Patrick; Corwin, Eric I; Parisi, Giorgio; Poncet, Alexis; Zamponi, Francesco

    2016-07-22

    At the jamming transition, amorphous packings are known to display anomalous vibrational modes with a density of states (DOS) that remains constant at low frequency. The scaling of the DOS at higher packing fractions remains, however, unclear. One might expect to find a simple Debye scaling, but recent results from effective medium theory and the exact solution of mean-field models both predict an anomalous, non-Debye scaling. Being mean-field in nature, however, these solutions are only strictly valid in the limit of infinite spatial dimension, and it is unclear what value they have for finite-dimensional systems. Here, we study packings of soft spheres in dimensions 3 through 7 and find, away from jamming, a universal non-Debye scaling of the DOS that is consistent with the mean-field predictions. We also consider how the soft mode participation ratio evolves as dimension increases.

  4. Actor groups, related needs, and challenges at the climate downscaling interface

    NASA Astrophysics Data System (ADS)

    Rössler, Ole; Benestad, Rasmus; Diamando, Vlachogannis; Heike, Hübener; Kanamaru, Hideki; Pagé, Christian; Margarida Cardoso, Rita; Soares, Pedro; Maraun, Douglas; Kreienkamp, Frank; Christodoulides, Paul; Fischer, Andreas; Szabo, Peter

    2016-04-01

    At the climate downscaling interface, numerous downscaling techniques and different philosophies compete on being the best method in their specific terms. Thereby, it remains unclear to what extent and for which purpose these downscaling techniques are valid or even the most appropriate choice. A common validation framework that compares all the different available methods was missing so far. The initiative VALUE closes this gap with such a common validation framework. An essential part of a validation framework for downscaling techniques is the definition of appropriate validation measures. The selection of validation measures should consider the needs of the stakeholder: some might need a temporal or spatial average of a certain variable, others might need temporal or spatial distributions of some variables, still others might need extremes for the variables of interest or even inter-variable dependencies. Hence, a close interaction of climate data providers and climate data users is necessary. Thus, the challenge in formulating a common validation framework mirrors also the challenges between the climate data providers and the impact assessment community. This poster elaborates the issues and challenges at the downscaling interface as it is seen within the VALUE community. It suggests three different actor groups: one group consisting of the climate data providers, the other two groups being climate data users (impact modellers and societal users). Hence, the downscaling interface faces classical transdisciplinary challenges. We depict a graphical illustration of actors involved and their interactions. In addition, we identified four different types of issues that need to be considered: i.e. data based, knowledge based, communication based, and structural issues. They all may, individually or jointly, hinder an optimal exchange of data and information between the actor groups at the downscaling interface. Finally, some possible ways to tackle these issues are discussed.

  5. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments.

    PubMed

    Munkácsy, Gyöngyi; Sztupinszki, Zsófia; Herman, Péter; Bán, Bence; Pénzváltó, Zsófia; Szarvas, Nóra; Győrffy, Balázs

    2016-09-27

    No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA) for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal-Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC) of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E-06). Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR) or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E-04). There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  6. The Chinese version of the Severe Respiratory Insufficiency questionnaire for patients with chronic hypercapnic chronic obstructive pulmonary disease receiving non-invasive positive pressure ventilation.

    PubMed

    Chen, Rongchang; Guan, Lili; Wu, Weiliang; Yang, Zhicong; Li, Xiaoying; Luo, Qun; Liang, Zhenyu; Wang, Fengyan; Guo, Bingpeng; Huo, Yating; Yang, Yuqiong; Zhou, Luqian

    2017-08-28

    The Severe Respiratory Insufficiency (SRI) questionnaire is the best assessment tool for health-related quality of life in patients with chronic obstructive pulmonary disease (COPD) receiving non-invasive positive pressure ventilation (NIPPV). This study aimed to translate the SRI Questionnaire into Chinese and to validate it. Prospective validation study. A total of 149 participants with chronic hypercapnic COPD receiving NIPPV completed the study. The SRI questionnaire was translated into Chinese using translation and back-translation. Reliability was gauged using Cronbach's α coefficient. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were used to assess construct validity. Content validity was confirmed by evaluating the relationship between the score of each item and the total score of the relevant subscale. Cronbach's α coefficients for each subscale and summary scale were above 0.7. Using EFA, one factor was extracted from the anxiety and summary scales and two factors were extracted from the remaining six subscales. Based on the EFA results, subsequent CFA revealed a good model fit for each subscale, but the extracted factors of each subscale were correlated. Content validity was confirmed by the good relationship between the score of each item and the total score of the relevant subscale. The Chinese version of the SRI questionnaire is valid and reliable for patients with chronic hypercapnic COPD receiving NIPPV in China. NCT02499718. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Longitudinal evaluation of Patient Reported Outcomes Measurement Information Systems (PROMIS) measures in pediatric chronic pain

    PubMed Central

    Kashikar-Zuck, Susmita; Carle, Adam; Barnett, Kimberly; Goldschneider, Kenneth R.; Sherry, David D.; Mara, Constance A.; Cunningham, Natoshia; Farrell, Jennifer; Tress, Jenna; DeWitt, Esi Morgan

    2015-01-01

    The Patient Reported Outcomes Measurement Information System (PROMIS) initiative is a comprehensive strategy by the National Institutes of Health to support the development and validation of precise instruments to assess self-reported health domains across healthy and disease-specific populations. Much progress has been made in instrument development but there remains a gap in the validation of PROMIS measures for pediatric chronic pain. The purpose of this study was to investigate the construct validity and responsiveness to change of seven PROMIS domains for the assessment of children (ages 8-18) with chronic pain – Pain Interference, Fatigue, Anxiety, Depression, Mobility, Upper Extremity Function and Peer Relationships. PROMIS measures were administered at the initial visit and two follow-up visits at an outpatient chronic pain clinic (CPC; N=82) and at an intensive amplified pain day-treatment program (AMP; N= 63). Aim 1 examined construct validity of PROMIS measures by comparing them with corresponding “legacy” measures administered as part of usual care in the CPC sample. Aim 2 examined sensitivity to change in both CPC and AMP samples. Longitudinal growth models showed that PROMIS Pain Interference, Anxiety, Depression, Mobility, Upper Extremity and Peer Relationship measures and legacy instruments generally performed similarly with slightly steeper slopes of improvement in legacy measures. All seven PROMIS domains showed responsiveness to change. Results offered initial support for the validity of PROMIS measures in pediatric chronic pain. Further validation with larger and more diverse pediatric pain samples and additional legacy measures would broaden the scope of use of PROMIS in clinical research. PMID:26447704

  8. Aspiration of human neutrophils: effects of shear thinning and cortical dissipation.

    PubMed

    Drury, J L; Dembo, M

    2001-12-01

    It is generally accepted that the human neutrophil can be mechanically represented as a droplet of polymeric fluid enclosed by some sort of thin slippery viscoelastic cortex. Many questions remain however about the detailed rheology and chemistry of the interior fluid and the cortex. To address these quantitative issues, we have used a finite element method to simulate the dynamics of neutrophils during micropipet aspiration using various plausible assumptions. The results were then systematically compared with aspiration experiments conducted at eight different combinations of pipet size and pressure. Models in which the cytoplasm was represented by a simple Newtonian fluid (i.e., models without shear thinning) were grossly incapable of accounting for the effects of pressure on the general time scale of neutrophil aspiration. Likewise, models in which the cortex was purely elastic (i.e., models without surface viscosity) were unable to explain the effects of pipet size on the general aspiration rate. Such models also failed to explain the rapid acceleration of the aspiration rate during the final phase of aspiration nor could they account for the geometry of the neutrophil during various phases of aspiration. Thus, our results indicate that a minimal mechanical model of the neutrophil needs to incorporate both shear thinning and surface viscosity to remain valid over a reasonable range of conditions. At low shear rates, the surface dilatation viscosity of the neutrophil was found to be on the order of 100 poise-cm, whereas the viscosity of the interior cytoplasm was on the order of 1000 poise. Both the surface viscosity and the interior viscosity seem to decrease in a similar fashion when the shear rate exceeds approximately 0.05 s(-1). Unfortunately, even models with both surface viscosity and shear thinning studied are still not sufficient to fully explain all the features of neutrophil aspiration. In particular, the very high rate of aspiration during the initial moments after ramping of pressure remains mysterious.

  9. Aspiration of human neutrophils: effects of shear thinning and cortical dissipation.

    PubMed Central

    Drury, J L; Dembo, M

    2001-01-01

    It is generally accepted that the human neutrophil can be mechanically represented as a droplet of polymeric fluid enclosed by some sort of thin slippery viscoelastic cortex. Many questions remain however about the detailed rheology and chemistry of the interior fluid and the cortex. To address these quantitative issues, we have used a finite element method to simulate the dynamics of neutrophils during micropipet aspiration using various plausible assumptions. The results were then systematically compared with aspiration experiments conducted at eight different combinations of pipet size and pressure. Models in which the cytoplasm was represented by a simple Newtonian fluid (i.e., models without shear thinning) were grossly incapable of accounting for the effects of pressure on the general time scale of neutrophil aspiration. Likewise, models in which the cortex was purely elastic (i.e., models without surface viscosity) were unable to explain the effects of pipet size on the general aspiration rate. Such models also failed to explain the rapid acceleration of the aspiration rate during the final phase of aspiration nor could they account for the geometry of the neutrophil during various phases of aspiration. Thus, our results indicate that a minimal mechanical model of the neutrophil needs to incorporate both shear thinning and surface viscosity to remain valid over a reasonable range of conditions. At low shear rates, the surface dilatation viscosity of the neutrophil was found to be on the order of 100 poise-cm, whereas the viscosity of the interior cytoplasm was on the order of 1000 poise. Both the surface viscosity and the interior viscosity seem to decrease in a similar fashion when the shear rate exceeds approximately 0.05 s(-1). Unfortunately, even models with both surface viscosity and shear thinning studied are still not sufficient to fully explain all the features of neutrophil aspiration. In particular, the very high rate of aspiration during the initial moments after ramping of pressure remains mysterious. PMID:11720983

  10. The Complexity of Biomechanics Causing Primary Blast-Induced Traumatic Brain Injury: A Review of Potential Mechanisms

    PubMed Central

    Courtney, Amy; Courtney, Michael

    2015-01-01

    Primary blast-induced traumatic brain injury (bTBI) is a prevalent battlefield injury in recent conflicts, yet biomechanical mechanisms of bTBI remain unclear. Elucidating specific biomechanical mechanisms is essential to developing animal models for testing candidate therapies and for improving protective equipment. Three hypothetical mechanisms of primary bTBI have received the most attention. Because translational and rotational head accelerations are primary contributors to TBI from non-penetrating blunt force head trauma, the acceleration hypothesis suggests that blast-induced head accelerations may cause bTBI. The hypothesis of direct cranial transmission suggests that a pressure transient traverses the skull into the brain and directly injures brain tissue. The thoracic hypothesis of bTBI suggests that some combination of a pressure transient reaching the brain via the thorax and a vagally mediated reflex result in bTBI. These three mechanisms may not be mutually exclusive, and quantifying exposure thresholds (for blasts of a given duration) is essential for determining which mechanisms may be contributing for a level of blast exposure. Progress has been hindered by experimental designs, which do not effectively expose animal models to a single mechanism and by over-reliance on poorly validated computational models. The path forward should be predictive validation of computational models by quantitative confirmation with blast experiments in animal models, human cadavers, and biofidelic human surrogates over a range of relevant blast magnitudes and durations coupled with experimental designs, which isolate a single injury mechanism. PMID:26539158

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doherty, K.E.; Naugle, D.E.; Walker, B.L.

    Recent energy development has resulted in rapid and large-scale changes to western shrub-steppe ecosystems without a complete understanding of its potential impacts on wildlife populations. We modeled winter habitat use by female greater sage-grouse (Centrocercus urophasianus) in the Powder River Basin (PRB) of Wyoming and Montana, USA, to 1) identify landscape features that influenced sage-grouse habitat selection, 2) assess the scale at which selection occurred, 3) spatially depict winter habitat quality in a Geographic Information System, and 4) assess the effect of coal-bed natural gas (CBNG) development on winter habitat selection. We developed a model of winter habitat selection basedmore » on 435 aerial relocations of 200 radiomarked female sage-grouse obtained during the winters of 2005 and 2006. Percent sagebrush (Artemisia spp.) cover on the landscape was an important predictor of use by sage-grouse in winter. Sage-grouse were 1.3 times more likely to occupy sagebrush habitats that lacked CBNG wells within a 4-km{sup 2} area, compared to those that had the maximum density of 12.3 wells per 4 km{sup 2} allowed on federal lands. We validated the model with 74 locations from 74 radiomarked individuals obtained during the winters of 2004 and 2007. This winter habitat model based on vegetation, topography, and CBNG avoidance was highly predictive (validation R{sup 2} = 0.984). Our spatially explicit model can be used to identify areas that provide the best remaining habitat for wintering sage-grouse in the PRB to mitigate impacts of energy development.« less

  12. Do time-invariant confounders explain away the association between job stress and workers' mental health? Evidence from Japanese occupational panel data.

    PubMed

    Oshio, Takashi; Tsutsumi, Akizumi; Inoue, Akiomi

    2015-02-01

    It is well known that job stress is negatively related to workers' mental health, but most recent studies have not controlled for unobserved time-invariant confounders. In the current study, we attempted to validate previous observations on the association between job stress and workers' mental health, by removing the effects of unobserved time-invariant confounders. We used data from three to four waves of an occupational Japanese cohort survey, focusing on 31,382 observations of 9741 individuals who participated in at least two consecutive waves. We estimated mean-centered fixed effects models to explain psychological distress in terms of the Kessler 6 (K6) scores (range: 0-24) by eight job stress indicators related to the job demands-control, effort-reward imbalance, and organizational injustice models. Mean-centered fixed effects models reduced the magnitude of the association between jobs stress and K6 scores to 44.8-54.2% of those observed from pooled ordinary least squares. However, the association remained highly significant even after controlling for unobserved time-invariant confounders for all job stress indicators. In addition, alternatively specified models showed the robustness of the results. In all, we concluded that the validity of major job stress models, which link job stress and workers' mental health, was robust, although unobserved time-invariant confounders led to an overestimation of the association. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Rapid identification and validation of novel targeted approaches for Glioblastoma: A combined ex vivo-in vivo pharmaco-omic model.

    PubMed

    Daher, Ahmad; de Groot, John

    2018-01-01

    Tumor heterogeneity is a major factor in glioblastoma's poor response to therapy and seemingly inevitable recurrence. Only two glioblastoma drugs have received Food and Drug Administration approval since 1998, highlighting the urgent need for new therapies. Profiling "omics" analyses have helped characterize glioblastoma molecularly and have thus identified multiple molecular targets for precision medicine. These molecular targets have influenced clinical trial design; many "actionable" mutation-focused trials are underway, but because they have not yet led to therapeutic breakthroughs, new strategies for treating glioblastoma, especially those with a pharmacological functional component, remain in high demand. In that regard, high-throughput screening that allows for expedited preclinical drug testing and the use of GBM models that represent tumor heterogeneity more accurately than traditional cancer cell lines is necessary to maximize the successful translation of agents into the clinic. High-throughput screening has been successfully used in the testing, discovery, and validation of potential therapeutics in various cancer models, but it has not been extensively utilized in glioblastoma models. In this report, we describe the basic aspects of high-throughput screening and propose a modified high-throughput screening model in which ex vivo and in vivo drug testing is complemented by post-screening pharmacological, pan-omic analysis to expedite anti-glioma drugs' preclinical testing and develop predictive biomarker datasets that can aid in personalizing glioblastoma therapy and inform clinical trial design. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Scaling Laws of Discrete-Fracture-Network Models

    NASA Astrophysics Data System (ADS)

    Philippe, D.; Olivier, B.; Caroline, D.; Jean-Raynald, D.

    2006-12-01

    The statistical description of fracture networks through scale still remains a concern for geologists, considering the complexity of fracture networks. A challenging task of the last 20-years studies has been to find a solid and rectifiable rationale to the trivial observation that fractures exist everywhere and at all sizes. The emergence of fractal models and power-law distributions quantifies this fact, and postulates in some ways that small-scale fractures are genetically linked to their larger-scale relatives. But the validation of these scaling concepts still remains an issue considering the unreachable amount of information that would be necessary with regards to the complexity of natural fracture networks. Beyond the theoretical interest, a scaling law is a basic and necessary ingredient of Discrete-Fracture-Network models (DFN) that are used for many environmental and industrial applications (groundwater resources, mining industry, assessment of the safety of deep waste disposal sites, ..). Indeed, such a function is necessary to assemble scattered data, taken at different scales, into a unified scaling model, and to interpolate fracture densities between observations. In this study, we discuss some important issues related to scaling laws of DFN: - We first describe a complete theoretical and mathematical framework that takes account of both the fracture- size distribution and the fracture clustering through scales (fractal dimension). - We review the scaling laws that have been obtained, and we discuss the ability of fracture datasets to really constrain the parameters of the DFN model. - And finally we discuss the limits of scaling models.

  15. A Validation Study of the "School Leader Dispositions Inventory"[C

    ERIC Educational Resources Information Center

    Melton, Teri Denlea; Tysinger, Dawn; Mallory, Barbara; Green, James

    2011-01-01

    Although university-based school administrator preparation programs are required by accreditation agencies to assess the dispositions of candidates, valid and reliable methods for doing so remain scarce. "The School Leaders Disposition Inventory"[C] (SDLI) is proposed as an instrument that has promise for identifying leadership…

  16. Anxiety measures validated in perinatal populations: a systematic review.

    PubMed

    Meades, Rose; Ayers, Susan

    2011-09-01

    Research and screening of anxiety in the perinatal period is hampered by a lack of psychometric data on self-report anxiety measures used in perinatal populations. This paper aimed to review self-report measures that have been validated with perinatal women. A systematic search was carried out of four electronic databases. Additional papers were obtained through searching identified articles. Thirty studies were identified that reported validation of an anxiety measure with perinatal women. Most commonly validated self-report measures were the General Health Questionnaire (GHQ), State-Trait Anxiety Inventory (STAI), and Hospital Anxiety and Depression Scales (HADS). Of the 30 studies included, 11 used a clinical interview to provide criterion validity. Remaining studies reported one or more other forms of validity (factorial, discriminant, concurrent and predictive) or reliability. The STAI shows criterion, discriminant and predictive validity and may be most useful for research purposes as a specific measure of anxiety. The Kessler 10 (K-10) may be the best short screening measure due to its ability to differentiate anxiety disorders. The Depression Anxiety Stress Scales 21 (DASS-21) measures multiple types of distress, shows appropriate content, and remains to be validated against clinical interview in perinatal populations. Nineteen studies did not report sensitivity or specificity data. The early stages of research into perinatal anxiety, the multitude of measures in use, and methodological differences restrict comparison of measures across studies. There is a need for further validation of self-report measures of anxiety in the perinatal period to enable accurate screening and detection of anxiety symptoms and disorders. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  18. Development and validation of an environmental fragility index (EFI) for the neotropical savannah biome.

    PubMed

    Macedo, Diego R; Hughes, Robert M; Kaufmann, Philip R; Callisto, Marcos

    2018-04-23

    Augmented production and transport of fine sediments resulting from increased human activities are major threats to freshwater ecosystems, including reservoirs and their ecosystem services. To support large scale assessment of the likelihood of soil erosion and reservoir sedimentation, we developed and validated an environmental fragility index (EFI) for the Brazilian neotropical savannah. The EFI was derived from measured geoclimatic controls on sediment production (rainfall, variation of elevation and slope, geology) and anthropogenic pressures (natural cover, road density, distance from roads and urban centers) in 111 catchments upstream of four large hydroelectric reservoirs. We evaluated the effectiveness of the EFI by regressing it against a relative bed stability index (LRBS) that assesses the degree to which stream sites draining into the reservoirs are affected by excess fine sediments. We developed the EFI on 111 of these sites and validated our model on the remaining 37 independent sites. We also compared the effectiveness of the EFI in predicting LRBS with that of a multiple linear regression model (via best-subset procedure) using 7 independent variables. The EFI was significantly correlated with the LRBS, with regression R 2 values of 0.32 and 0.40, respectively, in development and validation sites. Although the EFI and multiple regression explained similar amounts of variability (R 2  = 0.32 vs 0.36), the EFI had a higher F-ratio (51.6 vs 8.5) and better AICc value (333 vs 338). Because the sites were randomly selected and well-distributed across geoclimatic controlling factors, we were able to calculate spatially-explicit EFI values for all hydrologic units within the study area (~38,500 km 2 ). This model-based inference showed that over 65% of those units had high or extreme fragility. This methodology has great potential for application in the management, recovery, and preservation of hydroelectric reservoirs and streams in tropical river basins. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Improving accuracy of genomic prediction in Brangus cattle by adding animals with imputed low-density SNP genotypes.

    PubMed

    Lopes, F B; Wu, X-L; Li, H; Xu, J; Perkins, T; Genho, J; Ferretti, R; Tait, R G; Bauck, S; Rosa, G J M

    2018-02-01

    Reliable genomic prediction of breeding values for quantitative traits requires the availability of sufficient number of animals with genotypes and phenotypes in the training set. As of 31 October 2016, there were 3,797 Brangus animals with genotypes and phenotypes. These Brangus animals were genotyped using different commercial SNP chips. Of them, the largest group consisted of 1,535 animals genotyped by the GGP-LDV4 SNP chip. The remaining 2,262 genotypes were imputed to the SNP content of the GGP-LDV4 chip, so that the number of animals available for training the genomic prediction models was more than doubled. The present study showed that the pooling of animals with both original or imputed 40K SNP genotypes substantially increased genomic prediction accuracies on the ten traits. By supplementing imputed genotypes, the relative gains in genomic prediction accuracies on estimated breeding values (EBV) were from 12.60% to 31.27%, and the relative gain in genomic prediction accuracies on de-regressed EBV was slightly small (i.e. 0.87%-18.75%). The present study also compared the performance of five genomic prediction models and two cross-validation methods. The five genomic models predicted EBV and de-regressed EBV of the ten traits similarly well. Of the two cross-validation methods, leave-one-out cross-validation maximized the number of animals at the stage of training for genomic prediction. Genomic prediction accuracy (GPA) on the ten quantitative traits was validated in 1,106 newly genotyped Brangus animals based on the SNP effects estimated in the previous set of 3,797 Brangus animals, and they were slightly lower than GPA in the original data. The present study was the first to leverage currently available genotype and phenotype resources in order to harness genomic prediction in Brangus beef cattle. © 2018 Blackwell Verlag GmbH.

  20. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    NASA Astrophysics Data System (ADS)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.

Top