Sample records for effective model based

  1. Control Theory Perspective of Effects-Based Thinking and Operations: Modelling Operations as a Feedback Control System

    DTIC Science & Technology

    2007-11-01

    Control Theory Perspective of Effects-Based Thinking and Operations Modelling “Operations” as a Feedback Control System Philip S. E... Theory Perspective of Effects-Based Thinking and Operations Modelling “Operations” as a Feedback Control System Philip S. E. Farrell...Abstract This paper explores operations that involve effects-based thinking (EBT) using Control Theory techniques in order to highlight the concept’s

  2. Propagation Effects in Space-Based Surveillance Systems

    DTIC Science & Technology

    1982-02-01

    This report describes the first year’s effort to investigate propagation effects in space - based radars. A model was developed for analyzing the...deleterious systems effects by first developing a generalized aperture distribution that ultimately can be applied to any space - based radar configuration...The propagation effects are characterized in terms of the SATCOM model striation parameters. The form of a generalized channel model for space - based radars

  3. Haplotype-Based Genome-Wide Prediction Models Exploit Local Epistatic Interactions Among Markers

    PubMed Central

    Jiang, Yong; Schmidt, Renate H.; Reif, Jochen C.

    2018-01-01

    Genome-wide prediction approaches represent versatile tools for the analysis and prediction of complex traits. Mostly they rely on marker-based information, but scenarios have been reported in which models capitalizing on closely-linked markers that were combined into haplotypes outperformed marker-based models. Detailed comparisons were undertaken to reveal under which circumstances haplotype-based genome-wide prediction models are superior to marker-based models. Specifically, it was of interest to analyze whether and how haplotype-based models may take local epistatic effects between markers into account. Assuming that populations consisted of fully homozygous individuals, a marker-based model in which local epistatic effects inside haplotype blocks were exploited (LEGBLUP) was linearly transformable into a haplotype-based model (HGBLUP). This theoretical derivation formally revealed that haplotype-based genome-wide prediction models capitalize on local epistatic effects among markers. Simulation studies corroborated this finding. Due to its computational efficiency the HGBLUP model promises to be an interesting tool for studies in which ultra-high-density SNP data sets are studied. Applying the HGBLUP model to empirical data sets revealed higher prediction accuracies than for marker-based models for both traits studied using a mouse panel. In contrast, only a small subset of the traits analyzed in crop populations showed such a benefit. Cases in which higher prediction accuracies are observed for HGBLUP than for marker-based models are expected to be of immediate relevance for breeders, due to the tight linkage a beneficial haplotype will be preserved for many generations. In this respect the inheritance of local epistatic effects very much resembles the one of additive effects. PMID:29549092

  4. Haplotype-Based Genome-Wide Prediction Models Exploit Local Epistatic Interactions Among Markers.

    PubMed

    Jiang, Yong; Schmidt, Renate H; Reif, Jochen C

    2018-05-04

    Genome-wide prediction approaches represent versatile tools for the analysis and prediction of complex traits. Mostly they rely on marker-based information, but scenarios have been reported in which models capitalizing on closely-linked markers that were combined into haplotypes outperformed marker-based models. Detailed comparisons were undertaken to reveal under which circumstances haplotype-based genome-wide prediction models are superior to marker-based models. Specifically, it was of interest to analyze whether and how haplotype-based models may take local epistatic effects between markers into account. Assuming that populations consisted of fully homozygous individuals, a marker-based model in which local epistatic effects inside haplotype blocks were exploited (LEGBLUP) was linearly transformable into a haplotype-based model (HGBLUP). This theoretical derivation formally revealed that haplotype-based genome-wide prediction models capitalize on local epistatic effects among markers. Simulation studies corroborated this finding. Due to its computational efficiency the HGBLUP model promises to be an interesting tool for studies in which ultra-high-density SNP data sets are studied. Applying the HGBLUP model to empirical data sets revealed higher prediction accuracies than for marker-based models for both traits studied using a mouse panel. In contrast, only a small subset of the traits analyzed in crop populations showed such a benefit. Cases in which higher prediction accuracies are observed for HGBLUP than for marker-based models are expected to be of immediate relevance for breeders, due to the tight linkage a beneficial haplotype will be preserved for many generations. In this respect the inheritance of local epistatic effects very much resembles the one of additive effects. Copyright © 2018 Jiang et al.

  5. DAMS: A Model to Assess Domino Effects by Using Agent-Based Modeling and Simulation.

    PubMed

    Zhang, Laobing; Landucci, Gabriele; Reniers, Genserik; Khakzad, Nima; Zhou, Jianfeng

    2017-12-19

    Historical data analysis shows that escalation accidents, so-called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent-based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent-based modeling technique explains the domino effects from a bottom-up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher-level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large-scale complicated cases. © 2017 Society for Risk Analysis.

  6. Provision of hearing aids to children in Bangladesh: costs and cost-effectiveness of a community-based and a centre-based approach.

    PubMed

    Ekman, Björn; Borg, Johan

    2017-08-01

    The aim of this study is to provide evidence on the costs and health effects of two alternative hearing aid delivery models, a community-based and a centre-based approach. The study is set in Bangladesh and the study population is children between 12 and 18 years old. Data on resource use by participants and their caregivers were collected by a household survey. Follow-up data were collected after two months. Data on the costs to providers of the two approaches were collected by means of key informant interviews. The total cost per participant in the community-based model was BDT 6,333 (USD 79) compared with BDT 13,718 (USD 172) for the centre-based model. Both delivery models are found to be cost-effective with an estimated cost per DALY averted of BDT 17,611 (USD 220) for the community-based model and BDT 36,775 (USD 460) for the centre-based model. Using a community-based approach to deliver hearing aids to children in a resource constrained environment is a cost-effective alternative to the traditional centre-based approach. Further evidence is needed to draw conclusions for scale-up of approaches; rigorous analysis is possible using well-prepared data collection tools and working closely with sector professionals. Implications for Rehabilitation Delivery models vary by resources needed for their implementation. Community-based deliver models of hearing aids to children in low-income countries are a cost-effective alternative. The assessment of costs and effects of hearing aids delivery models in low-income countries is possible through planned collaboration between researchers and sector professionals.

  7. A web-based portfolio model as the students' final assignment: Dealing with the development of higher education trend

    NASA Astrophysics Data System (ADS)

    Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi

    2017-03-01

    This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.

  8. Serotonergic hallucinogens as translational models relevant to schizophrenia.

    PubMed

    Halberstadt, Adam L; Geyer, Mark A

    2013-11-01

    One of the oldest models of schizophrenia is based on the effects of serotonergic hallucinogens such as mescaline, psilocybin, and (+)-lysergic acid diethylamide (LSD), which act through the serotonin 5-HT(2A) receptor. These compounds produce a 'model psychosis' in normal individuals that resembles at least some of the positive symptoms of schizophrenia. Based on these similarities, and because evidence has emerged that the serotonergic system plays a role in the pathogenesis of schizophrenia in some patients, animal models relevant to schizophrenia have been developed based on hallucinogen effects. Here we review the behavioural effects of hallucinogens in four of those models, the receptor and neurochemical mechanisms for the effects and their translational relevance. Despite the difficulty of modelling hallucinogen effects in nonverbal species, animal models of schizophrenia based on hallucinogens have yielded important insights into the linkage between 5-HT and schizophrenia and have helped to identify receptor targets and interactions that could be exploited in the development of new therapeutic agents.

  9. Serotonergic Hallucinogens as Translational Models Relevant to Schizophrenia

    PubMed Central

    Halberstadt, Adam L.; Geyer, Mark A.

    2014-01-01

    One of the oldest models of schizophrenia is based on the effects of serotonergic hallucinogens such as mescaline, psilocybin, and (+)-lysergic acid diethylamide (LSD), which act through the serotonin 5-HT2A receptor. These compounds produce a “model psychosis” in normal individuals that resembles at least some of the positive symptoms of schizophrenia. Based on these similarities, and because evidence has emerged that the serotonergic system plays a role in the pathogenesis of schizophrenia in some patients, animal models relevant to schizophrenia have been developed based on hallucinogen effects. Here we review the behavioral effects of hallucinogens in four of those models, the receptor and neurochemical mechanisms for the effects, and their translational relevance. Despite the difficulty of modeling hallucinogen effects in nonverbal species, animal models of schizophrenia based on hallucinogens have yielded important insights into the linkage between 5-HT and schizophrenia and have helped to identify receptor targets and interactions that could be exploited in the development of new therapeutic agents. PMID:23942028

  10. A new surface-potential-based compact model for the MoS2 field effect transistors in active matrix display applications

    NASA Astrophysics Data System (ADS)

    Cao, Jingchen; Peng, Songang; Liu, Wei; Wu, Quantan; Li, Ling; Geng, Di; Yang, Guanhua; Ji, Zhouyu; Lu, Nianduan; Liu, Ming

    2018-02-01

    We present a continuous surface-potential-based compact model for molybdenum disulfide (MoS2) field effect transistors based on the multiple trapping release theory and the variable-range hopping theory. We also built contact resistance and velocity saturation models based on the analytical surface potential. This model is verified with experimental data and is able to accurately predict the temperature dependent behavior of the MoS2 field effect transistor. Our compact model is coded in Verilog-A, which can be implemented in a computer-aided design environment. Finally, we carried out an active matrix display simulation, which suggested that the proposed model can be successfully applied to circuit design.

  11. Likelihood-Based Random-Effect Meta-Analysis of Binary Events.

    PubMed

    Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D

    2015-01-01

    Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.

  12. The effects of geometric uncertainties on computational modelling of knee biomechanics

    NASA Astrophysics Data System (ADS)

    Meng, Qingen; Fisher, John; Wilcox, Ruth

    2017-08-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models.

  13. Strained layer relaxation effect on current crowding and efficiency improvement of GaN based LED

    NASA Astrophysics Data System (ADS)

    Aurongzeb, Deeder

    2012-02-01

    Efficiency droop effect of GaN based LED at high power and high temperature is addressed by several groups based on career delocalization and photon recycling effect(radiative recombination). We extend the previous droop models to optical loss parameters. We correlate stained layer relaxation at high temperature and high current density to carrier delocalization. We propose a third order model and show that Shockley-Hall-Read and Auger recombination effect is not enough to account for the efficiency loss. Several strained layer modification scheme is proposed based on the model.

  14. Instantiating the art of war for effects-based operations

    NASA Astrophysics Data System (ADS)

    Burns, Carla L.

    2002-07-01

    Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.

  15. Cost-effectiveness of a National Telemedicine Diabetic Retinopathy Screening Program in Singapore.

    PubMed

    Nguyen, Hai V; Tan, Gavin Siew Wei; Tapp, Robyn Jennifer; Mital, Shweta; Ting, Daniel Shu Wei; Wong, Hon Tym; Tan, Colin S; Laude, Augustinus; Tai, E Shyong; Tan, Ngiap Chuan; Finkelstein, Eric A; Wong, Tien Yin; Lamoureux, Ecosse L

    2016-12-01

    To determine the incremental cost-effectiveness of a new telemedicine technician-based assessment relative to an existing model of family physician (FP)-based assessment of diabetic retinopathy (DR) in Singapore from the health system and societal perspectives. Model-based, cost-effectiveness analysis of the Singapore Integrated Diabetic Retinopathy Program (SiDRP). A hypothetical cohort of patients aged 55 years with type 2 diabetes previously not screened for DR. The SiDRP is a new telemedicine-based DR screening program using trained technicians to assess retinal photographs. We compared the cost-effectiveness of SiDRP with the existing model in which FPs assess photographs. We developed a hybrid decision tree/Markov model to simulate the costs, effectiveness, and incremental cost-effectiveness ratio (ICER) of SiDRP relative to FP-based DR screening over a lifetime horizon. We estimated the costs from the health system and societal perspectives. Effectiveness was measured in terms of quality-adjusted life-years (QALYs). Result robustness was calculated using deterministic and probabilistic sensitivity analyses. The ICER. From the societal perspective that takes into account all costs and effects, the telemedicine-based DR screening model had significantly lower costs (total cost savings of S$173 per person) while generating similar QALYs compared with the physician-based model (i.e., 13.1 QALYs). From the health system perspective that includes only direct medical costs, the cost savings are S$144 per person. By extrapolating these data to approximately 170 000 patients with diabetes currently being screened yearly for DR in Singapore's primary care polyclinics, the present value of future cost savings associated with the telemedicine-based model is estimated to be S$29.4 million over a lifetime horizon. While generating similar health outcomes, the telemedicine-based DR screening using technicians in the primary care setting saves costs for Singapore compared with the FP model. Our data provide a strong economic rationale to expand the telemedicine-based DR screening program in Singapore and elsewhere. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  16. Final Report for The Creation of a Physics-based Ground-effect Model, Phase 2 - Inclusion of the Effects of Wind, Stratification, and Shear into the New Ground Effect Model

    NASA Technical Reports Server (NTRS)

    Sarpkaya, Turgut

    2006-01-01

    The reduction of the separation of the leading and following aircrafts is desirable to enhance the airport capacity provided that there is a physics-based operational model applicable to all regions of the flight domain (out of ground effect, OGE; near ground effect, NGE; and in ground effect, IGE) and that the quality of the quantitative input from the measurements of the prevailing atmospheric conditions and the quality of the total airport operations regarding the safety and the sound interpretation of the prevailing conditions match the quality of the analysis and numerical simulations. In the absence of an analytical solution, the physics of the flow is best expressed by a mathematical model based on numerical simulations, field and laboratory experiments, and heuristic reasoning. This report deals with the creation of a sound physics-based real-time IGE model of the aircraft wake vortices subjected to crosswind, stratification and shear.

  17. Consequences of Base Time for Redundant Signals Experiments

    PubMed Central

    Townsend, James T.; Honey, Christopher

    2007-01-01

    We report analytical and computational investigations into the effects of base time on the diagnosticity of two popular theoretical tools in the redundant signals literature: (1) the race model inequality and (2) the capacity coefficient. We show analytically and without distributional assumptions that the presence of base time decreases the sensitivity of both of these measures to model violations. We further use simulations to investigate the statistical power model selection tools based on the race model inequality, both with and without base time. Base time decreases statistical power, and biases the race model test toward conservatism. The magnitude of this biasing effect increases as we increase the proportion of total reaction time variance contributed by base time. We marshal empirical evidence to suggest that the proportion of reaction time variance contributed by base time is relatively small, and that the effects of base time on the diagnosticity of our model-selection tools are therefore likely to be minor. However, uncertainty remains concerning the magnitude and even the definition of base time. Experimentalists should continue to be alert to situations in which base time may contribute a large proportion of the total reaction time variance. PMID:18670591

  18. Effectiveness of Training Model Capacity Building for Entrepreneurship Women Based Empowerment Community

    ERIC Educational Resources Information Center

    Idawati; Mahmud, Alimuddin; Dirawan, Gufran Darma

    2016-01-01

    The purpose of this research was to determine the effectiveness of a training model for capacity building of women entrepreneurship community-based. Research type approach Research and Development Model, which refers to the model of development research that developed by Romiszowki (1996) combined with a model of development Sugiono (2011) it was…

  19. Agent-Based Modeling of Chronic Diseases: A Narrative Review and Future Research Directions

    PubMed Central

    Lawley, Mark A.; Siscovick, David S.; Zhang, Donglan; Pagán, José A.

    2016-01-01

    The United States is experiencing an epidemic of chronic disease. As the US population ages, health care providers and policy makers urgently need decision models that provide systematic, credible prediction regarding the prevention and treatment of chronic diseases to improve population health management and medical decision-making. Agent-based modeling is a promising systems science approach that can model complex interactions and processes related to chronic health conditions, such as adaptive behaviors, feedback loops, and contextual effects. This article introduces agent-based modeling by providing a narrative review of agent-based models of chronic disease and identifying the characteristics of various chronic health conditions that must be taken into account to build effective clinical- and policy-relevant models. We also identify barriers to adopting agent-based models to study chronic diseases. Finally, we discuss future research directions of agent-based modeling applied to problems related to specific chronic health conditions. PMID:27236380

  20. Agent-Based Modeling of Chronic Diseases: A Narrative Review and Future Research Directions.

    PubMed

    Li, Yan; Lawley, Mark A; Siscovick, David S; Zhang, Donglan; Pagán, José A

    2016-05-26

    The United States is experiencing an epidemic of chronic disease. As the US population ages, health care providers and policy makers urgently need decision models that provide systematic, credible prediction regarding the prevention and treatment of chronic diseases to improve population health management and medical decision-making. Agent-based modeling is a promising systems science approach that can model complex interactions and processes related to chronic health conditions, such as adaptive behaviors, feedback loops, and contextual effects. This article introduces agent-based modeling by providing a narrative review of agent-based models of chronic disease and identifying the characteristics of various chronic health conditions that must be taken into account to build effective clinical- and policy-relevant models. We also identify barriers to adopting agent-based models to study chronic diseases. Finally, we discuss future research directions of agent-based modeling applied to problems related to specific chronic health conditions.

  1. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    ERIC Educational Resources Information Center

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  2. Wayside Bearing Fault Diagnosis Based on a Data-Driven Doppler Effect Eliminator and Transient Model Analysis

    PubMed Central

    Liu, Fang; Shen, Changqing; He, Qingbo; Zhang, Ao; Liu, Yongbin; Kong, Fanrang

    2014-01-01

    A fault diagnosis strategy based on the wayside acoustic monitoring technique is investigated for locomotive bearing fault diagnosis. Inspired by the transient modeling analysis method based on correlation filtering analysis, a so-called Parametric-Mother-Doppler-Wavelet (PMDW) is constructed with six parameters, including a center characteristic frequency and five kinematic model parameters. A Doppler effect eliminator containing a PMDW generator, a correlation filtering analysis module, and a signal resampler is invented to eliminate the Doppler effect embedded in the acoustic signal of the recorded bearing. Through the Doppler effect eliminator, the five kinematic model parameters can be identified based on the signal itself. Then, the signal resampler is applied to eliminate the Doppler effect using the identified parameters. With the ability to detect early bearing faults, the transient model analysis method is employed to detect localized bearing faults after the embedded Doppler effect is eliminated. The effectiveness of the proposed fault diagnosis strategy is verified via simulation studies and applications to diagnose locomotive roller bearing defects. PMID:24803197

  3. Effects of Learning Support in Simulation-Based Physics Learning

    ERIC Educational Resources Information Center

    Chang, Kuo-En; Chen, Yu-Lung; Lin, He-Yan; Sung, Yao-Ting

    2008-01-01

    This paper describes the effects of learning support on simulation-based learning in three learning models: experiment prompting, a hypothesis menu, and step guidance. A simulation learning system was implemented based on these three models, and the differences between simulation-based learning and traditional laboratory learning were explored in…

  4. Bayesian quantile regression-based partially linear mixed-effects joint models for longitudinal data with multiple features.

    PubMed

    Zhang, Hanze; Huang, Yangxin; Wang, Wei; Chen, Henian; Langland-Orban, Barbara

    2017-01-01

    In longitudinal AIDS studies, it is of interest to investigate the relationship between HIV viral load and CD4 cell counts, as well as the complicated time effect. Most of common models to analyze such complex longitudinal data are based on mean-regression, which fails to provide efficient estimates due to outliers and/or heavy tails. Quantile regression-based partially linear mixed-effects models, a special case of semiparametric models enjoying benefits of both parametric and nonparametric models, have the flexibility to monitor the viral dynamics nonparametrically and detect the varying CD4 effects parametrically at different quantiles of viral load. Meanwhile, it is critical to consider various data features of repeated measurements, including left-censoring due to a limit of detection, covariate measurement error, and asymmetric distribution. In this research, we first establish a Bayesian joint models that accounts for all these data features simultaneously in the framework of quantile regression-based partially linear mixed-effects models. The proposed models are applied to analyze the Multicenter AIDS Cohort Study (MACS) data. Simulation studies are also conducted to assess the performance of the proposed methods under different scenarios.

  5. The Use of Theory in School Effectiveness Research Revisited

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    2013-01-01

    From an international review of 109 school effectiveness research studies, only 6 could be seen as theory driven. As the border between substantive conceptual models of educational effectiveness and theory-based models is not always very sharp, this number might be increased to 11 by including those studies that are based on models that make…

  6. Charge carrier transport in polycrystalline organic thin film based field effect transistors

    NASA Astrophysics Data System (ADS)

    Rani, Varsha; Sharma, Akanksha; Ghosh, Subhasis

    2016-05-01

    The charge carrier transport mechanism in polycrystalline thin film based organic field effect transistors (OFETs) has been explained using two competing models, multiple trapping and releases (MTR) model and percolation model. It has been shown that MTR model is most suitable for explaining charge carrier transport in grainy polycrystalline organic thin films. The energetic distribution of traps determined independently using Mayer-Neldel rule (MNR) is in excellent agreement with the values obtained by MTR model for copper phthalocyanine and pentacene based OFETs.

  7. The Dynamics of the Law of Effect: A Comparison of Models

    ERIC Educational Resources Information Center

    Navakatikyan, Michael A.; Davison, Michael

    2010-01-01

    Dynamical models based on three steady-state equations for the law of effect were constructed under the assumption that behavior changes in proportion to the difference between current behavior and the equilibrium implied by current reinforcer rates. A comparison of dynamical models showed that a model based on Navakatikyan's (2007) two-component…

  8. The effects of geometric uncertainties on computational modelling of knee biomechanics

    PubMed Central

    Fisher, John; Wilcox, Ruth

    2017-01-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models. PMID:28879008

  9. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  10. Examining the effect of down regulation under high [CO2] on the growth of soybean assimilating a semi process-based model and FACE data

    NASA Astrophysics Data System (ADS)

    Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2011-12-01

    The actual impact of elevated [CO2] with the interaction of the other climatic factors on the crop growth is still debated. In many process-based crop models, the response of photosynthesis per single leaf to environmental factors is basically described using the biochemical model of Farquhar et al. (1980). However, the decline in photosynthetic enhancement known as down regulation has not been taken into account. On the other hand, the mechanisms causing photosynthetic down regulation is still unknown, which makes it difficult to include the effect of down regulation into process-based crop models. The current results of Free-air CO2 enrichment (FACE) experiments have reported the effect of down regulation under actual environments. One of the effective approaches to involve these results into future crop yield prediction is developing a semi process-based crop growth model, which includes the effect of photosynthetic down regulation as a statistical model, and assimilating the data obtained in FACE experiments. In this study, we statistically estimated the parameters of a semi process-based model for soybean growth ('SPM-soybean') using a hierarchical Baysian method with the FACE data on soybeans (Morgan et al. 2005). We also evaluated the effect of down regulation on the soybean yield in future climatic conditions. The model selection analysis showed that the effective correction to the overestimation of the Farquhar's biochemical C3 model was to reduce the maximum rate of carboxylation (Vcmax) under elevated [CO2]. However, interestingly, the difference in the estimated final crop yields between the corrected model and the non-corrected model was very slight (Fig.1a) for future projection under climate change scenario (Miroc-ESM). This was due to that the reduction in Vcmax also brought about the reduction of the base dark respiration rate of leaves. Because the dark respiration rate exponentially increases with temperature, the slight difference in base respiration rate becomes a large difference under high temperature under the future climate scenarios. In other words, if the temperature rise is very small or zero under elevated [CO2] condition, the effect of down regulation significantly appears (Fig.1b). This result suggest that further experimental data that considering high CO2 effect and high temperature effect in field conditions should be important and elaborate the model projection of the future crop yield through data assimilation method.

  11. TEAM-HF Cost-Effectiveness Model: A Web-Based Program Designed to Evaluate the Cost-Effectiveness of Disease Management Programs in Heart Failure

    PubMed Central

    Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.

    2015-01-01

    Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504

  12. Simulated Students and Classroom Use of Model-Based Intelligent Tutoring

    NASA Technical Reports Server (NTRS)

    Koedinger, Kenneth R.

    2008-01-01

    Two educational uses of models and simulations: 1) Students create models and use simulations ; and 2) Researchers create models of learners to guide development of reliably effective materials. Cognitive tutors simulate and support tutoring - data is crucial to create effective model. Pittsburgh Science of Learning Center: Resources for modeling, authoring, experimentation. Repository of data and theory. Examples of advanced modeling efforts: SimStudent learns rule-based model. Help-seeking model: Tutors metacognition. Scooter uses machine learning detectors of student engagement.

  13. A Nonlinear Model for Gene-Based Gene-Environment Interaction.

    PubMed

    Sa, Jian; Liu, Xu; He, Tao; Liu, Guifen; Cui, Yuehua

    2016-06-04

    A vast amount of literature has confirmed the role of gene-environment (G×E) interaction in the etiology of complex human diseases. Traditional methods are predominantly focused on the analysis of interaction between a single nucleotide polymorphism (SNP) and an environmental variable. Given that genes are the functional units, it is crucial to understand how gene effects (rather than single SNP effects) are influenced by an environmental variable to affect disease risk. Motivated by the increasing awareness of the power of gene-based association analysis over single variant based approach, in this work, we proposed a sparse principle component regression (sPCR) model to understand the gene-based G×E interaction effect on complex disease. We first extracted the sparse principal components for SNPs in a gene, then the effect of each principal component was modeled by a varying-coefficient (VC) model. The model can jointly model variants in a gene in which their effects are nonlinearly influenced by an environmental variable. In addition, the varying-coefficient sPCR (VC-sPCR) model has nice interpretation property since the sparsity on the principal component loadings can tell the relative importance of the corresponding SNPs in each component. We applied our method to a human birth weight dataset in Thai population. We analyzed 12,005 genes across 22 chromosomes and found one significant interaction effect using the Bonferroni correction method and one suggestive interaction. The model performance was further evaluated through simulation studies. Our model provides a system approach to evaluate gene-based G×E interaction.

  14. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Variable-intercept panel model for deformation zoning of a super-high arch dam.

    PubMed

    Shi, Zhongwen; Gu, Chongshi; Qin, Dong

    2016-01-01

    This study determines dam deformation similarity indexes based on an analysis of deformation zoning features and panel data clustering theory, with comprehensive consideration to the actual deformation law of super-high arch dams and the spatial-temporal features of dam deformation. Measurement methods of these indexes are studied. Based on the established deformation similarity criteria, the principle used to determine the number of dam deformation zones is constructed through entropy weight method. This study proposes the deformation zoning method for super-high arch dams and the implementation steps, analyzes the effect of special influencing factors of different dam zones on the deformation, introduces dummy variables that represent the special effect of dam deformation, and establishes a variable-intercept panel model for deformation zoning of super-high arch dams. Based on different patterns of the special effect in the variable-intercept panel model, two panel analysis models were established to monitor fixed and random effects of dam deformation. Hausman test method of model selection and model effectiveness assessment method are discussed. Finally, the effectiveness of established models is verified through a case study.

  17. SU-E-T-378: Evaluation of An Analytical Model for the Inter-Seed Attenuation Effect in 103-Pd Multi-Seed Implant Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safigholi, H; Soliman, A; Song, W

    Purpose: Brachytherapy treatment planning systems based on TG-43 protocol calculate the dose in water and neglects the heterogeneity effect of seeds in multi-seed implant brachytherapy. In this research, the accuracy of a novel analytical model that we propose for the inter-seed attenuation effect (ISA) for 103-Pd seed model is evaluated. Methods: In the analytical model, dose perturbation due to the ISA effect for each seed in an LDR multi-seed implant for 103-Pd is calculated by assuming that the seed of interest is active and the other surrounding seeds are inactive. The cumulative dosimetric effect of all seeds is then summedmore » using the superposition principle. The model is based on pre Monte Carlo (MC) simulated 3D kernels of the dose perturbations caused by the ISA effect. The cumulative ISA effect due to multiple surrounding seeds is obtained by a simple multiplication of the individual ISA effect by each seed, the effect of which is determined by the distance from the seed of interest. This novel algorithm is then compared with full MC water-based simulations (FMCW). Results: The results show that the dose perturbation model we propose is in excellent agreement with the FMCW values for a case with three seeds separated by 1 cm. The average difference of the model and the FMCW simulations was less than 8%±2%. Conclusion: Using the proposed novel analytical ISA effect model, one could expedite the corrections due to the ISA dose perturbation effects during permanent seed 103-Pd brachytherapy planning with minimal increase in time since the model is based on multiplications and superposition. This model can be applied, in principle, to any other brachytherapy seeds. Further work is necessary to validate this model on a more complicated geometry as well.« less

  18. A Resource-Based Modelling Framework to Assess Habitat Suitability for Steppe Birds in Semiarid Mediterranean Agricultural Systems

    PubMed Central

    Cardador, Laura; De Cáceres, Miquel; Bota, Gerard; Giralt, David; Casas, Fabián; Arroyo, Beatriz; Mougeot, François; Cantero-Martínez, Carlos; Moncunill, Judit; Butler, Simon J.; Brotons, Lluís

    2014-01-01

    European agriculture is undergoing widespread changes that are likely to have profound impacts on farmland biodiversity. The development of tools that allow an assessment of the potential biodiversity effects of different land-use alternatives before changes occur is fundamental to guiding management decisions. In this study, we develop a resource-based model framework to estimate habitat suitability for target species, according to simple information on species’ key resource requirements (diet, foraging habitat and nesting site), and examine whether it can be used to link land-use and local species’ distribution. We take as a study case four steppe bird species in a lowland area of the north-eastern Iberian Peninsula. We also compare the performance of our resource-based approach to that obtained through habitat-based models relating species’ occurrence and land-cover variables. Further, we use our resource-based approach to predict the effects that change in farming systems can have on farmland bird habitat suitability and compare these predictions with those obtained using the habitat-based models. Habitat suitability estimates generated by our resource-based models performed similarly (and better for one study species) than habitat based-models when predicting current species distribution. Moderate prediction success was achieved for three out of four species considered by resource-based models and for two of four by habitat-based models. Although, there is potential for improving the performance of resource-based models, they provide a structure for using available knowledge of the functional links between agricultural practices, provision of key resources and the response of organisms to predict potential effects of changing land-uses in a variety of context or the impacts of changes such as altered management practices that are not easily incorporated into habitat-based models. PMID:24667825

  19. Effects of field plot size on prediction accuracy of aboveground biomass in airborne laser scanning-assisted inventories in tropical rain forests of Tanzania.

    PubMed

    Mauya, Ernest William; Hansen, Endre Hofstad; Gobakken, Terje; Bollandsås, Ole Martin; Malimbwi, Rogers Ernest; Næsset, Erik

    2015-12-01

    Airborne laser scanning (ALS) has recently emerged as a promising tool to acquire auxiliary information for improving aboveground biomass (AGB) estimation in sample-based forest inventories. Under design-based and model-assisted inferential frameworks, the estimation relies on a model that relates the auxiliary ALS metrics to AGB estimated on ground plots. The size of the field plots has been identified as one source of model uncertainty because of the so-called boundary effects which increases with decreasing plot size. Recent research in tropical forests has aimed to quantify the boundary effects on model prediction accuracy, but evidence of the consequences for the final AGB estimates is lacking. In this study we analyzed the effect of field plot size on model prediction accuracy and its implication when used in a model-assisted inferential framework. The results showed that the prediction accuracy of the model improved as the plot size increased. The adjusted R 2 increased from 0.35 to 0.74 while the relative root mean square error decreased from 63.6 to 29.2%. Indicators of boundary effects were identified and confirmed to have significant effects on the model residuals. Variance estimates of model-assisted mean AGB relative to corresponding variance estimates of pure field-based AGB, decreased with increasing plot size in the range from 200 to 3000 m 2 . The variance ratio of field-based estimates relative to model-assisted variance ranged from 1.7 to 7.7. This study showed that the relative improvement in precision of AGB estimation when increasing field-plot size, was greater for an ALS-assisted inventory compared to that of a pure field-based inventory.

  20. Examining the Impact of the Walking School Bus With an Agent-Based Model

    PubMed Central

    Diez-Roux, Ana; Evenson, Kelly R.; Colabianchi, Natalie

    2014-01-01

    We used an agent-based model to examine the impact of the walking school bus (WSB) on children’s active travel to school. We identified a synergistic effect of the WSB with other intervention components such as an educational campaign designed to improve attitudes toward active travel to school. Results suggest that to maximize active travel to school, children should arrive on time at “bus stops” to allow faster WSB walking speeds. We also illustrate how an agent-based model can be used to identify the location of routes maximizing the effects of the WSB on active travel. Agent-based models can be used to examine plausible effects of the WSB on active travel to school under various conditions and to identify ways of implementing the WSB that maximize its effectiveness. PMID:24832410

  1. Piezoresistivity, mechanisms and model of cement-based materials with CNT/NCB composite fillers

    NASA Astrophysics Data System (ADS)

    Zhang, Liqing; Ding, Siqi; Dong, Sufen; Li, Zhen; Ouyang, Jian; Yu, Xun; Han, Baoguo

    2017-12-01

    The use of conductive cement-based materials as sensors has attracted intense interest over past decades. In this paper, carbon nanotube (CNT)/nano carbon black (NCB) composite fillers made by electrostatic self-assembly are used to fabricate conductive cement-based materials. Electrical and piezoresistive properties of the fabricated cement-based materials are investigated. Effect of filler content, load amplitudes and rate on piezoresistive property within elastic regime and piezoresistive behaviors during compressive loading to destruction are explored. Finally, a model describing piezoresistive property of cement-based materials with CNT/NCB composite fillers is established based on the effective conductive path and tunneling effect theory. The research results demonstrate that filler content and load amplitudes have obvious effect on piezoresistive property of the composites materials, while load rate has little influence on piezoresistive property. During compressive loading to destruction, the composites also show sensitive piezoresistive property. Therefore, the cement-based composites can be used to monitor the health state of structures during their whole life. The built model can well describe the piezoresistive property of the composites during compressive loading to destruction. The good match between the model and experiment data indicates that tunneling effect actually contributes to piezoresistive phenomenon.

  2. Video-Based Modeling: Differential Effects due to Treatment Protocol

    ERIC Educational Resources Information Center

    Mason, Rose A.; Ganz, Jennifer B.; Parker, Richard I.; Boles, Margot B.; Davis, Heather S.; Rispoli, Mandy J.

    2013-01-01

    Identifying evidence-based practices for individuals with disabilities requires specification of procedural implementation. Video-based modeling (VBM), consisting of both video self-modeling and video modeling with others as model (VMO), is one class of interventions that has frequently been explored in the literature. However, current information…

  3. The Effectiveness of Project Based Learning in Trigonometry

    NASA Astrophysics Data System (ADS)

    Gerhana, M. T. C.; Mardiyana, M.; Pramudya, I.

    2017-09-01

    This research aimed to explore the effectiveness of Project-Based Learning (PjBL) with scientific approach viewed from interpersonal intelligence toward students’ achievement learning in mathematics. This research employed quasi experimental research. The subjects of this research were grade X MIPA students in Sleman Yogyakarta. The result of the research showed that project-based learning model is more effective to generate students’ mathematics learning achievement that classical model with scientific approach. This is because in PjBL model students are more able to think actively and creatively. Students are faced with a pleasant atmosphere to solve a problem in everyday life. The use of project-based learning model is expected to be the choice of teachers to improve mathematics education.

  4. Don't Think, Just Feel the Music: Individuals with Strong Pavlovian-to-Instrumental Transfer Effects Rely Less on Model-based Reinforcement Learning.

    PubMed

    Sebold, Miriam; Schad, Daniel J; Nebe, Stephan; Garbusow, Maria; Jünger, Elisabeth; Kroemer, Nils B; Kathmann, Norbert; Zimmermann, Ulrich S; Smolka, Michael N; Rapp, Michael A; Heinz, Andreas; Huys, Quentin J M

    2016-07-01

    Behavioral choice can be characterized along two axes. One axis distinguishes reflexive, model-free systems that slowly accumulate values through experience and a model-based system that uses knowledge to reason prospectively. The second axis distinguishes Pavlovian valuation of stimuli from instrumental valuation of actions or stimulus-action pairs. This results in four values and many possible interactions between them, with important consequences for accounts of individual variation. We here explored whether individual variation along one axis was related to individual variation along the other. Specifically, we asked whether individuals' balance between model-based and model-free learning was related to their tendency to show Pavlovian interferences with instrumental decisions. In two independent samples with a total of 243 participants, Pavlovian-instrumental transfer effects were negatively correlated with the strength of model-based reasoning in a two-step task. This suggests a potential common underlying substrate predisposing individuals to both have strong Pavlovian interference and be less model-based and provides a framework within which to interpret the observation of both effects in addiction.

  5. Wind Plant Power Optimization through Yaw Control using a Parametric Model for Wake Effects -- A CFD Simulation Study

    DOE PAGES

    Gebraad, P. M. O.; Teeuwisse, F. W.; van Wingerden, J. W.; ...

    2016-01-01

    This article presents a wind plant control strategy that optimizes the yaw settings of wind turbines for improved energy production of the whole wind plant by taking into account wake effects. The optimization controller is based on a novel internal parametric model for wake effects, called the FLOw Redirection and Induction in Steady-state (FLORIS) model. The FLORIS model predicts the steady-state wake locations and the effective flow velocities at each turbine, and the resulting turbine electrical energy production levels, as a function of the axial induction and the yaw angle of the different rotors. The FLORIS model has a limitedmore » number of parameters that are estimated based on turbine electrical power production data. In high-fidelity computational fluid dynamics simulations of a small wind plant, we demonstrate that the optimization control based on the FLORIS model increases the energy production of the wind plant, with a reduction of loads on the turbines as an additional effect.« less

  6. 75 FR 2523 - Office of Innovation and Improvement; Overview Information; Arts in Education Model Development...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... that is based on rigorous scientifically based research methods to assess the effectiveness of a...) Relies on measurements or observational methods that provide reliable and valid data across evaluators... of innovative, cohesive models that are based on research and have demonstrated that they effectively...

  7. Model Selection with the Linear Mixed Model for Longitudinal Data

    ERIC Educational Resources Information Center

    Ryoo, Ji Hoon

    2011-01-01

    Model building or model selection with linear mixed models (LMMs) is complicated by the presence of both fixed effects and random effects. The fixed effects structure and random effects structure are codependent, so selection of one influences the other. Most presentations of LMM in psychology and education are based on a multilevel or…

  8. Assessment of spatial discordance of primary and effective seed dispersal of European beech (Fagus sylvatica L.) by ecological and genetic methods.

    PubMed

    Millerón, M; López de Heredia, U; Lorenzo, Z; Alonso, J; Dounavi, A; Gil, L; Nanos, N

    2013-03-01

    Spatial discordance between primary and effective dispersal in plant populations indicates that postdispersal processes erase the seed rain signal in recruitment patterns. Five different models were used to test the spatial concordance of the primary and effective dispersal patterns in a European beech (Fagus sylvatica) population from central Spain. An ecological method was based on classical inverse modelling (SSS), using the number of seed/seedlings as input data. Genetic models were based on direct kernel fitting of mother-to-offspring distances estimated by a parentage analysis or were spatially explicit models based on the genotype frequencies of offspring (competing sources model and Moran-Clark's Model). A fully integrated mixed model was based on inverse modelling, but used the number of genotypes as input data (gene shadow model). The potential sources of error and limitations of each seed dispersal estimation method are discussed. The mean dispersal distances for seeds and saplings estimated with these five methods were higher than those obtained by previous estimations for European beech forests. All the methods show strong discordance between primary and effective dispersal kernel parameters, and for dispersal directionality. While seed rain was released mostly under the canopy, saplings were established far from mother trees. This discordant pattern may be the result of the action of secondary dispersal by animals or density-dependent effects; that is, the Janzen-Connell effect. © 2013 Blackwell Publishing Ltd.

  9. Formalizing the Role of Agent-Based Modeling in Causal Inference and Epidemiology

    PubMed Central

    Marshall, Brandon D. L.; Galea, Sandro

    2015-01-01

    Calls for the adoption of complex systems approaches, including agent-based modeling, in the field of epidemiology have largely centered on the potential for such methods to examine complex disease etiologies, which are characterized by feedback behavior, interference, threshold dynamics, and multiple interacting causal effects. However, considerable theoretical and practical issues impede the capacity of agent-based methods to examine and evaluate causal effects and thus illuminate new areas for intervention. We build on this work by describing how agent-based models can be used to simulate counterfactual outcomes in the presence of complexity. We show that these models are of particular utility when the hypothesized causal mechanisms exhibit a high degree of interdependence between multiple causal effects and when interference (i.e., one person's exposure affects the outcome of others) is present and of intrinsic scientific interest. Although not without challenges, agent-based modeling (and complex systems methods broadly) represent a promising novel approach to identify and evaluate complex causal effects, and they are thus well suited to complement other modern epidemiologic methods of etiologic inquiry. PMID:25480821

  10. Impact of different policies on unhealthy dietary behaviors in an urban adult population: an agent-based simulation model.

    PubMed

    Zhang, Donglan; Giabbanelli, Philippe J; Arah, Onyebuchi A; Zimmerman, Frederick J

    2014-07-01

    Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems.

  11. Effect of Grain Boundaries on the Performance of Thin-Film-Based Polycrystalline Silicon Solar Cells: A Numerical Modeling

    NASA Astrophysics Data System (ADS)

    Chhetri, Nikita; Chatterjee, Somenath

    2018-01-01

    Solar cells/photovoltaic, a renewable energy source, is appraised to be the most effective alternative to the conventional electrical energy generator. A cost-effective alternative of crystalline wafer-based solar cell is thin-film polycrystalline-based solar cell. This paper reports the numerical analysis of dependency of the solar cell parameters (i.e., efficiency, fill factor, open-circuit voltage and short-circuit current density) on grain size for thin-film-based polycrystalline silicon (Si) solar cells. A minority carrier lifetime model is proposed to do a correlation between the grains, grain boundaries and lifetime for thin-film-based polycrystalline Si solar cells in MATLAB environment. As observed, the increment in the grain size diameter results in increase in minority carrier lifetime in polycrystalline Si thin film. A non-equivalent series resistance double-diode model is used to find the dark as well as light (AM1.5) current-voltage (I-V) characteristics for thin-film-based polycrystalline Si solar cells. To optimize the effectiveness of the proposed model, a successive approximation method is used and the corresponding fitting parameters are obtained. The model is validated with the experimentally obtained results reported elsewhere. The experimentally reported solar cell parameters can be found using the proposed model described here.

  12. Modified polarized geometrical attenuation model for bidirectional reflection distribution function based on random surface microfacet theory.

    PubMed

    Liu, Hong; Zhu, Jingping; Wang, Kai

    2015-08-24

    The geometrical attenuation model given by Blinn was widely used in the geometrical optics bidirectional reflectance distribution function (BRDF) models. Blinn's geometrical attenuation model based on symmetrical V-groove assumption and ray scalar theory causes obvious inaccuracies in BRDF curves and negatives the effects of polarization. Aiming at these questions, a modified polarized geometrical attenuation model based on random surface microfacet theory is presented by combining of masking and shadowing effects and polarized effect. The p-polarized, s-polarized and unpolarized geometrical attenuation functions are given in their separate expressions and are validated with experimental data of two samples. It shows that the modified polarized geometrical attenuation function reaches better physical rationality, improves the precision of BRDF model, and widens the applications for different polarization.

  13. Meta-analyses of Theory use in Medication Adherence Intervention Research

    PubMed Central

    Conn, Vicki S.; Enriquez, Maithe; Ruppar, Todd M.; Chan, Keith C.

    2016-01-01

    Objective This systematic review applied meta-analytic procedures to integrate primary research that examined theory- or model-linked medication adherence interventions. Methods Extensive literature searching strategies were used to locate trials testing interventions with medication adherence behavior outcomes measured by electronic event monitoring, pharmacy refills, pill counts, and self-reports. Random-effects model analysis was used to calculate standardized mean difference effect sizes for medication adherence outcomes. Results Codable data were extracted from 146 comparisons with 19,348 participants. The most common theories and models were social cognitive theory and motivational interviewing. The overall weighted effect size for all interventions comparing treatment and control participants was 0.294. The effect size for interventions based on single-theories was 0.323 and for multiple-theory interventions was 0.214. Effect sizes for individual theories and models ranged from 0.041 to 0.447. The largest effect sizes were for interventions based on the health belief model (0.477) and adult learning theory (0.443). The smallest effect sizes were for interventions based on PRECEDE (0.041) and self-regulation (0.118). Conclusion These findings suggest that theory- and model-linked interventions have a significant but modest effect on medication adherence outcomes. PMID:26931748

  14. Neural-Based Compensation of Nonlinearities in an Airplane Longitudinal Model with Dynamic-Inversion Control

    PubMed Central

    Li, YuHui; Jin, FeiTeng

    2017-01-01

    The inversion design approach is a very useful tool for the complex multiple-input-multiple-output nonlinear systems to implement the decoupling control goal, such as the airplane model and spacecraft model. In this work, the flight control law is proposed using the neural-based inversion design method associated with the nonlinear compensation for a general longitudinal model of the airplane. First, the nonlinear mathematic model is converted to the equivalent linear model based on the feedback linearization theory. Then, the flight control law integrated with this inversion model is developed to stabilize the nonlinear system and relieve the coupling effect. Afterwards, the inversion control combined with the neural network and nonlinear portion is presented to improve the transient performance and attenuate the uncertain effects on both external disturbances and model errors. Finally, the simulation results demonstrate the effectiveness of this controller. PMID:29410680

  15. Team Modelling: Survey of Experimental Platforms (Modelisation d’equipes : Examen de plate-formes experimentales)

    DTIC Science & Technology

    2006-09-01

    Control Force Agility Shared Situational Awareness Attentional Demand Interoperability Network Based Operations Effect Based Operations Speed of...Command Self Synchronization Reach Back Reach Forward Information Superiority Increased Mission Effectiveness Humansystems® Team Modelling...communication effectiveness and Distributed Mission Training (DMT) effectiveness . The NASA Ames Centre - Distributed Research Facilities platform could

  16. Evidence-Based Reform: Enhancing Language and Literacy in Early Childhood Education

    ERIC Educational Resources Information Center

    Slavin, Robert E.; Chambers, Bette

    2017-01-01

    Evidence-based reform is transforming education at all levels, both in providing effective models for use in schools and in linking policy to effective practice on a broad scale. As early education moves from a concern with effects of preschool versus no preschool to focus on creating and evaluating effective preschool models capable of improving…

  17. Sequential optimization of a terrestrial biosphere model constrained by multiple satellite based products

    NASA Astrophysics Data System (ADS)

    Ichii, K.; Kondo, M.; Wang, W.; Hashimoto, H.; Nemani, R. R.

    2012-12-01

    Various satellite-based spatial products such as evapotranspiration (ET) and gross primary productivity (GPP) are now produced by integration of ground and satellite observations. Effective use of these multiple satellite-based products in terrestrial biosphere models is an important step toward better understanding of terrestrial carbon and water cycles. However, due to the complexity of terrestrial biosphere models with large number of model parameters, the application of these spatial data sets in terrestrial biosphere models is difficult. In this study, we established an effective but simple framework to refine a terrestrial biosphere model, Biome-BGC, using multiple satellite-based products as constraints. We tested the framework in the monsoon Asia region covered by AsiaFlux observations. The framework is based on the hierarchical analysis (Wang et al. 2009) with model parameter optimization constrained by satellite-based spatial data. The Biome-BGC model is separated into several tiers to minimize the freedom of model parameter selections and maximize the independency from the whole model. For example, the snow sub-model is first optimized using MODIS snow cover product, followed by soil water sub-model optimized by satellite-based ET (estimated by an empirical upscaling method; Support Vector Regression (SVR) method; Yang et al. 2007), photosynthesis model optimized by satellite-based GPP (based on SVR method), and respiration and residual carbon cycle models optimized by biomass data. As a result of initial assessment, we found that most of default sub-models (e.g. snow, water cycle and carbon cycle) showed large deviations from remote sensing observations. However, these biases were removed by applying the proposed framework. For example, gross primary productivities were initially underestimated in boreal and temperate forest and overestimated in tropical forests. However, the parameter optimization scheme successfully reduced these biases. Our analysis shows that terrestrial carbon and water cycle simulations in monsoon Asia were greatly improved, and the use of multiple satellite observations with this framework is an effective way for improving terrestrial biosphere models.

  18. Modeling vs. Coaching of Argumentation in a Case-Based Learning Environment.

    ERIC Educational Resources Information Center

    Li, Tiancheng; And Others

    The major purposes of this study are: (1) to investigate and compare the effectiveness of two instructional strategies, modeling and coaching on helping students to articulate and support their decisions in a case-based learning environment; (2) to compare the effectiveness of modeling and coaching on helping students address essential criteria in…

  19. Mixed effects versus fixed effects modelling of binary data with inter-subject variability.

    PubMed

    Murphy, Valda; Dunne, Adrian

    2005-04-01

    The question of whether or not a mixed effects model is required when modelling binary data with inter-subject variability and within subject correlation was reported in this journal by Yano et al. (J. Pharmacokin. Pharmacodyn. 28:389-412 [2001]). That report used simulation experiments to demonstrate that, under certain circumstances, the use of a fixed effects model produced more accurate estimates of the fixed effect parameters than those produced by a mixed effects model. The Laplace approximation to the likelihood was used when fitting the mixed effects model. This paper repeats one of those simulation experiments, with two binary observations recorded for every subject, and uses both the Laplace and the adaptive Gaussian quadrature approximations to the likelihood when fitting the mixed effects model. The results show that the estimates produced using the Laplace approximation include a small number of extreme outliers. This was not the case when using the adaptive Gaussian quadrature approximation. Further examination of these outliers shows that they arise in situations in which the Laplace approximation seriously overestimates the likelihood in an extreme region of the parameter space. It is also demonstrated that when the number of observations per subject is increased from two to three, the estimates based on the Laplace approximation no longer include any extreme outliers. The root mean squared error is a combination of the bias and the variability of the estimates. Increasing the sample size is known to reduce the variability of an estimator with a consequent reduction in its root mean squared error. The estimates based on the fixed effects model are inherently biased and this bias acts as a lower bound for the root mean squared error of these estimates. Consequently, it might be expected that for data sets with a greater number of subjects the estimates based on the mixed effects model would be more accurate than those based on the fixed effects model. This is borne out by the results of a further simulation experiment with an increased number of subjects in each set of data. The difference in the interpretation of the parameters of the fixed and mixed effects models is discussed. It is demonstrated that the mixed effects model and parameter estimates can be used to estimate the parameters of the fixed effects model but not vice versa.

  20. A quantum wave based compact modeling approach for the current in ultra-short DG MOSFETs suitable for rapid multi-scale simulations

    NASA Astrophysics Data System (ADS)

    Hosenfeld, Fabian; Horst, Fabian; Iñíguez, Benjamín; Lime, François; Kloes, Alexander

    2017-11-01

    Source-to-drain (SD) tunneling decreases the device performance in MOSFETs falling below the 10 nm channel length. Modeling quantum mechanical effects including SD tunneling has gained more importance specially for compact model developers. The non-equilibrium Green's function (NEGF) has become a state-of-the-art method for nano-scaled device simulation in the past years. In the sense of a multi-scale simulation approach it is necessary to bridge the gap between compact models with their fast and efficient calculation of the device current, and numerical device models which consider quantum effects of nano-scaled devices. In this work, an NEGF based analytical model for nano-scaled double-gate (DG) MOSFETs is introduced. The model consists of a closed-form potential solution of a classical compact model and a 1D NEGF formalism for calculating the device current, taking into account quantum mechanical effects. The potential calculation omits the iterative coupling and allows the straightforward current calculation. The model is based on a ballistic NEGF approach whereby backscattering effects are considered as second order effect in a closed-form. The accuracy and scalability of the non-iterative DG MOSFET model is inspected in comparison with numerical NanoMOS TCAD data for various channel lengths. With the help of this model investigations on short-channel and temperature effects are performed.

  1. Effect of Modeling-Based Activities Developed Using Virtual Environments and Concrete Objects on Spatial Thinking and Mental Rotation Skills

    ERIC Educational Resources Information Center

    Yurt, Eyup; Sunbul, Ali Murat

    2012-01-01

    In this study, the effect of modeling based activities using virtual environments and concrete objects on spatial thinking and mental rotation skills was investigated. The study was designed as a pretest-posttest model with a control group, which is one of the experimental research models. The study was carried out on sixth grade students…

  2. Improving Learning for All Students through Equity-Based Inclusive Reform Practices: Effectiveness of a Fully Integrated Schoolwide Model on Student Reading and Math Achievement

    ERIC Educational Resources Information Center

    Choi, Jeong Hoon; Meisenheimer, Jessica M.; McCart, Amy B.; Sailor, Wayne

    2017-01-01

    The present investigation examines the schoolwide applications model (SAM) as a potentially effective school reform model for increasing equity-based inclusive education practices while enhancing student reading and math achievement for all students. A 3-year quasi-experimental comparison group analysis using latent growth modeling (LGM) was used…

  3. A Multi-layer Dynamic Model for Coordination Based Group Decision Making in Water Resource Allocation and Scheduling

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Zhang, Xingnan; Li, Chenming; Wang, Jianying

    Management of group decision-making is an important issue in water source management development. In order to overcome the defects in lacking of effective communication and cooperation in the existing decision-making models, this paper proposes a multi-layer dynamic model for coordination in water resource allocation and scheduling based group decision making. By introducing the scheme-recognized cooperative satisfaction index and scheme-adjusted rationality index, the proposed model can solve the problem of poor convergence of multi-round decision-making process in water resource allocation and scheduling. Furthermore, the problem about coordination of limited resources-based group decision-making process can be solved based on the effectiveness of distance-based group of conflict resolution. The simulation results show that the proposed model has better convergence than the existing models.

  4. Universal core model for multiple-gate field-effect transistors with short channel and quantum mechanical effects

    NASA Astrophysics Data System (ADS)

    Shin, Yong Hyeon; Bae, Min Soo; Park, Chuntaek; Park, Joung Won; Park, Hyunwoo; Lee, Yong Ju; Yun, Ilgu

    2018-06-01

    A universal core model for multiple-gate (MG) field-effect transistors (FETs) with short channel effects (SCEs) and quantum mechanical effects (QMEs) is proposed. By using a Young’s approximation based solution for one-dimensional Poisson’s equations the total inversion charge density (Q inv ) in the channel is modeled for double-gate (DG) and surrounding-gate SG (SG) FETs, following which a universal charge model is derived based on the similarity of the solutions, including for quadruple-gate (QG) FETs. For triple-gate (TG) FETs, the average of DG and QG FETs are used. A SCEs model is also proposed considering the potential difference between the channel’s surface and center. Finally, a QMEs model for MG FETs is developed using the quantum correction compact model. The proposed universal core model is validated on commercially available three-dimensional ATLAS numerical simulations.

  5. The effectiveness of clinical problem-based learning model of medico-jurisprudence education on general law knowledge for Obstetrics/Gynecological interns.

    PubMed

    Chang, Hui-Chin; Wang, Ning-Yen; Ko, Wen-Ru; Yu, You-Tsz; Lin, Long-Yau; Tsai, Hui-Fang

    2017-06-01

    The effective education method of medico-jurisprudence for medical students is unclear. The study was designed to evaluate the effectiveness of problem-based learning (PBL) model teaching medico-jurisprudence in clinical setting on General Law Knowledge (GLK) for medical students. Senior medical students attending either campus-based law curriculum or Obstetrics/Gynecology (Ob/Gyn) clinical setting morning meeting from February to July in 2015 were enrolled. A validated questionnaire comprising 45 questions were completed before and after the law education. The interns attending clinical setting small group improvisation medico-jurisprudence problem-based learning education had significantly better GLK scores than the GLK of students attending campus-based medical law education course after the period studied. PBL teaching model of medico-jurisprudence is an ideal alternative pedagogy model in medical law education curriculum. Copyright © 2017. Published by Elsevier B.V.

  6. A game theory-based trust measurement model for social networks.

    PubMed

    Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong

    2016-01-01

    In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.

  7. Example-Based Learning: Effects of Model Expertise in Relation to Student Expertise

    ERIC Educational Resources Information Center

    Boekhout, Paul; van Gog, Tamara; van de Wiel, Margje W. J.; Gerards-Last, Dorien; Geraets, Jacques

    2010-01-01

    Background: Worked examples are very effective for novice learners. They typically present a written-out ideal (didactical) solution for learners to study. Aims: This study used worked examples of patient history taking in physiotherapy that presented a "non"-didactical solution (i.e., based on actual performance). The effects of model expertise…

  8. School Processes Mediate School Compositional Effects: Model Specification and Estimation

    ERIC Educational Resources Information Center

    Liu, Hongqiang; Van Damme, Jan; Gielen, Sarah; Van Den Noortgate, Wim

    2015-01-01

    School composition effects have been consistently verified, but few studies ever attempted to study how school composition affects school achievement. Based on prior research findings, we employed multilevel mediation modeling to examine whether school processes mediate the effect of school composition upon school outcomes based on the data of 28…

  9. Using Design-Based Latent Growth Curve Modeling with Cluster-Level Predictor to Address Dependency

    ERIC Educational Resources Information Center

    Wu, Jiun-Yu; Kwok, Oi-Man; Willson, Victor L.

    2014-01-01

    The authors compared the effects of using the true Multilevel Latent Growth Curve Model (MLGCM) with single-level regular and design-based Latent Growth Curve Models (LGCM) with or without the higher-level predictor on various criterion variables for multilevel longitudinal data. They found that random effect estimates were biased when the…

  10. A Model for Measuring Effectiveness of an Online Course

    ERIC Educational Resources Information Center

    Mashaw, Bijan

    2012-01-01

    As a result of this research, a quantitative model and a procedure have been developed to create an online mentoring effectiveness index (EI). To develop the model, mentoring and teaching effectiveness are defined, and then the constructs and factors of effectiveness are identified. The model's construction is based on the theory that…

  11. Study of helicopterroll control effectiveness criteria

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Bourne, Simon M.; Curtiss, Howard C., Jr.; Hindson, William S.; Hess, Ronald A.

    1986-01-01

    A study of helicopter roll control effectiveness based on closed-loop task performance measurement and modeling is presented. Roll control critieria are based on task margin, the excess of vehicle task performance capability over the pilot's task performance demand. Appropriate helicopter roll axis dynamic models are defined for use with analytic models for task performance. Both near-earth and up-and-away large-amplitude maneuvering phases are considered. The results of in-flight and moving-base simulation measurements are presented to support the roll control effectiveness criteria offered. This Volume contains the theoretical analysis, simulation results and criteria development.

  12. Forgetting in immediate serial recall: decay, temporal distinctiveness, or interference?

    PubMed

    Oberauer, Klaus; Lewandowsky, Stephan

    2008-07-01

    Three hypotheses of forgetting from immediate memory were tested: time-based decay, decreasing temporal distinctiveness, and interference. The hypotheses were represented by 3 models of serial recall: the primacy model, the SIMPLE (scale-independent memory, perception, and learning) model, and the SOB (serial order in a box) model, respectively. The models were fit to 2 experiments investigating the effect of filled delays between items at encoding or at recall. Short delays between items, filled with articulatory suppression, led to massive impairment of memory relative to a no-delay baseline. Extending the delays had little additional effect, suggesting that the passage of time alone does not cause forgetting. Adding a choice reaction task in the delay periods to block attention-based rehearsal did not change these results. The interference-based SOB fit the data best; the primacy model overpredicted the effect of lengthening delays, and SIMPLE was unable to explain the effect of delays at encoding. The authors conclude that purely temporal views of forgetting are inadequate. Copyright (c) 2008 APA, all rights reserved.

  13. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  14. An improved null model for assessing the net effects of multiple stressors on communities.

    PubMed

    Thompson, Patrick L; MacLennan, Megan M; Vinebrooke, Rolf D

    2018-01-01

    Ecological stressors (i.e., environmental factors outside their normal range of variation) can mediate each other through their interactions, leading to unexpected combined effects on communities. Determining whether the net effect of stressors is ecologically surprising requires comparing their cumulative impact to a null model that represents the linear combination of their individual effects (i.e., an additive expectation). However, we show that standard additive and multiplicative null models that base their predictions on the effects of single stressors on community properties (e.g., species richness or biomass) do not provide this linear expectation, leading to incorrect interpretations of antagonistic and synergistic responses by communities. We present an alternative, the compositional null model, which instead bases its predictions on the effects of stressors on individual species, and then aggregates them to the community level. Simulations demonstrate the improved ability of the compositional null model to accurately provide a linear expectation of the net effect of stressors. We simulate the response of communities to paired stressors that affect species in a purely additive fashion and compare the relative abilities of the compositional null model and two standard community property null models (additive and multiplicative) to predict these linear changes in species richness and community biomass across different combinations (both positive, negative, or opposite) and intensities of stressors. The compositional model predicts the linear effects of multiple stressors under almost all scenarios, allowing for proper classification of net effects, whereas the standard null models do not. Our findings suggest that current estimates of the prevalence of ecological surprises on communities based on community property null models are unreliable, and should be improved by integrating the responses of individual species to the community level as does our compositional null model. © 2017 John Wiley & Sons Ltd.

  15. Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J. 2011. Component-based development and sensitivity analyses of an air pollutant dry deposition model. Environmental Modelling & Software. 26(6): 804-816.

    Treesearch

    Satoshi Hirabayashi; Chuck Kroll; David Nowak

    2011-01-01

    The Urban Forest Effects-Deposition model (UFORE-D) was developed with a component-based modeling approach. Functions of the model were separated into components that are responsible for user interface, data input/output, and core model functions. Taking advantage of the component-based approach, three UFORE-D applications were developed: a base application to estimate...

  16. Nonlinear Fluid Model Of 3-D Field Effects In Tokamak Plasmas

    NASA Astrophysics Data System (ADS)

    Callen, J. D.; Hegna, C. C.; Beidler, M. T.

    2017-10-01

    Extended MHD codes (e.g., NIMROD, M3D-C1) are beginning to explore nonlinear effects of small 3-D magnetic fields on tokamak plasmas. To facilitate development of analogous physically understandable reduced models, a fluid-based dynamic nonlinear model of these added 3-D field effects in the base axisymmetric tokamak magnetic field geometry is being developed. The model incorporates kinetic-based closures within an extended MHD framework. Key 3-D field effects models that have been developed include: 1) a comprehensive modified Rutherford equation for the growth of a magnetic island that includes the classical tearing and NTM perturbed bootstrap current drives, externally applied magnetic field and current drives, and classical and neoclassical polarization current effects, and 2) dynamic nonlinear evolution of the plasma toroidal flow (radial electric field) in response to the 3-D fields. An application of this model to RMP ELM suppression precipitated by an ELM crash will be discussed. Supported by Office of Fusion Energy Sciences, Office of Science, Dept. of Energy Grants DE-FG02-86ER53218 and DE-FG02-92ER54139.

  17. Suicide in the Media: A Quantitative Review of Studies Based on Nonfictional Stories

    ERIC Educational Resources Information Center

    Stack, Steven

    2005-01-01

    Research on the effect of suicide stories in the media on suicide in the real world has been marked by much debate and inconsistent findings. Recent narrative reviews have suggested that research based on nonfictional models is more apt to uncover imitative effects than research based on fictional models. There is, however, substantial variation…

  18. Time-dependent pharmacokinetics of dexamethasone and its efficacy in human breast cancer xenograft mice: a semi-mechanism-based pharmacokinetic/pharmacodynamic model.

    PubMed

    Li, Jian; Chen, Rong; Yao, Qing-Yu; Liu, Sheng-Jun; Tian, Xiu-Yun; Hao, Chun-Yi; Lu, Wei; Zhou, Tian-Yan

    2018-03-01

    Dexamethasone (DEX) is the substrate of CYP3A. However, the activity of CYP3A could be induced by DEX when DEX was persistently administered, resulting in auto-induction and time-dependent pharmacokinetics (pharmacokinetics with time-dependent clearance) of DEX. In this study we investigated the pharmacokinetic profiles of DEX after single or multiple doses in human breast cancer xenograft nude mice and established a semi-mechanism-based pharmacokinetic/pharmacodynamic (PK/PD) model for characterizing the time-dependent PK of DEX as well as its anti-cancer effect. The mice were orally given a single or multiple doses (8 mg/kg) of DEX, and the plasma concentrations of DEX were assessed using LC-MS/MS. Tumor volumes were recorded daily. Based on the experimental data, a two-compartment model with first order absorption and time-dependent clearance was established, and the time-dependence of clearance was modeled by a sigmoid E max equation. Moreover, a semi-mechanism-based PK/PD model was developed, in which the auto-induction effect of DEX on its metabolizing enzyme CYP3A was integrated and drug potency was described using an E max equation. The PK/PD model was further used to predict the drug efficacy when the auto-induction effect was or was not considered, which further revealed the necessity of adding the auto-induction effect into the final PK/PD model. This study established a semi-mechanism-based PK/PD model for characterizing the time-dependent pharmacokinetics of DEX and its anti-cancer effect in breast cancer xenograft mice. The model may serve as a reference for DEX dose adjustments or optimization in future preclinical or clinical studies.

  19. Testing the Model-Observer Similarity Hypothesis with Text-Based Worked Examples

    ERIC Educational Resources Information Center

    Hoogerheide, Vincent; Loyens, Sofie M. M.; Jadi, Fedora; Vrins, Anna; van Gog, Tamara

    2017-01-01

    Example-based learning is a very effective and efficient instructional strategy for novices. It can be implemented using text-based worked examples that provide a written demonstration of how to perform a task, or (video) modelling examples in which an instructor (the "model") provides a demonstration. The model-observer similarity (MOS)…

  20. An MPI-based MoSST core dynamics model

    NASA Astrophysics Data System (ADS)

    Jiang, Weiyuan; Kuang, Weijia

    2008-09-01

    Distributed systems are among the main cost-effective and expandable platforms for high-end scientific computing. Therefore scalable numerical models are important for effective use of such systems. In this paper, we present an MPI-based numerical core dynamics model for simulation of geodynamo and planetary dynamos, and for simulation of core-mantle interactions. The model is developed based on MPI libraries. Two algorithms are used for node-node communication: a "master-slave" architecture and a "divide-and-conquer" architecture. The former is easy to implement but not scalable in communication. The latter is scalable in both computation and communication. The model scalability is tested on Linux PC clusters with up to 128 nodes. This model is also benchmarked with a published numerical dynamo model solution.

  1. Effects of linking a soil-water-balance model with a groundwater-flow model

    USGS Publications Warehouse

    Stanton, Jennifer S.; Ryter, Derek W.; Peterson, Steven M.

    2013-01-01

    A previously published regional groundwater-flow model in north-central Nebraska was sequentially linked with the recently developed soil-water-balance (SWB) model to analyze effects to groundwater-flow model parameters and calibration results. The linked models provided a more detailed spatial and temporal distribution of simulated recharge based on hydrologic processes, improvement of simulated groundwater-level changes and base flows at specific sites in agricultural areas, and a physically based assessment of the relative magnitude of recharge for grassland, nonirrigated cropland, and irrigated cropland areas. Root-mean-squared (RMS) differences between the simulated and estimated or measured target values for the previously published model and linked models were relatively similar and did not improve for all types of calibration targets. However, without any adjustment to the SWB-generated recharge, the RMS difference between simulated and estimated base-flow target values for the groundwater-flow model was slightly smaller than for the previously published model, possibly indicating that the volume of recharge simulated by the SWB code was closer to actual hydrogeologic conditions than the previously published model provided. Groundwater-level and base-flow hydrographs showed that temporal patterns of simulated groundwater levels and base flows were more accurate for the linked models than for the previously published model at several sites, particularly in agricultural areas.

  2. An Ounce of Prevention, a Pound of Uncertainty: The Cost-Effectiveness of School-Based Drug Prevention Programs.

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.; Rydell, C. Peter; Everingham, Susan S.; Chiesa, James; Bushway, Shawn

    This book describes an analysis of the cost-effectiveness of model school-based drug prevention programs at reducing cocaine consumption. It compares prevention's cost-effectiveness with that of several enforcement programs and with that of treating heavy cocaine users. It also assesses the cost of nationwide implementation of model prevention…

  3. Evaluation and linking of effective parameters in particle-based models and continuum models for mixing-limited bimolecular reactions

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Papelis, Charalambos; Sun, Pengtao; Yu, Zhongbo

    2013-08-01

    Particle-based models and continuum models have been developed to quantify mixing-limited bimolecular reactions for decades. Effective model parameters control reaction kinetics, but the relationship between the particle-based model parameter (such as the interaction radius R) and the continuum model parameter (i.e., the effective rate coefficient Kf) remains obscure. This study attempts to evaluate and link R and Kf for the second-order bimolecular reaction in both the bulk and the sharp-concentration-gradient (SCG) systems. First, in the bulk system, the agent-based method reveals that R remains constant for irreversible reactions and decreases nonlinearly in time for a reversible reaction, while mathematical analysis shows that Kf transitions from an exponential to a power-law function. Qualitative link between R and Kf can then be built for the irreversible reaction with equal initial reactant concentrations. Second, in the SCG system with a reaction interface, numerical experiments show that when R and Kf decline as t-1/2 (for example, to account for the reactant front expansion), the two models capture the transient power-law growth of product mass, and their effective parameters have the same functional form. Finally, revisiting of laboratory experiments further shows that the best fit factor in R and Kf is on the same order, and both models can efficiently describe chemical kinetics observed in the SCG system. Effective model parameters used to describe reaction kinetics therefore may be linked directly, where the exact linkage may depend on the chemical and physical properties of the system.

  4. MDR-TB patients in KwaZulu-Natal, South Africa: Cost-effectiveness of 5 models of care

    PubMed Central

    Wallengren, Kristina; Reddy, Tarylee; Besada, Donela; Brust, James C. M.; Voce, Anna; Desai, Harsha; Ngozo, Jacqueline; Radebe, Zanele; Master, Iqbal; Padayatchi, Nesri; Daviaud, Emmanuelle

    2018-01-01

    Background South Africa has a high burden of MDR-TB, and to provide accessible treatment the government has introduced different models of care. We report the most cost-effective model after comparing cost per patient successfully treated across 5 models of care: centralized hospital, district hospitals (2), and community-based care through clinics or mobile injection teams. Methods In an observational study five cohorts were followed prospectively. The cost analysis adopted a provider perspective and economic cost per patient successfully treated was calculated based on country protocols and length of treatment per patient per model of care. Logistic regression was used to calculate propensity score weights, to compare pairs of treatment groups, whilst adjusting for baseline imbalances between groups. Propensity score weighted costs and treatment success rates were used in the ICER analysis. Sensitivity analysis focused on varying treatment success and length of hospitalization within each model. Results In 1,038 MDR-TB patients 75% were HIV-infected and 56% were successfully treated. The cost per successfully treated patient was 3 to 4.5 times lower in the community-based models with no hospitalization. Overall, the Mobile model was the most cost-effective. Conclusion Reducing the length of hospitalization and following community-based models of care improves the affordability of MDR-TB treatment without compromising its effectiveness. PMID:29668748

  5. Comparing effects of fire modeling methods on simulated fire patterns and succession: a case study in the Missouri Ozarks

    Treesearch

    Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson

    2008-01-01

    We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...

  6. Dynamic models for estimating the effect of HAART on CD4 in observational studies: Application to the Aquitaine Cohort and the Swiss HIV Cohort Study.

    PubMed

    Prague, Mélanie; Commenges, Daniel; Gran, Jon Michael; Ledergerber, Bruno; Young, Jim; Furrer, Hansjakob; Thiébaut, Rodolphe

    2017-03-01

    Highly active antiretroviral therapy (HAART) has proved efficient in increasing CD4 counts in many randomized clinical trials. Because randomized trials have some limitations (e.g., short duration, highly selected subjects), it is interesting to assess the effect of treatments using observational studies. This is challenging because treatment is started preferentially in subjects with severe conditions. This general problem had been treated using Marginal Structural Models (MSM) relying on the counterfactual formulation. Another approach to causality is based on dynamical models. We present three discrete-time dynamic models based on linear increments models (LIM): the first one based on one difference equation for CD4 counts, the second with an equilibrium point, and the third based on a system of two difference equations, which allows jointly modeling CD4 counts and viral load. We also consider continuous-time models based on ordinary differential equations with non-linear mixed effects (ODE-NLME). These mechanistic models allow incorporating biological knowledge when available, which leads to increased statistical evidence for detecting treatment effect. Because inference in ODE-NLME is numerically challenging and requires specific methods and softwares, LIM are a valuable intermediary option in terms of consistency, precision, and complexity. We compare the different approaches in simulation and in illustration on the ANRS CO3 Aquitaine Cohort and the Swiss HIV Cohort Study. © 2016, The International Biometric Society.

  7. A decision model for cost effective design of biomass based green energy supply chains.

    PubMed

    Yılmaz Balaman, Şebnem; Selim, Hasan

    2015-09-01

    The core driver of this study is to deal with the design of anaerobic digestion based biomass to energy supply chains in a cost effective manner. In this concern, a decision model is developed. The model is based on fuzzy multi objective decision making in order to simultaneously optimize multiple economic objectives and tackle the inherent uncertainties in the parameters and decision makers' aspiration levels for the goals. The viability of the decision model is explored with computational experiments on a real-world biomass to energy supply chain and further analyses are performed to observe the effects of different conditions. To this aim, scenario analyses are conducted to investigate the effects of energy crop utilization and operational costs on supply chain structure and performance measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Working memory differences in long-distance dependency resolution

    PubMed Central

    Nicenboim, Bruno; Vasishth, Shravan; Gattei, Carolina; Sigman, Mariano; Kliegl, Reinhold

    2015-01-01

    There is a wealth of evidence showing that increasing the distance between an argument and its head leads to more processing effort, namely, locality effects; these are usually associated with constraints in working memory (DLT: Gibson, 2000; activation-based model: Lewis and Vasishth, 2005). In SOV languages, however, the opposite effect has been found: antilocality (see discussion in Levy et al., 2013). Antilocality effects can be explained by the expectation-based approach as proposed by Levy (2008) or by the activation-based model of sentence processing as proposed by Lewis and Vasishth (2005). We report an eye-tracking and a self-paced reading study with sentences in Spanish together with measures of individual differences to examine the distinction between expectation- and memory-based accounts, and within memory-based accounts the further distinction between DLT and the activation-based model. The experiments show that (i) antilocality effects as predicted by the expectation account appear only for high-capacity readers; (ii) increasing dependency length by interposing material that modifies the head of the dependency (the verb) produces stronger facilitation than increasing dependency length with material that does not modify the head; this is in agreement with the activation-based model but not with the expectation account; and (iii) a possible outcome of memory load on low-capacity readers is the increase in regressive saccades (locality effects as predicted by memory-based accounts) or, surprisingly, a speedup in the self-paced reading task; the latter consistent with good-enough parsing (Ferreira et al., 2002). In sum, the study suggests that individual differences in working memory capacity play a role in dependency resolution, and that some of the aspects of dependency resolution can be best explained with the activation-based model together with a prediction component. PMID:25852623

  9. Working memory differences in long-distance dependency resolution.

    PubMed

    Nicenboim, Bruno; Vasishth, Shravan; Gattei, Carolina; Sigman, Mariano; Kliegl, Reinhold

    2015-01-01

    There is a wealth of evidence showing that increasing the distance between an argument and its head leads to more processing effort, namely, locality effects; these are usually associated with constraints in working memory (DLT: Gibson, 2000; activation-based model: Lewis and Vasishth, 2005). In SOV languages, however, the opposite effect has been found: antilocality (see discussion in Levy et al., 2013). Antilocality effects can be explained by the expectation-based approach as proposed by Levy (2008) or by the activation-based model of sentence processing as proposed by Lewis and Vasishth (2005). We report an eye-tracking and a self-paced reading study with sentences in Spanish together with measures of individual differences to examine the distinction between expectation- and memory-based accounts, and within memory-based accounts the further distinction between DLT and the activation-based model. The experiments show that (i) antilocality effects as predicted by the expectation account appear only for high-capacity readers; (ii) increasing dependency length by interposing material that modifies the head of the dependency (the verb) produces stronger facilitation than increasing dependency length with material that does not modify the head; this is in agreement with the activation-based model but not with the expectation account; and (iii) a possible outcome of memory load on low-capacity readers is the increase in regressive saccades (locality effects as predicted by memory-based accounts) or, surprisingly, a speedup in the self-paced reading task; the latter consistent with good-enough parsing (Ferreira et al., 2002). In sum, the study suggests that individual differences in working memory capacity play a role in dependency resolution, and that some of the aspects of dependency resolution can be best explained with the activation-based model together with a prediction component.

  10. The effect of row structure on soil moisture retrieval accuracy from passive microwave data.

    PubMed

    Xingming, Zheng; Kai, Zhao; Yangyang, Li; Jianhua, Ren; Yanling, Ding

    2014-01-01

    Row structure causes the anisotropy of microwave brightness temperature (TB) of soil surface, and it also can affect soil moisture retrieval accuracy when its influence is ignored in the inversion model. To study the effect of typical row structure on the retrieved soil moisture and evaluate if there is a need to introduce this effect into the inversion model, two ground-based experiments were carried out in 2011. Based on the observed C-band TB, field soil and vegetation parameters, row structure rough surface assumption (Q p model and discrete model), including the effect of row structure, and flat rough surface assumption (Q p model), ignoring the effect of row structure, are used to model microwave TB of soil surface. Then, soil moisture can be retrieved, respectively, by minimizing the difference of the measured and modeled TB. The results show that soil moisture retrieval accuracy based on the row structure rough surface assumption is approximately 0.02 cm(3)/cm(3) better than the flat rough surface assumption for vegetated soil, as well as 0.015 cm(3)/cm(3) better for bare and wet soil. This result indicates that the effect of row structure cannot be ignored for accurately retrieving soil moisture of farmland surface when C-band is used.

  11. [Primary branch size of Pinus koraiensis plantation: a prediction based on linear mixed effect model].

    PubMed

    Dong, Ling-Bo; Liu, Zhao-Gang; Li, Feng-Ri; Jiang, Li-Chun

    2013-09-01

    By using the branch analysis data of 955 standard branches from 60 sampled trees in 12 sampling plots of Pinus koraiensis plantation in Mengjiagang Forest Farm in Heilongjiang Province of Northeast China, and based on the linear mixed-effect model theory and methods, the models for predicting branch variables, including primary branch diameter, length, and angle, were developed. Considering tree effect, the MIXED module of SAS software was used to fit the prediction models. The results indicated that the fitting precision of the models could be improved by choosing appropriate random-effect parameters and variance-covariance structure. Then, the correlation structures including complex symmetry structure (CS), first-order autoregressive structure [AR(1)], and first-order autoregressive and moving average structure [ARMA(1,1)] were added to the optimal branch size mixed-effect model. The AR(1) improved the fitting precision of branch diameter and length mixed-effect model significantly, but all the three structures didn't improve the precision of branch angle mixed-effect model. In order to describe the heteroscedasticity during building mixed-effect model, the CF1 and CF2 functions were added to the branch mixed-effect model. CF1 function improved the fitting effect of branch angle mixed model significantly, whereas CF2 function improved the fitting effect of branch diameter and length mixed model significantly. Model validation confirmed that the mixed-effect model could improve the precision of prediction, as compare to the traditional regression model for the branch size prediction of Pinus koraiensis plantation.

  12. Effects of large vessel on temperature distribution based on photothermal coupling interaction model

    NASA Astrophysics Data System (ADS)

    Li, Zhifang; Zhang, Xiyang; Li, Zuoran; Li, Hui

    2016-10-01

    This paper is based on the finite element analysis method for studying effects of large blood vessel on temperature based on photothermal coupling interaction model, and it couples the physical field of optical transmission with the physical field of heat transfer in biological tissue by using COMSOL Multiphysics 4.4 software. The results demonstrate the cooling effect of large blood vessel, which can be potential application for the treatment of liver tumors.

  13. Scale effect challenges in urban hydrology highlighted with a distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Ichiba, Abdellah; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel; Bompard, Philippe; Ten Veldhuis, Marie-Claire

    2018-01-01

    Hydrological models are extensively used in urban water management, development and evaluation of future scenarios and research activities. There is a growing interest in the development of fully distributed and grid-based models. However, some complex questions related to scale effects are not yet fully understood and still remain open issues in urban hydrology. In this paper we propose a two-step investigation framework to illustrate the extent of scale effects in urban hydrology. First, fractal tools are used to highlight the scale dependence observed within distributed data input into urban hydrological models. Then an intensive multi-scale modelling work is carried out to understand scale effects on hydrological model performance. Investigations are conducted using a fully distributed and physically based model, Multi-Hydro, developed at Ecole des Ponts ParisTech. The model is implemented at 17 spatial resolutions ranging from 100 to 5 m. Results clearly exhibit scale effect challenges in urban hydrology modelling. The applicability of fractal concepts highlights the scale dependence observed within distributed data. Patterns of geophysical data change when the size of the observation pixel changes. The multi-scale modelling investigation confirms scale effects on hydrological model performance. Results are analysed over three ranges of scales identified in the fractal analysis and confirmed through modelling. This work also discusses some remaining issues in urban hydrology modelling related to the availability of high-quality data at high resolutions, and model numerical instabilities as well as the computation time requirements. The main findings of this paper enable a replacement of traditional methods of model calibration by innovative methods of model resolution alteration based on the spatial data variability and scaling of flows in urban hydrology.

  14. Analytical investigation of the faster-is-slower effect with a simplified phenomenological model

    NASA Astrophysics Data System (ADS)

    Suzuno, K.; Tomoeda, A.; Ueyama, D.

    2013-11-01

    We investigate the mechanism of the phenomenon called the “faster-is-slower”effect in pedestrian flow studies analytically with a simplified phenomenological model. It is well known that the flow rate is maximized at a certain strength of the driving force in simulations using the social force model when we consider the discharge of self-driven particles through a bottleneck. In this study, we propose a phenomenological and analytical model based on a mechanics-based modeling to reveal the mechanism of the phenomenon. We show that our reduced system, with only a few degrees of freedom, still has similar properties to the original many-particle system and that the effect comes from the competition between the driving force and the nonlinear friction from the model. Moreover, we predict the parameter dependences on the effect from our model qualitatively, and they are confirmed numerically by using the social force model.

  15. Multi-objective group scheduling optimization integrated with preventive maintenance

    NASA Astrophysics Data System (ADS)

    Liao, Wenzhu; Zhang, Xiufang; Jiang, Min

    2017-11-01

    This article proposes a single-machine-based integration model to meet the requirements of production scheduling and preventive maintenance in group production. To describe the production for identical/similar and different jobs, this integrated model considers the learning and forgetting effects. Based on machine degradation, the deterioration effect is also considered. Moreover, perfect maintenance and minimal repair are adopted in this integrated model. The multi-objective of minimizing total completion time and maintenance cost is taken to meet the dual requirements of delivery date and cost. Finally, a genetic algorithm is developed to solve this optimization model, and the computation results demonstrate that this integrated model is effective and reliable.

  16. A two-stage stochastic rule-based model to determine pre-assembly buffer content

    NASA Astrophysics Data System (ADS)

    Gunay, Elif Elcin; Kula, Ufuk

    2018-01-01

    This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.

  17. Theoretical study of mode evolution in active long tapered multimode fiber.

    PubMed

    Shi, Chen; Wang, Xiaolin; Zhou, Pu; Xu, Xiaojun; Lu, Qisheng

    2016-08-22

    A concise and effective model based on coupled mode theory to describe mode evolution in long tapered active fiber is presented in this manuscript. The mode coupling due to variation of core radius and slight perturbation have been analyzed and local gain with transverse spatial hole burning (TSHB) effect, loss and curvature have been taken into consideration in our model. On the base of this model, the mode evolution behaviors under different factors have been numerically investigated. Our model and results can provide instructive suggestions when designing long tapered fiber based laser and amplifiers.

  18. Genetic programming-based mathematical modeling of influence of weather parameters in BOD5 removal by Lemna minor.

    PubMed

    Chandrasekaran, Sivapragasam; Sankararajan, Vanitha; Neelakandhan, Nampoothiri; Ram Kumar, Mahalakshmi

    2017-11-04

    This study, through extensive experiments and mathematical modeling, reveals that other than retention time and wastewater temperature (T w ), atmospheric parameters also play important role in the effective functioning of aquatic macrophyte-based treatment system. Duckweed species Lemna minor is considered in this study. It is observed that the combined effect of atmospheric temperature (T atm ), wind speed (U w ), and relative humidity (RH) can be reflected through one parameter, namely the "apparent temperature" (T a ). A total of eight different models are considered based on the combination of input parameters and the best mathematical model is arrived at which is validated through a new experimental set-up outside the modeling period. The validation results are highly encouraging. Genetic programming (GP)-based models are found to reveal deeper understandings of the wetland process.

  19. New Geophysical Technique for Mineral Exploration and Mineral Discrimination Based on Electromagnetic Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael S. Zhdanov

    2005-03-09

    The research during the first year of the project was focused on developing the foundations of a new geophysical technique for mineral exploration and mineral discrimination, based on electromagnetic (EM) methods. The proposed new technique is based on examining the spectral induced polarization effects in electromagnetic data using modern distributed acquisition systems and advanced methods of 3-D inversion. The analysis of IP phenomena is usually based on models with frequency dependent complex conductivity distribution. One of the most popular is the Cole-Cole relaxation model. In this progress report we have constructed and analyzed a different physical and mathematical model ofmore » the IP effect based on the effective-medium theory. We have developed a rigorous mathematical model of multi-phase conductive media, which can provide a quantitative tool for evaluation of the type of mineralization, using the conductivity relaxation model parameters. The parameters of the new conductivity relaxation model can be used for discrimination of the different types of rock formations, which is an important goal in mineral exploration. The solution of this problem requires development of an effective numerical method for EM forward modeling in 3-D inhomogeneous media. During the first year of the project we have developed a prototype 3-D IP modeling algorithm using the integral equation (IP) method. Our IE forward modeling code INTEM3DIP is based on the contraction IE method, which improves the convergence rate of the iterative solvers. This code can handle various types of sources and receivers to compute the effect of a complex resistivity model. We have tested the working version of the INTEM3DIP code for computer simulation of the IP data for several models including a southwest US porphyry model and a Kambalda-style nickel sulfide deposit. The numerical modeling study clearly demonstrates how the various complex resistivity models manifest differently in the observed EM data. These modeling studies lay a background for future development of the IP inversion method, directed at determining the electrical conductivity and the intrinsic chargeability distributions, as well as the other parameters of the relaxation model simultaneously. The new technology envisioned in this proposal, will be used for the discrimination of different rocks, and in this way will provide an ability to distinguish between uneconomic mineral deposits and the location of zones of economic mineralization and geothermal resources.« less

  20. From Numerical Problem Solving to Model-Based Experimentation Incorporating Computer-Based Tools of Various Scales into the ChE Curriculum

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima

    2009-01-01

    A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…

  1. Cost-Effectiveness Evaluation of Quadrivalent Human Papilloma Virus Vaccine for HPV-Related Disease in Iran

    PubMed Central

    Khatibi, Mohsen; Rasekh, Hamid Reza; Shahverdi, Zohreh; jamshidi, Hamid Reza

    2014-01-01

    Human Papilloma Virus (HPV) vaccine has been added recently to the Iran Drug List. So, decision makers need information beyond that available from RCTs to recommend funding for this vaccination program to add it to the National Immunization program in Iran. Modeling and economic studies have addressed some of those information needs in foreign countries. In order to determine the long term benefit of this vaccine and impact of vaccine program on the future rate of cervical cancer in Iran, we described a model, based on the available economic and health effects of human papilloma virus (HPV), to estimate the cost-effectiveness of HPV vaccination of 15-year-old girls in Iran. Our objective is to estimate the cost-effectiveness of HPV vaccination in Iran against cervical cancer based on available data; incremental cost-effectiveness ratio (ICER) calculations were based on a model comparing a cohort of 15-year-old girls with and without vaccination. We developed a static model based on available data in Iran on the epidemiology of HPV related health outcome. The model compared the cohort of all 15-year old girls alive in the year 2013 with and without vaccination. The cost per QALY, which was found based on our assumption for the vaccination of 15-years old girl to current situation was 439,000,000 Iranian Rial rate (IRR). By considering the key parameters in our sensitivity analysis, value varied from 251,000,000 IRR to 842,000,000 IRR. In conclusion, quadrivalent HPV vaccine (Gardasil) is not cost-effective in Iran based on the base-case parameters value. PMID:24711850

  2. Moderating factors of video-modeling with other as model: a meta-analysis of single-case studies.

    PubMed

    Mason, Rose A; Ganz, Jennifer B; Parker, Richard I; Burke, Mack D; Camargo, Siglia P

    2012-01-01

    Video modeling with other as model (VMO) is a more practical method for implementing video-based modeling techniques, such as video self-modeling, which requires significantly more editing. Despite this, identification of contextual factors such as participant characteristics and targeted outcomes that moderate the effectiveness of VMO has not previously been explored. The purpose of this study was to meta-analytically evaluate the evidence base of VMO with individuals with disabilities to determine if participant characteristics and targeted outcomes moderate the effectiveness of the intervention. Findings indicate that VMO is highly effective for participants with autism spectrum disorder (IRD=.83) and moderately effective for participants with developmental disabilities (IRD=.68). However, differential effects are indicated across levels of moderators for diagnoses and targeted outcomes. Implications for practice and future research are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Courses of action for effects based operations using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Haider, Sajjad; Levis, Alexander H.

    2006-05-01

    This paper presents an Evolutionary Algorithms (EAs) based approach to identify effective courses of action (COAs) in Effects Based Operations. The approach uses Timed Influence Nets (TINs) as the underlying mathematical model to capture a dynamic uncertain situation. TINs provide a concise graph-theoretic probabilistic approach to specify the cause and effect relationships that exist among the variables of interest (actions, desired effects, and other uncertain events) in a problem domain. The purpose of building these TIN models is to identify and analyze several alternative courses of action. The current practice is to use trial and error based techniques which are not only labor intensive but also produce sub-optimal results and are not capable of modeling constraints among actionable events. The EA based approach presented in this paper is aimed to overcome these limitations. The approach generates multiple COAs that are close enough in terms of achieving the desired effect. The purpose of generating multiple COAs is to give several alternatives to a decision maker. Moreover, the alternate COAs could be generalized based on the relationships that exist among the actions and their execution timings. The approach also allows a system analyst to capture certain types of constraints among actionable events.

  4. A novel convolution-based approach to address ionization chamber volume averaging effect in model-based treatment planning systems

    NASA Astrophysics Data System (ADS)

    Barraclough, Brendan; Li, Jonathan G.; Lebron, Sharon; Fan, Qiyong; Liu, Chihray; Yan, Guanghua

    2015-08-01

    The ionization chamber volume averaging effect is a well-known issue without an elegant solution. The purpose of this study is to propose a novel convolution-based approach to address the volume averaging effect in model-based treatment planning systems (TPSs). Ionization chamber-measured beam profiles can be regarded as the convolution between the detector response function and the implicit real profiles. Existing approaches address the issue by trying to remove the volume averaging effect from the measurement. In contrast, our proposed method imports the measured profiles directly into the TPS and addresses the problem by reoptimizing pertinent parameters of the TPS beam model. In the iterative beam modeling process, the TPS-calculated beam profiles are convolved with the same detector response function. Beam model parameters responsible for the penumbra are optimized to drive the convolved profiles to match the measured profiles. Since the convolved and the measured profiles are subject to identical volume averaging effect, the calculated profiles match the real profiles when the optimization converges. The method was applied to reoptimize a CC13 beam model commissioned with profiles measured with a standard ionization chamber (Scanditronix Wellhofer, Bartlett, TN). The reoptimized beam model was validated by comparing the TPS-calculated profiles with diode-measured profiles. Its performance in intensity-modulated radiation therapy (IMRT) quality assurance (QA) for ten head-and-neck patients was compared with the CC13 beam model and a clinical beam model (manually optimized, clinically proven) using standard Gamma comparisons. The beam profiles calculated with the reoptimized beam model showed excellent agreement with diode measurement at all measured geometries. Performance of the reoptimized beam model was comparable with that of the clinical beam model in IMRT QA. The average passing rates using the reoptimized beam model increased substantially from 92.1% to 99.3% with 3%/3 mm and from 79.2% to 95.2% with 2%/2 mm when compared with the CC13 beam model. These results show the effectiveness of the proposed method. Less inter-user variability can be expected of the final beam model. It is also found that the method can be easily integrated into model-based TPS.

  5. Improving mathematical problem solving ability through problem-based learning and authentic assessment for the students of Bali State Polytechnic

    NASA Astrophysics Data System (ADS)

    Darma, I. K.

    2018-01-01

    This research is aimed at determining: 1) the differences of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) the differences of mathematical problem solving ability between the students facilitated with authentic and conventional assessment model, and 3) interaction effect between learning and assessment model on mathematical problem solving. The research was conducted in Bali State Polytechnic, using the 2x2 experiment factorial design. The samples of this research were 110 students. The data were collected using a theoretically and empirically-validated test. Instruments were validated by using Aiken’s approach of technique content validity and item analysis, and then analyzed using anova stylistic. The result of the analysis shows that the students facilitated with problem-based learning and authentic assessment models get the highest score average compared to the other students, both in the concept understanding and mathematical problem solving. The result of hypothesis test shows that, significantly: 1) there is difference of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) there is difference of mathematical problem solving ability between the students facilitated with authentic assessment model and conventional assessment model, and 3) there is interaction effect between learning model and assessment model on mathematical problem solving. In order to improve the effectiveness of mathematics learning, collaboration between problem-based learning model and authentic assessment model can be considered as one of learning models in class.

  6. A neural network model of causative actions.

    PubMed

    Lee-Hand, Jeremy; Knott, Alistair

    2015-01-01

    A common idea in models of action representation is that actions are represented in terms of their perceptual effects (see e.g., Prinz, 1997; Hommel et al., 2001; Sahin et al., 2007; Umiltà et al., 2008; Hommel, 2013). In this paper we extend existing models of effect-based action representations to account for a novel distinction. Some actions bring about effects that are independent events in their own right: for instance, if John smashes a cup, he brings about the event of the cup smashing. Other actions do not bring about such effects. For instance, if John grabs a cup, this action does not cause the cup to "do" anything: a grab action has well-defined perceptual effects, but these are not registered by the perceptual system that detects independent events involving external objects in the world. In our model, effect-based actions are implemented in several distinct neural circuits, which are organized into a hierarchy based on the complexity of their associated perceptual effects. The circuit at the top of this hierarchy is responsible for actions that bring about independently perceivable events. This circuit receives input from the perceptual module that recognizes arbitrary events taking place in the world, and learns movements that reliably cause such events. We assess our model against existing experimental observations about effect-based motor representations, and make some novel experimental predictions. We also consider the possibility that the "causative actions" circuit in our model can be identified with a motor pathway reported in other work, specializing in "functional" actions on manipulable tools (Bub et al., 2008; Binkofski and Buxbaum, 2013).

  7. The Effect of Modeling Based Science Education on Critical Thinking

    ERIC Educational Resources Information Center

    Bati, Kaan; Kaptan, Fitnat

    2015-01-01

    In this study to what degree the modeling based science education can influence the development of the critical thinking skills of the students was investigated. The research was based on pre-test-post-test quasi-experimental design with control group. The Modeling Based Science Education Program which was prepared with the purpose of exploring…

  8. Effects of Inquiry-Based Science Instruction on Science Achievement and Interest in Science: Evidence from Qatar

    ERIC Educational Resources Information Center

    Areepattamannil, Shaljan

    2012-01-01

    The author sought to investigate the effects of inquiry-based science instruction on science achievement and interest in science of 5,120 adolescents from 85 schools in Qatar. Results of hierarchical linear modeling analyses revealed the substantial positive effects of science teaching and learning with a focus on model or applications and…

  9. Of goals and habits: age-related and individual differences in goal-directed decision-making.

    PubMed

    Eppinger, Ben; Walter, Maik; Heekeren, Hauke R; Li, Shu-Chen

    2013-01-01

    In this study we investigated age-related and individual differences in habitual (model-free) and goal-directed (model-based) decision-making. Specifically, we were interested in three questions. First, does age affect the balance between model-based and model-free decision mechanisms? Second, are these age-related changes due to age differences in working memory (WM) capacity? Third, can model-based behavior be affected by manipulating the distinctiveness of the reward value of choice options? To answer these questions we used a two-stage Markov decision task in in combination with computational modeling to dissociate model-based and model-free decision mechanisms. To affect model-based behavior in this task we manipulated the distinctiveness of reward probabilities of choice options. The results show age-related deficits in model-based decision-making, which are particularly pronounced if unexpected reward indicates the need for a shift in decision strategy. In this situation younger adults explore the task structure, whereas older adults show perseverative behavior. Consistent with previous findings, these results indicate that older adults have deficits in the representation and updating of expected reward value. We also observed substantial individual differences in model-based behavior. In younger adults high WM capacity is associated with greater model-based behavior and this effect is further elevated when reward probabilities are more distinct. However, in older adults we found no effect of WM capacity. Moreover, age differences in model-based behavior remained statistically significant, even after controlling for WM capacity. Thus, factors other than decline in WM, such as deficits in the in the integration of expected reward value into strategic decisions may contribute to the observed impairments in model-based behavior in older adults.

  10. Of goals and habits: age-related and individual differences in goal-directed decision-making

    PubMed Central

    Eppinger, Ben; Walter, Maik; Heekeren, Hauke R.; Li, Shu-Chen

    2013-01-01

    In this study we investigated age-related and individual differences in habitual (model-free) and goal-directed (model-based) decision-making. Specifically, we were interested in three questions. First, does age affect the balance between model-based and model-free decision mechanisms? Second, are these age-related changes due to age differences in working memory (WM) capacity? Third, can model-based behavior be affected by manipulating the distinctiveness of the reward value of choice options? To answer these questions we used a two-stage Markov decision task in in combination with computational modeling to dissociate model-based and model-free decision mechanisms. To affect model-based behavior in this task we manipulated the distinctiveness of reward probabilities of choice options. The results show age-related deficits in model-based decision-making, which are particularly pronounced if unexpected reward indicates the need for a shift in decision strategy. In this situation younger adults explore the task structure, whereas older adults show perseverative behavior. Consistent with previous findings, these results indicate that older adults have deficits in the representation and updating of expected reward value. We also observed substantial individual differences in model-based behavior. In younger adults high WM capacity is associated with greater model-based behavior and this effect is further elevated when reward probabilities are more distinct. However, in older adults we found no effect of WM capacity. Moreover, age differences in model-based behavior remained statistically significant, even after controlling for WM capacity. Thus, factors other than decline in WM, such as deficits in the in the integration of expected reward value into strategic decisions may contribute to the observed impairments in model-based behavior in older adults. PMID:24399925

  11. Effects of Model-Based Teaching on Pre-Service Physics Teachers' Conceptions of the Moon, Moon Phases, and Other Lunar Phenomena

    ERIC Educational Resources Information Center

    Ogan-Bekiroglu, Feral

    2007-01-01

    The purpose of this study was twofold. First, it was aimed to identify Turkish pre-service physics teachers' knowledge and understanding of the Moon, Moon phases, and other lunar phenomena. Second, the effects of model-based teaching on pre-service teachers' conceptions were examined. Conceptions were proposed as mental models in this study. Four…

  12. Lapse of time effects on tax evasion in an agent-based econophysics model

    NASA Astrophysics Data System (ADS)

    Seibold, Götz; Pickhardt, Michael

    2013-05-01

    We investigate an inhomogeneous Ising model in the context of tax evasion dynamics where different types of agents are parameterized via local temperatures and magnetic fields. In particular, we analyze the impact of lapse of time effects (i.e. backauditing) and endogenously determined penalty rates on tax compliance. Both features contribute to a microfoundation of agent-based econophysics models of tax evasion.

  13. SSIC model: A multi-layer model for intervention of online rumors spreading

    NASA Astrophysics Data System (ADS)

    Tian, Ru-Ya; Zhang, Xue-Fu; Liu, Yi-Jun

    2015-06-01

    SIR model is a classical model to simulate rumor spreading, while the supernetwork is an effective tool for modeling complex systems. Based on the Opinion SuperNetwork involving Social Sub-network, Environmental Sub-network, Psychological Sub-network, and Viewpoint Sub-network, drawing from the modeling idea of SIR model, this paper designs super SIC model (SSIC model) and its evolution rules, and also analyzes intervention effects on public opinion of four elements of supernetwork, which are opinion agent, opinion environment, agent's psychology and viewpoint. Studies show that, the SSIC model based on supernetwork has effective intervention effects on rumor spreading. It is worth noting that (i) identifying rumor spreaders in Social Sub-network and isolating them can achieve desired intervention results, (ii) improving environmental information transparency so that the public knows as much information as possible to reduce the rumors is a feasible way to intervene, (iii) persuading wavering neutrals has better intervention effects than clarifying rumors already spread everywhere, so rumors should be intervened in properly in time by psychology counseling.

  14. The effectiveness of snow cube throwing learning model based on exploration

    NASA Astrophysics Data System (ADS)

    Sari, Nenden Mutiara

    2017-08-01

    This study aimed to know the effectiveness of Snow Cube Throwing (SCT) and Cooperative Model in Exploration-Based Math Learning in terms of the time required to complete the teaching materials and student engagement. This study was quasi-experimental research was conducted at SMPN 5 Cimahi, Indonesia. All student in grade VIII SMPN 5 Cimahi which consists of 382 students is used as population. The sample consists of two classes which had been chosen randomly with purposive sampling. First experiment class consists of 38 students and the second experiment class consists of 38 students. Observation sheet was used to observe the time required to complete the teaching materials and record the number of students involved in each meeting. The data obtained was analyzed by independent sample-t test and used the chart. The results of this study: SCT learning model based on exploration are more effective than cooperative learning models based on exploration in terms of the time required to complete teaching materials based on exploration and student engagement.

  15. Analysis of axial-induction-based wind plant control using an engineering and a high-order wind plant model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Annoni, Jennifer; Gebraad, Pieter M. O.; Scholbrock, Andrew K.

    2015-08-14

    Wind turbines are typically operated to maximize their performance without considering the impact of wake effects on nearby turbines. Wind plant control concepts aim to increase overall wind plant performance by coordinating the operation of the turbines. This paper focuses on axial-induction-based wind plant control techniques, in which the generator torque or blade pitch degrees of freedom of the wind turbines are adjusted. The paper addresses discrepancies between a high-order wind plant model and an engineering wind plant model. Changes in the engineering model are proposed to better capture the effects of axial-induction-based control shown in the high-order model.

  16. Force Modelling in Orthogonal Cutting Considering Flank Wear Effect

    NASA Astrophysics Data System (ADS)

    Rathod, Kanti Bhikhubhai; Lalwani, Devdas I.

    2017-05-01

    In the present work, an attempt has been made to provide a predictive cutting force model during orthogonal cutting by combining two different force models, that is, a force model for a perfectly sharp tool plus considering the effect of edge radius and a force model for a worn tool. The first force model is for a perfectly sharp tool that is based on Oxley's predictive machining theory for orthogonal cutting as the Oxley's model is for perfectly sharp tool, the effect of cutting edge radius (hone radius) is added and improve model is presented. The second force model is based on worn tool (flank wear) that was proposed by Waldorf. Further, the developed combined force model is also used to predict flank wear width using inverse approach. The performance of the developed combined total force model is compared with the previously published results for AISI 1045 and AISI 4142 materials and found reasonably good agreement.

  17. Rank-based estimation in the {ell}1-regularized partly linear model for censored outcomes with application to integrated analyses of clinical predictors and gene expression data.

    PubMed

    Johnson, Brent A

    2009-10-01

    We consider estimation and variable selection in the partial linear model for censored data. The partial linear model for censored data is a direct extension of the accelerated failure time model, the latter of which is a very important alternative model to the proportional hazards model. We extend rank-based lasso-type estimators to a model that may contain nonlinear effects. Variable selection in such partial linear model has direct application to high-dimensional survival analyses that attempt to adjust for clinical predictors. In the microarray setting, previous methods can adjust for other clinical predictors by assuming that clinical and gene expression data enter the model linearly in the same fashion. Here, we select important variables after adjusting for prognostic clinical variables but the clinical effects are assumed nonlinear. Our estimator is based on stratification and can be extended naturally to account for multiple nonlinear effects. We illustrate the utility of our method through simulation studies and application to the Wisconsin prognostic breast cancer data set.

  18. Cost-effectiveness of human papillomavirus vaccination in the United States.

    PubMed

    Chesson, Harrell W; Ekwueme, Donatus U; Saraiya, Mona; Markowitz, Lauri E

    2008-02-01

    We describe a simplified model, based on the current economic and health effects of human papillomavirus (HPV), to estimate the cost-effectiveness of HPV vaccination of 12-year-old girls in the United States. Under base-case parameter values, the estimated cost per quality-adjusted life year gained by vaccination in the context of current cervical cancer screening practices in the United States ranged from $3,906 to $14,723 (2005 US dollars), depending on factors such as whether herd immunity effects were assumed; the types of HPV targeted by the vaccine; and whether the benefits of preventing anal, vaginal, vulvar, and oropharyngeal cancers were included. The results of our simplified model were consistent with published studies based on more complex models when key assumptions were similar. This consistency is reassuring because models of varying complexity will be essential tools for policy makers in the development of optimal HPV vaccination strategies.

  19. Testing homogeneity in Weibull-regression models.

    PubMed

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  20. A 4DCT imaging-based breathing lung model with relative hysteresis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.

    To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for bothmore » models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry. - Highlights: • We developed a breathing human lung CFD model based on 4D-dynamic CT images. • The 4DCT-based breathing lung model is able to capture lung relative hysteresis. • A new boundary condition for lung model based on one static CT image was proposed. • The difference between lung models based on 4D and static CT images was quantified.« less

  1. Predictive Performance of Physiologically Based Pharmacokinetic Models for the Effect of Food on Oral Drug Absorption: Current Status

    PubMed Central

    Zhao, Ping; Pan, Yuzhuo; Wagner, Christian

    2017-01-01

    A comprehensive search in literature and published US Food and Drug Administration reviews was conducted to assess whether physiologically based pharmacokinetic (PBPK) modeling could be prospectively used to predict clinical food effect on oral drug absorption. Among the 48 resulted food effect predictions, ∼50% were predicted within 1.25‐fold of observed, and 75% within 2‐fold. Dissolution rate and precipitation time were commonly optimized parameters when PBPK modeling was not able to capture the food effect. The current work presents a knowledgebase for documenting PBPK experience to predict food effect. PMID:29168611

  2. Formalizing the role of agent-based modeling in causal inference and epidemiology.

    PubMed

    Marshall, Brandon D L; Galea, Sandro

    2015-01-15

    Calls for the adoption of complex systems approaches, including agent-based modeling, in the field of epidemiology have largely centered on the potential for such methods to examine complex disease etiologies, which are characterized by feedback behavior, interference, threshold dynamics, and multiple interacting causal effects. However, considerable theoretical and practical issues impede the capacity of agent-based methods to examine and evaluate causal effects and thus illuminate new areas for intervention. We build on this work by describing how agent-based models can be used to simulate counterfactual outcomes in the presence of complexity. We show that these models are of particular utility when the hypothesized causal mechanisms exhibit a high degree of interdependence between multiple causal effects and when interference (i.e., one person's exposure affects the outcome of others) is present and of intrinsic scientific interest. Although not without challenges, agent-based modeling (and complex systems methods broadly) represent a promising novel approach to identify and evaluate complex causal effects, and they are thus well suited to complement other modern epidemiologic methods of etiologic inquiry. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Linear Elastic and Cohesive Fracture Analysis to Model Hydraulic Fracture in Brittle and Ductile Rocks

    NASA Astrophysics Data System (ADS)

    Yao, Yao

    2012-05-01

    Hydraulic fracturing technology is being widely used within the oil and gas industry for both waste injection and unconventional gas production wells. It is essential to predict the behavior of hydraulic fractures accurately based on understanding the fundamental mechanism(s). The prevailing approach for hydraulic fracture modeling continues to rely on computational methods based on Linear Elastic Fracture Mechanics (LEFM). Generally, these methods give reasonable predictions for hard rock hydraulic fracture processes, but still have inherent limitations, especially when fluid injection is performed in soft rock/sand or other non-conventional formations. These methods typically give very conservative predictions on fracture geometry and inaccurate estimation of required fracture pressure. One of the reasons the LEFM-based methods fail to give accurate predictions for these materials is that the fracture process zone ahead of the crack tip and softening effect should not be neglected in ductile rock fracture analysis. A 3D pore pressure cohesive zone model has been developed and applied to predict hydraulic fracturing under fluid injection. The cohesive zone method is a numerical tool developed to model crack initiation and growth in quasi-brittle materials considering the material softening effect. The pore pressure cohesive zone model has been applied to investigate the hydraulic fracture with different rock properties. The hydraulic fracture predictions of a three-layer water injection case have been compared using the pore pressure cohesive zone model with revised parameters, LEFM-based pseudo 3D model, a Perkins-Kern-Nordgren (PKN) model, and an analytical solution. Based on the size of the fracture process zone and its effect on crack extension in ductile rock, the fundamental mechanical difference of LEFM and cohesive fracture mechanics-based methods is discussed. An effective fracture toughness method has been proposed to consider the fracture process zone effect on the ductile rock fracture.

  4. Development and application of Model of Resource Utilization, Costs, and Outcomes for Stroke (MORUCOS): an Australian economic model for stroke.

    PubMed

    Mihalopoulos, Catherine; Cadilhac, Dominique A; Moodie, Marjory L; Dewey, Helen M; Thrift, Amanda G; Donnan, Geoffrey A; Carter, Robert C

    2005-01-01

    To outline the development, structure, data assumptions, and application of an Australian economic model for stroke (Model of Resource Utilization, Costs, and Outcomes for Stroke [MORUCOS]). The model has a linked spreadsheet format with four modules to describe the disease burden and treatment pathways, estimate prevalence-based and incidence-based costs, and derive life expectancy and quality of life consequences. The model uses patient-level, community-based, stroke cohort data and macro-level simulations. An interventions module allows options for change to be consistently evaluated by modifying aspects of the other modules. To date, model validation has included sensitivity testing, face validity, and peer review. Further validation of technical and predictive accuracy is needed. The generic pathway model was assessed by comparison with a stroke subtypes (ischemic, hemorrhagic, or undetermined) approach and used to determine the relative cost-effectiveness of four interventions. The generic pathway model produced lower costs compared with a subtypes version (total average first-year costs/case AUD$ 15,117 versus AUD$ 17,786, respectively). Optimal evidence-based uptake of anticoagulation therapy for primary and secondary stroke prevention and intravenous thrombolytic therapy within 3 hours of stroke were more cost-effective than current practice (base year, 1997). MORUCOS is transparent and flexible in describing Australian stroke care and can effectively be used to systematically evaluate a range of different interventions. Adjusting results to account for stroke subtypes, as they influence cost estimates, could enhance the generic model.

  5. Teacher Evaluation Models: Compliance or Growth Oriented?

    ERIC Educational Resources Information Center

    Clenchy, Kelly R.

    2017-01-01

    This research study reviewed literature specific to the evolution of teacher evaluation models and explored the effectiveness of standards-based evaluation models' potential to facilitate professional growth. The researcher employed descriptive phenomenology to conduct a study of teachers' perceptions of a standard-based evaluation model's…

  6. A new hysteresis model based on force-displacement characteristics of magnetorheological fluid actuators subjected to squeeze mode operation

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Bai, Xian-Xu; Qian, Li-Jun; Choi, Seung-Bok

    2017-06-01

    This paper presents a new hysteresis model based on the force-displacement characteristics of magnetorheological (MR) fluid actuators (or devices) subjected to squeeze mode operation. The idea of the proposed model is originated from experimental observation of the field-dependent hysteretic behavior of MR fluids, which shows that from a view of rate-independence of hysteresis, a gap width-dependent hysteresis is occurred in the force-displacement relationship instead of the typical relationship of the force-velocity. To effectively and accurately portray the hysteresis behavior, the gap width-dependent hysteresis elements, the nonlinear viscous effect and the inertial effect are considered for the formulation of the hysteresis model. Then, a model-based feedforward force tracking control scheme is established through an observer which can estimate the virtual displacement. The effectiveness of the proposed hysteresis model is validated through the identification and prediction of the damping force of MR fluids in the squeeze mode. In addition, it is shown that superior force tracking performance of the feedforward control associated with the proposed hysteresis mode is evaluated by adopting several tracking trajectories.

  7. FOG Random Drift Signal Denoising Based on the Improved AR Model and Modified Sage-Husa Adaptive Kalman Filter.

    PubMed

    Sun, Jin; Xu, Xiaosu; Liu, Yiting; Zhang, Tao; Li, Yao

    2016-07-12

    In order to reduce the influence of fiber optic gyroscope (FOG) random drift error on inertial navigation systems, an improved auto regressive (AR) model is put forward in this paper. First, based on real-time observations at each restart of the gyroscope, the model of FOG random drift can be established online. In the improved AR model, the FOG measured signal is employed instead of the zero mean signals. Then, the modified Sage-Husa adaptive Kalman filter (SHAKF) is introduced, which can directly carry out real-time filtering on the FOG signals. Finally, static and dynamic experiments are done to verify the effectiveness. The filtering results are analyzed with Allan variance. The analysis results show that the improved AR model has high fitting accuracy and strong adaptability, and the minimum fitting accuracy of single noise is 93.2%. Based on the improved AR(3) model, the denoising method of SHAKF is more effective than traditional methods, and its effect is better than 30%. The random drift error of FOG is reduced effectively, and the precision of the FOG is improved.

  8. Variability in the Effectiveness of a Video Modeling Intervention Package for Children with Autism

    ERIC Educational Resources Information Center

    Plavnick, Joshua B.; MacFarland, Mari C.; Ferreri, Summer J.

    2015-01-01

    Video modeling is an evidence-based instructional strategy for teaching a variety of skills to individuals with autism. Despite the effectiveness of this strategy, there is some uncertainty regarding the conditions under which video modeling is likely to be effective. The present investigation examined the differential effectiveness of video…

  9. Multi-model comparison on the effects of climate change on tree species in the eastern U.S.: results from an enhanced niche model and process-based ecosystem and landscape models

    Treesearch

    Louis R. Iverson; Frank R. Thompson; Stephen Matthews; Matthew Peters; Anantha Prasad; William D. Dijak; Jacob Fraser; Wen J. Wang; Brice Hanberry; Hong He; Maria Janowiak; Patricia Butler; Leslie Brandt; Chris Swanston

    2016-01-01

    Context. Species distribution models (SDM) establish statistical relationships between the current distribution of species and key attributes whereas process-based models simulate ecosystem and tree species dynamics based on representations of physical and biological processes. TreeAtlas, which uses DISTRIB SDM, and Linkages and LANDIS PRO, process...

  10. Working-memory capacity protects model-based learning from stress.

    PubMed

    Otto, A Ross; Raio, Candace M; Chiang, Alice; Phelps, Elizabeth A; Daw, Nathaniel D

    2013-12-24

    Accounts of decision-making have long posited the operation of separate, competing valuation systems in the control of choice behavior. Recent theoretical and experimental advances suggest that this classic distinction between habitual and goal-directed (or more generally, automatic and controlled) choice may arise from two computational strategies for reinforcement learning, called model-free and model-based learning. Popular neurocomputational accounts of reward processing emphasize the involvement of the dopaminergic system in model-free learning and prefrontal, central executive-dependent control systems in model-based choice. Here we hypothesized that the hypothalamic-pituitary-adrenal (HPA) axis stress response--believed to have detrimental effects on prefrontal cortex function--should selectively attenuate model-based contributions to behavior. To test this, we paired an acute stressor with a sequential decision-making task that affords distinguishing the relative contributions of the two learning strategies. We assessed baseline working-memory (WM) capacity and used salivary cortisol levels to measure HPA axis stress response. We found that stress response attenuates the contribution of model-based, but not model-free, contributions to behavior. Moreover, stress-induced behavioral changes were modulated by individual WM capacity, such that low-WM-capacity individuals were more susceptible to detrimental stress effects than high-WM-capacity individuals. These results enrich existing accounts of the interplay between acute stress, working memory, and prefrontal function and suggest that executive function may be protective against the deleterious effects of acute stress.

  11. Working-memory capacity protects model-based learning from stress

    PubMed Central

    Otto, A. Ross; Raio, Candace M.; Chiang, Alice; Phelps, Elizabeth A.; Daw, Nathaniel D.

    2013-01-01

    Accounts of decision-making have long posited the operation of separate, competing valuation systems in the control of choice behavior. Recent theoretical and experimental advances suggest that this classic distinction between habitual and goal-directed (or more generally, automatic and controlled) choice may arise from two computational strategies for reinforcement learning, called model-free and model-based learning. Popular neurocomputational accounts of reward processing emphasize the involvement of the dopaminergic system in model-free learning and prefrontal, central executive–dependent control systems in model-based choice. Here we hypothesized that the hypothalamic-pituitary-adrenal (HPA) axis stress response—believed to have detrimental effects on prefrontal cortex function—should selectively attenuate model-based contributions to behavior. To test this, we paired an acute stressor with a sequential decision-making task that affords distinguishing the relative contributions of the two learning strategies. We assessed baseline working-memory (WM) capacity and used salivary cortisol levels to measure HPA axis stress response. We found that stress response attenuates the contribution of model-based, but not model-free, contributions to behavior. Moreover, stress-induced behavioral changes were modulated by individual WM capacity, such that low-WM-capacity individuals were more susceptible to detrimental stress effects than high-WM-capacity individuals. These results enrich existing accounts of the interplay between acute stress, working memory, and prefrontal function and suggest that executive function may be protective against the deleterious effects of acute stress. PMID:24324166

  12. Steel Alloy Hot Roll Simulations and Through-Thickness Variation Using Dislocation Density-Based Modeling

    NASA Astrophysics Data System (ADS)

    Jansen Van Rensburg, G. J.; Kok, S.; Wilke, D. N.

    2017-10-01

    Different roll pass reduction schedules have different effects on the through-thickness properties of hot-rolled metal slabs. In order to assess or improve a reduction schedule using the finite element method, a material model is required that captures the relevant deformation mechanisms and physics. The model should also report relevant field quantities to assess variations in material state through the thickness of a simulated rolled metal slab. In this paper, a dislocation density-based material model with recrystallization is presented and calibrated on the material response of a high-strength low-alloy steel. The model has the ability to replicate and predict material response to a fair degree thanks to the physically motivated mechanisms it is built on. An example study is also presented to illustrate the possible effect different reduction schedules could have on the through-thickness material state and the ability to assess these effects based on finite element simulations.

  13. A non-classical Mindlin plate model incorporating microstructure, surface energy and foundation effects.

    PubMed

    Gao, X-L; Zhang, G Y

    2016-07-01

    A non-classical model for a Mindlin plate resting on an elastic foundation is developed in a general form using a modified couple stress theory, a surface elasticity theory and a two-parameter Winkler-Pasternak foundation model. It includes all five kinematic variables possible for a Mindlin plate. The equations of motion and the complete boundary conditions are obtained simultaneously through a variational formulation based on Hamilton's principle, and the microstructure, surface energy and foundation effects are treated in a unified manner. The newly developed model contains one material length-scale parameter to describe the microstructure effect, three surface elastic constants to account for the surface energy effect, and two foundation parameters to capture the foundation effect. The current non-classical plate model reduces to its classical elasticity-based counterpart when the microstructure, surface energy and foundation effects are all suppressed. In addition, the new model includes the Mindlin plate models considering the microstructure dependence or the surface energy effect or the foundation influence alone as special cases, recovers the Kirchhoff plate model incorporating the microstructure, surface energy and foundation effects, and degenerates to the Timoshenko beam model including the microstructure effect. To illustrate the new Mindlin plate model, the static bending and free vibration problems of a simply supported rectangular plate are analytically solved by directly applying the general formulae derived.

  14. A non-classical Mindlin plate model incorporating microstructure, surface energy and foundation effects

    PubMed Central

    Zhang, G. Y.

    2016-01-01

    A non-classical model for a Mindlin plate resting on an elastic foundation is developed in a general form using a modified couple stress theory, a surface elasticity theory and a two-parameter Winkler–Pasternak foundation model. It includes all five kinematic variables possible for a Mindlin plate. The equations of motion and the complete boundary conditions are obtained simultaneously through a variational formulation based on Hamilton's principle, and the microstructure, surface energy and foundation effects are treated in a unified manner. The newly developed model contains one material length-scale parameter to describe the microstructure effect, three surface elastic constants to account for the surface energy effect, and two foundation parameters to capture the foundation effect. The current non-classical plate model reduces to its classical elasticity-based counterpart when the microstructure, surface energy and foundation effects are all suppressed. In addition, the new model includes the Mindlin plate models considering the microstructure dependence or the surface energy effect or the foundation influence alone as special cases, recovers the Kirchhoff plate model incorporating the microstructure, surface energy and foundation effects, and degenerates to the Timoshenko beam model including the microstructure effect. To illustrate the new Mindlin plate model, the static bending and free vibration problems of a simply supported rectangular plate are analytically solved by directly applying the general formulae derived. PMID:27493578

  15. Firm performance model in small and medium enterprises (SMEs) based on learning orientation and innovation

    NASA Astrophysics Data System (ADS)

    Lestari, E. R.; Ardianti, F. L.; Rachmawati, L.

    2018-03-01

    This study investigated the relationship between learning orientation, innovation, and firm performance. A conceptual model and hypothesis were empirically examined using structural equation modelling. The study involved a questionnaire-based survey of owners of small and medium enterprises (SMEs) operating in Batu City, Indonesia. The results showed that both variables of learning orientation and innovation effect positively on firm performance. Additionally, learning orientation has positive effect innovation. This study has implication for SMEs aiming at increasing their firm performance based on learning orientation and innovation capability.

  16. [Effect of stock abundance and environmental factors on the recruitment success of small yellow croaker in the East China Sea].

    PubMed

    Liu, Zun-lei; Yuan, Xing-wei; Yang, Lin-lin; Yan, Li-ping; Zhang, Hui; Cheng, Jia-hua

    2015-02-01

    Multiple hypotheses are available to explain recruitment rate. Model selection methods can be used to identify the best model that supports a particular hypothesis. However, using a single model for estimating recruitment success is often inadequate for overexploited population because of high model uncertainty. In this study, stock-recruitment data of small yellow croaker in the East China Sea collected from fishery dependent and independent surveys between 1992 and 2012 were used to examine density-dependent effects on recruitment success. Model selection methods based on frequentist (AIC, maximum adjusted R2 and P-values) and Bayesian (Bayesian model averaging, BMA) methods were applied to identify the relationship between recruitment and environment conditions. Interannual variability of the East China Sea environment was indicated by sea surface temperature ( SST) , meridional wind stress (MWS), zonal wind stress (ZWS), sea surface pressure (SPP) and runoff of Changjiang River ( RCR). Mean absolute error, mean squared predictive error and continuous ranked probability score were calculated to evaluate the predictive performance of recruitment success. The results showed that models structures were not consistent based on three kinds of model selection methods, predictive variables of models were spawning abundance and MWS by AIC, spawning abundance by P-values, spawning abundance, MWS and RCR by maximum adjusted R2. The recruitment success decreased linearly with stock abundance (P < 0.01), suggesting overcompensation effect in the recruitment success might be due to cannibalism or food competition. Meridional wind intensity showed marginally significant and positive effects on the recruitment success (P = 0.06), while runoff of Changjiang River showed a marginally negative effect (P = 0.07). Based on mean absolute error and continuous ranked probability score, predictive error associated with models obtained from BMA was the smallest amongst different approaches, while that from models selected based on the P-value of the independent variables was the highest. However, mean squared predictive error from models selected based on the maximum adjusted R2 was highest. We found that BMA method could improve the prediction of recruitment success, derive more accurate prediction interval and quantitatively evaluate model uncertainty.

  17. Stolen Base Physics

    NASA Astrophysics Data System (ADS)

    Kagan, David

    2013-05-01

    Few plays in baseball are as consistently close and exciting as the stolen base. While there are several studies of sprinting,2-4 the art of base stealing is much more nuanced. This article describes the motion of the base-stealing runner using a very basic kinematic model. The model will be compared to some data from a Major League game. The predictions of the model show consistency with the skills needed for effective base stealing.

  18. Combating weight-based cyberbullying on Facebook with the dissenter effect.

    PubMed

    Anderson, Jenn; Bresnahan, Mary; Musatics, Catherine

    2014-05-01

    Weight-based cyberbullying is prevalent among youth and adolescents and can have lasting negative psychological effects on the victims. One way to combat these negative effects is through modeling dissenting behavior. When a bystander challenges the bully or supports the victim, this models dissenting behavior. In this study, 181 participants were exposed to message manipulations posted on a Facebook page aimed at testing the conformity effect, the dissenter effect, and the bystander effect in response to enactment of weight-based bullying. Facebook is a common social media site where cyberbullying is reported. Results indicate that in the dissenting condition, participants' comments were significantly more positive or supporting for the victim, as compared to other conditions. This effect was more pronounced for men than for women. In addition, in the dissenting condition, men were less likely to consider the victim unhealthy than women and men in other conditions. These results support the effectiveness of efforts to model dissenting behavior in the face of bullies and extend them to online contexts. Implications are discussed.

  19. Modeling additive and non-additive effects in a hybrid population using genome-wide genotyping: prediction accuracy implications

    PubMed Central

    Bouvet, J-M; Makouanzi, G; Cros, D; Vigneron, Ph

    2016-01-01

    Hybrids are broadly used in plant breeding and accurate estimation of variance components is crucial for optimizing genetic gain. Genome-wide information may be used to explore models designed to assess the extent of additive and non-additive variance and test their prediction accuracy for the genomic selection. Ten linear mixed models, involving pedigree- and marker-based relationship matrices among parents, were developed to estimate additive (A), dominance (D) and epistatic (AA, AD and DD) effects. Five complementary models, involving the gametic phase to estimate marker-based relationships among hybrid progenies, were developed to assess the same effects. The models were compared using tree height and 3303 single-nucleotide polymorphism markers from 1130 cloned individuals obtained via controlled crosses of 13 Eucalyptus urophylla females with 9 Eucalyptus grandis males. Akaike information criterion (AIC), variance ratios, asymptotic correlation matrices of estimates, goodness-of-fit, prediction accuracy and mean square error (MSE) were used for the comparisons. The variance components and variance ratios differed according to the model. Models with a parent marker-based relationship matrix performed better than those that were pedigree-based, that is, an absence of singularities, lower AIC, higher goodness-of-fit and accuracy and smaller MSE. However, AD and DD variances were estimated with high s.es. Using the same criteria, progeny gametic phase-based models performed better in fitting the observations and predicting genetic values. However, DD variance could not be separated from the dominance variance and null estimates were obtained for AA and AD effects. This study highlighted the advantages of progeny models using genome-wide information. PMID:26328760

  20. Numerical simulation of cryogenic cavitating flow by an extended transport-based cavitation model with thermal effects

    NASA Astrophysics Data System (ADS)

    Zhang, Shaofeng; Li, Xiaojun; Zhu, Zuchao

    2018-06-01

    Thermodynamic effects on cryogenic cavitating flow is important to the accuracy of numerical simulations mainly because cryogenic fluids are thermo-sensitive, and the vapour saturation pressure is strongly dependent on the local temperature. The present study analyses the thermal cavitating flows in liquid nitrogen around a 2D hydrofoil. Thermal effects were considered using the RNG k-ε turbulence model with a modified turbulent eddy viscosity and the mass transfer homogenous cavitation model coupled with energy equation. In the cavitation model process, the saturated vapour pressure is modified based on the Clausius-Clapron equation. The convection heat transfer approach is also considered to extend the Zwart-Gerber-Belamri model. The predicted pressure and temperature inside the cavity under cryogenic conditions show that the modified Zwart-Gerber-Belamri model is in agreement with the experimental data of Hord et al. in NASA, especially in the thermal field. The thermal effect significantly affects the cavitation dynamics during phase-change process, which could delay or suppress the occurrence and development of cavitation behaviour. Based on the modified Zwart-Gerber-Belamri model proposed in this paper, better prediction of the cryogenic cavitation is attainable.

  1. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps

  2. Postural effects on intracranial pressure: modeling and clinical evaluation.

    PubMed

    Qvarlander, Sara; Sundström, Nina; Malm, Jan; Eklund, Anders

    2013-11-01

    The physiological effect of posture on intracranial pressure (ICP) is not well described. This study defined and evaluated three mathematical models describing the postural effects on ICP, designed to predict ICP at different head-up tilt angles from the supine ICP value. Model I was based on a hydrostatic indifference point for the cerebrospinal fluid (CSF) system, i.e., the existence of a point in the system where pressure is independent of body position. Models II and III were based on Davson's equation for CSF absorption, which relates ICP to venous pressure, and postulated that gravitational effects within the venous system are transferred to the CSF system. Model II assumed a fully communicating venous system, and model III assumed that collapse of the jugular veins at higher tilt angles creates two separate hydrostatic compartments. Evaluation of the models was based on ICP measurements at seven tilt angles (0-71°) in 27 normal pressure hydrocephalus patients. ICP decreased with tilt angle (ANOVA: P < 0.01). The reduction was well predicted by model III (ANOVA lack-of-fit: P = 0.65), which showed excellent fit against measured ICP. Neither model I nor II adequately described the reduction in ICP (ANOVA lack-of-fit: P < 0.01). Postural changes in ICP could not be predicted based on the currently accepted theory of a hydrostatic indifference point for the CSF system, but a new model combining Davson's equation for CSF absorption and hydrostatic gradients in a collapsible venous system performed well and can be useful in future research on gravity and CSF physiology.

  3. The comparison of the effect of Transactional Model-based Teaching and Ordinary Education Curriculum- based Teaching programs on stress management among teachers.

    PubMed

    Mazloomy Mahmoodabad, Seyed Saeed; Mohammadi, Maryam; Zadeh, Davood Shojaei; Barkhordari, Abolfazl; Hosaini, Fatemeh; Kaveh, Mohammad Hossain; Malehi, Amal Saki; Rahiminegad, Mohammadkazem

    2014-04-09

    Regarding the effect of teachers' stress on teaching and learning processes, the researchers decided to provide a stress management program based on Transactional Model to solve this teachers' problems. Thus, this study is going to investigate the effect of Transactional Model- based Teaching and the Ordinary Education Curriculum- based Teaching programs on Yazd teachers. The study was a semi- experimental one. The sample population (200 people) was selected using categorized method. The data were collected via PSS Questionnaire and a questionnaire which its validity and reliability had been proved. Eight teaching sessions were hold for 60-90 min. Evaluation was performed in three steps. The data were described and analyzed using SPSS software version 15. Value of P<0.05 was considered as significant. The participants were 200 people of Yazd teachers of primary schools. Mean age of group 1 and 2 was 42.05±5.69 and 41.25±5.89 respectively. Independent T- Test indicated a significant mean score (p=0.000) due to perceived stress of interference groups in post interference step and follow-up one respectively. Results showed a decreasing effect of both programs, but the Transactional Model- based interference indicated to decrease stress more than the other.

  4. Models for Microbial Fuel Cells: A critical review

    NASA Astrophysics Data System (ADS)

    Xia, Chengshuo; Zhang, Daxing; Pedrycz, Witold; Zhu, Yingmin; Guo, Yongxian

    2018-01-01

    Microbial fuel cells (MFCs) have been widely viewed as one of the most promising alternative sources of renewable energy. A recognition of needs of efficient development methods based on multidisciplinary research becomes crucial for the optimization of MFCs. Modeling of MFCs is an effective way for not only gaining a thorough understanding of the effects of operation conditions on the performance of power generation but also becomes of essential interest to the successful implementation of MFCs. The MFC models encompass the underlying reaction process and limiting factors of the MFC. The models come in various forms, such as the mathematical equations or the equivalent circuits. Different modeling focuses and approaches of the MFC have emerged. In this study, we present a state of the art of MFCs modeling; the past modeling methods are reviewed as well. Models and modeling methods are elaborated on based on the classification provided by Mechanism-based models and Application-based models. Mechanisms, advantages, drawbacks, and application fields of different models are illustrated as well. We exhibit a complete and comprehensive exposition of the different models for MFCs and offer further guidance to promote the performance of MFCs.

  5. Illumination modelling of a mobile device environment for effective use in driving mobile apps

    NASA Astrophysics Data System (ADS)

    Marhoubi, Asmaa H.; Saravi, Sara; Edirisinghe, Eran A.; Bez, Helmut E.

    2015-05-01

    The present generation of Ambient Light Sensors (ALS) of a mobile handheld device suffer from two practical shortcomings. The ALSs are narrow angle, i.e. they respond effectively only within a narrow angle of operation and there is a latency of operation. As a result mobile applications that operate based on the ALS readings could perform sub-optimally especially when operated in environments with non-uniform illumination. The applications will either adopt with unacceptable levels of latency or/and may demonstrate a discrete nature of operation. In this paper we propose a framework to predict the ambient illumination of an environment in which a mobile device is present. The predictions are based on an illumination model that is developed based on a small number of readings taken during an application calibration stage. We use a machine learning based approach in developing the models. Five different regression models were developed, implemented and compared based on Polynomial, Gaussian, Sum of Sine, Fourier and Smoothing Spline functions. Approaches to remove noisy data, missing values and outliers were used prior to the modelling stage to remove their negative effects on modelling. The prediction accuracy for all models were found to be above 0.99 when measured using R-Squared test with the best performance being from Smoothing Spline. In this paper we will discuss mathematical complexity of each model and investigate how to make compromises in finding the best model.

  6. Model for large magnetoresistance effect in p–n junctions

    NASA Astrophysics Data System (ADS)

    Cao, Yang; Yang, Dezheng; Si, Mingsu; Shi, Huigang; Xue, Desheng

    2018-06-01

    We present a simple model based on the classic Shockley model to explain the magnetotransport in nonmagnetic p–n junctions. Under a magnetic field, the evaluation of the carrier to compensate Lorentz force establishes the necessary space-charge region distribution. The calculated current–voltage (I–V) characteristics under various magnetic fields demonstrate that the conventional nonmagnetic p–n junction can exhibit an extremely large magnetoresistance effect, which is even larger than that in magnetic materials. Because the large magnetoresistance effect that we discussed is based on the conventional p–n junction device, our model provides new insight into the development of semiconductor magnetoelectronics.

  7. Modeling the Player: Predictability of the Models of Bartle and Kolb Based on NEO-FFI (Big5) and the Implications for Game Based Learning

    ERIC Educational Resources Information Center

    Konert, Johannes; Gutjahr, Michael; Göbel, Stefan; Steinmetz, Ralf

    2014-01-01

    For adaptation and personalization of game play sophisticated player models and learner models are used in game-based learning environments. Thus, the game flow can be optimized to increase efficiency and effectiveness of gaming and learning in parallel. In the field of gaming still the Bartle model is commonly used due to its simplicity and good…

  8. The Development and Evaluation of Speaking Learning Model by Cooperative Approach

    ERIC Educational Resources Information Center

    Darmuki, Agus; Andayani; Nurkamto, Joko; Saddhono, Kundharu

    2018-01-01

    A cooperative approach-based Speaking Learning Model (SLM) has been developed to improve speaking skill of Higher Education students. This research aimed at evaluating the effectiveness of cooperative-based SLM viewed from the development of student's speaking ability and its effectiveness on speaking activity. This mixed method study combined…

  9. Detecting treatment-subgroup interactions in clustered data with generalized linear mixed-effects model trees.

    PubMed

    Fokkema, M; Smits, N; Zeileis, A; Hothorn, T; Kelderman, H

    2017-10-25

    Identification of subgroups of patients for whom treatment A is more effective than treatment B, and vice versa, is of key importance to the development of personalized medicine. Tree-based algorithms are helpful tools for the detection of such interactions, but none of the available algorithms allow for taking into account clustered or nested dataset structures, which are particularly common in psychological research. Therefore, we propose the generalized linear mixed-effects model tree (GLMM tree) algorithm, which allows for the detection of treatment-subgroup interactions, while accounting for the clustered structure of a dataset. The algorithm uses model-based recursive partitioning to detect treatment-subgroup interactions, and a GLMM to estimate the random-effects parameters. In a simulation study, GLMM trees show higher accuracy in recovering treatment-subgroup interactions, higher predictive accuracy, and lower type II error rates than linear-model-based recursive partitioning and mixed-effects regression trees. Also, GLMM trees show somewhat higher predictive accuracy than linear mixed-effects models with pre-specified interaction effects, on average. We illustrate the application of GLMM trees on an individual patient-level data meta-analysis on treatments for depression. We conclude that GLMM trees are a promising exploratory tool for the detection of treatment-subgroup interactions in clustered datasets.

  10. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  11. Adequacy Model for School Funding

    ERIC Educational Resources Information Center

    Banicki, Guy; Murphy, Gregg

    2014-01-01

    This study considers the effectiveness of the Evidence-Based Adequacy model of school funding. In looking at the Evidence-Based Adequacy model for school funding, one researcher has been centrally associated with the development and study of this model. Allen Odden is currently a professor in the Department of Educational Leadership and Policy…

  12. [Advance in researches on the effect of forest on hydrological process].

    PubMed

    Zhang, Zhiqiang; Yu, Xinxiao; Zhao, Yutao; Qin, Yongsheng

    2003-01-01

    According to the effects of forest on hydrological process, forest hydrology can be divided into three related aspects: experimental research on the effects of forest changing on hydrological process quantity and water quality; mechanism study on the effects of forest changing on hydrological cycle, and establishing and exploitating physical-based distributed forest hydrological model for resource management and engineering construction. Orientation experiment research can not only support the first-hand data for forest hydrological model, but also make clear the precipitation-runoff mechanisms. Research on runoff mechanisms can be valuable for the exploitation and improvement of physical based hydrological models. Moreover, the model can also improve the experimental and runoff mechanism researches. A review of above three aspects are summarized in this paper.

  13. Analysis of radiative and phase-change phenomena with application to space-based thermal energy storage

    NASA Technical Reports Server (NTRS)

    Lund, Kurt O.

    1991-01-01

    The simplified geometry for the analysis is an infinite, axis symmetric annulus with a specified solar flux at the outer radius. The inner radius is either adiabatic (modeling Flight Experiment conditions), or convective (modeling Solar Dynamic conditions). Liquid LiF either contacts the outer wall (modeling ground based testing), or faces a void gap at the outer wall (modeling possible space based conditions). The analysis is presented in three parts: Part 3 considers and adiabatic inner wall and linearized radiation equations; part 2 adds effects of convection at the inner wall; and part 1 includes the effect of the void gap, as well as previous effects, and develops the radiation model further. The main results are the differences in melting behavior which can occur between ground based 1 g experiments and the microgravity flight experiments. Under 1 gravity, melted PCM will always contact the outer wall having the heat flux source, thus providing conductance from this source to the phase change front. In space based tests where a void gap may likely form during solidification, the situation is reversed; radiation is now the only mode of heat transfer and the majority of melting takes place from the inner wall.

  14. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  15. A PHYSIOLOGICALLY BASED TOXICOKINETIC MODEL FOR LAKE TROUT (SALVELINUS NAMAYCUSH)

    EPA Science Inventory

    A physiologically based toxicokinetic (PB-TK) model for fish, incorporating chemical exchange at the gill and accumulation in five tissue compartments, was used to examine the effect of natural variability in physiological, morphological, and physico-chemical parameters on model ...

  16. A 4DCT imaging-based breathing lung model with relative hysteresis

    PubMed Central

    Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.; Lin, Ching-Long

    2016-01-01

    To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for both models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry. PMID:28260811

  17. A 4DCT imaging-based breathing lung model with relative hysteresis

    NASA Astrophysics Data System (ADS)

    Miyawaki, Shinjiro; Choi, Sanghun; Hoffman, Eric A.; Lin, Ching-Long

    2016-12-01

    To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for both models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry.

  18. Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation

    NASA Astrophysics Data System (ADS)

    Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong

    2017-05-01

    Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.

  19. Coupling of the simultaneous heat and water model with a distributed hydrological model and evaluation of the combined model in a cold region watershed

    USDA-ARS?s Scientific Manuscript database

    To represent the effects of frozen soil on hydrology in cold regions, a new physically based distributed hydrological model has been developed by coupling the simultaneous heat and water model (SHAW) with the geomorphology based distributed hydrological model (GBHM), under the framework of the water...

  20. A Review of Element-Based Galerkin Methods for Numerical Weather Prediction

    DTIC Science & Technology

    2015-04-01

    with body forces to model the effects of gravity and the Earth’s rotation (i.e. Coriolis force). Although the gravitational force varies with both...more phenomena (e.g. resolving non-hydrostatic effects , incorporating more complex moisture parameterizations), their appetite for High Performance...operation effectively ). For instance, the ST-based model NOGAPS, used by the U. S. Navy, could not scale beyond 150 processes at typical resolutions [119

  1. Examining the Big-Fish-Little-Pond Effect on Students' Self-Concept of Learning Science in Taiwan Based on the TIMSS Databases

    ERIC Educational Resources Information Center

    Liou, Pey-Yan

    2014-01-01

    The purpose of this study is to examine the relationship between student self-concept and achievement in science in Taiwan based on the big-fish-little-pond effect (BFLPE) model using the Trends in International Mathematics and Science Study (TIMSS) 2003 and 2007 databases. Hierarchical linear modeling was used to examine the effects of the…

  2. Using Agent Based Modeling (ABM) to Develop Cultural Interaction Simulations

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Jones, Phillip N.

    2012-01-01

    Today, most cultural training is based on or built around "cultural engagements" or discrete interactions between the individual learner and one or more cultural "others". Often, success in the engagement is the end or the objective. In reality, these interactions usually involve secondary and tertiary effects with potentially wide ranging consequences. The concern is that learning culture within a strict engagement context might lead to "checklist" cultural thinking that will not empower learners to understand the full consequence of their actions. We propose the use of agent based modeling (ABM) to collect, store, and, simulating the effects of social networks, promulgate engagement effects over time, distance, and consequence. The ABM development allows for rapid modification to re-create any number of population types, extending the applicability of the model to any requirement for social modeling.

  3. Cost-effectiveness of Population Screening for BRCA Mutations in Ashkenazi Jewish Women Compared With Family History–Based Testing

    PubMed Central

    Manchanda, Ranjit; Legood, Rosa; Burnell, Matthew; McGuire, Alistair; Raikou, Maria; Loggenberg, Kelly; Wardle, Jane; Sanderson, Saskia; Gessler, Sue; Side, Lucy; Balogun, Nyala; Desai, Rakshit; Kumar, Ajith; Dorkins, Huw; Wallis, Yvonne; Chapman, Cyril; Taylor, Rohan; Jacobs, Chris; Tomlinson, Ian; Beller, Uziel; Menon, Usha

    2015-01-01

    Background: Population-based testing for BRCA1/2 mutations detects the high proportion of carriers not identified by cancer family history (FH)–based testing. We compared the cost-effectiveness of population-based BRCA testing with the standard FH-based approach in Ashkenazi Jewish (AJ) women. Methods: A decision-analytic model was developed to compare lifetime costs and effects amongst AJ women in the UK of BRCA founder-mutation testing amongst: 1) all women in the population age 30 years or older and 2) just those with a strong FH (≥10% mutation risk). The model assumes that BRCA carriers are offered risk-reducing salpingo-oophorectomy and annual MRI/mammography screening or risk-reducing mastectomy. Model probabilities utilize the Genetic Cancer Prediction through Population Screening trial/published literature to estimate total costs, effects in terms of quality-adjusted life-years (QALYs), cancer incidence, incremental cost-effectiveness ratio (ICER), and population impact. Costs are reported at 2010 prices. Costs/outcomes were discounted at 3.5%. We used deterministic/probabilistic sensitivity analysis (PSA) to evaluate model uncertainty. Results: Compared with FH-based testing, population-screening saved 0.090 more life-years and 0.101 more QALYs resulting in 33 days’ gain in life expectancy. Population screening was found to be cost saving with a baseline-discounted ICER of -£2079/QALY. Population-based screening lowered ovarian and breast cancer incidence by 0.34% and 0.62%. Assuming 71% testing uptake, this leads to 276 fewer ovarian and 508 fewer breast cancer cases. Overall, reduction in treatment costs led to a discounted cost savings of £3.7 million. Deterministic sensitivity analysis and 94% of simulations on PSA (threshold £20000) indicated that population screening is cost-effective, compared with current NHS policy. Conclusion: Population-based screening for BRCA mutations is highly cost-effective compared with an FH-based approach in AJ women age 30 years and older. PMID:25435542

  4. Quantifying the Global Nitrous Oxide Emissions Using a Trait-based Biogeochemistry Model

    NASA Astrophysics Data System (ADS)

    Zhuang, Q.; Yu, T.

    2017-12-01

    Nitrogen is an essential element for the global biogeochemical cycle. It is a key nutrient for organisms and N compounds including nitrous oxide significantly influence the global climate. The activities of bacteria and archaea are responsible for the nitrification and denitrification in a wide variety of environments, so microbes play an important role in the nitrogen cycle in soils. To date, most existing process-based models treated nitrification and denitrification as chemical reactions driven by soil physical variables including soil temperature and moisture. In general, the effect of microbes on N cycling has not been modeled in sufficient details. Soil organic carbon also affects the N cycle because it supplies energy to microbes. In my study, a trait-based biogeochemistry model quantifying N2O emissions from the terrestrial ecosystems is developed based on an extant process-based model TEM (Terrestrial Ecosystem Model). Specifically, the improvement to TEM includes: 1) Incorporating the N fixation process to account for the inflow of N from the atmosphere to biosphere; 2) Implementing the effects of microbial dynamics on nitrification process; 3) fully considering the effects of carbon cycling on N nitrogen cycling following the principles of stoichiometry of carbon and nitrogen in soils, plants, and microbes. The difference between simulations with and without the consideration of bacterial activity lies between 5% 25% based on climate conditions and vegetation types. The trait based module allows a more detailed estimation of global N2O emissions.

  5. e-CBT (myCompass), Antidepressant Medication, and Face-to-Face Psychological Treatment for Depression in Australia: A Cost-Effectiveness Comparison

    PubMed Central

    2015-01-01

    Background The economic cost of depression is becoming an ever more important determinant for health policy and decision makers. Internet-based interventions with and without therapist support have been found to be effective options for the treatment of mild to moderate depression. With increasing demands on health resources and shortages of mental health care professionals, the integration of cost-effective treatment options such as Internet-based programs into primary health care could increase efficiency in terms of resource use and costs. Objective Our aim was to evaluate the cost-effectiveness of an Internet-based intervention (myCompass) for the treatment of mild-to-moderate depression compared to treatment as usual and cognitive behavior therapy in a stepped care model. Methods A decision model was constructed using a cost utility framework to show both costs and health outcomes. In accordance with current treatment guidelines, a stepped care model included myCompass as the first low-intervention step in care for a proportion of the model cohort, with participants beginning from a low-intensity intervention to increasing levels of treatment. Model parameters were based on data from the recent randomized controlled trial of myCompass, which showed that the intervention reduced symptoms of depression, anxiety, and stress and improved work and social functioning for people with symptoms in the mild-to-moderate range. Results The average net monetary benefit (NMB) was calculated, identifying myCompass as the strategy with the highest net benefit. The mean incremental NMB per individual for the myCompass group was AUD 1165.88 compared to treatment as usual and AUD 522.58 for the cognitive behavioral therapy model. Conclusions Internet-based interventions can provide cost-effective access to treatment when provided as part of a stepped care model. Widespread dissemination of Internet-based programs can potentially reduce demands on primary and tertiary services and reduce unmet need. PMID:26561555

  6. e-CBT (myCompass), Antidepressant Medication, and Face-to-Face Psychological Treatment for Depression in Australia: A Cost-Effectiveness Comparison.

    PubMed

    Solomon, Daniela; Proudfoot, Judith; Clarke, Janine; Christensen, Helen

    2015-11-11

    The economic cost of depression is becoming an ever more important determinant for health policy and decision makers. Internet-based interventions with and without therapist support have been found to be effective options for the treatment of mild to moderate depression. With increasing demands on health resources and shortages of mental health care professionals, the integration of cost-effective treatment options such as Internet-based programs into primary health care could increase efficiency in terms of resource use and costs. Our aim was to evaluate the cost-effectiveness of an Internet-based intervention (myCompass) for the treatment of mild-to-moderate depression compared to treatment as usual and cognitive behavior therapy in a stepped care model. A decision model was constructed using a cost utility framework to show both costs and health outcomes. In accordance with current treatment guidelines, a stepped care model included myCompass as the first low-intervention step in care for a proportion of the model cohort, with participants beginning from a low-intensity intervention to increasing levels of treatment. Model parameters were based on data from the recent randomized controlled trial of myCompass, which showed that the intervention reduced symptoms of depression, anxiety, and stress and improved work and social functioning for people with symptoms in the mild-to-moderate range. The average net monetary benefit (NMB) was calculated, identifying myCompass as the strategy with the highest net benefit. The mean incremental NMB per individual for the myCompass group was AUD 1165.88 compared to treatment as usual and AUD 522.58 for the cognitive behavioral therapy model. Internet-based interventions can provide cost-effective access to treatment when provided as part of a stepped care model. Widespread dissemination of Internet-based programs can potentially reduce demands on primary and tertiary services and reduce unmet need.

  7. The Effect of Environmental Regulation on Employment in Resource-Based Areas of China-An Empirical Research Based on the Mediating Effect Model.

    PubMed

    Cao, Wenbin; Wang, Hui; Ying, Huihui

    2017-12-19

    While environmental pollution is becoming more and more serious, many countries are adopting policies to control pollution. At the same time, the environmental regulation will inevitably affect economic and social development, especially employment growth. The environmental regulation will not only affect the scale of employment directly, but it will also have indirect effects by stimulating upgrades in the industrial structure and in technological innovation. This paper examines the impact of environmental regulation on employment, using a mediating model based on the data from five typical resource-based provinces in China from 2000 to 2015. The estimation is performed based on the system GMM (Generalized Method of Moments) estimator. The results show that the implementation of environmental regulation in resource-based areas has both a direct effect and a mediating effect on employment. These findings provide policy implications for these resource-based areas to promote the coordinating development between the environment and employment.

  8. The Effect of Environmental Regulation on Employment in Resource-Based Areas of China—An Empirical Research Based on the Mediating Effect Model

    PubMed Central

    Cao, Wenbin; Wang, Hui; Ying, Huihui

    2017-01-01

    While environmental pollution is becoming more and more serious, many countries are adopting policies to control pollution. At the same time, the environmental regulation will inevitably affect economic and social development, especially employment growth. The environmental regulation will not only affect the scale of employment directly, but it will also have indirect effects by stimulating upgrades in the industrial structure and in technological innovation. This paper examines the impact of environmental regulation on employment, using a mediating model based on the data from five typical resource-based provinces in China from 2000 to 2015. The estimation is performed based on the system GMM (Generalized Method of Moments) estimator. The results show that the implementation of environmental regulation in resource-based areas has both a direct effect and a mediating effect on employment. These findings provide policy implications for these resource-based areas to promote the coordinating development between the environment and employment. PMID:29257068

  9. A Bifactor Multidimensional Item Response Theory Model for Differential Item Functioning Analysis on Testlet-Based Items

    ERIC Educational Resources Information Center

    Fukuhara, Hirotaka; Kamata, Akihito

    2011-01-01

    A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…

  10. Diagnosing Students' Mental Models via the Web-Based Mental Models Diagnosis System

    ERIC Educational Resources Information Center

    Wang, Tzu-Hua; Chiu, Mei-Hung; Lin, Jing-Wen; Chou, Chin-Cheng

    2013-01-01

    Mental models play an important role in science education research. To extend the effectiveness of conceptual change research and to improve mental model identi?cation and diagnosis, the authors developed and tested the Web-Based Mental Models Diagnosis (WMMD) system. In this article, they describe their WMMD system, which goes beyond the…

  11. Diffraction peak profiles of surface relaxed spherical nanocrystals

    NASA Astrophysics Data System (ADS)

    Perez-Demydenko, C.; Scardi, P.

    2017-09-01

    A model is proposed for surface relaxation of spherical nanocrystals. Besides reproducing the primary effect of changing the average unit cell parameter, the model accounts for the inhomogeneous atomic displacement caused by surface relaxation and its effect on the diffraction line profiles. Based on three parameters with clear physical meanings - extension of the sub-coordination effect, maximum radial displacement due to sub-coordination, and effective hydrostatic pressure - the model also considers elastic anisotropy and provides parametric expressions of the diffraction line profiles directly applicable in data analysis. The model was tested on spherical nanocrystals of several fcc metals, matching atomic positions with those provided by Molecular Dynamics (MD) simulations based on embedded atom potentials. Agreement was also verified between powder diffraction patterns generated by the Debye scattering equation, using atomic positions from MD and the proposed model.

  12. Designing Effective Online Instruction: A Handbook for Web-Based Courses

    ERIC Educational Resources Information Center

    Koontz, Franklin R.; Li, Hongqin; Compora, Daniel P.

    2006-01-01

    The designing of online courses requires a radical change in the way the instruction is designed and presented to the student. To date, however, there are no research-based models, using a systems approach, that are available to design Web-based instruction. This book introduces the ASSIST-ME Model, an instructional design model for Web-based…

  13. Effectiveness of Gross Model-Based Emotion Regulation Strategies Training on Anger Reduction in Drug-Dependent Individuals and its Sustainability in Follow-up.

    PubMed

    Massah, Omid; Sohrabi, Faramarz; A'azami, Yousef; Doostian, Younes; Farhoudian, Ali; Daneshmand, Reza

    2016-03-01

    Emotion plays an important role in adapting to life changes and stressful events. Difficulty regulating emotions is one of the problems drug abusers often face, and teaching these individuals to express and manage their emotions can be effective on improving their difficult circumstances. The present study aimed to determine the effectiveness of the Gross model-based emotion regulation strategies training on anger reduction in drug-dependent individuals. The present study had a quasi-experimental design wherein pretest-posttest evaluations were applied using a control group. The population under study included addicts attending Marivan's methadone maintenance therapy centers in 2012 - 2013. Convenience sampling was used to select 30 substance-dependent individuals undergoing maintenance treatment who were then randomly assigned to the experiment and control groups. The experiment group received its training in eight two-hour sessions. Data were analyzed using analysis of co-variance and paired t-test. There was significant reduction in anger symptoms of drug-dependent individuals after gross model based emotion regulation training (ERT) (P < 0.001). Moreover, the effectiveness of the training on anger was persistent in the follow-up period. Symptoms of anger in drug-dependent individuals of this study were reduced by gross model-based emotion regulation strategies training. Based on the results of this study, we may conclude that the gross model based emotion regulation strategies training can be applied alongside other therapies to treat drug abusers undergoing rehabilitation.

  14. Efficient Vaccine Distribution Based on a Hybrid Compartmental Model.

    PubMed

    Yu, Zhiwen; Liu, Jiming; Wang, Xiaowei; Zhu, Xianjun; Wang, Daxing; Han, Guoqiang

    2016-01-01

    To effectively and efficiently reduce the morbidity and mortality that may be caused by outbreaks of emerging infectious diseases, it is very important for public health agencies to make informed decisions for controlling the spread of the disease. Such decisions must incorporate various kinds of intervention strategies, such as vaccinations, school closures and border restrictions. Recently, researchers have paid increased attention to searching for effective vaccine distribution strategies for reducing the effects of pandemic outbreaks when resources are limited. Most of the existing research work has been focused on how to design an effective age-structured epidemic model and to select a suitable vaccine distribution strategy to prevent the propagation of an infectious virus. Models that evaluate age structure effects are common, but models that additionally evaluate geographical effects are less common. In this paper, we propose a new SEIR (susceptible-exposed-infectious šC recovered) model, named the hybrid SEIR-V model (HSEIR-V), which considers not only the dynamics of infection prevalence in several age-specific host populations, but also seeks to characterize the dynamics by which a virus spreads in various geographic districts. Several vaccination strategies such as different kinds of vaccine coverage, different vaccine releasing times and different vaccine deployment methods are incorporated into the HSEIR-V compartmental model. We also design four hybrid vaccination distribution strategies (based on population size, contact pattern matrix, infection rate and infectious risk) for controlling the spread of viral infections. Based on data from the 2009-2010 H1N1 influenza epidemic, we evaluate the effectiveness of our proposed HSEIR-V model and study the effects of different types of human behaviour in responding to epidemics.

  15. Efficient Vaccine Distribution Based on a Hybrid Compartmental Model

    PubMed Central

    Yu, Zhiwen; Liu, Jiming; Wang, Xiaowei; Zhu, Xianjun; Wang, Daxing; Han, Guoqiang

    2016-01-01

    To effectively and efficiently reduce the morbidity and mortality that may be caused by outbreaks of emerging infectious diseases, it is very important for public health agencies to make informed decisions for controlling the spread of the disease. Such decisions must incorporate various kinds of intervention strategies, such as vaccinations, school closures and border restrictions. Recently, researchers have paid increased attention to searching for effective vaccine distribution strategies for reducing the effects of pandemic outbreaks when resources are limited. Most of the existing research work has been focused on how to design an effective age-structured epidemic model and to select a suitable vaccine distribution strategy to prevent the propagation of an infectious virus. Models that evaluate age structure effects are common, but models that additionally evaluate geographical effects are less common. In this paper, we propose a new SEIR (susceptible—exposed—infectious šC recovered) model, named the hybrid SEIR-V model (HSEIR-V), which considers not only the dynamics of infection prevalence in several age-specific host populations, but also seeks to characterize the dynamics by which a virus spreads in various geographic districts. Several vaccination strategies such as different kinds of vaccine coverage, different vaccine releasing times and different vaccine deployment methods are incorporated into the HSEIR-V compartmental model. We also design four hybrid vaccination distribution strategies (based on population size, contact pattern matrix, infection rate and infectious risk) for controlling the spread of viral infections. Based on data from the 2009–2010 H1N1 influenza epidemic, we evaluate the effectiveness of our proposed HSEIR-V model and study the effects of different types of human behaviour in responding to epidemics. PMID:27233015

  16. Cost-effectiveness of the community-based management of severe acute malnutrition by community health workers in southern Bangladesh.

    PubMed

    Puett, Chloe; Sadler, Kate; Alderman, Harold; Coates, Jennifer; Fiedler, John L; Myatt, Mark

    2013-07-01

    This study assessed the cost-effectiveness of adding the community-based management of severe acute malnutrition (CMAM) to a community-based health and nutrition programme delivered by community health workers (CHWs) in southern Bangladesh. The cost-effectiveness of this model of treatment for severe acute malnutrition (SAM) was compared with the cost-effectiveness of the 'standard of care' for SAM (i.e. inpatient treatment), augmented with community surveillance by CHWs to detect cases, in a neighbouring area. An activity-based cost model was used, and a societal perspective taken, to include all costs incurred in the programme by providers and participants for the management of SAM in both areas. Cost data were coupled with programme effectiveness data. The community-based strategy cost US$26 per disability-adjusted life year (DALY) averted, compared with US$1344 per DALY averted for inpatient treatment. The average cost to participant households for their child to recover from SAM in community treatment was one-sixth that of inpatient treatment. These results suggest that this model of treatment for SAM is highly cost-effective and that CHWs, given adequate supervision and training, can be employed effectively to expand access to treatment for SAM in Bangladesh.

  17. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  18. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  19. The fractional volatility model: An agent-based interpretation

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  20. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    NASA Astrophysics Data System (ADS)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  1. Micromechanics based phenomenological damage modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muju, S.; Anderson, P.M.; Popelar, C.H.

    A model is developed for the study of process zone effects on dominant cracks. The model proposed here is intended to bridge the gap between the micromechanics based and the phenomenological models for the class of problems involving microcracking, transforming inclusions etc. It is based on representation of localized eigenstrains using dislocation dipoles. The eigenstrain (fitting strain) is represented as the strength (Burgers vector) of the dipole which obeys a certain phenomenological constitutive relation.

  2. Lithologic Effects on Landscape Response to Base Level Changes: A Modeling Study in the Context of the Eastern Jura Mountains, Switzerland

    NASA Astrophysics Data System (ADS)

    Yanites, Brian J.; Becker, Jens K.; Madritsch, Herfried; Schnellmann, Michael; Ehlers, Todd A.

    2017-11-01

    Landscape evolution is a product of the forces that drive geomorphic processes (e.g., tectonics and climate) and the resistance to those processes. The underlying lithology and structural setting in many landscapes set the resistance to erosion. This study uses a modified version of the Channel-Hillslope Integrated Landscape Development (CHILD) landscape evolution model to determine the effect of a spatially and temporally changing erodibility in a terrain with a complex base level history. Specifically, our focus is to quantify how the effects of variable lithology influence transient base level signals. We set up a series of numerical landscape evolution models with increasing levels of complexity based on the lithologic variability and base level history of the Jura Mountains of northern Switzerland. The models are consistent with lithology (and therewith erodibility) playing an important role in the transient evolution of the landscape. The results show that the erosion rate history at a location depends on the rock uplift and base level history, the range of erodibilities of the different lithologies, and the history of the surface geology downstream from the analyzed location. Near the model boundary, the history of erosion is dominated by the base level history. The transient wave of incision, however, is quite variable in the different model runs and depends on the geometric structure of lithology used. It is thus important to constrain the spatiotemporal erodibility patterns downstream of any given point of interest to understand the evolution of a landscape subject to variable base level in a quantitative framework.

  3. Application of physiologically based absorption modeling to formulation development of a low solubility, low permeability weak base: mechanistic investigation of food effect.

    PubMed

    Zhang, Hefei; Xia, Binfeng; Sheng, Jennifer; Heimbach, Tycho; Lin, Tsu-Han; He, Handan; Wang, Yanfeng; Novick, Steven; Comfort, Ann

    2014-04-01

    Physiologically based pharmacokinetic (PBPK) modeling has been broadly used to facilitate drug development, hereby we developed a PBPK model to systematically investigate the underlying mechanisms of the observed positive food effect of compound X (cpd X) and to strategically explore the feasible approaches to mitigate the food effect. Cpd X is a weak base with pH-dependent solubility; the compound displays significant and dose-dependent food effect in humans, leading to a nonadherence of drug administration. A GastroPlus Opt logD Model was selected for pharmacokinetic simulation under both fasted and fed conditions, where the biopharmaceutic parameters (e.g., solubility and permeability) for cpd X were determined in vitro, and human pharmacokinetic disposition properties were predicted from preclinical data and then optimized with clinical pharmacokinetic data. A parameter sensitivity analysis was performed to evaluate the effect of particle size on the cpd X absorption. A PBPK model was successfully developed for cpd X; its pharmacokinetic parameters (e.g., C max, AUCinf, and t max) predicted at different oral doses were within ±25% of the observed mean values. The in vivo solubility (in duodenum) and mean precipitation time under fed conditions were estimated to be 7.4- and 3.4-fold higher than those under fasted conditions, respectively. The PBPK modeling analysis provided a reasonable explanation for the underlying mechanism for the observed positive food effect of the cpd X in humans. Oral absorption of the cpd X can be increased by reducing the particle size (<100 nm) of an active pharmaceutical ingredient under fasted conditions and therefore, reduce the cpd X food effect correspondingly.

  4. Modeling the population-level effects of hypoxia on a coastal fish: implications of a spatially-explicit individual-based model

    NASA Astrophysics Data System (ADS)

    Rose, K.; Creekmore, S.; Thomas, P.; Craig, K.; Neilan, R.; Rahman, S.; Wang, L.; Justic, D.

    2016-02-01

    The northwestern Gulf of Mexico (USA) currently experiences a large hypoxic area ("dead zone") during the summer. The population-level effects of hypoxia on coastal fish are largely unknown. We developed a spatially-explicit, individual-based model to analyze how hypoxia effects on reproduction, growth, and mortality of individual Atlantic croaker could lead to population-level responses. The model follows the hourly growth, mortality, reproduction, and movement of individuals on a 300 x 800 spatial grid of 1 km2 cells for 140 years. Chlorophyll-a concentration and water temperature were specified daily for each grid cell. Dissolved oxygen (DO) was obtained from a 3-D water quality model for four years that differed in their severity of hypoxia. A bioenergetics model was used to represent growth, mortality was assumed stage- and age-dependent, and movement behavior was based on temperature preferences and avoidance of low DO. Hypoxia effects were imposed using exposure-effects sub-models that converted time-varying exposure to DO to reductions in growth and fecundity, and increases in mortality. Using sequences of mild, intermediate, and severe hypoxia years, the model predicted a 20% decrease in population abundance. Additional simulations were performed under the assumption that river-based nutrients loadings that lead to more hypoxia also lead to higher primary production and more food for croaker. Twenty-five percent and 50% nutrient reduction scenarios were simulated by adjusting the cholorphyll-a concentrations used as food proxy for the croaker. We then incrementally increased the DO concentrations to determine how much hypoxia would need to be reduced to offset the lower food production resulting from reduced nutrients. We discuss the generality of our results, the hidden effects of hypoxia on fish, and our overall strategy of combining laboratory and field studies with modeling to produce robust predictions of population responses to stressors under dynamic and multi-stressor conditions.

  5. A Micromechanics-Based Damage Model for [+/- Theta/90n]s Composite Laminates

    NASA Technical Reports Server (NTRS)

    Mayugo, Joan-Andreu; Camanho, Pedro P.; Maimi, Pere; Davila, Carlos G.

    2006-01-01

    A new damage model based on a micromechanical analysis of cracked [+/- Theta/90n]s laminates subjected to multiaxial loads is proposed. The model predicts the onset and accumulation of transverse matrix cracks in uniformly stressed laminates, the effect of matrix cracks on the stiffness of the laminate, as well as the ultimate failure of the laminate. The model also accounts for the effect of the ply thickness on the ply strength. Predictions relating the elastic properties of several laminates and multiaxial loads are presented.

  6. The Effectiveness of Learning Model of Basic Education with Character-Based at Universitas Muslim Indonesia

    ERIC Educational Resources Information Center

    Rosmiati, Rosmiati; Mahmud, Alimuddin; Talib, Syamsul B.

    2016-01-01

    The purpose of this study was to determine the effectiveness of the basic education learning model with character-based through learning in the Universitas Muslim Indonesia. In addition, the research specifically examines the character of discipline, curiosity and responsibility. The specific target is to produce a basic education learning model…

  7. Effectiveness of Facebook Based Learning to Enhance Creativity among Islamic Studies Students by Employing Isman Instructional Design Model

    ERIC Educational Resources Information Center

    Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah

    2013-01-01

    The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…

  8. A Modeling-Based College Algebra Course and Its Effect on Student Achievement

    ERIC Educational Resources Information Center

    Ellington, Aimee J.

    2005-01-01

    In Fall 2004, Virginia Commonwealth University (VCU) piloted a modeling-based approach to college algebra. This paper describes the course and an assessment that was conducted to determine the effect of this approach on student achievement in comparison to a traditional approach to college algebra. The results show that compared with their…

  9. Devil is in the details: Using logic models to investigate program process.

    PubMed

    Peyton, David J; Scicchitano, Michael

    2017-12-01

    Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A New Ductility Exhaustion Model for High Temperature Low Cycle Fatigue Life Prediction of Turbine Disk Alloys

    NASA Astrophysics Data System (ADS)

    Zhu, Shun-Peng; Huang, Hong-Zhong; Li, Haiqing; Sun, Rui; Zuo, Ming J.

    2011-06-01

    Based on ductility exhaustion theory and the generalized energy-based damage parameter, a new viscosity-based life prediction model is introduced to account for the mean strain/stress effects in the low cycle fatigue regime. The loading waveform parameters and cyclic hardening effects are also incorporated within this model. It is assumed that damage accrues by means of viscous flow and ductility consumption is only related to plastic strain and creep strain under high temperature low cycle fatigue conditions. In the developed model, dynamic viscosity is used to describe the flow behavior. This model provides a better prediction of Superalloy GH4133's fatigue behavior when compared to Goswami's ductility model and the generalized damage parameter. Under non-zero mean strain conditions, moreover, the proposed model provides more accurate predictions of Superalloy GH4133's fatigue behavior than that with zero mean strains.

  11. Minimizing Concentration Effects in Water-Based, Laminar-Flow Condensation Particle Counters

    PubMed Central

    Lewis, Gregory S.; Hering, Susanne V.

    2013-01-01

    Concentration effects in water condensation systems, such as used in the water-based condensation particle counter, are explored through numeric modeling and direct measurements. Modeling shows that the condensation heat release and vapor depletion associated with particle activation and growth lowers the peak supersaturation. At higher number concentrations, the diameter of the droplets formed is smaller, and the threshold particle size for activation is higher. This occurs in both cylindrical and parallel plate geometries. For water-based systems we find that condensational heat release is more important than is vapor depletion. We also find that concentration effects can be minimized through use of smaller tube diameters, or more closely spaced parallel plates. Experimental measurements of droplet diameter confirm modeling results. PMID:24436507

  12. A Physically Based Analytical Model to Describe Effective Excess Charge for Streaming Potential Generation in Water Saturated Porous Media

    NASA Astrophysics Data System (ADS)

    Guarracino, L.; Jougnot, D.

    2018-01-01

    Among the different contributions generating self-potential, the streaming potential is of particular interest in hydrogeology for its sensitivity to water flow. Estimating water flux in porous media using streaming potential data relies on our capacity to understand, model, and upscale the electrokinetic coupling at the mineral-solution interface. Different approaches have been proposed to predict streaming potential generation in porous media. One of these approaches is the flux averaging which is based on determining the excess charge which is effectively dragged in the medium by water flow. In this study, we develop a physically based analytical model to predict the effective excess charge in saturated porous media using a flux-averaging approach in a bundle of capillary tubes with a fractal pore size distribution. The proposed model allows the determination of the effective excess charge as a function of pore water ionic concentration and hydrogeological parameters like porosity, permeability, and tortuosity. The new model has been successfully tested against different set of experimental data from the literature. One of the main findings of this study is the mechanistic explanation to the empirical dependence between the effective excess charge and the permeability that has been found by several researchers. The proposed model also highlights the link to other lithological properties, and it is able to reproduce the evolution of effective excess charge with electrolyte concentrations.

  13. Modeling the Atmospheric Phase Effects of a Digital Antenna Array Communications System

    NASA Technical Reports Server (NTRS)

    Tkacenko, A.

    2006-01-01

    In an antenna array system such as that used in the Deep Space Network (DSN) for satellite communication, it is often necessary to account for the effects due to the atmosphere. Typically, the atmosphere induces amplitude and phase fluctuations on the transmitted downlink signal that invalidate the assumed stationarity of the signal model. The degree to which these perturbations affect the stationarity of the model depends both on parameters of the atmosphere, including wind speed and turbulence strength, and on parameters of the communication system, such as the sampling rate used. In this article, we focus on modeling the atmospheric phase fluctuations in a digital antenna array communications system. Based on a continuous-time statistical model for the atmospheric phase effects, we show how to obtain a related discrete-time model based on sampling the continuous-time process. The effects of the nonstationarity of the resulting signal model are investigated using the sample matrix inversion (SMI) algorithm for minimum mean-squared error (MMSE) equalization of the received signal

  14. Individual-based modelling and control of bovine brucellosis

    NASA Astrophysics Data System (ADS)

    Nepomuceno, Erivelton G.; Barbosa, Alípio M.; Silva, Marcos X.; Perc, Matjaž

    2018-05-01

    We present a theoretical approach to control bovine brucellosis. We have used individual-based modelling, which is a network-type alternative to compartmental models. Our model thus considers heterogeneous populations, and spatial aspects such as migration among herds and control actions described as pulse interventions are also easily implemented. We show that individual-based modelling reproduces the mean field behaviour of an equivalent compartmental model. Details of this process, as well as flowcharts, are provided to facilitate the reproduction of the presented results. We further investigate three numerical examples using real parameters of herds in the São Paulo state of Brazil, in scenarios which explore eradication, continuous and pulsed vaccination and meta-population effects. The obtained results are in good agreement with the expected behaviour of this disease, which ultimately showcases the effectiveness of our theory.

  15. Comparing species distribution models constructed with different subsets of environmental predictors

    USGS Publications Warehouse

    Bucklin, David N.; Basille, Mathieu; Benscoter, Allison M.; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.; Speroterra, Carolina; Watling, James I.

    2014-01-01

    Our results indicate that additional predictors have relatively minor effects on the accuracy of climate-based species distribution models and minor to moderate effects on spatial predictions. We suggest that implementing species distribution models with only climate predictors may provide an effective and efficient approach for initial assessments of environmental suitability.

  16. Validating and Optimizing the Effects of Model Progression in Simulation-Based Inquiry Learning

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton; Anjewierden, Anjo; Bollen, Lars

    2012-01-01

    Model progression denotes the organization of the inquiry learning process in successive phases of increasing complexity. This study investigated the effectiveness of model progression in general, and explored the added value of either broadening or narrowing students' possibilities to change model progression phases. Results showed that…

  17. Effects of Instructional Design with Mental Model Analysis on Learning.

    ERIC Educational Resources Information Center

    Hong, Eunsook

    This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…

  18. Linking population viability, habitat suitability, and landscape simulation models for conservation planning

    Treesearch

    Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley

    2004-01-01

    Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...

  19. Investigating the state of physiologically based kinetic modelling practices and challenges associated with gaining regulatory acceptance of model applications

    EPA Science Inventory

    Physiologically based kinetic (PBK) models are used widely throughout a number of working sectors, including academia and industry, to provide insight into the dosimetry related to observed adverse health effects in humans and other species. Use of these models has increased over...

  20. Cost-effectiveness in Clostridium difficile treatment decision-making

    PubMed Central

    Nuijten, Mark JC; Keller, Josbert J; Visser, Caroline E; Redekop, Ken; Claassen, Eric; Speelman, Peter; Pronk, Marja H

    2015-01-01

    AIM: To develop a framework for the clinical and health economic assessment for management of Clostridium difficile infection (CDI). METHODS: CDI has vast economic consequences emphasizing the need for innovative and cost effective solutions, which were aim of this study. A guidance model was developed for coverage decisions and guideline development in CDI. The model included pharmacotherapy with oral metronidazole or oral vancomycin, which is the mainstay for pharmacological treatment of CDI and is recommended by most treatment guidelines. RESULTS: A design for a patient-based cost-effectiveness model was developed, which can be used to estimate the cost-effectiveness of current and future treatment strategies in CDI. Patient-based outcomes were extrapolated to the population by including factors like, e.g., person-to-person transmission, isolation precautions and closing and cleaning wards of hospitals. CONCLUSION: The proposed framework for a population-based CDI model may be used for clinical and health economic assessments of CDI guidelines and coverage decisions for emerging treatments for CDI. PMID:26601096

  1. Cost-effectiveness in Clostridium difficile treatment decision-making.

    PubMed

    Nuijten, Mark Jc; Keller, Josbert J; Visser, Caroline E; Redekop, Ken; Claassen, Eric; Speelman, Peter; Pronk, Marja H

    2015-11-16

    To develop a framework for the clinical and health economic assessment for management of Clostridium difficile infection (CDI). CDI has vast economic consequences emphasizing the need for innovative and cost effective solutions, which were aim of this study. A guidance model was developed for coverage decisions and guideline development in CDI. The model included pharmacotherapy with oral metronidazole or oral vancomycin, which is the mainstay for pharmacological treatment of CDI and is recommended by most treatment guidelines. A design for a patient-based cost-effectiveness model was developed, which can be used to estimate the cost-effectiveness of current and future treatment strategies in CDI. Patient-based outcomes were extrapolated to the population by including factors like, e.g., person-to-person transmission, isolation precautions and closing and cleaning wards of hospitals. The proposed framework for a population-based CDI model may be used for clinical and health economic assessments of CDI guidelines and coverage decisions for emerging treatments for CDI.

  2. Multi-allelic haplotype model based on genetic partition for genomic prediction and variance component estimation using SNP markers.

    PubMed

    Da, Yang

    2015-12-18

    The amount of functional genomic information has been growing rapidly but remains largely unused in genomic selection. Genomic prediction and estimation using haplotypes in genome regions with functional elements such as all genes of the genome can be an approach to integrate functional and structural genomic information for genomic selection. Towards this goal, this article develops a new haplotype approach for genomic prediction and estimation. A multi-allelic haplotype model treating each haplotype as an 'allele' was developed for genomic prediction and estimation based on the partition of a multi-allelic genotypic value into additive and dominance values. Each additive value is expressed as a function of h - 1 additive effects, where h = number of alleles or haplotypes, and each dominance value is expressed as a function of h(h - 1)/2 dominance effects. For a sample of q individuals, the limit number of effects is 2q - 1 for additive effects and is the number of heterozygous genotypes for dominance effects. Additive values are factorized as a product between the additive model matrix and the h - 1 additive effects, and dominance values are factorized as a product between the dominance model matrix and the h(h - 1)/2 dominance effects. Genomic additive relationship matrix is defined as a function of the haplotype model matrix for additive effects, and genomic dominance relationship matrix is defined as a function of the haplotype model matrix for dominance effects. Based on these results, a mixed model implementation for genomic prediction and variance component estimation that jointly use haplotypes and single markers is established, including two computing strategies for genomic prediction and variance component estimation with identical results. The multi-allelic genetic partition fills a theoretical gap in genetic partition by providing general formulations for partitioning multi-allelic genotypic values and provides a haplotype method based on the quantitative genetics model towards the utilization of functional and structural genomic information for genomic prediction and estimation.

  3. Physically based model for extracting dual permeability parameters using non-Newtonian fluids

    NASA Astrophysics Data System (ADS)

    Abou Najm, M. R.; Basset, C.; Stewart, R. D.; Hauswirth, S.

    2017-12-01

    Dual permeability models are effective for the assessment of flow and transport in structured soils with two dominant structures. The major challenge to those models remains in the ability to determine appropriate and unique parameters through affordable, simple, and non-destructive methods. This study investigates the use of water and a non-Newtonian fluid in saturated flow experiments to derive physically-based parameters required for improved flow predictions using dual permeability models. We assess the ability of these two fluids to accurately estimate the representative pore sizes in dual-domain soils, by determining the effective pore sizes of macropores and micropores. We developed two sub-models that solve for the effective macropore size assuming either cylindrical (e.g., biological pores) or planar (e.g., shrinkage cracks and fissures) pore geometries, with the micropores assumed to be represented by a single effective radius. Furthermore, the model solves for the percent contribution to flow (wi) corresponding to the representative macro and micro pores. A user-friendly solver was developed to numerically solve the system of equations, given that relevant non-Newtonian viscosity models lack forms conducive to analytical integration. The proposed dual-permeability model is a unique attempt to derive physically based parameters capable of measuring dual hydraulic conductivities, and therefore may be useful in reducing parameter uncertainty and improving hydrologic model predictions.

  4. The creation and evaluation of a model predicting the probability of conception in seasonal-calving, pasture-based dairy cows.

    PubMed

    Fenlon, Caroline; O'Grady, Luke; Doherty, Michael L; Dunnion, John; Shalloo, Laurence; Butler, Stephen T

    2017-07-01

    Reproductive performance in pasture-based production systems has a fundamentally important effect on economic efficiency. The individual factors affecting the probability of submission and conception are multifaceted and have been extensively researched. The present study analyzed some of these factors in relation to service-level probability of conception in seasonal-calving pasture-based dairy cows to develop a predictive model of conception. Data relating to 2,966 services from 737 cows on 2 research farms were used for model development and data from 9 commercial dairy farms were used for model testing, comprising 4,212 services from 1,471 cows. The data spanned a 15-yr period and originated from seasonal-calving pasture-based dairy herds in Ireland. The calving season for the study herds extended from January to June, with peak calving in February and March. A base mixed-effects logistic regression model was created using a stepwise model-building strategy and incorporated parity, days in milk, interservice interval, calving difficulty, and predicted transmitting abilities for calving interval and milk production traits. To attempt to further improve the predictive capability of the model, the addition of effects that were not statistically significant was considered, resulting in a final model composed of the base model with the inclusion of BCS at service. The models' predictions were evaluated using discrimination to measure their ability to correctly classify positive and negative cases. Precision, recall, F-score, and area under the receiver operating characteristic curve (AUC) were calculated. Calibration tests measured the accuracy of the predicted probabilities. These included tests of overall goodness-of-fit, bias, and calibration error. Both models performed better than using the population average probability of conception. Neither of the models showed high levels of discrimination (base model AUC 0.61, final model AUC 0.62), possibly because of the narrow central range of conception rates in the study herds. The final model was found to reliably predict the probability of conception without bias when evaluated against the full external data set, with a mean absolute calibration error of 2.4%. The chosen model could be used to support a farmer's decision-making and in stochastic simulation of fertility in seasonal-calving pasture-based dairy cows. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. The Compass Rose Effectiveness Model

    ERIC Educational Resources Information Center

    Spiers, Cynthia E.; Kiel, Dorothy; Hohenrink, Brad

    2008-01-01

    The effectiveness model focuses the institution on mission achievement through assessment and improvement planning. Eleven mission criteria, measured by key performance indicators, are aligned with the accountability interest of internal and external stakeholders. A Web-based performance assessment application supports the model, documenting the…

  6. Analysis of thermal effects in endoscopic nanocarriers-based photodynamic therapy applied to esophageal diseases

    NASA Astrophysics Data System (ADS)

    Salas-García, I.; Fanjul-Vélez, F.; Ortega-Quijano, N.; Wilfert, O.; Hudcova, L.; Poliak, J.; Barcik, P.; Arce-Diego, J. L.

    2014-02-01

    In this work we propose a predictive model that allows the study of thermal effects produced when the optical radiation interacts with an esophageal or stomach disease with gold nanoparticles embedded. The model takes into account light distribution in the tumor tissue by means of a Monte Carlo method. Mie theory is used to obtain the gold nanoparticles optical properties and the thermal model employed is based on the bio-heat equation. The complete model was applied to two types of tumoral tissue (squamous cell carcinoma located in the esophagus and adenocarcinoma in the stomach) in order to study the thermal effects induced by the inclusion of gold nanoparticles.

  7. Two levels ARIMAX and regression models for forecasting time series data with calendar variation effects

    NASA Astrophysics Data System (ADS)

    Suhartono, Lee, Muhammad Hisyam; Prastyo, Dedy Dwi

    2015-12-01

    The aim of this research is to develop a calendar variation model for forecasting retail sales data with the Eid ul-Fitr effect. The proposed model is based on two methods, namely two levels ARIMAX and regression methods. Two levels ARIMAX and regression models are built by using ARIMAX for the first level and regression for the second level. Monthly men's jeans and women's trousers sales in a retail company for the period January 2002 to September 2009 are used as case study. In general, two levels of calendar variation model yields two models, namely the first model to reconstruct the sales pattern that already occurred, and the second model to forecast the effect of increasing sales due to Eid ul-Fitr that affected sales at the same and the previous months. The results show that the proposed two level calendar variation model based on ARIMAX and regression methods yields better forecast compared to the seasonal ARIMA model and Neural Networks.

  8. OPC modeling by genetic algorithm

    NASA Astrophysics Data System (ADS)

    Huang, W. C.; Lai, C. M.; Luo, B.; Tsai, C. K.; Tsay, C. S.; Lai, C. W.; Kuo, C. C.; Liu, R. G.; Lin, H. T.; Lin, B. J.

    2005-05-01

    Optical proximity correction (OPC) is usually used to pre-distort mask layouts to make the printed patterns as close to the desired shapes as possible. For model-based OPC, a lithographic model to predict critical dimensions after lithographic processing is needed. The model is usually obtained via a regression of parameters based on experimental data containing optical proximity effects. When the parameters involve a mix of the continuous (optical and resist models) and the discrete (kernel numbers) sets, the traditional numerical optimization method may have difficulty handling model fitting. In this study, an artificial-intelligent optimization method was used to regress the parameters of the lithographic models for OPC. The implemented phenomenological models were constant-threshold models that combine diffused aerial image models with loading effects. Optical kernels decomposed from Hopkin"s equation were used to calculate aerial images on the wafer. Similarly, the numbers of optical kernels were treated as regression parameters. This way, good regression results were obtained with different sets of optical proximity effect data.

  9. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  10. Agent Based Modeling Applications for Geosciences

    NASA Astrophysics Data System (ADS)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in a thermodynamic framework as a set of reactions that roll-up the integrated effect that diverse biological communities exert on a geological system. This approach may work well to predict the effect of certain biological communities in specific environments in which experimental data is available. However, it does not further our knowledge of how the geobiological system actually functions on a micro scale. Agent-based techniques may provide a framework to explore the fundamental interactions required to explain the system-wide behavior. This presentation will present a survey of several promising applications of agent-based modeling approaches to problems in the geosciences and describe specific contributions to some of the inherent challenges facing this approach.

  11. Evaluation of some random effects methodology applicable to bird ringing data

    USGS Publications Warehouse

    Burnham, K.P.; White, Gary C.

    2002-01-01

    Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.

  12. Modelling strategies to predict the multi-scale effects of rural land management change

    NASA Astrophysics Data System (ADS)

    Bulygina, N.; Ballard, C. E.; Jackson, B. M.; McIntyre, N.; Marshall, M.; Reynolds, B.; Wheater, H. S.

    2011-12-01

    Changes to the rural landscape due to agricultural land management are ubiquitous, yet predicting the multi-scale effects of land management change on hydrological response remains an important scientific challenge. Much empirical research has been of little generic value due to inadequate design and funding of monitoring programmes, while the modelling issues challenge the capability of data-based, conceptual and physics-based modelling approaches. In this paper we report on a major UK research programme, motivated by a national need to quantify effects of agricultural intensification on flood risk. Working with a consortium of farmers in upland Wales, a multi-scale experimental programme (from experimental plots to 2nd order catchments) was developed to address issues of upland agricultural intensification. This provided data support for a multi-scale modelling programme, in which highly detailed physics-based models were conditioned on the experimental data and used to explore effects of potential field-scale interventions. A meta-modelling strategy was developed to represent detailed modelling in a computationally-efficient manner for catchment-scale simulation; this allowed catchment-scale quantification of potential management options. For more general application to data-sparse areas, alternative approaches were needed. Physics-based models were developed for a range of upland management problems, including the restoration of drained peatlands, afforestation, and changing grazing practices. Their performance was explored using literature and surrogate data; although subject to high levels of uncertainty, important insights were obtained, of practical relevance to management decisions. In parallel, regionalised conceptual modelling was used to explore the potential of indices of catchment response, conditioned on readily-available catchment characteristics, to represent ungauged catchments subject to land management change. Although based in part on speculative relationships, significant predictive power was derived from this approach. Finally, using a formal Bayesian procedure, these different sources of information were combined with local flow data in a catchment-scale conceptual model application , i.e. using small-scale physical properties, regionalised signatures of flow and available flow measurements.

  13. Modeling fuels and fire effects in 3D: Model description and applications

    Treesearch

    Francois Pimont; Russell Parsons; Eric Rigolot; Francois de Coligny; Jean-Luc Dupuy; Philippe Dreyfus; Rodman R. Linn

    2016-01-01

    Scientists and managers critically need ways to assess how fuel treatments alter fire behavior, yet few tools currently exist for this purpose.We present a spatially-explicit-fuel-modeling system, FuelManager, which models fuels, vegetation growth, fire behavior (using a physics-based model, FIRETEC), and fire effects. FuelManager's flexible approach facilitates...

  14. Explicit continuous charge-based compact model for long channel heavily doped surrounding-gate MOSFETs incorporating interface traps and quantum effects

    NASA Astrophysics Data System (ADS)

    Hamzah, Afiq; Hamid, Fatimah A.; Ismail, Razali

    2016-12-01

    An explicit solution for long-channel surrounding-gate (SRG) MOSFETs is presented from intrinsic to heavily doped body including the effects of interface traps and fixed oxide charges. The solution is based on the core SRGMOSFETs model of the Unified Charge Control Model (UCCM) for heavily doped conditions. The UCCM model of highly doped SRGMOSFETs is derived to obtain the exact equivalent expression as in the undoped case. Taking advantage of the undoped explicit charge-based expression, the asymptotic limits for below threshold and above threshold have been redefined to include the effect of trap states for heavily doped cases. After solving the asymptotic limits, an explicit mobile charge expression is obtained which includes the trap state effects. The explicit mobile charge model shows very good agreement with respect to numerical simulation over practical terminal voltages, doping concentration, geometry effects, and trap state effects due to the fixed oxide charges and interface traps. Then, the drain current is obtained using the Pao-Sah's dual integral, which is expressed as a function of inversion charge densities at the source/drain ends. The drain current agreed well with the implicit solution and numerical simulation for all regions of operation without employing any empirical parameters. A comparison with previous explicit models has been conducted to verify the competency of the proposed model with the doping concentration of 1× {10}19 {{cm}}-3, as the proposed model has better advantages in terms of its simplicity and accuracy at a higher doping concentration.

  15. Capturing ecology in modeling approaches applied to environmental risk assessment of endocrine active chemicals in fish.

    PubMed

    Mintram, Kate S; Brown, A Ross; Maynard, Samuel K; Thorbek, Pernille; Tyler, Charles R

    2018-02-01

    Endocrine active chemicals (EACs) are widespread in freshwater environments and both laboratory and field based studies have shown reproductive effects in fish at environmentally relevant exposures. Environmental risk assessment (ERA) seeks to protect wildlife populations and prospective assessments rely on extrapolation from individual-level effects established for laboratory fish species to populations of wild fish using arbitrary safety factors. Population susceptibility to chemical effects, however, depends on exposure risk, physiological susceptibility, and population resilience, each of which can differ widely between fish species. Population models have significant potential to address these shortfalls and to include individual variability relating to life-history traits, demographic and density-dependent vital rates, and behaviors which arise from inter-organism and organism-environment interactions. Confidence in population models has recently resulted in the EU Commission stating that results derived from reliable models may be considered when assessing the relevance of adverse effects of EACs at the population level. This review critically assesses the potential risks posed by EACs for fish populations, considers the ecological factors influencing these risks and explores the benefits and challenges of applying population modeling (including individual-based modeling) in ERA for EACs in fish. We conclude that population modeling offers a way forward for incorporating greater environmental relevance in assessing the risks of EACs for fishes and for identifying key risk factors through sensitivity analysis. Individual-based models (IBMs) allow for the incorporation of physiological and behavioral endpoints relevant to EAC exposure effects, thus capturing both direct and indirect population-level effects.

  16. Development and validation of a physiology-based model for the prediction of pharmacokinetics/toxicokinetics in rabbits

    PubMed Central

    Hermes, Helen E.; Teutonico, Donato; Preuss, Thomas G.; Schneckener, Sebastian

    2018-01-01

    The environmental fates of pharmaceuticals and the effects of crop protection products on non-target species are subjects that are undergoing intense review. Since measuring the concentrations and effects of xenobiotics on all affected species under all conceivable scenarios is not feasible, standard laboratory animals such as rabbits are tested, and the observed adverse effects are translated to focal species for environmental risk assessments. In that respect, mathematical modelling is becoming increasingly important for evaluating the consequences of pesticides in untested scenarios. In particular, physiologically based pharmacokinetic/toxicokinetic (PBPK/TK) modelling is a well-established methodology used to predict tissue concentrations based on the absorption, distribution, metabolism and excretion of drugs and toxicants. In the present work, a rabbit PBPK/TK model is developed and evaluated with data available from the literature. The model predictions include scenarios of both intravenous (i.v.) and oral (p.o.) administration of small and large compounds. The presented rabbit PBPK/TK model predicts the pharmacokinetics (Cmax, AUC) of the tested compounds with an average 1.7-fold error. This result indicates a good predictive capacity of the model, which enables its use for risk assessment modelling and simulations. PMID:29561908

  17. Frequency Response Function Expansion for Unmeasured Translation and Rotation Dofs for Impedance Modelling Applications

    NASA Astrophysics Data System (ADS)

    Avitabile, P.; O'Callahan, J.

    2003-07-01

    Inclusion of rotational effects is critical for the accuracy of the predicted system characteristics, in almost all system modelling studies. However, experimentally derived information for the description of one or more of the components for the system will generally not have any rotational effects included in the description of the component. The lack of rotational effects has long affected the results from any system model development whether using a modal-based approach or an impedance-based approach. Several new expansion processes are described herein for the development of FRFs needed for impedance-based system models. These techniques expand experimentally derived mode shapes, residual modes from the modal parameter estimation process and FRFs directly to allow for the inclusion of the necessary rotational dof. The FRFs involving translational to rotational dofs are developed as well as the rotational to rotational dof. Examples are provided to show the use of these techniques.

  18. Particle swarm optimization algorithm based parameters estimation and control of epileptiform spikes in a neural mass model

    NASA Astrophysics Data System (ADS)

    Shan, Bonan; Wang, Jiang; Deng, Bin; Wei, Xile; Yu, Haitao; Zhang, Zhen; Li, Huiyan

    2016-07-01

    This paper proposes an epilepsy detection and closed-loop control strategy based on Particle Swarm Optimization (PSO) algorithm. The proposed strategy can effectively suppress the epileptic spikes in neural mass models, where the epileptiform spikes are recognized as the biomarkers of transitions from the normal (interictal) activity to the seizure (ictal) activity. In addition, the PSO algorithm shows capabilities of accurate estimation for the time evolution of key model parameters and practical detection for all the epileptic spikes. The estimation effects of unmeasurable parameters are improved significantly compared with unscented Kalman filter. When the estimated excitatory-inhibitory ratio exceeds a threshold value, the epileptiform spikes can be inhibited immediately by adopting the proportion-integration controller. Besides, numerical simulations are carried out to illustrate the effectiveness of the proposed method as well as the potential value for the model-based early seizure detection and closed-loop control treatment design.

  19. Pharmacokinetic/Pharmacodynamic Relationship of Gabapentin in a CFA-induced Inflammatory Hyperalgesia Rat Model.

    PubMed

    Larsen, Malte Selch; Keizer, Ron; Munro, Gordon; Mørk, Arne; Holm, René; Savic, Rada; Kreilgaard, Mads

    2016-05-01

    Gabapentin displays non-linear drug disposition, which complicates dosing for optimal therapeutic effect. Thus, the current study was performed to elucidate the pharmacokinetic/pharmacodynamic (PKPD) relationship of gabapentin's effect on mechanical hypersensitivity in a rat model of CFA-induced inflammatory hyperalgesia. A semi-mechanistic population-based PKPD model was developed using nonlinear mixed-effects modelling, based on gabapentin plasma and brain extracellular fluid (ECF) time-concentration data and measurements of CFA-evoked mechanical hyperalgesia following administration of a range of gabapentin doses (oral and intravenous). The plasma/brain ECF concentration-time profiles of gabapentin were adequately described with a two-compartment plasma model with saturable intestinal absorption rate (K m  = 44.1 mg/kg, V max  = 41.9 mg/h∙kg) and dose-dependent oral bioavailability linked to brain ECF concentration through a transit compartment. Brain ECF concentration was directly linked to a sigmoid E max function describing reversal of hyperalgesia (EC 50, plasma  = 16.7 μg/mL, EC 50, brain  = 3.3 μg/mL). The proposed semi-mechanistic population-based PKPD model provides further knowledge into the understanding of gabapentin's non-linear pharmacokinetics and the link between plasma/brain disposition and anti-hyperalgesic effects. The model suggests that intestinal absorption is the primary source of non-linearity and that the investigated rat model provides reasonable predictions of clinically effective plasma concentrations for gabapentin.

  20. Efficacy of a surfactant-based wound dressing on biofilm control.

    PubMed

    Percival, Steven L; Mayer, Dieter; Salisbury, Anne-Marie

    2017-09-01

    The aim of this study was to evaluate the efficacy of both a nonantimicrobial and antimicrobial (1% silver sulfadiazine-SSD) surfactant-based wound dressing in the control of Pseudomonas aeruginosa, Enterococcus sp, Staphylococcus epidermidis, Staphylococcus aureus, and methicillin-resistant S. aureus (MRSA) biofilms. Anti-biofilm efficacy was evaluated in numerous adapted American Standards for Testing and Materials (ASTM) standard biofilm models and other bespoke biofilm models. The ASTM standard models employed included the Minimum biofilm eradication concentration (MBEC) biofilm model (ASTM E2799) and the Centers for Disease Control (CDC) biofilm reactor model (ASTM 2871). Such bespoke biofilm models included the filter biofilm model and the chamberslide biofilm model. Results showed complete kill of microorganisms within a biofilm using the antimicrobial surfactant-based wound dressing. Interestingly, the nonantimicrobial surfactant-based dressing could disrupt existing biofilms by causing biofilm detachment. Prior to biofilm detachment, we demonstrated, using confocal laser scanning microscopy (CLSM), the dispersive effect of the nonantimicrobial surfactant-based wound dressing on the biofilm within 10 minutes of treatment. Furthermore, the non-antimicrobial surfactant-based wound dressing caused an increase in microbial flocculation/aggregation, important for microbial concentration. In conclusion, this nonantimicrobial surfactant-based wound dressing leads to the effective detachment and dispersion of in vitro biofilms. The use of surfactant-based wound dressings in a clinical setting may help to disrupt existing biofilm from wound tissue and may increase the action of antimicrobial treatment. © 2017 by the Wound Healing Society.

  1. Dynamic updating atlas for heart segmentation with a nonlinear field-based model.

    PubMed

    Cai, Ken; Yang, Rongqian; Yue, Hongwei; Li, Lihua; Ou, Shanxing; Liu, Feng

    2017-09-01

    Segmentation of cardiac computed tomography (CT) images is an effective method for assessing the dynamic function of the heart and lungs. In the atlas-based heart segmentation approach, the quality of segmentation usually relies upon atlas images, and the selection of those reference images is a key step. The optimal goal in this selection process is to have the reference images as close to the target image as possible. This study proposes an atlas dynamic update algorithm using a scheme of nonlinear deformation field. The proposed method is based on the features among double-source CT (DSCT) slices. The extraction of these features will form a base to construct an average model and the created reference atlas image is updated during the registration process. A nonlinear field-based model was used to effectively implement a 4D cardiac segmentation. The proposed segmentation framework was validated with 14 4D cardiac CT sequences. The algorithm achieved an acceptable accuracy (1.0-2.8 mm). Our proposed method that combines a nonlinear field-based model and dynamic updating atlas strategies can provide an effective and accurate way for whole heart segmentation. The success of the proposed method largely relies on the effective use of the prior knowledge of the atlas and the similarity explored among the to-be-segmented DSCT sequences. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Understanding the Effects of Users' Behaviors on Effectiveness of Different Exogenous Regulatory Common Pool Resource Management Institutions

    NASA Astrophysics Data System (ADS)

    Madani, K.; Dinar, A.

    2013-12-01

    Tragedy of the commons is generally recognized as one of the possible destinies for common pool resources (CPRs). To avoid the tragedy of the commons and prolonging the life of CPRs, users may show different behavioral characteristics and use different rationales for CPR planning and management. Furthermore, regulators may adopt different strategies for sustainable management of CPRs. The effectiveness of different regulatory exogenous management institutions cannot be evaluated through conventional CPR models since they assume that either users base their behavior on individual rationality and adopt a selfish behavior (Nash behavior), or that the users seek the system's optimal solution without giving priority to their own interests. Therefore, conventional models fail to reliably predict the outcome of CPR problems in which parties may have a range of behavioral characteristics, putting them somewhere in between the two types of behaviors traditionally considered. This work examines the effectiveness of different regulatory exogenous CPR management institutions through a user-based model (as opposed to a system-based model). The new modeling framework allows for consideration of sensitivity of the results to different behavioral characteristics of interacting CPR users. The suggested modeling approach is applied to a benchmark groundwater management problem. Results indicate that some well-known exogenous management institutions (e.g. taxing) are ineffective in sustainable management of CPRs in most cases. Bankruptcy-based management can be helpful, but determination of the fair level of cutbacks remains challenging under this type of institution. Furthermore, some bankruptcy rules such as the Constrained Equal Award (CEA) method are more beneficial to wealthier users, failing to establish social justice. Quota-based and CPR status-based management perform as the most promising and robust regulatory exogenous institutions in prolonging the CPR's life and increasing the long-term benefits to its users.

  3. Calculating permittivity of semi-conductor fillers in composites based on simplified effective medium approximation models

    NASA Astrophysics Data System (ADS)

    Feng, Yefeng; Wu, Qin; Hu, Jianbing; Xu, Zhichao; Peng, Cheng; Xia, Zexu

    2018-03-01

    Interface induced polarization has a significant impact on permittivity of 0–3 type polymer composites with Si based semi-conducting fillers. Polarity of Si based filler, polarity of polymer matrix and grain size of filler are closely connected with induced polarization and permittivity of composites. However, unlike 2–2 type composites, the real permittivity of Si based fillers in 0–3 type composites could be not directly measured. Therefore, achieving the theoretical permittivity of fillers in 0–3 composites through effective medium approximation (EMA) models should be very necessary. In this work, the real permittivity results of Si based semi-conducting fillers in ten different 0–3 polymer composite systems were calculated by linear fitting of simplified EMA models, based on particularity of reported parameters in those composites. The results further confirmed the proposed interface induced polarization. The results further verified significant influences of filler polarity, polymer polarity and filler size on induced polarization and permittivity of composites as well. High self-consistency was gained between present modelling and prior measuring. This work might offer a facile and effective route to achieve the difficultly measured dielectric performances of discrete filler phase in some special polymer based composite systems.

  4. Variable selection models for genomic selection using whole-genome sequence data and singular value decomposition.

    PubMed

    Meuwissen, Theo H E; Indahl, Ulf G; Ødegård, Jørgen

    2017-12-27

    Non-linear Bayesian genomic prediction models such as BayesA/B/C/R involve iteration and mostly Markov chain Monte Carlo (MCMC) algorithms, which are computationally expensive, especially when whole-genome sequence (WGS) data are analyzed. Singular value decomposition (SVD) of the genotype matrix can facilitate genomic prediction in large datasets, and can be used to estimate marker effects and their prediction error variances (PEV) in a computationally efficient manner. Here, we developed, implemented, and evaluated a direct, non-iterative method for the estimation of marker effects for the BayesC genomic prediction model. The BayesC model assumes a priori that markers have normally distributed effects with probability [Formula: see text] and no effect with probability (1 - [Formula: see text]). Marker effects and their PEV are estimated by using SVD and the posterior probability of the marker having a non-zero effect is calculated. These posterior probabilities are used to obtain marker-specific effect variances, which are subsequently used to approximate BayesC estimates of marker effects in a linear model. A computer simulation study was conducted to compare alternative genomic prediction methods, where a single reference generation was used to estimate marker effects, which were subsequently used for 10 generations of forward prediction, for which accuracies were evaluated. SVD-based posterior probabilities of markers having non-zero effects were generally lower than MCMC-based posterior probabilities, but for some regions the opposite occurred, resulting in clear signals for QTL-rich regions. The accuracies of breeding values estimated using SVD- and MCMC-based BayesC analyses were similar across the 10 generations of forward prediction. For an intermediate number of generations (2 to 5) of forward prediction, accuracies obtained with the BayesC model tended to be slightly higher than accuracies obtained using the best linear unbiased prediction of SNP effects (SNP-BLUP model). When reducing marker density from WGS data to 30 K, SNP-BLUP tended to yield the highest accuracies, at least in the short term. Based on SVD of the genotype matrix, we developed a direct method for the calculation of BayesC estimates of marker effects. Although SVD- and MCMC-based marker effects differed slightly, their prediction accuracies were similar. Assuming that the SVD of the marker genotype matrix is already performed for other reasons (e.g. for SNP-BLUP), computation times for the BayesC predictions were comparable to those of SNP-BLUP.

  5. Modeling the effect of land use change on hydrology of a forested watershed in coastal South Carolina.

    Treesearch

    Zhaohua Dai; Devendra M. Amatya; Ge Sun; Changsheng Li; Carl C. Trettin; Harbin Li

    2009-01-01

    Since hydrology is one of main factors controlling wetland functions, hydrologic models are useful for evaluating the effects of land use change on we land ecosystems. We evaluated two process-based hydrologic models with...

  6. Dose-dependent EEG effects of zolpidem provide evidence for GABA(A) receptor subtype selectivity in vivo.

    PubMed

    Visser, S A G; Wolters, F L C; van der Graaf, P H; Peletier, L A; Danhof, M

    2003-03-01

    Zolpidem is a nonbenzodiazepine GABA(A) receptor modulator that binds in vitro with high affinity to GABA(A) receptors expressing alpha(1) subunits but with relatively low affinity to receptors expressing alpha(2), alpha(3), and alpha(5) subunits. In the present study, it was investigated whether this subtype selectivity could be detected and quantified in vivo. Three doses (1.25, 5, and 25 mg) of zolpidem were administered to rats in an intravenous infusion over 5 min. The time course of the plasma concentrations was determined in conjunction with the change in the beta-frequency range of the EEG as pharmacodynamic endpoint. The concentration-effect relationship of the three doses showed a dose-dependent maximum effect and a dose-dependent potency. The data were analyzed for one- or two-site binding using two pharmacodynamic models based on 1) the descriptive model and 2) a novel mechanism-based pharmacokinetic/pharmacodynamic (PK/PD) model for GABA(A) receptor modulators that aims to separates drug- and system-specific properties, thereby allowing the estimation of in vivo affinity and efficacy. The application of two-site models significantly improved the fits compared with one-site models. Furthermore, in contrast to the descriptive model, the mechanism-based PK/PD model yielded dose-independent estimates for affinity (97 +/- 40 and 33,100 +/- 14,800 ng x ml(-1)). In conclusion, the mechanism-based PK/PD model is able to describe and explain the observed dose-dependent EEG effects of zolpidem and suggests the subtype selectivity of zolpidem in vivo.

  7. Systems pharmacology - Towards the modeling of network interactions.

    PubMed

    Danhof, Meindert

    2016-10-30

    Mechanism-based pharmacokinetic and pharmacodynamics (PKPD) and disease system (DS) models have been introduced in drug discovery and development research, to predict in a quantitative manner the effect of drug treatment in vivo in health and disease. This requires consideration of several fundamental properties of biological systems behavior including: hysteresis, non-linearity, variability, interdependency, convergence, resilience, and multi-stationarity. Classical physiology-based PKPD models consider linear transduction pathways, connecting processes on the causal path between drug administration and effect, as the basis of drug action. Depending on the drug and its biological target, such models may contain expressions to characterize i) the disposition and the target site distribution kinetics of the drug under investigation, ii) the kinetics of target binding and activation and iii) the kinetics of transduction. When connected to physiology-based DS models, PKPD models can characterize the effect on disease progression in a mechanistic manner. These models have been found useful to characterize hysteresis and non-linearity, yet they fail to explain the effects of the other fundamental properties of biological systems behavior. Recently systems pharmacology has been introduced as novel approach to predict in vivo drug effects, in which biological networks rather than single transduction pathways are considered as the basis of drug action and disease progression. These models contain expressions to characterize the functional interactions within a biological network. Such interactions are relevant when drugs act at multiple targets in the network or when homeostatic feedback mechanisms are operative. As a result systems pharmacology models are particularly useful to describe complex patterns of drug action (i.e. synergy, oscillatory behavior) and disease progression (i.e. episodic disorders). In this contribution it is shown how physiology-based PKPD and disease models can be extended to account for internal systems interactions. It is demonstrated how SP models can be used to predict the effects of multi-target interactions and of homeostatic feedback on the pharmacological response. In addition it is shown how DS models may be used to distinguish symptomatic from disease modifying effects and to predict the long term effects on disease progression, from short term biomarker responses. It is concluded that incorporation of expressions to describe the interactions in biological network analysis opens new avenues to the understanding of the effects of drug treatment on the fundamental aspects of biological systems behavior. Copyright © 2016 The Author. Published by Elsevier B.V. All rights reserved.

  8. An equivalent dissipation rate model for capturing history effects in non-premixed flames

    DOE PAGES

    Kundu, Prithwish; Echekki, Tarek; Pei, Yuanjiang; ...

    2016-11-11

    The effects of strain rate history on turbulent flames have been studied in the. past decades with 1D counter flow diffusion flame (CFDF) configurations subjected to oscillating strain rates. In this work, these unsteady effects are studied for complex hydrocarbon fuel surrogates at engine relevant conditions with unsteady strain rates experienced by flamelets in a typical spray flame. Tabulated combustion models are based on a steady scalar dissipation rate (SDR) assumption and hence cannot capture these unsteady strain effects; even though they can capture the unsteady chemistry. In this work, 1D CFDF with varying strain rates are simulated using twomore » different modeling approaches: steady SDR assumption and unsteady flamelet model. Comparative studies show that the history effects due to unsteady SDR are directly proportional to the temporal gradient of the SDR. A new equivalent SDR model based on the history of a flamelet is proposed. An averaging procedure is constructed such that the most recent histories are given higher weights. This equivalent SDR is then used with the steady SDR assumption in 1D flamelets. Results show a good agreement between tabulated flamelet solution and the unsteady flamelet results. This equivalent SDR concept is further implemented and compared against 3D spray flames (Engine Combustion Network Spray A). Tabulated models based on steady SDR assumption under-predict autoignition and flame lift-off when compared with an unsteady Representative Interactive Flamelet (RIF) model. However, equivalent SDR model coupled with the tabulated model predicted autoignition and flame lift-off very close to those reported by the RIF model. This model is further validated for a range of injection pressures for Spray A flames. As a result, the new modeling framework now enables tabulated models with significantly lower computational cost to account for unsteady history effects.« less

  9. An equivalent dissipation rate model for capturing history effects in non-premixed flames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kundu, Prithwish; Echekki, Tarek; Pei, Yuanjiang

    The effects of strain rate history on turbulent flames have been studied in the. past decades with 1D counter flow diffusion flame (CFDF) configurations subjected to oscillating strain rates. In this work, these unsteady effects are studied for complex hydrocarbon fuel surrogates at engine relevant conditions with unsteady strain rates experienced by flamelets in a typical spray flame. Tabulated combustion models are based on a steady scalar dissipation rate (SDR) assumption and hence cannot capture these unsteady strain effects; even though they can capture the unsteady chemistry. In this work, 1D CFDF with varying strain rates are simulated using twomore » different modeling approaches: steady SDR assumption and unsteady flamelet model. Comparative studies show that the history effects due to unsteady SDR are directly proportional to the temporal gradient of the SDR. A new equivalent SDR model based on the history of a flamelet is proposed. An averaging procedure is constructed such that the most recent histories are given higher weights. This equivalent SDR is then used with the steady SDR assumption in 1D flamelets. Results show a good agreement between tabulated flamelet solution and the unsteady flamelet results. This equivalent SDR concept is further implemented and compared against 3D spray flames (Engine Combustion Network Spray A). Tabulated models based on steady SDR assumption under-predict autoignition and flame lift-off when compared with an unsteady Representative Interactive Flamelet (RIF) model. However, equivalent SDR model coupled with the tabulated model predicted autoignition and flame lift-off very close to those reported by the RIF model. This model is further validated for a range of injection pressures for Spray A flames. As a result, the new modeling framework now enables tabulated models with significantly lower computational cost to account for unsteady history effects.« less

  10. The Effects of the Activities of Current Textbook and 5 E Model on the Attitude of the Students: Sample of "The Global Effects of Natural Resources Unit"

    ERIC Educational Resources Information Center

    Uzunoz, Abdulkadir

    2011-01-01

    This study aimed to determine the effects of the activities of current textbook and 5 E Model on the attitude of the students. This study is a research as an experimental model. For testing the effects of geography education supported by 5 E model and geography education based on activities of current textbook attitude of students, controlled…

  11. Explicating an Evidence-Based, Theoretically Informed, Mobile Technology-Based System to Improve Outcomes for People in Recovery for Alcohol Dependence

    PubMed Central

    Gustafson, David H.; Isham, Andrew; Baker, Timothy; Boyle, Michael G.; Levy, Michael

    2011-01-01

    Post treatment relapse to uncontrolled alcohol use is common. More cost-effective approaches are needed. We believe currently available communication technology can use existing models for relapse prevention to cost-effectively improve long-term relapse prevention. This paper describes: 1) research-based elements of alcohol related relapse prevention and how they can be encompassed in Self Determination Theory (SDT) and Marlatt’s Cognitive Behavioral Relapse Prevention Model, 2) how technology could help address the needs of people seeking recovery, 3) a technology-based prototype, organized around Self Determination Theory and Marlatt’s model and 4) how we are testing a system based on the ideas in this article and related ethical and operational considerations. PMID:21190410

  12. Evaluating the effect of human activity patterns on air pollution exposure using an integrated field-based and agent-based modelling framework

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Beelen, Rob M. J.; de Bakker, Merijn P.; Karssenberg, Derek

    2015-04-01

    Constructing spatio-temporal numerical models to support risk assessment, such as assessing the exposure of humans to air pollution, often requires the integration of field-based and agent-based modelling approaches. Continuous environmental variables such as air pollution are best represented using the field-based approach which considers phenomena as continuous fields having attribute values at all locations. When calculating human exposure to such pollutants it is, however, preferable to consider the population as a set of individuals each with a particular activity pattern. This would allow to account for the spatio-temporal variation in a pollutant along the space-time paths travelled by individuals, determined, for example, by home and work locations, road network, and travel times. Modelling this activity pattern requires an agent-based or individual based modelling approach. In general, field- and agent-based models are constructed with the help of separate software tools, while both approaches should play together in an interacting way and preferably should be combined into one modelling framework, which would allow for efficient and effective implementation of models by domain specialists. To overcome this lack in integrated modelling frameworks, we aim at the development of concepts and software for an integrated field-based and agent-based modelling framework. Concepts merging field- and agent-based modelling were implemented by extending PCRaster (http://www.pcraster.eu), a field-based modelling library implemented in C++, with components for 1) representation of discrete, mobile, agents, 2) spatial networks and algorithms by integrating the NetworkX library (http://networkx.github.io), allowing therefore to calculate e.g. shortest routes or total transport costs between locations, and 3) functions for field-network interactions, allowing to assign field-based attribute values to networks (i.e. as edge weights), such as aggregated or averaged concentration values. We demonstrate the approach by using six land use regression (LUR) models developed in the ESCAPE (European Study of Cohorts for Air Pollution Effects) project. These models calculate several air pollutants (e.g. NO2, NOx, PM2.5) for the entire Netherlands at a high (5 m) resolution. Using these air pollution maps, we compare exposure of individuals calculated at their x, y location of their home, their work place, and aggregated over the close surroundings of these locations. In addition, total exposure is accumulated over daily activity patterns, summing exposure at home, at the work place, and while travelling between home and workplace, by routing individuals over the Dutch road network, using the shortest route. Finally, we illustrate how routes can be calculated with the minimum total exposure (instead of shortest distance).

  13. Modelling nonlinearity in superconducting split ring resonator and its effects on metamaterial structures

    NASA Astrophysics Data System (ADS)

    Mazdouri, Behnam; Mohammad Hassan Javadzadeh, S.

    2017-09-01

    Superconducting materials are intrinsically nonlinear, because of nonlinear Meissner effect (NLME). Considering nonlinear behaviors, such as harmonic generation and intermodulation distortion (IMD) in superconducting structures, are very important. In this paper, we proposed distributed nonlinear circuit model for superconducting split ring resonators (SSRRs). This model can be analyzed by using Harmonic Balance method (HB) as a nonlinear solver. Thereafter, we considered a superconducting metamaterial filter which was based on split ring resonators and we calculated fundamental and third-order IMD signals. There are good agreement between nonlinear results from proposed model and measured ones. Additionally, based on the proposed nonlinear model and by using a novel method, we considered nonlinear effects on main parameters in the superconducting metamaterial structures such as phase constant (β) and attenuation factor (α).

  14. A multi-scale model of dislocation plasticity in α-Fe: Incorporating temperature, strain rate and non-Schmid effects

    DOE PAGES

    Lim, H.; Hale, L. M.; Zimmerman, J. A.; ...

    2015-01-05

    In this study, we develop an atomistically informed crystal plasticity finite element (CP-FE) model for body-centered-cubic (BCC) α-Fe that incorporates non-Schmid stress dependent slip with temperature and strain rate effects. Based on recent insights obtained from atomistic simulations, we propose a new constitutive model that combines a generalized non-Schmid yield law with aspects from a line tension (LT) model for describing activation enthalpy required for the motion of dislocation kinks. Atomistic calculations are conducted to quantify the non-Schmid effects while both experimental data and atomistic simulations are used to assess the temperature and strain rate effects. The parameterized constitutive equationmore » is implemented into a BCC CP-FE model to simulate plastic deformation of single and polycrystalline Fe which is compared with experimental data from the literature. This direct comparison demonstrates that the atomistically informed model accurately captures the effects of crystal orientation, temperature and strain rate on the flow behavior of siangle crystal Fe. Furthermore, our proposed CP-FE model exhibits temperature and strain rate dependent flow and yield surfaces in polycrystalline Fe that deviate from conventional CP-FE models based on Schmid's law.« less

  15. Effect of Inquiry-Based Computer Simulation Modeling on Pre-Service Teachers' Understanding of Homeostasis and Their Perceptions of Design Features

    ERIC Educational Resources Information Center

    Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn

    2016-01-01

    This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…

  16. Comparison of "E-Rater"[R] Automated Essay Scoring Model Calibration Methods Based on Distributional Targets

    ERIC Educational Resources Information Center

    Zhang, Mo; Williamson, David M.; Breyer, F. Jay; Trapani, Catherine

    2012-01-01

    This article describes two separate, related studies that provide insight into the effectiveness of "e-rater" score calibration methods based on different distributional targets. In the first study, we developed and evaluated a new type of "e-rater" scoring model that was cost-effective and applicable under conditions of absent human rating and…

  17. A Comparison of Video Modeling, Text-Based Instruction, and No Instruction for Creating Multiple Baseline Graphs in Microsoft Excel

    ERIC Educational Resources Information Center

    Tyner, Bryan C.; Fienup, Daniel M.

    2015-01-01

    Graphing is socially significant for behavior analysts; however, graphing can be difficult to learn. Video modeling (VM) may be a useful instructional method but lacks evidence for effective teaching of computer skills. A between-groups design compared the effects of VM, text-based instruction, and no instruction on graphing performance.…

  18. Effects of a Health Behavior Change Model-Based HIV/STI Prevention Intervention on Condom Use among Heterosexual Couples: A Randomized Trial

    ERIC Educational Resources Information Center

    Harvey, S. Marie; Kraft, Joan Marie; West, Stephen G.; Taylor, Aaron B.; Pappas-DeLuca, Katina A.; Beckman, Linda J.

    2009-01-01

    This study examines an intervention for heterosexual couples to prevent human immunodeficiency virus/sexually transmitted infections. It also evaluates the effect of the intervention, which is based on current models of health behavior change, on intermediate outcomes (individual and relationship factors) and consistency of condom use. Eligible…

  19. A hierarchical fire frequency model to simulate temporal patterns of fire regimes in LANDIS

    Treesearch

    Jian Yang; Hong S. He; Eric J. Gustafson

    2004-01-01

    Fire disturbance has important ecological effects in many forest landscapes. Existing statistically based approaches can be used to examine the effects of a fire regime on forest landscape dynamics. Most examples of statistically based fire models divide a fire occurrence into two stages--fire ignition and fire initiation. However, the exponential and Weibull fire-...

  20. Estimation of Standard Error of Regression Effects in Latent Regression Models Using Binder's Linearization. Research Report. ETS RR-07-09

    ERIC Educational Resources Information Center

    Li, Deping; Oranje, Andreas

    2007-01-01

    Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…

  1. The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…

  2. Market-oriented Programming Using Small-world Networks for Controlling Building Environments

    NASA Astrophysics Data System (ADS)

    Shigei, Noritaka; Miyajima, Hiromi; Osako, Tsukasa

    The market model, which is one of the economic activity models, is modeled as an agent system, and applying the model to the resource allocation problem has been studied. For air conditioning control of building, which is one of the resource allocation problems, an effective method based on the agent system using auction has been proposed for traditional PID controller. On the other hand, it has been considered that this method is performed by decentralized control. However, its decentralization is not perfect, and its performace is not enough. In this paper, firstly, we propose a perfectly decentralized agent model and show its performance. Secondly, in order to improve the model, we propose the agent model based on small-world model. The effectiveness of the proposed model is shown by simulation.

  3. Robustness of Value-Added Analysis of School Effectiveness. Research Report. ETS RR-08-22

    ERIC Educational Resources Information Center

    Braun, Henry; Qu, Yanxuan

    2008-01-01

    This paper reports on a study conducted to investigate the consistency of the results between 2 approaches to estimating school effectiveness through value-added modeling. Estimates of school effects from the layered model employing item response theory (IRT) scaled data are compared to estimates derived from a discrete growth model based on the…

  4. Developing, Testing, and Using Theoretical Models for Promoting Quality in Education

    ERIC Educational Resources Information Center

    Creemers, Bert; Kyriakides, Leonidas

    2015-01-01

    This paper argues that the dynamic model of educational effectiveness can be used to establish stronger links between educational effectiveness research (EER) and school improvement. It provides research evidence to support the validity of the model. Thus, the importance of using the dynamic model to establish an evidence-based and theory-driven…

  5. An Evaluation of the Preceptor Model versus the Formal Teaching Model.

    ERIC Educational Resources Information Center

    Shamian, Judith; Lemieux, Suzanne

    1984-01-01

    This study evaluated the effectiveness of two teaching methods to determine which is more effective in enhancing the knowledge base of participating nurses: the preceptor model embodies decentralized instruction by a member of the nursing staff, and the formal teaching model uses centralized teaching by the inservice education department. (JOW)

  6. Metaheuristic and Machine Learning Models for TFE-731-2, PW4056, and JT8D-9 Cruise Thrust

    NASA Astrophysics Data System (ADS)

    Baklacioglu, Tolga

    2017-08-01

    The requirement for an accurate engine thrust model has a major antecedence in airline fuel saving programs, assessment of environmental effects of fuel consumption, emissions reduction studies, and air traffic management applications. In this study, utilizing engine manufacturers' real data, a metaheuristic model based on genetic algorithms (GAs) and a machine learning model based on neural networks (NNs) trained with Levenberg-Marquardt (LM), delta-bar-delta (DBD), and conjugate gradient (CG) algorithms were accomplished to incorporate the effect of both flight altitude and Mach number in the estimation of thrust. For the GA model, the analysis of population size impact on the model's accuracy and effect of number of data on model coefficients were also performed. For the NN model, design of optimum topology was searched for one- and two-hidden-layer networks. Predicted thrust values presented a close agreement with real thrust data for both models, among which LM trained NNs gave the best accuracies.

  7. The Source of the Symbolic Numerical Distance and Size Effects

    PubMed Central

    Krajcsi, Attila; Lengyel, Gábor; Kojouharova, Petia

    2016-01-01

    Human number understanding is thought to rely on the analog number system (ANS), working according to Weber’s law. We propose an alternative account, suggesting that symbolic mathematical knowledge is based on a discrete semantic system (DSS), a representation that stores values in a semantic network, similar to the mental lexicon or to a conceptual network. Here, focusing on the phenomena of numerical distance and size effects in comparison tasks, first we discuss how a DSS model could explain these numerical effects. Second, we demonstrate that the DSS model can give quantitatively as appropriate a description of the effects as the ANS model. Finally, we show that symbolic numerical size effect is mainly influenced by the frequency of the symbols, and not by the ratios of their values. This last result suggests that numerical distance and size effects cannot be caused by the ANS, while the DSS model might be the alternative approach that can explain the frequency-based size effect. PMID:27917139

  8. Hyper-Book: A Formal Model for Electronic Books.

    ERIC Educational Resources Information Center

    Catenazzi, Nadia; Sommaruga, Lorenzo

    1994-01-01

    Presents a model for electronic books based on the paper book metaphor. Discussion includes how the book evolves under the effects of its functional components; the use and impact of the model for organizing and presenting electronic documents in the context of electronic publishing; and the possible applications of a system based on the model.…

  9. Development and testing of a physically based model of streambank erosion for coupling with a basin-scale hydrologic model SWAT

    USDA-ARS?s Scientific Manuscript database

    A comprehensive stream bank erosion model based on excess shear stress has been developed and incorporated in the hydrological model Soil and Water Assessment Tool (SWAT). It takes into account processes such as weathering, vegetative cover, and channel meanders to adjust critical and effective str...

  10. Effect of Bayesian Student Modeling on Academic Achievement in Foreign Language Teaching (University Level English Preparatory School Example)

    ERIC Educational Resources Information Center

    Aslan, Burak Galip; Öztürk, Özlem; Inceoglu, Mustafa Murat

    2014-01-01

    Considering the increasing importance of adaptive approaches in CALL systems, this study implemented a machine learning based student modeling middleware with Bayesian networks. The profiling approach of the student modeling system is based on Felder and Silverman's Learning Styles Model and Felder and Soloman's Index of Learning Styles…

  11. Improving Instruction through Schoolwide Professional Development: Effects of the Data-on-Enacted-Curriculum Model

    ERIC Educational Resources Information Center

    Blank, Rolf K.; Smithson, John; Porter, Andrew; Nunnaley, Diana; Osthoff, Eric

    2006-01-01

    The instructional improvement model Data on Enacted Curriculum was tested with an experimental design using randomized place-based trials. The improvement model is based on using data on instructional practices and achievement to guide professional development and decisions to refocus on instruction. The model was tested in 50 U.S. middle schools…

  12. A context-based theory of recency and contiguity in free recall

    PubMed Central

    Sederberg, Per B.; Howard, Marc W.; Kahana, Michael J.

    2008-01-01

    We present a new model of free recall based on Howard and Kahana’s (2002) temporal context model and Usher and McClelland’s (2001) leaky-accumulator decision model. In this model, contextual drift gives rise to both short-term and long-term recency effects, and contextual retrieval gives rise to short-term and long-term contiguity effects, Recall decisions are controlled by a race between competitive leaky-accumulators. The model captures the dynamics of immediate, delayed, and continual distractor free recall, demonstrating that dissociations between short- and long-term recency can naturally arise from a model that uses an internal contextual state as the sole cue for retrieval across time scales. PMID:18954208

  13. Community Modeling Program for Space Weather: A CCMC Perspective

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2009-01-01

    A community modeling program, which provides a forum for exchange and integration between modelers, has excellent potential for furthering our Space Weather modeling and forecasting capabilities. The design of such a program is of great importance to its success. In this presentation, we will argue that the most effective community modeling program should be focused on Space Weather-related objectives, and that it should be open and inclusive. The tremendous successes of prior community research activities further suggest that the most effective implementation of a new community modeling program should be based on community leadership, rather than on domination by individual institutions or centers. This presentation will provide an experience-based justification for these conclusions.

  14. Interpretable inference on the mixed effect model with the Box-Cox transformation.

    PubMed

    Maruo, K; Yamaguchi, Y; Noma, H; Gosho, M

    2017-07-10

    We derived results for inference on parameters of the marginal model of the mixed effect model with the Box-Cox transformation based on the asymptotic theory approach. We also provided a robust variance estimator of the maximum likelihood estimator of the parameters of this model in consideration of the model misspecifications. Using these results, we developed an inference procedure for the difference of the model median between treatment groups at the specified occasion in the context of mixed effects models for repeated measures analysis for randomized clinical trials, which provided interpretable estimates of the treatment effect. From simulation studies, it was shown that our proposed method controlled type I error of the statistical test for the model median difference in almost all the situations and had moderate or high performance for power compared with the existing methods. We illustrated our method with cluster of differentiation 4 (CD4) data in an AIDS clinical trial, where the interpretability of the analysis results based on our proposed method is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Surface-Potential-Based Metal-Oxide-Silicon-Varactor Model for RF Applications

    NASA Astrophysics Data System (ADS)

    Miyake, Masataka; Sadachika, Norio; Navarro, Dondee; Mizukane, Yoshio; Matsumoto, Kenji; Ezaki, Tatsuya; Miura-Mattausch, Mitiko; Mattausch, Hans Juergen; Ohguro, Tatsuya; Iizuka, Takahiro; Taguchi, Masahiko; Kumashiro, Shigetaka; Miyamoto, Shunsuke

    2007-04-01

    We have developed a surface-potential-based metal-oxide-silicon (MOS)-varactor model valid for RF applications up to 200 GHz. The model enables the calculation of the MOS-varactor capacitance seamlessly from the depletion region to the accumulation region and explicitly considers the carrier-response delay causing a non-quasi-static (NQS) effect. It has been observed that capacitance reduction due to this non-quasi-static effect limits the MOS-varactor application to an RF regime.

  16. A social marketing approach to implementing evidence-based practice in VHA QUERI: the TIDES depression collaborative care model.

    PubMed

    Luck, Jeff; Hagigi, Fred; Parker, Louise E; Yano, Elizabeth M; Rubenstein, Lisa V; Kirchner, JoAnn E

    2009-09-28

    Collaborative care models for depression in primary care are effective and cost-effective, but difficult to spread to new sites. Translating Initiatives for Depression into Effective Solutions (TIDES) is an initiative to promote evidence-based collaborative care in the U.S. Veterans Health Administration (VHA). Social marketing applies marketing techniques to promote positive behavior change. Described in this paper, TIDES used a social marketing approach to foster national spread of collaborative care models. The approach relied on a sequential model of behavior change and explicit attention to audience segmentation. Segments included VHA national leadership, Veterans Integrated Service Network (VISN) regional leadership, facility managers, frontline providers, and veterans. TIDES communications, materials and messages targeted each segment, guided by an overall marketing plan. Depression collaborative care based on the TIDES model was adopted by VHA as part of the new Primary Care Mental Health Initiative and associated policies. It is currently in use in more than 50 primary care practices across the United States, and continues to spread, suggesting success for its social marketing-based dissemination strategy. Development, execution and evaluation of the TIDES marketing effort shows that social marketing is a promising approach for promoting implementation of evidence-based interventions in integrated healthcare systems.

  17. Quantitative analysis of factors that affect oil pipeline network accident based on Bayesian networks: A case study in China

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan

    2018-06-01

    Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.

  18. Separation of very hydrophobic analytes by micellar electrokinetic chromatography IV. Modeling of the effective electrophoretic mobility from carbon number equivalents and octanol-water partition coefficients.

    PubMed

    Huhn, Carolin; Pyell, Ute

    2008-07-11

    It is investigated whether those relationships derived within an optimization scheme developed previously to optimize separations in micellar electrokinetic chromatography can be used to model effective electrophoretic mobilities of analytes strongly differing in their properties (polarity and type of interaction with the pseudostationary phase). The modeling is based on two parameter sets: (i) carbon number equivalents or octanol-water partition coefficients as analyte descriptors and (ii) four coefficients describing properties of the separation electrolyte (based on retention data for a homologous series of alkyl phenyl ketones used as reference analytes). The applicability of the proposed model is validated comparing experimental and calculated effective electrophoretic mobilities. The results demonstrate that the model can effectively be used to predict effective electrophoretic mobilities of neutral analytes from the determined carbon number equivalents or from octanol-water partition coefficients provided that the solvation parameters of the analytes of interest are similar to those of the reference analytes.

  19. Energy harvesting from vibration of Timoshenko nanobeam under base excitation considering flexoelectric and elastic strain gradient effects

    NASA Astrophysics Data System (ADS)

    Managheb, S. A. M.; Ziaei-Rad, S.; Tikani, R.

    2018-05-01

    The coupling between polarization and strain gradients is called flexoelectricity. This phenomenon exists in all dielectrics with any symmetry. In this paper, energy harvesting from a Timoshenko beam is studied by considering the flexoelectric and strain gradient effects. General governing equations and related boundary conditions are derived using Hamilton's principle. The flexoelectric effects are defined by gradients of normal and shear strains which lead to a more general model. The developed model also covers the classical Timoshenko beam theory by ignoring the flexoelectric effect. Based on the developed model, flexoelectricity effect on dielectric beams and energy harvesting from cantilever beam under harmonic base excitation is investigated. A parametric study was conducted to evaluate the effects of flexoelectric coefficients, strain gradient constants, base acceleration and the attaching tip mass on the energy harvested from a cantilever Timoshenko beam. Results show that the flexoelectricity has a significant effect on the energy harvester performance, especially in submicron and nano scales. In addition, this effect makes the beam to behave softer than before and also it changes the harvester first resonance frequency. The present study provides guidance for flexoelectric nano-beam analysis and a method to evaluate the performance of energy harvester in nano-dielectric devices.

  20. Modelling & Simulation Support to the Effects Based Approach to Operations - Observations from Using GAMMA in MNE 4

    DTIC Science & Technology

    2006-09-01

    The aim of the two parts of the experiment was identical: To explore concepts and supporting tools for Effects Based Approach to Operations (EBAO...feedback on the PMESII factors over time and the degree of achievement of the Operational Endstate. Modelling & Simulation Support to the Effects ...specific situation depends also on his interests. GAMMA provides two different methods: 1. The importance for different PMESII factors (ie potential

  1. Investigation on the Yarn Squeezing Effect of Three Dimensional Full Five Directional Braided Composites

    NASA Astrophysics Data System (ADS)

    Hu, Long; Tao, Guoquan; Liu, Zhenguo; Wang, Yibo; Ya, Jixuan

    2018-04-01

    The influence of yarn squeezing effect on the geometric morphology and mechanical property of the three dimensional full five directional (3DF5D) braided composites is explored. Spatial path and cross-section shape of the yarns in the braided structure are characterized based on the micro computed tomography (micro CT) scanning images. The yarn distortion due to the squeezing effect is discussed and mathematical morphology of the yarn geometry is established. A new repeated unit cell (RUC) model of 3DF5D braided composites considering yarn squeezing effect is developed. Based on this model, mechanical properties of 3DF5D braided composites are analyzed. Good agreement is obtained between the predicted and experiment results. Moreover, the stress distribution of the new RUC model are compared with original RUC model, showing that the squeezing effect significantly increases the stress concentration level of the axial yarns.

  2. [Prediction of schistosomiasis infection rates of population based on ARIMA-NARNN model].

    PubMed

    Ke-Wei, Wang; Yu, Wu; Jin-Ping, Li; Yu-Yu, Jiang

    2016-07-12

    To explore the effect of the autoregressive integrated moving average model-nonlinear auto-regressive neural network (ARIMA-NARNN) model on predicting schistosomiasis infection rates of population. The ARIMA model, NARNN model and ARIMA-NARNN model were established based on monthly schistosomiasis infection rates from January 2005 to February 2015 in Jiangsu Province, China. The fitting and prediction performances of the three models were compared. Compared to the ARIMA model and NARNN model, the mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model were the least with the values of 0.011 1, 0.090 0 and 0.282 4, respectively. The ARIMA-NARNN model could effectively fit and predict schistosomiasis infection rates of population, which might have a great application value for the prevention and control of schistosomiasis.

  3. The modelled cost-effectiveness of cognitive dissonance for the prevention of anorexia nervosa and bulimia nervosa in adolescent girls in Australia.

    PubMed

    Le, Long Khanh-Dao; Barendregt, Jan J; Hay, Phillipa; Sawyer, Susan M; Paxton, Susan J; Mihalopoulos, Cathrine

    2017-07-01

    Eating disorders (EDs), including anorexia nervosa (AN) and bulimia nervosa (BN), are prevalent disorders that carry substantial economic and social burden. The aim of the current study was to evaluate the modelled population cost-effectiveness of cognitive dissonance (CD), a school-based preventive intervention for EDs, in the Australian health care context. A population-based Markov model was developed to estimate the cost per disability adjusted life-year (DALY) averted by CD relative to no intervention. We modelled the cases of AN and BN that could be prevented over a 10-year time horizon in each study arm and the subsequent reduction in DALYs associated with this. The target population was 15-18 year old secondary school girls with high body-image concerns. This study only considered costs of the health sector providing services and not costs to individuals. Multivariate probabilistic and one-way sensitivity analyses were conducted to test model assumptions. Findings showed that the mean incremental cost-effectiveness ratio at base-case for the intervention was $103,980 per DALY averted with none of the uncertainty iterations falling below the threshold of AUD$50,000 per DALY averted. The evaluation was most sensitive to estimates of participant rates with higher rates associated with more favourable results. The intervention would become cost-effective (84% chance) if the effect of the intervention lasted up to 5 years. As modelled, school-based CD intervention is not a cost-effective preventive intervention for AN and BN. Given the burden of EDs, understanding how to improve participation rates is an important opportunity for future research. © 2017 Wiley Periodicals, Inc.

  4. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  5. Development of a recursion RNG-based turbulence model

    NASA Technical Reports Server (NTRS)

    Zhou, YE; Vahala, George; Thangam, S.

    1993-01-01

    Reynolds stress closure models based on the recursion renormalization group theory are developed for the prediction of turbulent separated flows. The proposed model uses a finite wavenumber truncation scheme to account for the spectral distribution of energy. In particular, the model incorporates effects of both local and nonlocal interactions. The nonlocal interactions are shown to yield a contribution identical to that from the epsilon-renormalization group (RNG), while the local interactions introduce higher order dispersive effects. A formal analysis of the model is presented and its ability to accurately predict separated flows is analyzed from a combined theoretical and computational stand point. Turbulent flow past a backward facing step is chosen as a test case and the results obtained based on detailed computations demonstrate that the proposed recursion -RNG model with finite cut-off wavenumber can yield very good predictions for the backstep problem.

  6. Effectiveness of Gross Model-Based Emotion Regulation Strategies Training on Anger Reduction in Drug-Dependent Individuals and its Sustainability in Follow-up

    PubMed Central

    Massah, Omid; Sohrabi, Faramarz; A’azami, Yousef; Doostian, Younes; Farhoudian, Ali; Daneshmand, Reza

    2016-01-01

    Background Emotion plays an important role in adapting to life changes and stressful events. Difficulty regulating emotions is one of the problems drug abusers often face, and teaching these individuals to express and manage their emotions can be effective on improving their difficult circumstances. Objectives The present study aimed to determine the effectiveness of the Gross model-based emotion regulation strategies training on anger reduction in drug-dependent individuals. Patients and Methods The present study had a quasi-experimental design wherein pretest-posttest evaluations were applied using a control group. The population under study included addicts attending Marivan’s methadone maintenance therapy centers in 2012 - 2013. Convenience sampling was used to select 30 substance-dependent individuals undergoing maintenance treatment who were then randomly assigned to the experiment and control groups. The experiment group received its training in eight two-hour sessions. Data were analyzed using analysis of co-variance and paired t-test. Results There was significant reduction in anger symptoms of drug-dependent individuals after gross model based emotion regulation training (ERT) (P < 0.001). Moreover, the effectiveness of the training on anger was persistent in the follow-up period. Conclusions Symptoms of anger in drug-dependent individuals of this study were reduced by gross model-based emotion regulation strategies training. Based on the results of this study, we may conclude that the gross model based emotion regulation strategies training can be applied alongside other therapies to treat drug abusers undergoing rehabilitation. PMID:27162759

  7. A ‘frozen volume’ transition model and working mechanism for the shape memory effect in amorphous polymers

    NASA Astrophysics Data System (ADS)

    Lu, Haibao; Wang, Xiaodong; Yao, Yongtao; Qing Fu, Yong

    2018-06-01

    Phenomenological models based on frozen volume parameters could well predict shape recovery behavior of shape memory polymers (SMPs), but the physical meaning of using the frozen volume parameters to describe thermomechanical properties has not been well-established. In this study, the fundamental working mechanisms of the shape memory effect (SME) in amorphous SMPs, whose temperature-dependent viscoelastic behavior follows the Eyring equation, have been established with the considerations of both internal stress and its resulted frozen volume. The stress-strain constitutive relation was initially modeled to quantitatively describe effects of internal stresses at the macromolecular scale based on the transient network theory. A phenomenological ‘frozen volume’ model was then established to characterize the macromolecule structure and SME of amorphous SMPs based on a two-site stress-relaxation model. Effects of the internal stress, frozen volume and strain rate on shape memory behavior and thermomechanical properties of the SMP were investigated. Finally, the simulation results were compared with the experimental results reported in the literature, and good agreements between the theoretical and experimental results were achieved. The novelty and key differences of our newly proposed model with respect to the previous reports are (1). The ‘frozen volume’ in our study is caused by the internal stress and governed by the two-site model theory, thus has a good physical meaning. (2). The model can be applied to characterize and predict both the thermal and thermomechanical behaviors of SMPs based on the constitutive relationship with internal stress parameters. It is expected to provide a power tool to investigate the thermomechanical behavior of the SMPs, of which both the macromolecular structure characteristics and SME could be predicted using this ‘frozen volume’ model.

  8. Constructing Self-Modeling Videos: Procedures and Technology

    ERIC Educational Resources Information Center

    Collier-Meek, Melissa A.; Fallon, Lindsay M.; Johnson, Austin H.; Sanetti, Lisa M. H.; Delcampo, Marisa A.

    2012-01-01

    Although widely recommended, evidence-based interventions are not regularly utilized by school practitioners. Video self-modeling is an effective and efficient evidence-based intervention for a variety of student problem behaviors. However, like many other evidence-based interventions, it is not frequently used in schools. As video creation…

  9. On the usage of ultrasound computational models for decision making under ambiguity

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  10. Operation Brain Trauma Therapy

    DTIC Science & Technology

    2016-12-01

    either clinical trials in TBI if shown to be highly effective across OBTT, or tested in a precision medicine TBI phenotype (such as contusion) based...clinical trial if shown to be potently effective in one of the models in OBTT (i.e., a model that mimicked a specific clinical TBI phenotype). In... effective drug seen thus far in primary screening albeit with benefit highly model dependent, largely restricted to the CCI model. This suggests

  11. Modeling abundance effects in distance sampling

    USGS Publications Warehouse

    Royle, J. Andrew; Dawson, D.K.; Bates, S.

    2004-01-01

    Distance-sampling methods are commonly used in studies of animal populations to estimate population density. A common objective of such studies is to evaluate the relationship between abundance or density and covariates that describe animal habitat or other environmental influences. However, little attention has been focused on methods of modeling abundance covariate effects in conventional distance-sampling models. In this paper we propose a distance-sampling model that accommodates covariate effects on abundance. The model is based on specification of the distance-sampling likelihood at the level of the sample unit in terms of local abundance (for each sampling unit). This model is augmented with a Poisson regression model for local abundance that is parameterized in terms of available covariates. Maximum-likelihood estimation of detection and density parameters is based on the integrated likelihood, wherein local abundance is removed from the likelihood by integration. We provide an example using avian point-transect data of Ovenbirds (Seiurus aurocapillus) collected using a distance-sampling protocol and two measures of habitat structure (understory cover and basal area of overstory trees). The model yields a sensible description (positive effect of understory cover, negative effect on basal area) of the relationship between habitat and Ovenbird density that can be used to evaluate the effects of habitat management on Ovenbird populations.

  12. The effect of wave current interactions on the storm surge and inundation in Charleston Harbor during Hurricane Hugo 1989

    NASA Astrophysics Data System (ADS)

    Xie, Lian; Liu, Huiqing; Peng, Machuan

    The effects of wave-current interactions on the storm surge and inundation induced by Hurricane Hugo in and around the Charleston Harbor and its adjacent coastal regions are examined by using a three-dimensional (3-D) wave-current coupled modeling system. The 3-D storm surge and inundation modeling component of the coupled system is based on the Princeton ocean model (POM), whereas the wave modeling component is based on the third-generation wave model, simulating waves nearshore (SWAN). The results indicate that the effects of wave-induced surface, bottom, and radiation stresses can separately or in combination produce significant changes in storm surge and inundation. The effects of waves vary spatially. In some areas, the contribution of waves to peak storm surge during Hurricane Hugo reached as high as 0.76 m which led to substantial changes in the inundation and drying areas simulated by the storm surge model.

  13. Image-based 3D reconstruction and virtual environmental walk-through

    NASA Astrophysics Data System (ADS)

    Sun, Jifeng; Fang, Lixiong; Luo, Ying

    2001-09-01

    We present a 3D reconstruction method, which combines geometry-based modeling, image-based modeling and rendering techniques. The first component is an interactive geometry modeling method which recovery of the basic geometry of the photographed scene. The second component is model-based stereo algorithm. We discus the image processing problems and algorithms of walking through in virtual space, then designs and implement a high performance multi-thread wandering algorithm. The applications range from architectural planning and archaeological reconstruction to virtual environments and cinematic special effects.

  14. When and where does preferential flow matter - from observation to large scale modelling

    NASA Astrophysics Data System (ADS)

    Weiler, Markus; Leistert, Hannes; Steinbrich, Andreas

    2017-04-01

    Preferential flow can be of relevance in a wide range of soils and the interaction of different processes and factors are still difficult to assess. As most studies (including our own studies) focusing on the effect of preferential flow are based on relatively high precipitation rates, there is always the question how relevant preferential flow is under natural conditions, considering the site specific precipitation characteristics, the effect of the drying and wetting cycle on the initial soil water condition and shrinkage cracks, the site specific soil properties, soil structure and rock fragments, and the effect of plant roots and soil fauna (e.g. earthworm channels). In order to assess this question, we developed the distributed, process-based model RoGeR (Runoff Generation Research) to include a large number relevant features and processes of preferential flow in soils. The model was developed from a large number of process based research and experiments and includes preferential flow in roots, earthworm channels, along rock fragments and shrinkage cracks. We parameterized the uncalibrated model at a high spatial resolution of 5x5m for the whole state of Baden-Württemberg in Germany using LiDAR data, degree of sealing, landuse, soil properties and geology. As the model is an event based model, we derived typical event based precipitation characteristics based on rainfall duration, mean intensity and amount. Using the site-specific variability of initial soil moisture derived from a water balance model based on the same dataset, we simulated the infiltration and recharge amounts of all event classes derived from the event precipitation characteristics and initial soil moisture conditions. The analysis of the simulation results allowed us to extracts the relevance of preferential flow for infiltration and recharge considering all factors above. We could clearly see a strong effect of the soil properties and land-use, but also, particular for clay rich soils a strong effect of the initial conditions due to the development of soil cracks. Not too surprisingly, the relevance of preferential flow was much lower when considering the whole range of precipitation events as only considering events with a high rainfall intensity. Also, the influence on infiltration and recharge were different. Despite the model can still be improved in particular considering more realistic information about the spatial and temporal variability of preferential flow by soil fauna and plants, the model already shows under what situation we need to be very careful when predicting infiltration and recharge with models considering only longer time steps (daily) or only matrix flow.

  15. Estimating Pressure Reactivity Using Noninvasive Doppler-Based Systolic Flow Index.

    PubMed

    Zeiler, Frederick A; Smielewski, Peter; Donnelly, Joseph; Czosnyka, Marek; Menon, David K; Ercole, Ari

    2018-04-05

    The study objective was to derive models that estimate the pressure reactivity index (PRx) using the noninvasive transcranial Doppler (TCD) based systolic flow index (Sx_a) and mean flow index (Mx_a), both based on mean arterial pressure, in traumatic brain injury (TBI). Using a retrospective database of 347 patients with TBI with intracranial pressure and TCD time series recordings, we derived PRx, Sx_a, and Mx_a. We first derived the autocorrelative structure of PRx based on: (A) autoregressive integrative moving average (ARIMA) modeling in representative patients, and (B) within sequential linear mixed effects (LME) models with various embedded ARIMA error structures for PRx for the entire population. Finally, we performed sequential LME models with embedded PRx ARIMA modeling to find the best model for estimating PRx using Sx_a and Mx_a. Model adequacy was assessed via normally distributed residual density. Model superiority was assessed via Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC), log likelihood (LL), and analysis of variance testing between models. The most appropriate ARIMA structure for PRx in this population was (2,0,2). This was applied in sequential LME modeling. Two models were superior (employing random effects in the independent variables and intercept): (A) PRx ∼ Sx_a, and (B) PRx ∼ Sx_a + Mx_a. Correlation between observed and estimated PRx with these two models was: (A) 0.794 (p < 0.0001, 95% confidence interval (CI) = 0.788-0.799), and (B) 0.814 (p < 0.0001, 95% CI = 0.809-0.819), with acceptable agreement on Bland-Altman analysis. Through using linear mixed effects modeling and accounting for the ARIMA structure of PRx, one can estimate PRx using noninvasive TCD-based indices. We have described our first attempts at such modeling and PRx estimation, establishing the strong link between two aspects of cerebral autoregulation: measures of cerebral blood flow and those of pulsatile cerebral blood volume. Further work is required to validate.

  16. Research and Implementation of Tibetan Word Segmentation Based on Syllable Methods

    NASA Astrophysics Data System (ADS)

    Jiang, Jing; Li, Yachao; Jiang, Tao; Yu, Hongzhi

    2018-03-01

    Tibetan word segmentation (TWS) is an important problem in Tibetan information processing, while abbreviated word recognition is one of the key and most difficult problems in TWS. Most of the existing methods of Tibetan abbreviated word recognition are rule-based approaches, which need vocabulary support. In this paper, we propose a method based on sequence tagging model for abbreviated word recognition, and then implement in TWS systems with sequence labeling models. The experimental results show that our abbreviated word recognition method is fast and effective and can be combined easily with the segmentation model. This significantly increases the effect of the Tibetan word segmentation.

  17. Waveform model for an eccentric binary black hole based on the effective-one-body-numerical-relativity formalism

    NASA Astrophysics Data System (ADS)

    Cao, Zhoujian; Han, Wen-Biao

    2017-08-01

    Binary black hole systems are among the most important sources for gravitational wave detection. They are also good objects for theoretical research for general relativity. A gravitational waveform template is important to data analysis. An effective-one-body-numerical-relativity (EOBNR) model has played an essential role in the LIGO data analysis. For future space-based gravitational wave detection, many binary systems will admit a somewhat orbit eccentricity. At the same time, the eccentric binary is also an interesting topic for theoretical study in general relativity. In this paper, we construct the first eccentric binary waveform model based on an effective-one-body-numerical-relativity framework. Our basic assumption in the model construction is that the involved eccentricity is small. We have compared our eccentric EOBNR model to the circular one used in the LIGO data analysis. We have also tested our eccentric EOBNR model against another recently proposed eccentric binary waveform model; against numerical relativity simulation results; and against perturbation approximation results for extreme mass ratio binary systems. Compared to numerical relativity simulations with an eccentricity as large as about 0.2, the overlap factor for our eccentric EOBNR model is better than 0.98 for all tested cases, including spinless binary and spinning binary, equal mass binary, and unequal mass binary. Hopefully, our eccentric model can be the starting point to develop a faithful template for future space-based gravitational wave detectors.

  18. Genotype-Based Association Mapping of Complex Diseases: Gene-Environment Interactions with Multiple Genetic Markers and Measurement Error in Environmental Exposures

    PubMed Central

    Lobach, Irvna; Fan, Ruzone; Carroll, Raymond T.

    2011-01-01

    With the advent of dense single nucleotide polymorphism genotyping, population-based association studies have become the major tools for identifying human disease genes and for fine gene mapping of complex traits. We develop a genotype-based approach for association analysis of case-control studies of gene-environment interactions in the case when environmental factors are measured with error and genotype data are available on multiple genetic markers. To directly use the observed genotype data, we propose two genotype-based models: genotype effect and additive effect models. Our approach offers several advantages. First, the proposed risk functions can directly incorporate the observed genotype data while modeling the linkage disequihbrium information in the regression coefficients, thus eliminating the need to infer haplotype phase. Compared with the haplotype-based approach, an estimating procedure based on the proposed methods can be much simpler and significantly faster. In addition, there is no potential risk due to haplotype phase estimation. Further, by fitting the proposed models, it is possible to analyze the risk alleles/variants of complex diseases, including their dominant or additive effects. To model measurement error, we adopt the pseudo-likelihood method by Lobach et al. [2008]. Performance of the proposed method is examined using simulation experiments. An application of our method is illustrated using a population-based case-control study of association between calcium intake with the risk of colorectal adenoma development. PMID:21031455

  19. Telephone-based disease management: why it does not save money.

    PubMed

    Motheral, Brenda R

    2011-01-01

    To understand why the current telephone-based model of disease management (DM) does not provide cost savings and how DM can be retooled based on the best available evidence to deliver better value. Literature review. The published peer-reviewed evaluations of DM and transitional care models from 1990 to 2010 were reviewed. Also examined was the cost-effectiveness literature on the treatment of chronic conditions that are commonly included in DM programs, including heart failure, diabetes mellitus, coronary artery disease, and asthma. First, transitional care models, which have historically been confused with commercial DM programs, can provide credible savings over a short period, rendering them low-hanging fruit for plan sponsors who desire real savings. Second, cost-effectiveness research has shown that the individual activities that constitute contemporary DM programs are not cost saving except for heart failure. Targeting of specific patients and activity combinations based on risk, actionability, treatment and program effectiveness, and costs will be necessary to deliver a cost-saving DM program, combined with an outreach model that brings vendors closer to the patient and physician. Barriers to this evidence-driven approach include resources required, marketability, and business model disruption. After a decade of market experimentation with limited success, new thinking is called for in the design of DM programs. A program design that is based on a cost-effectiveness approach, combined with greater program efficacy, will allow for the development of DM programs that are cost saving.

  20. A Markov chain model for studying suicide dynamics: an illustration of the Rose theorem

    PubMed Central

    2014-01-01

    Background High-risk strategies would only have a modest effect on suicide prevention within a population. It is best to incorporate both high-risk and population-based strategies to prevent suicide. This study aims to compare the effectiveness of suicide prevention between high-risk and population-based strategies. Methods A Markov chain illness and death model is proposed to determine suicide dynamic in a population and examine its effectiveness for reducing the number of suicides by modifying certain parameters of the model. Assuming a population with replacement, the suicide risk of the population was estimated by determining the final state of the Markov model. Results The model shows that targeting the whole population for suicide prevention is more effective than reducing risk in the high-risk tail of the distribution of psychological distress (i.e. the mentally ill). Conclusions The results of this model reinforce the essence of the Rose theorem that lowering the suicidal risk in the population at large may be more effective than reducing the high risk in a small population. PMID:24948330

  1. Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.

    PubMed

    Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W

    2017-09-01

    An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.

  2. Adaptive model-based assistive control for pneumatic direct driven soft rehabilitation robots.

    PubMed

    Wilkening, Andre; Ivlev, Oleg

    2013-06-01

    Assistive behavior and inherent compliance are assumed to be the essential properties for effective robot-assisted therapy in neurological as well as in orthopedic rehabilitation. This paper presents two adaptive model-based assistive controllers for pneumatic direct driven soft rehabilitation robots that are based on separated models of the soft-robot and the patient's extremity, in order to take into account the individual patient's behavior, effort and ability during control, what is assumed to be essential to relearn lost motor functions in neurological and facilitate muscle reconstruction in orthopedic rehabilitation. The high inherent compliance of soft-actuators allows for a general human-robot interaction and provides the base for effective and dependable assistive control. An inverse model of the soft-robot with estimated parameters is used to achieve robot transparency during treatment and inverse adaptive models of the individual patient's extremity allow the controllers to learn on-line the individual patient's behavior and effort and react in a way that assist the patient only as much as needed. The effectiveness of the controllers is evaluated with unimpaired subjects using a first prototype of a soft-robot for elbow training. Advantages and disadvantages of both controllers are analyzed and discussed.

  3. Framework for modelling the cost-effectiveness of systemic interventions aimed to reduce youth delinquency.

    PubMed

    Schawo, Saskia J; van Eeren, Hester; Soeteman, Djira I; van der Veldt, Marie-Christine; Noom, Marc J; Brouwer, Werner; Busschbach, Jan J V; Hakkaart, Leona

    2012-12-01

    Many interventions initiated within and financed from the health care sector are not necessarily primarily aimed at improving health. This poses important questions regarding the operationalisation of economic evaluations in such contexts. We investigated whether assessing cost-effectiveness using state-of-the-art methods commonly applied in health care evaluations is feasible and meaningful when evaluating interventions aimed at reducing youth delinquency. A probabilistic Markov model was constructed to create a framework for the assessment of the cost-effectiveness of systemic interventions in delinquent youth. For illustrative purposes, Functional Family Therapy (FFT), a systemic intervention aimed at improving family functioning and, primarily, reducing delinquent activity in youths, was compared to Treatment as Usual (TAU). "Criminal activity free years" (CAFYs) were introduced as central outcome measure. Criminal activity may e.g. be based on police contacts or committed crimes. In absence of extensive data and for illustrative purposes the current study based criminal activity on available literature on recidivism. Furthermore, a literature search was performed to deduce the model's structure and parameters. Common cost-effectiveness methodology could be applied to interventions for youth delinquency. Model characteristics and parameters were derived from literature and ongoing trial data. The model resulted in an estimate of incremental costs/CAFY and included long-term effects. Illustrative model results point towards dominance of FFT compared to TAU. Using a probabilistic model and the CAFY outcome measure to assess cost-effectiveness of systemic interventions aimed to reduce delinquency is feasible. However, the model structure is limited to three states and the CAFY measure was defined rather crude. Moreover, as the model parameters are retrieved from literature the model results are illustrative in the absence of empirical data. The current model provides a framework to assess the cost-effectiveness of systemic interventions, while taking into account parameter uncertainty and long-term effectiveness. The framework of the model could be used to assess the cost-effectiveness of systemic interventions alongside (clinical) trial data. Consequently, it is suitable to inform reimbursement decisions, since the value for money of systemic interventions can be demonstrated using a decision analytic model. Future research could be focussed on testing the current model based on extensive empirical data, improving the outcome measure and finding appropriate values for that outcome.

  4. Effect of Payment Model on Patient Outcomes in Outpatient Physical Therapy.

    PubMed

    Charles, Derek; Boyd, Sylvester; Heckert, Logan; Lake, Austin; Petersen, Kevin

    2018-01-01

    Although the literature has well recognized the effectiveness of physical therapy for treating musculoskeletal injuries, reimbursement is evolving towards value-based or alternative payment models and away from procedure orientated, fee-for-service in the outpatient setting. Alternative models include cased-based clinics, pay-for-performance, out-of-network services, accountable care organizations, and concierge practices. There is the possibility that alternative payment models could produce different and even superior patient outcomes. Physical therapists should be alert to this possibility, and research is warranted in this area to conclude if outcomes in patient care are related to method of reimbursement.

  5. Cost-Effectiveness of Orthogeriatric and Fracture Liaison Service Models of Care for Hip Fracture Patients: A Population-Based Study.

    PubMed

    Leal, Jose; Gray, Alastair M; Hawley, Samuel; Prieto-Alhambra, Daniel; Delmestri, Antonella; Arden, Nigel K; Cooper, Cyrus; Javaid, M Kassim; Judge, Andrew

    2017-02-01

    Fracture liaison services are recommended as a model of best practice for organizing patient care and secondary fracture prevention for hip fracture patients, although variation exists in how such services are structured. There is considerable uncertainty as to which model is most cost-effective and should therefore be mandated. This study evaluated the cost- effectiveness of orthogeriatric (OG)- and nurse-led fracture liaison service (FLS) models of post-hip fracture care compared with usual care. Analyses were conducted from a health care and personal social services payer perspective, using a Markov model to estimate the lifetime impact of the models of care. The base-case population consisted of men and women aged 83 years with a hip fracture. The risk and costs of hip and non-hip fractures were derived from large primary and hospital care data sets in the UK. Utilities were informed by a meta-regression of 32 studies. In the base-case analysis, the orthogeriatric-led service was the most effective and cost-effective model of care at a threshold of £30,000 per quality-adjusted life years gained (QALY). For women aged 83 years, the OG-led service was the most cost-effective at £22,709/QALY. If only health care costs are considered, OG-led service was cost-effective at £12,860/QALY and £14,525/QALY for women and men aged 83 years, respectively. Irrespective of how patients were stratified in terms of their age, sex, and Charlson comorbidity score at index hip fracture, our results suggest that introducing an orthogeriatrician-led or a nurse-led FLS is cost-effective when compared with usual care. Although considerable uncertainty remains concerning which of the models of care should be preferred, introducing an orthogeriatrician-led service seems to be the most cost-effective service to pursue. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  6. FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization

    NASA Astrophysics Data System (ADS)

    Hirigoyen, Flavien; Crocherie, Axel; Vaillant, Jérôme M.; Cazaux, Yvon

    2008-02-01

    This paper presents a new FDTD-based optical simulation model dedicated to describe the optical performances of CMOS image sensors taking into account diffraction effects. Following market trend and industrialization constraints, CMOS image sensors must be easily embedded into even smaller packages, which are now equipped with auto-focus and short-term coming zoom system. Due to miniaturization, the ray-tracing models used to evaluate pixels optical performances are not accurate anymore to describe the light propagation inside the sensor, because of diffraction effects. Thus we adopt a more fundamental description to take into account these diffraction effects: we chose to use Maxwell-Boltzmann based modeling to compute the propagation of light, and to use a software with an FDTD-based (Finite Difference Time Domain) engine to solve this propagation. We present in this article the complete methodology of this modeling: on one hand incoherent plane waves are propagated to approximate a product-use diffuse-like source, on the other hand we use periodic conditions to limit the size of the simulated model and both memory and computation time. After having presented the correlation of the model with measurements we will illustrate its use in the case of the optimization of a 1.75μm pixel.

  7. Strategic directions for agent-based modeling: avoiding the YAAWN syndrome.

    PubMed

    O'Sullivan, David; Evans, Tom; Manson, Steven; Metcalf, Sara; Ligmann-Zielinska, Arika; Bone, Chris

    In this short communication, we examine how agent-based modeling has become common in land change science and is increasingly used to develop case studies for particular times and places. There is a danger that the research community is missing a prime opportunity to learn broader lessons from the use of agent-based modeling (ABM), or at the very least not sharing these lessons more widely. How do we find an appropriate balance between empirically rich, realistic models and simpler theoretically grounded models? What are appropriate and effective approaches to model evaluation in light of uncertainties not only in model parameters but also in model structure? How can we best explore hybrid model structures that enable us to better understand the dynamics of the systems under study, recognizing that no single approach is best suited to this task? Under what circumstances - in terms of model complexity, model evaluation, and model structure - can ABMs be used most effectively to lead to new insight for stakeholders? We explore these questions in the hope of helping the growing community of land change scientists using models in their research to move from 'yet another model' to doing better science with models.

  8. Application of clustering analysis in the prediction of photovoltaic power generation based on neural network

    NASA Astrophysics Data System (ADS)

    Cheng, K.; Guo, L. M.; Wang, Y. K.; Zafar, M. T.

    2017-11-01

    In order to select effective samples in the large number of data of PV power generation years and improve the accuracy of PV power generation forecasting model, this paper studies the application of clustering analysis in this field and establishes forecasting model based on neural network. Based on three different types of weather on sunny, cloudy and rainy days, this research screens samples of historical data by the clustering analysis method. After screening, it establishes BP neural network prediction models using screened data as training data. Then, compare the six types of photovoltaic power generation prediction models before and after the data screening. Results show that the prediction model combining with clustering analysis and BP neural networks is an effective method to improve the precision of photovoltaic power generation.

  9. Gene-Based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions.

    PubMed

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y; Chen, Wei

    2016-02-01

    Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, here we develop Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT), which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. © 2016 WILEY PERIODICALS, INC.

  10. Gene-based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E.; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y.; Chen, Wei

    2015-01-01

    Summary Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, we develop here Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT) which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. PMID:26782979

  11. Estimating thermal performance curves from repeated field observations

    USGS Publications Warehouse

    Childress, Evan; Letcher, Benjamin H.

    2017-01-01

    Estimating thermal performance of organisms is critical for understanding population distributions and dynamics and predicting responses to climate change. Typically, performance curves are estimated using laboratory studies to isolate temperature effects, but other abiotic and biotic factors influence temperature-performance relationships in nature reducing these models' predictive ability. We present a model for estimating thermal performance curves from repeated field observations that includes environmental and individual variation. We fit the model in a Bayesian framework using MCMC sampling, which allowed for estimation of unobserved latent growth while propagating uncertainty. Fitting the model to simulated data varying in sampling design and parameter values demonstrated that the parameter estimates were accurate, precise, and unbiased. Fitting the model to individual growth data from wild trout revealed high out-of-sample predictive ability relative to laboratory-derived models, which produced more biased predictions for field performance. The field-based estimates of thermal maxima were lower than those based on laboratory studies. Under warming temperature scenarios, field-derived performance models predicted stronger declines in body size than laboratory-derived models, suggesting that laboratory-based models may underestimate climate change effects. The presented model estimates true, realized field performance, avoiding assumptions required for applying laboratory-based models to field performance, which should improve estimates of performance under climate change and advance thermal ecology.

  12. A copula-multifractal volatility hedging model for CSI 300 index futures

    NASA Astrophysics Data System (ADS)

    Wei, Yu; Wang, Yudong; Huang, Dengshi

    2011-11-01

    In this paper, we propose a new hedging model combining the newly introduced multifractal volatility (MFV) model and the dynamic copula functions. Using high-frequency intraday quotes of the spot Shanghai Stock Exchange Composite Index (SSEC), spot China Securities Index 300 (CSI 300), and CSI 300 index futures, we compare the direct and cross hedging effectiveness of the copula-MFV model with several popular copula-GARCH models. The main empirical results show that the proposed copula-MFV model obtains better hedging effectiveness than the copula-GARCH-type models in general. Furthermore, the hedge operating strategy based MFV hedging model involves fewer transaction costs than those based on the GARCH-type models. The finding of this paper indicates that multifractal analysis may offer a new way of quantitative hedging model design using financial futures.

  13. An analysis of USSPACECOM's space surveillance network sensor tasking methodology

    NASA Astrophysics Data System (ADS)

    Berger, Jeff M.; Moles, Joseph B.; Wilsey, David G.

    1992-12-01

    This study provides the basis for the development of a cost/benefit assessment model to determine the effects of alterations to the Space Surveillance Network (SSN) on orbital element (OE) set accuracy. It provides a review of current methods used by NORAD and the SSN to gather and process observations, an alternative to the current Gabbard classification method, and the development of a model to determine the effects of observation rate and correction interval on OE set accuracy. The proposed classification scheme is based on satellite J2 perturbations. Specifically, classes were established based on mean motion, eccentricity, and inclination since J2 perturbation effects are functions of only these elements. Model development began by creating representative sensor observations using a highly accurate orbital propagation model. These observations were compared to predicted observations generated using the NORAD Simplified General Perturbation (SGP4) model and differentially corrected using a Bayes, sequential estimation, algorithm. A 10-run Monte Carlo analysis was performed using this model on 12 satellites using 16 different observation rate/correction interval combinations. An ANOVA and confidence interval analysis of the results show that this model does demonstrate the differences in steady state position error based on varying observation rate and correction interval.

  14. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users.

    PubMed

    Veksler, Vladislav D; Buchler, Norbou; Hoffman, Blaine E; Cassenti, Daniel N; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting.

  15. Machine Learning-based discovery of closures for reduced models of dynamical systems

    NASA Astrophysics Data System (ADS)

    Pan, Shaowu; Duraisamy, Karthik

    2017-11-01

    Despite the successful application of machine learning (ML) in fields such as image processing and speech recognition, only a few attempts has been made toward employing ML to represent the dynamics of complex physical systems. Previous attempts mostly focus on parameter calibration or data-driven augmentation of existing models. In this work we present a ML framework to discover closure terms in reduced models of dynamical systems and provide insights into potential problems associated with data-driven modeling. Based on exact closure models for linear system, we propose a general linear closure framework from viewpoint of optimization. The framework is based on trapezoidal approximation of convolution term. Hyperparameters that need to be determined include temporal length of memory effect, number of sampling points, and dimensions of hidden states. To circumvent the explicit specification of memory effect, a general framework inspired from neural networks is also proposed. We conduct both a priori and posteriori evaluations of the resulting model on a number of non-linear dynamical systems. This work was supported in part by AFOSR under the project ``LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  16. A Cost-Utility Analysis of Prostate Cancer Screening in Australia.

    PubMed

    Keller, Andrew; Gericke, Christian; Whitty, Jennifer A; Yaxley, John; Kua, Boon; Coughlin, Geoff; Gianduzzo, Troy

    2017-02-01

    The Göteborg randomised population-based prostate cancer screening trial demonstrated that prostate-specific antigen (PSA)-based screening reduces prostate cancer deaths compared with an age-matched control group. Utilising the prostate cancer detection rates from this study, we investigated the clinical and cost effectiveness of a similar PSA-based screening strategy for an Australian population of men aged 50-69 years. A decision model that incorporated Markov processes was developed from a health system perspective. The base-case scenario compared a population-based screening programme with current opportunistic screening practices. Costs, utility values, treatment patterns and background mortality rates were derived from Australian data. All costs were adjusted to reflect July 2015 Australian dollars (A$). An alternative scenario compared systematic with opportunistic screening but with optimisation of active surveillance (AS) uptake in both groups. A discount rate of 5 % for costs and benefits was utilised. Univariate and probabilistic sensitivity analyses were performed to assess the effect of variable uncertainty on model outcomes. Our model very closely replicated the number of deaths from both prostate cancer and background mortality in the Göteborg study. The incremental cost per quality-adjusted life-year (QALY) for PSA screening was A$147,528. However, for years of life gained (LYGs), PSA-based screening (A$45,890/LYG) appeared more favourable. Our alternative scenario with optimised AS improved cost utility to A$45,881/QALY, with screening becoming cost effective at a 92 % AS uptake rate. Both modelled scenarios were most sensitive to the utility of patients before and after intervention, and the discount rate used. PSA-based screening is not cost effective compared with Australia's assumed willingness-to-pay threshold of A$50,000/QALY. It appears more cost effective if LYGs are used as the relevant outcome, and is more cost effective than the established Australian breast cancer screening programme on this basis. Optimised utilisation of AS increases the cost effectiveness of prostate cancer screening dramatically.

  17. A Model-Driven Architecture Approach for Modeling, Specifying and Deploying Policies in Autonomous and Autonomic Systems

    NASA Technical Reports Server (NTRS)

    Pena, Joaquin; Hinchey, Michael G.; Sterritt, Roy; Ruiz-Cortes, Antonio; Resinas, Manuel

    2006-01-01

    Autonomic Computing (AC), self-management based on high level guidance from humans, is increasingly gaining momentum as the way forward in designing reliable systems that hide complexity and conquer IT management costs. Effectively, AC may be viewed as Policy-Based Self-Management. The Model Driven Architecture (MDA) approach focuses on building models that can be transformed into code in an automatic manner. In this paper, we look at ways to implement Policy-Based Self-Management by means of models that can be converted to code using transformations that follow the MDA philosophy. We propose a set of UML-based models to specify autonomic and autonomous features along with the necessary procedures, based on modification and composition of models, to deploy a policy as an executing system.

  18. Mixed quantum-classical simulation of the hydride transfer reaction catalyzed by dihydrofolate reductase based on a mapped system-harmonic bath model

    NASA Astrophysics Data System (ADS)

    Xu, Yang; Song, Kai; Shi, Qiang

    2018-03-01

    The hydride transfer reaction catalyzed by dihydrofolate reductase is studied using a recently developed mixed quantum-classical method to investigate the nuclear quantum effects on the reaction. Molecular dynamics simulation is first performed based on a two-state empirical valence bond potential to map the atomistic model to an effective double-well potential coupled to a harmonic bath. In the mixed quantum-classical simulation, the hydride degree of freedom is quantized, and the effective harmonic oscillator modes are treated classically. It is shown that the hydride transfer reaction rate using the mapped effective double-well/harmonic-bath model is dominated by the contribution from the ground vibrational state. Further comparison with the adiabatic reaction rate constant based on the Kramers theory confirms that the reaction is primarily vibrationally adiabatic, which agrees well with the high transmission coefficients found in previous theoretical studies. The calculated kinetic isotope effect is also consistent with the experimental and recent theoretical results.

  19. Role of different types of solid models in hydrodynamic modeling and their effects on groundwater protection processes

    NASA Astrophysics Data System (ADS)

    Bódi, Erika; Buday, Tamás; McIntosh, Richard William

    2013-04-01

    Defining extraction-modified flow patterns with hydrodynamic models is a pivotal question in preserving groundwater resources regarding both quality and quantity. Modeling is the first step in groundwater protection the main result of which is the determination of the protective area depending on the amount of extracted water. Solid models have significant effects on hydrodynamic models as they are based on the solid models. Due to the legislative regulations, on protection areas certain restrictions must be applied which has firm consequences on economic activities. In Hungarian regulations there are no clear instructions for the establishment of either geological or hydrodynamic modeling, however, modeling itself is an obligation. Choosing the modeling method is a key consideration for further numerical calculations and it is decisive regarding the shape and size of the groundwater protection area. The geometry of hydrodynamic model layers is derived from the solid model. There are different geological approaches including lithological and sequence stratigraphic classifications furthermore in the case of regional models, formation-based hydrostratigraphic units are also applicable. Lithological classification is based on assigning and mapping of lithotypes. When the geometry (e.g. tectonic characteristics) of the research area is not known, horizontal bedding is assumed the probability of which can not be assessed based on only lithology. If the geological correlation is based on sequence stratigraphic studies, the cyclicity of sediment deposition is also considered. This method is more integrated thus numerous parameters (e.g. electrofacies) are taken into consideration studying the geological conditions ensuring more reliable modeling. Layers of sequence stratigraphic models can be either lithologically homogeneous or they may include greater cycles of sediments containing therefore several lithological units. The advantage of this is that the modeling can handle pinching out lithological units and lenticular bodies easier while most hydrodynamic softwares cannot handle flow units related to such model layers. Interpretation of tectonic disturbance is similar. In Hungary groundwater is extracted mainly from Pleistocene and Pannonian aquifers sediments of which were deposited in the ancient Pannonian Lake. When the basin lost its open-marine connection eustasy had no direct effects on facies changes therefore subsidence and sediment supply became the main factors. Various basin-filling related facies developed including alluvial plain facies, different delta facies types and pelitic deep-basin facies. Creating solid models based on sequence stratigraphic methods requires more raw data and also genetic approaches, in addition more working hours hence this method is seldom used in practice. Lithology-based models can be transformed into sequence stratigraphic models by extending the data base (e.g. detecting more survey data). In environments where the obtained models differ significantly notable changes can occur in the supply directions in addition the groundwater travel-time of the two models even on equal extraction terms. Our study aims to call attention to the consequences of using different solid models for typical depositional systems of the Great Hungarian Plain and to their effects on groundwater protection.

  20. Numerical Investigation of Flapwise-Torsional Vibration Model of a Smart Section Blade with Microtab

    DOE PAGES

    Li, Nailu; Balas, Mark J.; Yang, Hua; ...

    2015-01-01

    This paper presents a method to develop an aeroelastic model of a smart section blade equipped with microtab. The model is suitable for potential passive vibration control study of the blade section in classic flutter. Equations of the model are described by the nondimensional flapwise and torsional vibration modes coupled with the aerodynamic model based on the Theodorsen theory and aerodynamic effects of the microtab based on the wind tunnel experimental data. The aeroelastic model is validated using numerical data available in the literature and then utilized to analyze the microtab control capability on flutter instability case and divergence instabilitymore » case. The effectiveness of the microtab is investigated with the scenarios of different output controllers and actuation deployments for both instability cases. The numerical results show that the microtab can effectively suppress both vibration modes with the appropriate choice of the output feedback controller.« less

  1. ISS Plasma Interaction: Measurements and Modeling

    NASA Technical Reports Server (NTRS)

    Barsamian, H.; Mikatarian, R.; Alred, J.; Minow, J.; Koontz, S.

    2004-01-01

    Ionospheric plasma interaction effects on the International Space Station are discussed in the following paper. The large structure and high voltage arrays of the ISS represent a complex system interacting with LEO plasma. Discharge current measurements made by the Plasma Contactor Units and potential measurements made by the Floating Potential Probe delineate charging and magnetic induction effects on the ISS. Based on theoretical and physical understanding of the interaction phenomena, a model of ISS plasma interaction has been developed. The model includes magnetic induction effects, interaction of the high voltage solar arrays with ionospheric plasma, and accounts for other conductive areas on the ISS. Based on these phenomena, the Plasma Interaction Model has been developed. Limited verification of the model has been performed by comparison of Floating Potential Probe measurement data to simulations. The ISS plasma interaction model will be further tested and verified as measurements from the Floating Potential Measurement Unit become available, and construction of the ISS continues.

  2. Effective pollutant emission heights for atmospheric transport modelling based on real-world information.

    PubMed

    Pregger, Thomas; Friedrich, Rainer

    2009-02-01

    Emission data needed as input for the operation of atmospheric models should not only be spatially and temporally resolved. Another important feature is the effective emission height which significantly influences modelled concentration values. Unfortunately this information, which is especially relevant for large point sources, is usually not available and simple assumptions are often used in atmospheric models. As a contribution to improve knowledge on emission heights this paper provides typical default values for the driving parameters stack height and flue gas temperature, velocity and flow rate for different industrial sources. The results were derived from an analysis of the probably most comprehensive database of real-world stack information existing in Europe based on German industrial data. A bottom-up calculation of effective emission heights applying equations used for Gaussian dispersion models shows significant differences depending on source and air pollutant and compared to approaches currently used for atmospheric transport modelling.

  3. Numerical Investigation of Flapwise-Torsional Vibration Model of a Smart Section Blade with Microtab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Nailu; Balas, Mark J.; Yang, Hua

    2015-01-01

    This study presents a method to develop an aeroelastic model of a smart section blade equipped with microtab. The model is suitable for potential passive vibration control study of the blade section in classic flutter. Equations of the model are described by the nondimensional flapwise and torsional vibration modes coupled with the aerodynamic model based on the Theodorsen theory and aerodynamic effects of the microtab based on the wind tunnel experimental data. The aeroelastic model is validated using numerical data available in the literature and then utilized to analyze the microtab control capability on flutter instability case and divergence instabilitymore » case. The effectiveness of the microtab is investigated with the scenarios of different output controllers and actuation deployments for both instability cases. The numerical results show that the microtab can effectively suppress both vibration modes with the appropriate choice of the output feedback controller.« less

  4. Base of Principal Aquifer for the Elkhorn-Loup Model Area, North-Central Nebraska

    USGS Publications Warehouse

    McGuire, V.L.; Peterson, Steven M.

    2008-01-01

    In Nebraska, the water managers in the Natural Resources Districts and the Nebraska Department of Natural Resources are concerned with the effect of ground-water withdrawal on the availability of surface water and the long-term effects of ground-water withdrawal on ground- and surface-water resources. In north-central Nebraska, in the Elkhorn and Loup River Basins, ground water is used for irrigation, domestic supply, and public supply; surface water is used in this area for irrigation, recreation, and hydropower production. In recognition of these sometimes competing ground- and surface-water uses in the Elkhorn and Loup River Basins, the U.S. Geological Survey, the Lewis and Clark Natural Resources District, the Lower Elkhorn Natural Resources District, the Lower Loup Natural Resources District, the Lower Niobrara Natural Resources District, the Lower Platte North Natural Resources District, the Middle Niobrara Natural Resources District, the Upper Elkhorn Natural Resources District, and the Upper Loup Natural Resources District agreed to cooperatively study water resources in the Elkhorn and Loup River Basins. The goals of the overall study were to construct and calibrate a regional ground-water flow model of the area and to use that flow model as a tool to assess current and future effects of ground-water irrigation on stream base flow and to help develop long-term water-resource management strategies for this area, hereafter referred to as the Elkhorn-Loup model area. The Elkhorn-Loup model area covers approximately 30,800 square miles, and extends from the Niobrara River in the north to the Platte River in the south. The western boundary of the Elkhorn-Loup model area coincides with the western boundary of the Middle Niobrara, Twin Platte, and Upper Loup Natural Resources Districts; the eastern boundary coincides with the approximate location of the western extent of glacial till in eastern Nebraska. The principal aquifer in most of the Elkhorn-Loup model area is the High Plains aquifer; the principal aquifer in the remaining part of the Elkhorn-Loup model area is an unnamed alluvial aquifer. The upper surface of the geologic units that directly underlie the aquifer is called the 'base of aquifer' in this report. The geologic unit that forms the base of aquifer in the Elkhorn-Loup model area varies by location. The Tertiary-age Brule Formation generally is the base of aquifer in the west; the Cretaceous-age Pierre Shale generally is the base of aquifer in the east. The purpose of this report is to update the altitude and configuration of the base of the principal aquifer in the Elkhorn-Loup model area and a 2-mile buffer area around the Elkhorn-Loup model area, using base-of-aquifer data from test holes, registered water wells, and oil and gas wells within the Elkhorn-Loup model area and a 20-mile buffer area around the Elkhorn-Loup model area that have become available since the publication of earlier maps of the base of aquifer for this area. The base-of-aquifer map is important for the Elkhorn-Loup ground-water flow model because it defines the model's lower boundary. The accuracy of the Elkhorn-Loup ground-water flow model and the accuracy of the model's predictions about the effects of ground-water irrigation on stream base flow are directly related to the accuracy of the model's lower boundary.

  5. Pharmacokinetic-Pharmacodynamic Modeling of the Anti-Tumor Effect of Sunitinib Combined with Dopamine in the Human Non-Small Cell Lung Cancer Xenograft.

    PubMed

    Hao, Fangran; Wang, Siyuan; Zhu, Xiao; Xue, Junsheng; Li, Jingyun; Wang, Lijie; Li, Jian; Lu, Wei; Zhou, Tianyan

    2017-02-01

    To investigate the anti-tumor effect of sunitinib in combination with dopamine in the treatment of nu/nu nude mice bearing non-small cell lung cancer (NSCLC) A549 cells and to develop the combination PK/PD model. Further, simulations were conducted to optimize the administration regimens. A PK/PD model was developed based on our preclinical experiment to explore the relationship between plasma concentration and drug effect quantitatively. Further, the model was evaluated and validated. By fixing the parameters obtained from the PK/PD model, simulations were built to predict the tumor suppression under various regimens. The synergistic effect was observed between sunitinib and dopamine in the study, which was confirmed by the effect constant (GAMA, estimated as 2.49). The enhanced potency of dopamine on sunitinib was exerted by on/off effect in the PK/PD model. The optimal dose regimen was selected as sunitinib (120 mg/kg, q3d) in combination with dopamine (2 mg/kg, q3d) based on the simulation study. The synergistic effect of sunitinib and dopamine was demonstrated by the preclinical experiment and confirmed by the developed PK/PD model. In addition, the regimens were optimized by means of modeling as well as simulation, which may be conducive to clinical study.

  6. Estimating wildfire behavior and effects

    Treesearch

    Frank A. Albini

    1976-01-01

    This paper presents a brief survey of the research literature on wildfire behavior and effects and assembles formulae and graphical computation aids based on selected theoretical and empirical models. The uses of mathematical fire behavior models are discussed, and the general capabilities and limitations of currently available models are outlined.

  7. Improving satellite-based PM2.5 estimates in China using Gaussian processes modeling in a Bayesian hierarchical setting.

    PubMed

    Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun

    2017-08-01

    Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2  = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.

  8. Development and validation of quasi-steady-state heat pump water heater model having stratified water tank and wrapped-tank condenser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Bo; Nawaz, Kashif; Baxter, Van D.

    Heat pump water heater systems (HPWH) introduce new challenges for design and modeling tools, because they require vapor compression system balanced with a water storage tank. In addition, a wrapped-tank condenser coil has strong coupling with a stratified water tank, which leads HPWH simulation to a transient process. To tackle these challenges and deliver an effective, hardware-based HPWH equipment design tool, a quasi-steady-state HPWH model was developed based on the DOE/ORNL Heat Pump Design Model (HPDM). Two new component models were added via this study. One is a one-dimensional stratified water tank model, an improvement to the open-source EnergyPlus watermore » tank model, by introducing a calibration factor to account for bulk mixing effect due to water draws, circulations, etc. The other is a wrapped-tank condenser coil model, using a segment-to-segment modeling approach. In conclusion, the HPWH system model was validated against available experimental data. After that, the model was used for parametric simulations to determine the effects of various design factors.« less

  9. Development and validation of quasi-steady-state heat pump water heater model having stratified water tank and wrapped-tank condenser

    DOE PAGES

    Shen, Bo; Nawaz, Kashif; Baxter, Van D.; ...

    2017-10-31

    Heat pump water heater systems (HPWH) introduce new challenges for design and modeling tools, because they require vapor compression system balanced with a water storage tank. In addition, a wrapped-tank condenser coil has strong coupling with a stratified water tank, which leads HPWH simulation to a transient process. To tackle these challenges and deliver an effective, hardware-based HPWH equipment design tool, a quasi-steady-state HPWH model was developed based on the DOE/ORNL Heat Pump Design Model (HPDM). Two new component models were added via this study. One is a one-dimensional stratified water tank model, an improvement to the open-source EnergyPlus watermore » tank model, by introducing a calibration factor to account for bulk mixing effect due to water draws, circulations, etc. The other is a wrapped-tank condenser coil model, using a segment-to-segment modeling approach. In conclusion, the HPWH system model was validated against available experimental data. After that, the model was used for parametric simulations to determine the effects of various design factors.« less

  10. Strategic directions for agent-based modeling: avoiding the YAAWN syndrome

    PubMed Central

    O’Sullivan, David; Evans, Tom; Manson, Steven; Metcalf, Sara; Ligmann-Zielinska, Arika; Bone, Chris

    2015-01-01

    In this short communication, we examine how agent-based modeling has become common in land change science and is increasingly used to develop case studies for particular times and places. There is a danger that the research community is missing a prime opportunity to learn broader lessons from the use of agent-based modeling (ABM), or at the very least not sharing these lessons more widely. How do we find an appropriate balance between empirically rich, realistic models and simpler theoretically grounded models? What are appropriate and effective approaches to model evaluation in light of uncertainties not only in model parameters but also in model structure? How can we best explore hybrid model structures that enable us to better understand the dynamics of the systems under study, recognizing that no single approach is best suited to this task? Under what circumstances – in terms of model complexity, model evaluation, and model structure – can ABMs be used most effectively to lead to new insight for stakeholders? We explore these questions in the hope of helping the growing community of land change scientists using models in their research to move from ‘yet another model’ to doing better science with models. PMID:27158257

  11. Anisotropic failure and size effects in periodic honeycomb materials: A gradient-elasticity approach

    NASA Astrophysics Data System (ADS)

    Réthoré, Julien; Dang, Thi Bach Tuyet; Kaltenbrunner, Christine

    2017-02-01

    This paper proposes a fracture mechanics model for the analysis of crack propagation in periodic honeycomb materials. The model is based on gradient-elasticity which enables us to account for the effect of the material structure at the macroscopic scale. For simulating the propagation of cracks along an arbitrary path, the numerical implementation is elaborated based on an extended finite element method with the required level of continuity. The two main features captured by the model are directionality and size effect. The numerical predictions are consistent with experimental results on honeycomb materials but also with results reported in the literature for microstructurally short cracks in metals.

  12. Multiscaling Edge Effects in an Agent-based Money Emergence Model

    NASA Astrophysics Data System (ADS)

    Oświęcimka, P.; Drożdż, S.; Gębarowski, R.; Górski, A. Z.; Kwapień, J.

    An agent-based computational economical toy model for the emergence of money from the initial barter trading, inspired by Menger's postulate that money can spontaneously emerge in a commodity exchange economy, is extensively studied. The model considered, while manageable, is significantly complex, however. It is already able to reveal phenomena that can be interpreted as emergence and collapse of money as well as the related competition effects. In particular, it is shown that - as an extra emerging effect - the money lifetimes near the critical threshold value develop multiscaling, which allow one to set parallels to critical phenomena and, thus, to the real financial markets.

  13. Model identification methodology for fluid-based inerters

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  14. Cancer Survivorship Care: Person Centered Care in a Multidisciplinary Shared Care Model.

    PubMed

    Loonen, Jacqueline J; Blijlevens, Nicole Ma; Prins, Judith; Dona, Desiree Js; Den Hartogh, Jaap; Senden, Theo; van Dulmen-Den Broeder, Eline; van der Velden, Koos; Hermens, Rosella Pmg

    2018-01-16

    Survivors of childhood and adult-onset cancer are at lifelong risk for the development of late effects of treatment that can lead to serious morbidity and premature mortality. Regular long-term follow-up aiming for prevention, early detection and intervention of late effects can preserve or improve health. The heterogeneous and often serious character of late effects emphasizes the need for specialized cancer survivorship care clinics. Multidisciplinary cancer survivorship care requires a coordinated and well integrated health care environment for risk based screening and intervention. In addition survivors engagement and adherence to the recommendations are also important elements. We developed an innovative model for integrated care for cancer survivors, the "Personalized Cancer Survivorship Care Model", that is being used in our clinic. This model comprises 1. Personalized follow-up care according to the principles of Person Centered Care, aiming to empower survivors and to support self management, and 2. Organization according to a multidisciplinary and risk based approach. The concept of person centered care is based on three components: initiating, integrating and safeguarding the partnership with the patient. This model has been developed as a universal model of care that will work for all cancer survivors in different health care systems. It could be used for studies to improve self efficacy and the cost-effectiveness of cancer survivorship care.

  15. A Physiologically Based, Multi-Scale Model of Skeletal Muscle Structure and Function

    PubMed Central

    Röhrle, O.; Davidson, J. B.; Pullan, A. J.

    2012-01-01

    Models of skeletal muscle can be classified as phenomenological or biophysical. Phenomenological models predict the muscle’s response to a specified input based on experimental measurements. Prominent phenomenological models are the Hill-type muscle models, which have been incorporated into rigid-body modeling frameworks, and three-dimensional continuum-mechanical models. Biophysically based models attempt to predict the muscle’s response as emerging from the underlying physiology of the system. In this contribution, the conventional biophysically based modeling methodology is extended to include several structural and functional characteristics of skeletal muscle. The result is a physiologically based, multi-scale skeletal muscle finite element model that is capable of representing detailed, geometrical descriptions of skeletal muscle fibers and their grouping. Together with a well-established model of motor-unit recruitment, the electro-physiological behavior of single muscle fibers within motor units is computed and linked to a continuum-mechanical constitutive law. The bridging between the cellular level and the organ level has been achieved via a multi-scale constitutive law and homogenization. The effect of homogenization has been investigated by varying the number of embedded skeletal muscle fibers and/or motor units and computing the resulting exerted muscle forces while applying the same excitatory input. All simulations were conducted using an anatomically realistic finite element model of the tibialis anterior muscle. Given the fact that the underlying electro-physiological cellular muscle model is capable of modeling metabolic fatigue effects such as potassium accumulation in the T-tubular space and inorganic phosphate build-up, the proposed framework provides a novel simulation-based way to investigate muscle behavior ranging from motor-unit recruitment to force generation and fatigue. PMID:22993509

  16. First-order fire effects models for land Management: Overview and issues

    Treesearch

    Elizabeth D. Reinhardt; Matthew B. Dickinson

    2010-01-01

    We give an overview of the science application process at work in supporting fire management. First-order fire effects models, such as those discussed in accompanying papers, are the building blocks of software systems designed for application to landscapes over time scales from days to centuries. Fire effects may be modeled using empirical, rule based, or process...

  17. Tactical Games Model and Its Effects on Student Physical Activity and Gameplay Performance in Secondary Physical Education

    ERIC Educational Resources Information Center

    Hodges, Michael; Wicke, Jason; Flores-Marti, Ismael

    2018-01-01

    Many have examined game-based instructional models, though few have examined the effects of the Tactical Games Model (TGM) on secondary-aged students. Therefore, this study examined the effects TGM has on secondary students' physical activity (PA) and gameplay performance (GPP) in three secondary schools. Physical education teachers (N = 3) were…

  18. Using agent based modeling to assess the effect of increased Bus Rapid Transit system infrastructure on walking for transportation.

    PubMed

    Lemoine, Pablo D; Cordovez, Juan Manuel; Zambrano, Juan Manuel; Sarmiento, Olga L; Meisel, Jose D; Valdivia, Juan Alejandro; Zarama, Roberto

    2016-07-01

    The effect of transport infrastructure on walking is of interest to researchers because it provides an opportunity, from the public policy point of view, to increase physical activity (PA). We use an agent based model (ABM) to examine the effect of transport infrastructure on walking. Particular relevance is given to assess the effect of the growth of the Bus Rapid Transit (BRT) system in Bogotá on walking. In the ABM agents are assigned a home, work location, and socioeconomic status (SES) based on which they are assigned income for transportation. Individuals must decide between the available modes of transport (i.e., car, taxi, bus, BRT, and walking) as the means of reaching their destination, based on resources and needed travel time. We calibrated the model based on Bogota's 2011 mobility survey. The ABM results are consistent with previous empirical findings, increasing BRT access does indeed increase the number of minutes that individuals walk for transportation, although this effect also depends on the availability of other transport modes. The model indicates a saturation process: as more BRT lanes are added, the increment in minutes walking becomes smaller, and eventually the walking time decreases. Our findings on the potential contribution of the expansion of the BRT system to walking for transportation suggest that ABMs may prove helpful in designing policies to continue promoting walking. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Motivating Teachers to Enact Free-Choice Project-Based Learning in Science and Technology (PBLSAT): Effects of a Professional Development Model

    ERIC Educational Resources Information Center

    Fallik, Orna; Eylon, Bat-Sheva; Rosenfeld, Sherman

    2008-01-01

    We investigated the effects of a long-term, continuous professional development (CPD) model, designed to support teachers to enact Project-Based Learning (PBLSAT). How do novice PBLSAT teachers view their acquisition of PBLSAT skills and how do expert PBLSAT teachers, who enacted the program 5-7 years, perceive the program? Novice teachers…

  20. Direct coupling of a genome-scale microbial in silico model and a groundwater reactive transport model.

    PubMed

    Fang, Yilin; Scheibe, Timothy D; Mahadevan, Radhakrishnan; Garg, Srinath; Long, Philip E; Lovley, Derek R

    2011-03-25

    The activity of microorganisms often plays an important role in dynamic natural attenuation or engineered bioremediation of subsurface contaminants, such as chlorinated solvents, metals, and radionuclides. To evaluate and/or design bioremediated systems, quantitative reactive transport models are needed. State-of-the-art reactive transport models often ignore the microbial effects or simulate the microbial effects with static growth yield and constant reaction rate parameters over simulated conditions, while in reality microorganisms can dynamically modify their functionality (such as utilization of alternative respiratory pathways) in response to spatial and temporal variations in environmental conditions. Constraint-based genome-scale microbial in silico models, using genomic data and multiple-pathway reaction networks, have been shown to be able to simulate transient metabolism of some well studied microorganisms and identify growth rate, substrate uptake rates, and byproduct rates under different growth conditions. These rates can be identified and used to replace specific microbially-mediated reaction rates in a reactive transport model using local geochemical conditions as constraints. We previously demonstrated the potential utility of integrating a constraint-based microbial metabolism model with a reactive transport simulator as applied to bioremediation of uranium in groundwater. However, that work relied on an indirect coupling approach that was effective for initial demonstration but may not be extensible to more complex problems that are of significant interest (e.g., communities of microbial species and multiple constraining variables). Here, we extend that work by presenting and demonstrating a method of directly integrating a reactive transport model (FORTRAN code) with constraint-based in silico models solved with IBM ILOG CPLEX linear optimizer base system (C library). The models were integrated with BABEL, a language interoperability tool. The modeling system is designed in such a way that constraint-based models targeting different microorganisms or competing organism communities can be easily plugged into the system. Constraint-based modeling is very costly given the size of a genome-scale reaction network. To save computation time, a binary tree is traversed to examine the concentration and solution pool generated during the simulation in order to decide whether the constraint-based model should be called. We also show preliminary results from the integrated model including a comparison of the direct and indirect coupling approaches and evaluated the ability of the approach to simulate field experiment. Published by Elsevier B.V.

  1. SPH modelling of depth-limited turbulent open channel flows over rough boundaries.

    PubMed

    Kazemi, Ehsan; Nichols, Andrew; Tait, Simon; Shao, Songdong

    2017-01-10

    A numerical model based on the smoothed particle hydrodynamics method is developed to simulate depth-limited turbulent open channel flows over hydraulically rough beds. The 2D Lagrangian form of the Navier-Stokes equations is solved, in which a drag-based formulation is used based on an effective roughness zone near the bed to account for the roughness effect of bed spheres and an improved sub-particle-scale model is applied to account for the effect of turbulence. The sub-particle-scale model is constructed based on the mixing-length assumption rather than the standard Smagorinsky approach to compute the eddy-viscosity. A robust in/out-flow boundary technique is also proposed to achieve stable uniform flow conditions at the inlet and outlet boundaries where the flow characteristics are unknown. The model is applied to simulate uniform open channel flows over a rough bed composed of regular spheres and validated by experimental velocity data. To investigate the influence of the bed roughness on different flow conditions, data from 12 experimental tests with different bed slopes and uniform water depths are simulated, and a good agreement has been observed between the model and experimental results of the streamwise velocity and turbulent shear stress. This shows that both the roughness effect and flow turbulence should be addressed in order to simulate the correct mechanisms of turbulent flow over a rough bed boundary and that the presented smoothed particle hydrodynamics model accomplishes this successfully. © 2016 The Authors International Journal for Numerical Methods in Fluids Published by John Wiley & Sons Ltd.

  2. Suicide in the media: a quantitative review of studies based on non-fictional stories.

    PubMed

    Stack, Steven

    2005-04-01

    Research on the effect of suicide stories in the media on suicide in the real world has been marked by much debate and inconsistent findings. Recent narrative reviews have suggested that research based on nonfictional models is more apt to uncover imitative effects than research based on fictional models. There is, however, substantial variation in media effects within the research restricted to nonfictional accounts of suicide. The present analysis provides some explanations of the variation in findings in the work on nonfictional media. Logistic regression techniques applied to 419 findings from 55 studies determined that: (1) studies measuring the presence of either an entertainment or political celebrity were 5.27 times more likely to find a copycat effect, (2) studies focusing on stories that stressed negative definitions of suicide were 99% less likely to report a copycat effect, (3) research based on television stories (which receive less coverage than print stories) were 79% less likely to find a copycat effect, and (4) studies focusing on female suicide were 4.89 times more likely to report a copycat effect than other studies. The full logistic regression model correctly classified 77.3% of the findings from the 55 studies. Methodological differences among studies are associated with discrepancies in their results.

  3. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  4. The effect of different distance measures in detecting outliers using clustering-based algorithm for circular regression model

    NASA Astrophysics Data System (ADS)

    Di, Nur Faraidah Muhammad; Satari, Siti Zanariah

    2017-05-01

    Outlier detection in linear data sets has been done vigorously but only a small amount of work has been done for outlier detection in circular data. In this study, we proposed multiple outliers detection in circular regression models based on the clustering algorithm. Clustering technique basically utilizes distance measure to define distance between various data points. Here, we introduce the similarity distance based on Euclidean distance for circular model and obtain a cluster tree using the single linkage clustering algorithm. Then, a stopping rule for the cluster tree based on the mean direction and circular standard deviation of the tree height is proposed. We classify the cluster group that exceeds the stopping rule as potential outlier. Our aim is to demonstrate the effectiveness of proposed algorithms with the similarity distances in detecting the outliers. It is found that the proposed methods are performed well and applicable for circular regression model.

  5. Toward refined environmental scenarios for ecological risk assessment of down-the-drain chemicals in freshwater environments.

    PubMed

    Franco, Antonio; Price, Oliver R; Marshall, Stuart; Jolliet, Olivier; Van den Brink, Paul J; Rico, Andreu; Focks, Andreas; De Laender, Frederik; Ashauer, Roman

    2017-03-01

    Current regulatory practice for chemical risk assessment suffers from the lack of realism in conventional frameworks. Despite significant advances in exposure and ecological effect modeling, the implementation of novel approaches as high-tier options for prospective regulatory risk assessment remains limited, particularly among general chemicals such as down-the-drain ingredients. While reviewing the current state of the art in environmental exposure and ecological effect modeling, we propose a scenario-based framework that enables a better integration of exposure and effect assessments in a tiered approach. Global- to catchment-scale spatially explicit exposure models can be used to identify areas of higher exposure and to generate ecologically relevant exposure information for input into effect models. Numerous examples of mechanistic ecological effect models demonstrate that it is technically feasible to extrapolate from individual-level effects to effects at higher levels of biological organization and from laboratory to environmental conditions. However, the data required to parameterize effect models that can embrace the complexity of ecosystems are large and require a targeted approach. Experimental efforts should, therefore, focus on vulnerable species and/or traits and ecological conditions of relevance. We outline key research needs to address the challenges that currently hinder the practical application of advanced model-based approaches to risk assessment of down-the-drain chemicals. Integr Environ Assess Manag 2017;13:233-248. © 2016 SETAC. © 2016 SETAC.

  6. Thermodynamics-based models of transcriptional regulation with gene sequence.

    PubMed

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  7. Measuring Teacher Effectiveness through Hierarchical Linear Models: Exploring Predictors of Student Achievement and Truancy

    ERIC Educational Resources Information Center

    Subedi, Bidya Raj; Reese, Nancy; Powell, Randy

    2015-01-01

    This study explored significant predictors of student's Grade Point Average (GPA) and truancy (days absent), and also determined teacher effectiveness based on proportion of variance explained at teacher level model. We employed a two-level hierarchical linear model (HLM) with student and teacher data at level-1 and level-2 models, respectively.…

  8. The Integrated Model of Sustainability Perspective in Spermatophyta Learning Based on Local Wisdom

    NASA Astrophysics Data System (ADS)

    Hartadiyati, E.; Rizqiyah, K.; Wiyanto; Rusilowati, A.; Prasetia, A. P. B.

    2017-09-01

    In present condition, culture is diminished, the change of social order toward the generation that has no policy and pro-sustainability; As well as the advancement of science and technology are often treated unwisely so as to excite local wisdom. It is therefore necessary to explore intra-curricular local wisdom in schools. This study aims to produce an integration model of sustainability perspectives based on local wisdom on spermatophyta material that is feasible and effective. This research uses define, design and develop stages to an integration model of sustainability perspectives based on local wisdom on spermatophyta material. The resulting product is an integration model of socio-cultural, economic and environmental sustainability perspective and formulated with preventive, preserve and build action on spermatophyta material consisting of identification and classification, metagenesis and the role of spermatophyta for human life. The integration model of sustainability perspective in learning spermatophyta based on local wisdom is considered proven to be effective in raising sustainability’s awareness of high school students.

  9. Analysis of the contributions of ring current and electric field effects to the chemical shifts of RNA bases.

    PubMed

    Sahakyan, Aleksandr B; Vendruscolo, Michele

    2013-02-21

    Ring current and electric field effects can considerably influence NMR chemical shifts in biomolecules. Understanding such effects is particularly important for the development of accurate mappings between chemical shifts and the structures of nucleic acids. In this work, we first analyzed the Pople and the Haigh-Mallion models in terms of their ability to describe nitrogen base conjugated ring effects. We then created a database (DiBaseRNA) of three-dimensional arrangements of RNA base pairs from X-ray structures, calculated the corresponding chemical shifts via a hybrid density functional theory approach and used the results to parametrize the ring current and electric field effects in RNA bases. Next, we studied the coupling of the electric field and ring current effects for different inter-ring arrangements found in RNA bases using linear model fitting, with joint electric field and ring current, as well as only electric field and only ring current approximations. Taken together, our results provide a characterization of the interdependence of ring current and electric field geometric factors, which is shown to be especially important for the chemical shifts of non-hydrogen atoms in RNA bases.

  10. A systematic review on the anxiolytic effects of aromatherapy on rodents under experimentally induced anxiety models.

    PubMed

    Tsang, Hector W H; Ho, Timothy Y C

    2010-01-01

    We reviewed studies from 1999 to 2009 on anxiolytic effects of different essential oils toward rodents in anxiety-related behavioral models. Journal papers that evaluated the anxiolytic effects of essential oils for rodents were extracted from available electronic data bases. The results based on 14 studies showed that different rodent species were recruited including ICR mice and Swiss mice. Most of studies applied the Elevated Plus Maze (EPM) as the animal behavioral model. Lavender oil was the most popular within the 14 studies. Lavender and rose oils were found to be effective in some of the studies. Only one study reported the underlying neurophysiological mechanism in terms of concentrations of emotionally related neuro-transmitters such as dopamine, serotonin, and their derivatives, in various brain regions. Some essential oils are found to be effective to induce anxiolytic effect in rodents under different animal anxiety models. However, more standardized experimental procedures and outcome measures are needed in future studies. Translational research to human subjects is also recommended.

  11. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    PubMed

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  12. Quantifying the indirect impacts of climate on agriculture: an inter-method comparison

    DOE PAGES

    Calvin, Kate; Fisher-Vanden, Karen

    2017-10-27

    Climate change and increases in CO2 concentration affect the productivity of land, with implications for land use, land cover, and agricultural production. Much of the literature on the effect of climate on agriculture has focused on linking projections of changes in climate to process-based or statistical crop models. However, the changes in productivity have broader economic implications that cannot be quantified in crop models alone. How important are these socio-economic feedbacks to a comprehensive assessment of the impacts of climate change on agriculture? In this paper, we attempt to measure the importance of these interaction effects through an inter-method comparisonmore » between process models, statistical models, and integrated assessment model (IAMs). We find the impacts on crop yields vary widely between these three modeling approaches. Yield impacts generated by the IAMs are 20%-40% higher than the yield impacts generated by process-based or statistical crop models, with indirect climate effects adjusting yields by between - 12% and + 15% (e.g. input substitution and crop switching). The remaining effects are due to technological change.« less

  13. Quantifying the indirect impacts of climate on agriculture: an inter-method comparison

    NASA Astrophysics Data System (ADS)

    Calvin, Kate; Fisher-Vanden, Karen

    2017-11-01

    Climate change and increases in CO2 concentration affect the productivity of land, with implications for land use, land cover, and agricultural production. Much of the literature on the effect of climate on agriculture has focused on linking projections of changes in climate to process-based or statistical crop models. However, the changes in productivity have broader economic implications that cannot be quantified in crop models alone. How important are these socio-economic feedbacks to a comprehensive assessment of the impacts of climate change on agriculture? In this paper, we attempt to measure the importance of these interaction effects through an inter-method comparison between process models, statistical models, and integrated assessment model (IAMs). We find the impacts on crop yields vary widely between these three modeling approaches. Yield impacts generated by the IAMs are 20%-40% higher than the yield impacts generated by process-based or statistical crop models, with indirect climate effects adjusting yields by between -12% and +15% (e.g. input substitution and crop switching). The remaining effects are due to technological change.

  14. Quantifying the indirect impacts of climate on agriculture: an inter-method comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvin, Kate; Fisher-Vanden, Karen

    Climate change and increases in CO2 concentration affect the productivity of land, with implications for land use, land cover, and agricultural production. Much of the literature on the effect of climate on agriculture has focused on linking projections of changes in climate to process-based or statistical crop models. However, the changes in productivity have broader economic implications that cannot be quantified in crop models alone. How important are these socio-economic feedbacks to a comprehensive assessment of the impacts of climate change on agriculture? In this paper, we attempt to measure the importance of these interaction effects through an inter-method comparisonmore » between process models, statistical models, and integrated assessment model (IAMs). We find the impacts on crop yields vary widely between these three modeling approaches. Yield impacts generated by the IAMs are 20%-40% higher than the yield impacts generated by process-based or statistical crop models, with indirect climate effects adjusting yields by between - 12% and + 15% (e.g. input substitution and crop switching). The remaining effects are due to technological change.« less

  15. Mechanistic modelling of the inhibitory effect of pH on microbial growth.

    PubMed

    Akkermans, Simen; Van Impe, Jan F

    2018-06-01

    Modelling and simulation of microbial dynamics as a function of processing, transportation and storage conditions is a useful tool to improve microbial food safety and quality. The goal of this research is to improve an existing methodology for building mechanistic predictive models based on the environmental conditions. The effect of environmental conditions on microbial dynamics is often described by combining the separate effects in a multiplicative way (gamma concept). This idea was extended further in this work by including the effects of the lag and stationary growth phases on microbial growth rate as independent gamma factors. A mechanistic description of the stationary phase as a function of pH was included, based on a novel class of models that consider product inhibition. Experimental results on Escherichia coli growth dynamics indicated that also the parameters of the product inhibition equations can be modelled with the gamma approach. This work has extended a modelling methodology, resulting in predictive models that are (i) mechanistically inspired, (ii) easily identifiable with a limited work load and (iii) easily extended to additional environmental conditions. Copyright © 2017. Published by Elsevier Ltd.

  16. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis.

    PubMed

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine

    2017-09-01

    To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. A mixed model for the relationship between climate and human cranial form.

    PubMed

    Katz, David C; Grote, Mark N; Weaver, Timothy D

    2016-08-01

    We expand upon a multivariate mixed model from quantitative genetics in order to estimate the magnitude of climate effects in a global sample of recent human crania. In humans, genetic distances are correlated with distances based on cranial form, suggesting that population structure influences both genetic and quantitative trait variation. Studies controlling for this structure have demonstrated significant underlying associations of cranial distances with ecological distances derived from climate variables. However, to assess the biological importance of an ecological predictor, estimates of effect size and uncertainty in the original units of measurement are clearly preferable to significance claims based on units of distance. Unfortunately, the magnitudes of ecological effects are difficult to obtain with distance-based methods, while models that produce estimates of effect size generally do not scale to high-dimensional data like cranial shape and form. Using recent innovations that extend quantitative genetics mixed models to highly multivariate observations, we estimate morphological effects associated with a climate predictor for a subset of the Howells craniometric dataset. Several measurements, particularly those associated with cranial vault breadth, show a substantial linear association with climate, and the multivariate model incorporating a climate predictor is preferred in model comparison. Previous studies demonstrated the existence of a relationship between climate and cranial form. The mixed model quantifies this relationship concretely. Evolutionary questions that require population structure and phylogeny to be disentangled from potential drivers of selection may be particularly well addressed by mixed models. Am J Phys Anthropol 160:593-603, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  18. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  19. Ferroelectric Field Effect Transistor Model Using Partitioned Ferroelectric Layer and Partial Polarization

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat D.

    2004-01-01

    A model of an n-channel ferroelectric field effect transistor has been developed based on both theoretical and empirical data. The model is based on an existing model that incorporates partitioning of the ferroelectric layer to calculate the polarization within the ferroelectric material. The model incorporates several new aspects that are useful to the user. It takes into account the effect of a non-saturating gate voltage only partially polarizing the ferroelectric material based on the existing remnant polarization. The model also incorporates the decay of the remnant polarization based on the time history of the FFET. A gate pulse of a specific voltage; will not put the ferroelectric material into a single amount of polarization for that voltage, but instead vary with previous state of the material and the time since the last change to the gate voltage. The model also utilizes data from FFETs made from different types of ferroelectric materials to allow the user just to input the material being used and not recreate the entire model. The model also allows the user to input the quality of the ferroelectric material being used. The ferroelectric material quality can go from a theoretical perfect material with little loss and no decay to a less than perfect material with remnant losses and decay. This model is designed to be used by people who need to predict the external characteristics of a FFET before the time and expense of design and fabrication. It also allows the parametric evaluation of quality of the ferroelectric film on the overall performance of the transistor.

  20. A meta-analytic review of school-based prevention for cannabis use.

    PubMed

    Porath-Waller, Amy J; Beasley, Erin; Beirness, Douglas J

    2010-10-01

    This investigation used meta-analytic techniques to evaluate the effectiveness of school-based prevention programming in reducing cannabis use among youth aged 12 to 19. It summarized the results from 15 studies published in peer-reviewed journals since 1999 and identified features that influenced program effectiveness. The results from the set of 15 studies indicated that these school-based programs had a positive impact on reducing students' cannabis use (d = 0.58, CI: 0.55, 0.62) compared to control conditions. Findings revealed that programs incorporating elements of several prevention models were significantly more effective than were those based on only a social influence model. Programs that were longer in duration (≥15 sessions) and facilitated by individuals other than teachers in an interactive manner also yielded stronger effects. The results also suggested that programs targeting high school students were more effective than were those aimed at middle-school students. Implications for school-based prevention programming are discussed.

  1. A Consideration of Factors Accounting for Goal Effectiveness: A Longitudinal Study.

    ERIC Educational Resources Information Center

    Stewart, James H.

    This research paper presents a model of organizational effectiveness based on the open system perspective and tests four hypotheses concerning organizational effectiveness factors. Organizational effectiveness can be defined as the extent to which a social system makes progress toward objectives based on the four phases of organizational…

  2. Turbulence Model Comparisons and Reynolds Number Effects Over a High-Speed Aircraft at Transonic Speeds

    NASA Technical Reports Server (NTRS)

    Rivers, Melissa B.; Wahls, Richard A.

    1999-01-01

    This paper gives the results of a grid study, a turbulence model study, and a Reynolds number effect study for transonic flows over a high-speed aircraft using the thin-layer, upwind, Navier-Stokes CFL3D code. The four turbulence models evaluated are the algebraic Baldwin-Lomax model with the Degani-Schiff modifications, the one-equation Baldwin-Barth model, the one-equation Spalart-Allmaras model, and Menter's two-equation Shear-Stress-Transport (SST) model. The flow conditions, which correspond to tests performed in the NASA Langley National Transonic Facility (NTF), are a Mach number of 0.90 and a Reynolds number of 30 million based on chord for a range of angle-of-attacks (1 degree to 10 degrees). For the Reynolds number effect study, Reynolds numbers of 10 and 80 million based on chord were also evaluated. Computed forces and surface pressures compare reasonably well with the experimental data for all four of the turbulence models. The Baldwin-Lomax model with the Degani-Schiff modifications and the one-equation Baldwin-Barth model show the best agreement with experiment overall. The Reynolds number effects are evaluated using the Baldwin-Lomax with the Degani-Schiff modifications and the Baldwin-Barth turbulence models. Five angles-of-attack were evaluated for the Reynolds number effect study at three different Reynolds numbers. More work is needed to determine the ability of CFL3D to accurately predict Reynolds number effects.

  3. The Use of Health Information Technology Within Collaborative and Integrated Models of Child Psychiatry Practice.

    PubMed

    Coffey, Sara; Vanderlip, Erik; Sarvet, Barry

    2017-01-01

    There is a consistent need for more child and adolescent psychiatrists. Despite increased recruitment of child and adolescent psychiatry trainees, traditional models of care will likely not be able to meet the need of youth with mental illness. Integrated care models focusing on population-based, team-based, measurement-based, and evidenced-based care have been effective in addressing accessibility and quality of care. These integrated models have specific needs regarding health information technology (HIT). HIT has been used in a variety of different ways in several integrated care models. HIT can aid in implementation of these models but is not without its challenges. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    PubMed Central

    Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs

    2018-01-01

    Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344

  5. Radiative transfer model for aerosols in infrared wavelengths for passive remote sensing applications.

    PubMed

    Ben-David, Avishai; Embury, Janon F; Davidson, Charles E

    2006-09-10

    A comprehensive analytical radiative transfer model for isothermal aerosols and vapors for passive infrared remote sensing applications (ground-based and airborne sensors) has been developed. The theoretical model illustrates the qualitative difference between an aerosol cloud and a chemical vapor cloud. The model is based on two and two/four stream approximations and includes thermal emission-absorption by the aerosols; scattering of diffused sky radiances incident from all sides on the aerosols (downwelling, upwelling, left, and right); and scattering of aerosol thermal emission. The model uses moderate resolution transmittance ambient atmospheric radiances as boundary conditions and provides analytical expressions for the information on the aerosol cloud that is contained in remote sensing measurements by using thermal contrasts between the aerosols and diffused sky radiances. Simulated measurements of a ground-based sensor viewing Bacillus subtilis var. niger bioaerosols and kaolin aerosols are given and discussed to illustrate the differences between a vapor-only model (i.e., only emission-absorption effects) and a complete model that adds aerosol scattering effects.

  6. A method of real-time fault diagnosis for power transformers based on vibration analysis

    NASA Astrophysics Data System (ADS)

    Hong, Kaixing; Huang, Hai; Zhou, Jianping; Shen, Yimin; Li, Yujie

    2015-11-01

    In this paper, a novel probability-based classification model is proposed for real-time fault detection of power transformers. First, the transformer vibration principle is introduced, and two effective feature extraction techniques are presented. Next, the details of the classification model based on support vector machine (SVM) are shown. The model also includes a binary decision tree (BDT) which divides transformers into different classes according to health state. The trained model produces posterior probabilities of membership to each predefined class for a tested vibration sample. During the experiments, the vibrations of transformers under different conditions are acquired, and the corresponding feature vectors are used to train the SVM classifiers. The effectiveness of this model is illustrated experimentally on typical in-service transformers. The consistency between the results of the proposed model and the actual condition of the test transformers indicates that the model can be used as a reliable method for transformer fault detection.

  7. Robust estimation of the proportion of treatment effect explained by surrogate marker information.

    PubMed

    Parast, Layla; McDermott, Mary M; Tian, Lu

    2016-05-10

    In randomized treatment studies where the primary outcome requires long follow-up of patients and/or expensive or invasive obtainment procedures, the availability of a surrogate marker that could be used to estimate the treatment effect and could potentially be observed earlier than the primary outcome would allow researchers to make conclusions regarding the treatment effect with less required follow-up time and resources. The Prentice criterion for a valid surrogate marker requires that a test for treatment effect on the surrogate marker also be a valid test for treatment effect on the primary outcome of interest. Based on this criterion, methods have been developed to define and estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on the surrogate marker. These methods aim to identify useful statistical surrogates that capture a large proportion of the treatment effect. However, current methods to estimate this proportion usually require restrictive model assumptions that may not hold in practice and thus may lead to biased estimates of this quantity. In this paper, we propose a nonparametric procedure to estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on a potential surrogate marker and extend this procedure to a setting with multiple surrogate markers. We compare our approach with previously proposed model-based approaches and propose a variance estimation procedure based on a perturbation-resampling method. Simulation studies demonstrate that the procedure performs well in finite samples and outperforms model-based procedures when the specified models are not correct. We illustrate our proposed procedure using a data set from a randomized study investigating a group-mediated cognitive behavioral intervention for peripheral artery disease participants. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Linear mixed model for heritability estimation that explicitly addresses environmental variation.

    PubMed

    Heckerman, David; Gurdasani, Deepti; Kadie, Carl; Pomilla, Cristina; Carstensen, Tommy; Martin, Hilary; Ekoru, Kenneth; Nsubuga, Rebecca N; Ssenyomo, Gerald; Kamali, Anatoli; Kaleebu, Pontiano; Widmer, Christian; Sandhu, Manjinder S

    2016-07-05

    The linear mixed model (LMM) is now routinely used to estimate heritability. Unfortunately, as we demonstrate, LMM estimates of heritability can be inflated when using a standard model. To help reduce this inflation, we used a more general LMM with two random effects-one based on genomic variants and one based on easily measured spatial location as a proxy for environmental effects. We investigated this approach with simulated data and with data from a Uganda cohort of 4,778 individuals for 34 phenotypes including anthropometric indices, blood factors, glycemic control, blood pressure, lipid tests, and liver function tests. For the genomic random effect, we used identity-by-descent estimates from accurately phased genome-wide data. For the environmental random effect, we constructed a covariance matrix based on a Gaussian radial basis function. Across the simulated and Ugandan data, narrow-sense heritability estimates were lower using the more general model. Thus, our approach addresses, in part, the issue of "missing heritability" in the sense that much of the heritability previously thought to be missing was fictional. Software is available at https://github.com/MicrosoftGenomics/FaST-LMM.

  9. Exploring social structure effect on language evolution based on a computational model

    NASA Astrophysics Data System (ADS)

    Gong, Tao; Minett, James; Wang, William

    2008-06-01

    A compositionality-regularity coevolution model is adopted to explore the effect of social structure on language emergence and maintenance. Based on this model, we explore language evolution in three experiments, and discuss the role of a popular agent in language evolution, the relationship between mutual understanding and social hierarchy, and the effect of inter-community communications and that of simple linguistic features on convergence of communal languages in two communities. This work embodies several important interactions during social learning, and introduces a new approach that manipulates individuals' probabilities to participate in social interactions to study the effect of social structure. We hope it will stimulate further theoretical and empirical explorations on language evolution in a social environment.

  10. A probabilistic maintenance model for diesel engines

    NASA Astrophysics Data System (ADS)

    Pathirana, Shan; Abeygunawardane, Saranga Kumudu

    2018-02-01

    In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.

  11. Effects of capillarity and microtopography on wetland specific yield

    USGS Publications Warehouse

    Sumner, D.M.

    2007-01-01

    Hydrologic models aid in describing water flows and levels in wetlands. Frequently, these models use a specific yield conceptualization to relate water flows to water level changes. Traditionally, a simple conceptualization of specific yield is used, composed of two constant values for above- and below-surface water levels and neglecting the effects of soil capillarity and land surface microtopography. The effects of capiltarity and microtopography on specific yield were evaluated at three wetland sites in the Florida Everglades. The effect of capillarity on specific yield was incorporated based on the fillable pore space within a soil moisture profile at hydrostatic equilibrium with the water table. The effect of microtopography was based on areal averaging of topographically varying values of specific yield. The results indicate that a more physically-based conceptualization of specific yield incorporating capillary and microtopographic considerations can be substantially different from the traditional two-part conceptualization, and from simpler conceptualizations incorporating only capillarity or only microtopography. For the sites considered, traditional estimates of specific yield could under- or overestimate the more physically based estimates by a factor of two or more. The results suggest that consideration of both capillarity and microtopography is important to the formulation of specific yield in physically based hydrologic models of wetlands. ?? 2007, The Society of Wetland Scientists.

  12. The recovery model and complex health needs: what health psychology can learn from mental health and substance misuse service provision.

    PubMed

    Webb, Lucy

    2012-07-01

    This article reviews key arguments around evidence-based practice and outlines the methodological demands for effective adoption of recovery model principles. The recovery model is outlined and demonstrated as compatible with current needs in substance misuse service provision. However, the concepts of evidence-based practice and the recovery model are currently incompatible unless the current value system of evidence-based practice changes to accommodate the methodologies demanded by the recovery model. It is suggested that critical health psychology has an important role to play in widening the scope of evidence-based practice to better accommodate complex social health needs.

  13. An Evaluation of Alternative Functional Models of Narrative Schemata,

    DTIC Science & Technology

    1980-07-01

    proposi- tions. This model, like models 2L-5L, predicts a levels effect for both recall and recognition due to differential initial encoding of...Table 1. Model 7L. Model 7L differs from 6L in assuming top-down search. This model thus predicts a levels effect in recall based on probabilis- tic...Bower (1980) failed to find a levels effect in recall of narratives. To compare our results to -20- .70- 501 A I iMMED IATE 0 40 1--- 0c. 3 0 1 020 .00

  14. Effects of Correctional-Based Programs for Female Inmates: A Systematic Review

    ERIC Educational Resources Information Center

    Tripodi, Stephen J.; Bledsoe, Sarah E.; Kim, Johnny S.; Bender, Kimberly

    2011-01-01

    Objective: To examine the effectiveness of interventions for incarcerated women. Method: The researchers use a two-model system: the risk-reduction model for studies analyzing interventions to reduce recidivism rates, and the enhancement model for studies that target psychological and physical well-being. Results: Incarcerated women who…

  15. A Quantitative Cost Effectiveness Model for Web-Supported Academic Instruction

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2006-01-01

    This paper describes a quantitative cost effectiveness model for Web-supported academic instruction. The model was designed for Web-supported instruction (rather than distance learning only) characterizing most of the traditional higher education institutions. It is based on empirical data (Web logs) of students' and instructors' usage…

  16. Learning Molecular Behaviour May Improve Student Explanatory Models of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Harris, Sara E.; Gold, Anne U.

    2018-01-01

    We assessed undergraduates' representations of the greenhouse effect, based on student-generated concept sketches, before and after a 30-min constructivist lesson. Principal component analysis of features in student sketches revealed seven distinct and coherent explanatory models including a new "Molecular Details" model. After the…

  17. Steric effects in the design of Co-Schiff base complexes for the catalytic oxidation of lignin models to para-benzoquinones

    Treesearch

    Berenger Biannic; Joseph J. Bozell; Thomas Elder

    2014-01-01

    New Co-Schiff base complexes that incorporate a sterically hindered ligand and an intramolecular bulky piperazine base in close proximity to the Co center are synthesized. Their utility as catalysts for the oxidation of para-substituted lignin model phenols with molecular oxygen is examined. Syringyl and guaiacyl alcohol, as models of S and G units in lignin, are...

  18. Spring hydrograph simulation of karstic aquifers: Impacts of variable recharge area, intermediate storage and memory effects

    NASA Astrophysics Data System (ADS)

    Hosseini, Seiyed Mossa; Ataie-Ashtiani, Behzad; Simmons, Craig T.

    2017-09-01

    A simple conceptual rainfall-runoff model is proposed for the estimation of groundwater balance components in complex karst aquifers. In the proposed model the effects of memory length of different karst flow systems of base-flow, intermediate-flow, and quick-flow and also time variation of recharge area (RA) during a hydrological year were investigated. The model consists of three sub-models: soil moisture balance (SMB), epikarst balance (EPB), and groundwater balance (GWB) to simulate the daily spring discharge. The SMB and EPB sub-models utilize the mass conservation equation to compute the variation of moisture storages in the soil cover and epikarst, respectively. The GWB sub-model computes the spring discharge hydrograph through three parallel linear reservoirs for base-flow, intermediate-flow, and quick-flow. Three antecedent recharge indices are defined and embedded in the model structure to deal with the memory effect of three karst flow systems to antecedent recharge flow. The Sasan Karst aquifer located in the semi-arid region of south-west Iran with a continuous long-term (21-years) daily meteorological and discharge data are considered to describe model calibration and validation procedures. The effects of temporal variations of RA of karst formations during the hydrological year namely invariant RA, two RA (winter and summer), four RA (seasonal), and twelve RA (monthly) are assessed to determine their impact on the model efficiency. Results indicated that the proposed model with monthly-variant RA is able to reproduce acceptable simulation results based on modified Kling-Gupta efficiency (KGE = -0.83). The results of density-based global sensitivity analysis for dry (June to September) and a wet (October to May) period reveal the dominant influence of RA (with sensitivity indices equal to 0.89 and 0.93, respectively) in spring discharge simulation. The sensitivity of simulated spring discharge to memory effect of different karst formations during the dry period is greater than the wet period. In addition, the results reveal the important role of intermediate-flow system in the hydrological modeling of karst systems during the wet period. Precise estimation of groundwater budgets for a better decision making regarding water supplies from complex karst systems with long memory effect can considerably be improved by use of the proposed model.

  19. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    PubMed

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  20. A Marginal Cost Based "Social Cost of Carbon" Provides Inappropriate Guidance in a World That Needs Rapid and Deep Decarbonization

    NASA Astrophysics Data System (ADS)

    Morgan, M. G.; Vaishnav, P.; Azevedo, I. L.; Dowlatabadi, H.

    2016-12-01

    Rising temperatures and changing precipitation patterns due to climate change are projected to alter many sectors of the US economy. A growing body of research has examined these effects in the energy, water, and agricultural sectors. Rising summer temperatures increase the demand for electricity. Changing precipitation patterns effect the availability of water for hydropower generation, thermo-electric cooling, irrigation, and municipal and industrial consumption. A combination of changes to temperature and precipitation alter crop yields and cost-effective farming practices. Although a significant body of research exists on analyzing impacts to individual sectors, fewer studies examine the effects using a common set of assumptions (e.g., climatic and socio-economic) within a coupled modeling framework. The present analysis uses a multi-sector, multi-model framework with common input assumptions to assess the projected effects of climate change on energy, water, and land-use in the United States. The analysis assesses the climate impacts for across 5 global circulation models for representative concentration pathways (RCP) of 8.5 and 4.5 W/m2. The energy sector models - Pacific Northwest National Lab's Global Change Assessment Model (GCAM) and the National Renewable Energy Laboratory's Regional Energy Deployment System (ReEDS) - show the effects of rising temperature on energy and electricity demand. Electricity supply in ReEDS is also affected by the availability of water for hydropower and thermo-electric cooling. Water availability is calculated from the GCM's precipitation using the US Basins model. The effects on agriculture are estimated using both a process-based crop model (EPIC) and an agricultural economic model (FASOM-GHG), which adjusts water supply curves based on information from US Basins. The sectoral models show higher economic costs of climate change under RCP 8.5 than RCP 4.5 averaged across the country and across GCM's.

  1. Agent-Based Model with Asymmetric Trading and Herding for Complex Financial Systems

    PubMed Central

    Chen, Jun-Jie; Zheng, Bo; Tan, Lei

    2013-01-01

    Background For complex financial systems, the negative and positive return-volatility correlations, i.e., the so-called leverage and anti-leverage effects, are particularly important for the understanding of the price dynamics. However, the microscopic origination of the leverage and anti-leverage effects is still not understood, and how to produce these effects in agent-based modeling remains open. On the other hand, in constructing microscopic models, it is a promising conception to determine model parameters from empirical data rather than from statistical fitting of the results. Methods To study the microscopic origination of the return-volatility correlation in financial systems, we take into account the individual and collective behaviors of investors in real markets, and construct an agent-based model. The agents are linked with each other and trade in groups, and particularly, two novel microscopic mechanisms, i.e., investors’ asymmetric trading and herding in bull and bear markets, are introduced. Further, we propose effective methods to determine the key parameters in our model from historical market data. Results With the model parameters determined for six representative stock-market indices in the world, respectively, we obtain the corresponding leverage or anti-leverage effect from the simulation, and the effect is in agreement with the empirical one on amplitude and duration. At the same time, our model produces other features of the real markets, such as the fat-tail distribution of returns and the long-term correlation of volatilities. Conclusions We reveal that for the leverage and anti-leverage effects, both the investors’ asymmetric trading and herding are essential generation mechanisms. Among the six markets, however, the investors’ trading is approximately symmetric for the five markets which exhibit the leverage effect, thus contributing very little. These two microscopic mechanisms and the methods for the determination of the key parameters can be applied to other complex systems with similar asymmetries. PMID:24278146

  2. Agent-based model with asymmetric trading and herding for complex financial systems.

    PubMed

    Chen, Jun-Jie; Zheng, Bo; Tan, Lei

    2013-01-01

    For complex financial systems, the negative and positive return-volatility correlations, i.e., the so-called leverage and anti-leverage effects, are particularly important for the understanding of the price dynamics. However, the microscopic origination of the leverage and anti-leverage effects is still not understood, and how to produce these effects in agent-based modeling remains open. On the other hand, in constructing microscopic models, it is a promising conception to determine model parameters from empirical data rather than from statistical fitting of the results. To study the microscopic origination of the return-volatility correlation in financial systems, we take into account the individual and collective behaviors of investors in real markets, and construct an agent-based model. The agents are linked with each other and trade in groups, and particularly, two novel microscopic mechanisms, i.e., investors' asymmetric trading and herding in bull and bear markets, are introduced. Further, we propose effective methods to determine the key parameters in our model from historical market data. With the model parameters determined for six representative stock-market indices in the world, respectively, we obtain the corresponding leverage or anti-leverage effect from the simulation, and the effect is in agreement with the empirical one on amplitude and duration. At the same time, our model produces other features of the real markets, such as the fat-tail distribution of returns and the long-term correlation of volatilities. We reveal that for the leverage and anti-leverage effects, both the investors' asymmetric trading and herding are essential generation mechanisms. Among the six markets, however, the investors' trading is approximately symmetric for the five markets which exhibit the leverage effect, thus contributing very little. These two microscopic mechanisms and the methods for the determination of the key parameters can be applied to other complex systems with similar asymmetries.

  3. Modeling Health Care Expenditures and Use.

    PubMed

    Deb, Partha; Norton, Edward C

    2018-04-01

    Health care expenditures and use are challenging to model because these dependent variables typically have distributions that are skewed with a large mass at zero. In this article, we describe estimation and interpretation of the effects of a natural experiment using two classes of nonlinear statistical models: one for health care expenditures and the other for counts of health care use. We extend prior analyses to test the effect of the ACA's young adult expansion on three different outcomes: total health care expenditures, office-based visits, and emergency department visits. Modeling the outcomes with a two-part or hurdle model, instead of a single-equation model, reveals that the ACA policy increased the number of office-based visits but decreased emergency department visits and overall spending.

  4. Generalisability in economic evaluation studies in healthcare: a review and case studies.

    PubMed

    Sculpher, M J; Pang, F S; Manca, A; Drummond, M F; Golder, S; Urdahl, H; Davies, L M; Eastwood, A

    2004-12-01

    To review, and to develop further, the methods used to assess and to increase the generalisability of economic evaluation studies. Electronic databases. Methodological studies relating to economic evaluation in healthcare were searched. This included electronic searches of a range of databases, including PREMEDLINE, MEDLINE, EMBASE and EconLit, and manual searches of key journals. The case studies of a decision analytic model involved highlighting specific features of previously published economic studies related to generalisability and location-related variability. The case-study involving the secondary analysis of cost-effectiveness analyses was based on the secondary analysis of three economic studies using data from randomised trials. The factor most frequently cited as generating variability in economic results between locations was the unit costs associated with particular resources. In the context of studies based on the analysis of patient-level data, regression analysis has been advocated as a means of looking at variability in economic results across locations. These methods have generally accepted that some components of resource use and outcomes are exchangeable across locations. Recent studies have also explored, in cost-effectiveness analysis, the use of tests of heterogeneity similar to those used in clinical evaluation in trials. The decision analytic model has been the main means by which cost-effectiveness has been adapted from trial to non-trial locations. Most models have focused on changes to the cost side of the analysis, but it is clear that the effectiveness side may also need to be adapted between locations. There have been weaknesses in some aspects of the reporting in applied cost-effectiveness studies. These may limit decision-makers' ability to judge the relevance of a study to their specific situations. The case study demonstrated the potential value of multilevel modelling (MLM). Where clustering exists by location (e.g. centre or country), MLM can facilitate correct estimates of the uncertainty in cost-effectiveness results, and also a means of estimating location-specific cost-effectiveness. The review of applied economic studies based on decision analytic models showed that few studies were explicit about their target decision-maker(s)/jurisdictions. The studies in the review generally made more effort to ensure that their cost inputs were specific to their target jurisdiction than their effectiveness parameters. Standard sensitivity analysis was the main way of dealing with uncertainty in the models, although few studies looked explicitly at variability between locations. The modelling case study illustrated how effectiveness and cost data can be made location-specific. In particular, on the effectiveness side, the example showed the separation of location-specific baseline events and pooled estimates of relative treatment effect, where the latter are assumed exchangeable across locations. A large number of factors are mentioned in the literature that might be expected to generate variation in the cost-effectiveness of healthcare interventions across locations. Several papers have demonstrated differences in the volume and cost of resource use between locations, but few studies have looked at variability in outcomes. In applied trial-based cost-effectiveness studies, few studies provide sufficient evidence for decision-makers to establish the relevance or to adjust the results of the study to their location of interest. Very few studies utilised statistical methods formally to assess the variability in results between locations. In applied economic studies based on decision models, most studies either stated their target decision-maker/jurisdiction or provided sufficient information from which this could be inferred. There was a greater tendency to ensure that cost inputs were specific to the target jurisdiction than clinical parameters. Methods to assess generalisability and variability in economic evaluation studies have been discussed extensively in the literature relating to both trial-based and modelling studies. Regression-based methods are likely to offer a systematic approach to quantifying variability in patient-level data. In particular, MLM has the potential to facilitate estimates of cost-effectiveness, which both reflect the variation in costs and outcomes between locations and also enable the consistency of cost-effectiveness estimates between locations to be assessed directly. Decision analytic models will retain an important role in adapting the results of cost-effectiveness studies between locations. Recommendations for further research include: the development of methods of evidence synthesis which model the exchangeability of data across locations and allow for the additional uncertainty in this process; assessment of alternative approaches to specifying multilevel models to the analysis of cost-effectiveness data alongside multilocation randomised trials; identification of a range of appropriate covariates relating to locations (e.g. hospitals) in multilevel models; and further assessment of the role of econometric methods (e.g. selection models) for cost-effectiveness analysis alongside observational datasets, and to increase the generalisability of randomised trials.

  5. How TK-TD and population models for aquatic macrophytes could support the risk assessment for plant protection products.

    PubMed

    Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie

    2016-01-01

    This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also linked to the exposure scenarios, and 2) quantitative protection goals be set to facilitate the interpretation of model results for risk assessment. © 2015 SETAC.

  6. Ecotoxicological assessment of oil-based paint using three-dimensional multi-species bio-testing model: pre- and post-bioremediation analysis.

    PubMed

    Phulpoto, Anwar Hussain; Qazi, Muneer Ahmed; Haq, Ihsan Ul; Phul, Abdul Rahman; Ahmed, Safia; Kanhar, Nisar Ahmed

    2018-06-01

    The present study validates the oil-based paint bioremediation potential of Bacillus subtilis NAP1 for ecotoxicological assessment using a three-dimensional multi-species bio-testing model. The model included bioassays to determine phytotoxic effect, cytotoxic effect, and antimicrobial effect of oil-based paint. Additionally, the antioxidant activity of pre- and post-bioremediation samples was also detected to confirm its detoxification. Although, the pre-bioremediation samples of oil-based paint displayed significant toxicity against all the life forms. However, post-bioremediation, the cytotoxic effect against Artemia salina revealed substantial detoxification of oil-based paint with LD 50 of 121 μl ml -1 (without glucose) and > 400 μl ml -1 (with glucose). Similarly, the reduction in toxicity against Raphanus raphanistrum seeds germination (%FG = 98 to 100%) was also evident of successful detoxification under experimental conditions. Moreover, the toxicity against test bacterial strains and fungal strains was completely removed after bioremediation. In addition, the post-bioremediation samples showed reduced antioxidant activities (% scavenging = 23.5 ± 0.35 and 28.9 ± 2.7) without and with glucose, respectively. Convincingly, the present multi-species bio-testing model in addition to antioxidant studies could be suggested as a validation tool for bioremediation experiments, especially for middle and low-income countries. Graphical abstract ᅟ.

  7. High-level PC-based laser system modeling

    NASA Astrophysics Data System (ADS)

    Taylor, Michael S.

    1991-05-01

    Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.

  8. A model-based analysis of extinction ratio effects on phase-OTDR distributed acoustic sensing system performance

    NASA Astrophysics Data System (ADS)

    Aktas, Metin; Maral, Hakan; Akgun, Toygar

    2018-02-01

    Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.

  9. Mental Models about Seismic Effects: Students' Profile Based Comparative Analysis

    ERIC Educational Resources Information Center

    Moutinho, Sara; Moura, Rui; Vasconcelos, Clara

    2016-01-01

    Nowadays, meaningful learning takes a central role in science education and is based in mental models that allow the representation of the real world by individuals. Thus, it is essential to analyse the student's mental models by promoting an easier reconstruction of scientific knowledge, by allowing them to become consistent with the curricular…

  10. Lumped Parameter Models for Predicting Nitrogen Transport in Lower Coastal Plain Watersheds

    Treesearch

    Devendra M. Amatya; George M. Chescheir; Glen P. Fernandez; R. Wayne Skaggs; F. Birgand; J.W. Gilliam

    2003-01-01

    hl recent years physically based comprehensive disfributed watershed scale hydrologic/water quality models have been developed and applied 10 evaluate cumulative effects of land arld water management practices on receiving waters, Although fhesc complex physically based models are capable of simulating the impacts ofthese changes in large watersheds, they are often...

  11. Landscape-based population viability models demonstrate importance of strategic conservation planning for birds

    Treesearch

    Thomas W. Bonnot; Frank R. Thompson; Joshua J. Millspaugh; D. Todd Jones-Farland

    2013-01-01

    Efforts to conserve regional biodiversity in the face of global climate change, habitat loss and fragmentation will depend on approaches that consider population processes at multiple scales. By combining habitat and demographic modeling, landscape-based population viability models effectively relate small-scale habitat and landscape patterns to regional population...

  12. The relative effectiveness of computer-based and traditional resources for education in anatomy.

    PubMed

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views. © 2013 American Association of Anatomists.

  13. Effect of ceramic thickness and composite bases on stress distribution of inlays--a finite element analysis.

    PubMed

    Durand, Letícia Brandão; Guimarães, Jackeline Coutinho; Monteiro Junior, Sylvio; Baratieri, Luiz Narciso

    2015-01-01

    The purpose of this study was to determine the effect of cavity depth, ceramic thickness, and resin bases with different elastic modulus on von Mises stress patterns of ceramic inlays. Tridimensional geometric models were developed with SolidWorks image software. The differences between the models were: depth of pulpal wall, ceramic thickness, and presence of composite bases with different thickness and elastic modulus. The geometric models were constrained at the proximal surfaces and base of maxillary bone. A load of 100 N was applied. The stress distribution pattern was analyzed with von Mises stress diagrams. The maximum von Mises stress values ranged from 176 MPa to 263 MPa and varied among the 3D-models. The highest von Mises stress value was found on models with 1-mm-thick composite resin base and 1-mm-thick ceramic inlay. Intermediate values (249-250 MPa) occurred on models with 2-mm-thick composite resin base and 1-mm-thick ceramic inlay and 1-mm-thick composite resin base and 2-mm-thick ceramic inlay. The lowest values were observed on models restored exclusively with ceramic inlay (176 MPa to 182 MPa). It was found that thicker inlays distribute stress more favorably and bases with low elastic modulus increase stress concentrations on the internal surface of the ceramic inlay. The increase of ceramic thickness tends to present more favorable stress distribution, especially when bonded directly onto the cavity without the use of supporting materials. When the use of a composite base is required, composite resin with high elastic modulus and reduced thickness should be preferred.

  14. Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset

    DOE PAGES

    Gao, Xujiao; Huang, Andy; Kerr, Bert

    2017-10-25

    In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less

  15. Efficient Band-to-Trap Tunneling Model Including Heterojunction Band Offset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Xujiao; Huang, Andy; Kerr, Bert

    In this paper, we present an efficient band-to-trap tunneling model based on the Schenk approach, in which an analytic density-of-states (DOS) model is developed based on the open boundary scattering method. The new model explicitly includes the effect of heterojunction band offset, in addition to the well-known field effect. Its analytic form enables straightforward implementation into TCAD device simulators. It is applicable to all one-dimensional potentials, which can be approximated to a good degree such that the approximated potentials lead to piecewise analytic wave functions with open boundary conditions. The model allows for simulating both the electric-field-enhanced and band-offset-enhanced carriermore » recombination due to the band-to-trap tunneling near the heterojunction in a heterojunction bipolar transistor (HBT). Simulation results of an InGaP/GaAs/GaAs NPN HBT show that the proposed model predicts significantly increased base currents, due to the hole-to-trap tunneling enhanced by the emitter-base junction band offset. Finally, the results compare favorably with experimental observation.« less

  16. Stochastic model for fatigue crack size and cost effective design decisions. [for aerospace structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1975-01-01

    This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.

  17. The importance of explicitly mapping instructional analogies in science education

    NASA Astrophysics Data System (ADS)

    Asay, Loretta Johnson

    Analogies are ubiquitous during instruction in science classrooms, yet research about the effectiveness of using analogies has produced mixed results. An aspect seldom studied is a model of instruction when using analogies. The few existing models for instruction with analogies have not often been examined quantitatively. The Teaching With Analogies (TWA) model (Glynn, 1991) is one of the models frequently cited in the variety of research about analogies. The TWA model outlines steps for instruction, including the step of explicitly mapping the features of the source to the target. An experimental study was conducted to examine the effects of explicitly mapping the features of the source and target in an analogy during computer-based instruction about electrical circuits. Explicit mapping was compared to no mapping and to a control with no analogy. Participants were ninth- and tenth-grade biology students who were each randomly assigned to one of three conditions (no analogy module, analogy module, or explicitly mapped analogy module) for computer-based instruction. Subjects took a pre-test before the instruction, which was used to assign them to a level of previous knowledge about electrical circuits for analysis of any differential effects. After the instruction modules, students took a post-test about electrical circuits. Two weeks later, they took a delayed post-test. No advantage was found for explicitly mapping the analogy. Learning patterns were the same, regardless of the type of instruction. Those who knew the least about electrical circuits, based on the pre-test, made the most gains. After the two-week delay, this group maintained the largest amount of their gain. Implications exist for science education classrooms, as analogy use should be based on research about effective practices. Further studies are suggested to foster the building of research-based models for classroom instruction with analogies.

  18. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  19. Multiscale Modeling of Angiogenesis and Predictive Capacity

    NASA Astrophysics Data System (ADS)

    Pillay, Samara; Byrne, Helen; Maini, Philip

    Tumors induce the growth of new blood vessels from existing vasculature through angiogenesis. Using an agent-based approach, we model the behavior of individual endothelial cells during angiogenesis. We incorporate crowding effects through volume exclusion, motility of cells through biased random walks, and include birth and death-like processes. We use the transition probabilities associated with the discrete model and a discrete conservation equation for cell occupancy to determine collective cell behavior, in terms of partial differential equations (PDEs). We derive three PDE models incorporating single, multi-species and no volume exclusion. By fitting the parameters in our PDE models and other well-established continuum models to agent-based simulations during a specific time period, and then comparing the outputs from the PDE models and agent-based model at later times, we aim to determine how well the PDE models predict the future behavior of the agent-based model. We also determine whether predictions differ across PDE models and the significance of those differences. This may impact drug development strategies based on PDE models.

  20. A performance model for GPUs with caches

    DOE PAGES

    Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...

    2014-06-24

    To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less

  1. Model-Based Diagnostics for Propellant Loading Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.

    2011-01-01

    The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.

  2. Biomechanical effects of hybrid stabilization on the risk of proximal adjacent-segment degeneration following lumbar spinal fusion using an interspinous device or a pedicle screw-based dynamic fixator.

    PubMed

    Lee, Chang-Hyun; Kim, Young Eun; Lee, Hak Joong; Kim, Dong Gyu; Kim, Chi Heon

    2017-12-01

    OBJECTIVE Pedicle screw-rod-based hybrid stabilization (PH) and interspinous device-based hybrid stabilization (IH) have been proposed to prevent adjacent-segment degeneration (ASD) and their effectiveness has been reported. However, a comparative study based on sound biomechanical proof has not yet been reported. The aim of this study was to compare the biomechanical effects of IH and PH on the transition and adjacent segments. METHODS A validated finite element model of the normal lumbosacral spine was used. Based on the normal model, a rigid fusion model was immobilized at the L4-5 level by a rigid fixator. The DIAM or NFlex model was added on the L3-4 segment of the fusion model to construct the IH and PH models, respectively. The developed models simulated 4 different loading directions using the hybrid loading protocol. RESULTS Compared with the intact case, fusion on L4-5 produced 18.8%, 9.3%, 11.7%, and 13.7% increments in motion at L3-4 under flexion, extension, lateral bending, and axial rotation, respectively. Additional instrumentation at L3-4 (transition segment) in hybrid models reduced motion changes at this level. The IH model showed 8.4%, -33.9%, 6.9%, and 2.0% change in motion at the segment, whereas the PH model showed -30.4%, -26.7%, -23.0%, and 12.9%. At L2-3 (adjacent segment), the PH model showed 14.3%, 3.4%, 15.0%, and 0.8% of motion increment compared with the motion in the IH model. Both hybrid models showed decreased intradiscal pressure (IDP) at the transition segment compared with the fusion model, but the pressure at L2-3 (adjacent segment) increased in all loading directions except under extension. CONCLUSIONS Both IH and PH models limited excessive motion and IDP at the transition segment compared with the fusion model. At the segment adjacent to the transition level, PH induced higher stress than IH model. Such differences may eventually influence the likelihood of ASD.

  3. An investigation of the mentalization-based model of borderline pathology in adolescents.

    PubMed

    Quek, Jeremy; Bennett, Clair; Melvin, Glenn A; Saeedi, Naysun; Gordon, Michael S; Newman, Louise K

    2018-07-01

    According to mentalization-based theory, transgenerational transmission of mentalization from caregiver to offspring is implicated in the pathogenesis of borderline personality disorder (BPD). Recent research has demonstrated an association between hypermentalizing (excessive, inaccurate mental state reasoning) and BPD, indicating the particular relevance of this form of mentalizing dysfunction to the transgenerational mentalization-based model. As yet, no study has empirically assessed a transgenerational mentalization-based model of BPD. The current study sought firstly to test the mentalization-based model, and additionally, to determine the form of mentalizing dysfunction in caregivers (e.g., hypo- or hypermentalizing) most relevant to a hypermentalizing model of BPD. Participants were a mixed sample of adolescents with BPD and a sample of non-clinical adolescents, and their respective primary caregivers (n = 102; 51 dyads). Using an ecologically valid measure of mentalization, mediational analyses were conducted to examine the relationships between caregiver mentalizing, adolescent mentalizing, and adolescent borderline features. Findings demonstrated that adolescent mentalization mediated the effect of caregiver mentalization on adolescent borderline personality pathology. Furthermore, results indicated that hypomentalizing in caregivers was related to adolescent borderline personality pathology via an effect on adolescent hypermentalizing. Results provide empirical support for the mentalization-based model of BPD, and suggest the indirect influence of caregiver mentalization on adolescent borderline psychopathology. Results further indicate the relevance of caregiver hypomentalizing to a hypermentalizing model of BPD. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. A Novel Grid SINS/DVL Integrated Navigation Algorithm for Marine Application

    PubMed Central

    Kang, Yingyao; Zhao, Lin; Cheng, Jianhua; Fan, Xiaoliang

    2018-01-01

    Integrated navigation algorithms under the grid frame have been proposed based on the Kalman filter (KF) to solve the problem of navigation in some special regions. However, in the existing study of grid strapdown inertial navigation system (SINS)/Doppler velocity log (DVL) integrated navigation algorithms, the Earth models of the filter dynamic model and the SINS mechanization are not unified. Besides, traditional integrated systems with the KF based correction scheme are susceptible to measurement errors, which would decrease the accuracy and robustness of the system. In this paper, an adaptive robust Kalman filter (ARKF) based hybrid-correction grid SINS/DVL integrated navigation algorithm is designed with the unified reference ellipsoid Earth model to improve the navigation accuracy in middle-high latitude regions for marine application. Firstly, to unify the Earth models, the mechanization of grid SINS is introduced and the error equations are derived based on the same reference ellipsoid Earth model. Then, a more accurate grid SINS/DVL filter model is designed according to the new error equations. Finally, a hybrid-correction scheme based on the ARKF is proposed to resist the effect of measurement errors. Simulation and experiment results show that, compared with the traditional algorithms, the proposed navigation algorithm can effectively improve the navigation performance in middle-high latitude regions by the unified Earth models and the ARKF based hybrid-correction scheme. PMID:29373549

  5. Quantifying the effect of a community-based injury prevention program in Queensland using a generalized estimating equation approach.

    PubMed

    Yorkston, Emily; Turner, Catherine; Schluter, Philip J; McClure, Rod

    2007-06-01

    To develop a generalized estimating equation (GEE) model of childhood injury rates to quantify the effectiveness of a community-based injury prevention program implemented in 2 communities in Australia, in order to contribute to the discussion of community-based injury prevention program evaluation. An ecological study was conducted comparing injury rates in two intervention communities in rural and remote Queensland, Australia, with those of 16 control regions. A model of childhood injury was built using hospitalization injury rate data from 1 July 1991 to 30 June 2005 and 16 social variables. The model was built using GEE analysis and was used to estimate parameters and to test the effectiveness of the intervention. When social variables were controlled for, the intervention was associated with a decrease of 0.09 injuries/10,000 children aged 0-4 years (95% CI -0.29 to 0.11) in logarithmically transformed injury rates; however, this decrease was not significant (p = 0.36). The evaluation methods proposed in this study provide a way of determining the effectiveness of a community-based injury prevention program while considering the effect of baseline differences and secular changes in social variables.

  6. Cost-effectiveness of scaling up voluntary counselling and testing in West-Java, Indonesia.

    PubMed

    Tromp, Noor; Siregar, Adiatma; Leuwol, Barnabas; Komarudin, Dindin; van der Ven, Andre; van Crevel, Reinout; Baltussen, Rob

    2013-01-01

    to evaluate the costs-effectiveness of scaling up community-based VCT in West-Java. the Asian epidemic model (AEM) and resource needs model (RNM) were used to calculate incremental costs per HIV infection averted and per disability-adjusted life years saved (DALYs). Locally monitored demographic, epidemiological behavior and cost data were used as model input. scaling up community-based VCT in West-Java will reduce the overall population prevalence by 36% in 2030 and costs US$248 per HIV infection averted and US$9.17 per DALY saved. Cost-effectiveness estimation were most sensitive to the impact of VCT on condom use and to the population size of clients of female sex workers (FSWs), but were overall robust. The total costs for scaling up community-based VCT range between US$1.3 and 3.8 million per year and require the number of VCT integrated clinics at public community health centers to increase from 73 in 2010 to 594 in 2030. scaling up community-based VCT seems both an effective and cost-effective intervention. However, in order to prioritize VCT in HIV/AIDS control in West-Java, issues of budget availability and organizational capacity should be addressed.

  7. Comparison of CTT and Rasch-based approaches for the analysis of longitudinal Patient Reported Outcomes.

    PubMed

    Blanchin, Myriam; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Blanchard, Claire; Mirallié, Eric; Sébille, Véronique

    2011-04-15

    Health sciences frequently deal with Patient Reported Outcomes (PRO) data for the evaluation of concepts, in particular health-related quality of life, which cannot be directly measured and are often called latent variables. Two approaches are commonly used for the analysis of such data: Classical Test Theory (CTT) and Item Response Theory (IRT). Longitudinal data are often collected to analyze the evolution of an outcome over time. The most adequate strategy to analyze longitudinal latent variables, which can be either based on CTT or IRT models, remains to be identified. This strategy must take into account the latent characteristic of what PROs are intended to measure as well as the specificity of longitudinal designs. A simple and widely used IRT model is the Rasch model. The purpose of our study was to compare CTT and Rasch-based approaches to analyze longitudinal PRO data regarding type I error, power, and time effect estimation bias. Four methods were compared: the Score and Mixed models (SM) method based on the CTT approach, the Rasch and Mixed models (RM), the Plausible Values (PV), and the Longitudinal Rasch model (LRM) methods all based on the Rasch model. All methods have shown comparable results in terms of type I error, all close to 5 per cent. LRM and SM methods presented comparable power and unbiased time effect estimations, whereas RM and PV methods showed low power and biased time effect estimations. This suggests that RM and PV methods should be avoided to analyze longitudinal latent variables. Copyright © 2010 John Wiley & Sons, Ltd.

  8. Simulating Cancer Growth with Multiscale Agent-Based Modeling

    PubMed Central

    Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.

    2014-01-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698

  9. An EKV-based high voltage MOSFET model with improved mobility and drift model

    NASA Astrophysics Data System (ADS)

    Chauhan, Yogesh Singh; Gillon, Renaud; Bakeroot, Benoit; Krummenacher, Francois; Declercq, Michel; Ionescu, Adrian Mihai

    2007-11-01

    An EKV-based high voltage MOSFET model is presented. The intrinsic channel model is derived based on the charge based EKV-formalism. An improved mobility model is used for the modeling of the intrinsic channel to improve the DC characteristics. The model uses second order dependence on the gate bias and an extra parameter for the smoothening of the saturation voltage of the intrinsic drain. An improved drift model [Chauhan YS, Anghel C, Krummenacher F, Ionescu AM, Declercq M, Gillon R, et al. A highly scalable high voltage MOSFET model. In: IEEE European solid-state device research conference (ESSDERC), September 2006. p. 270-3; Chauhan YS, Anghel C, Krummenacher F, Maier C, Gillon R, Bakeroot B, et al. Scalable general high voltage MOSFET model including quasi-saturation and self-heating effect. Solid State Electron 2006;50(11-12):1801-13] is used for the modeling of the drift region, which gives smoother transition on output characteristics and also models well the quasi-saturation region of high voltage MOSFETs. First, the model is validated on the numerical device simulation of the VDMOS transistor and then, on the measured characteristics of the SOI-LDMOS transistor. The accuracy of the model is better than our previous model [Chauhan YS, Anghel C, Krummenacher F, Maier C, Gillon R, Bakeroot B, et al. Scalable general high voltage MOSFET model including quasi-saturation and self-heating effect. Solid State Electron 2006;50(11-12):1801-13] especially in the quasi-saturation region of output characteristics.

  10. Waste management under multiple complexities: Inexact piecewise-linearization-based fuzzy flexible programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun Wei; Huang, Guo H., E-mail: huang@iseis.org; Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan, S4S 0A2

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Inexact piecewise-linearization-based fuzzy flexible programming is proposed. Black-Right-Pointing-Pointer It's the first application to waste management under multiple complexities. Black-Right-Pointing-Pointer It tackles nonlinear economies-of-scale effects in interval-parameter constraints. Black-Right-Pointing-Pointer It estimates costs more accurately than the linear-regression-based model. Black-Right-Pointing-Pointer Uncertainties are decreased and more satisfactory interval solutions are obtained. - Abstract: To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerancemore » intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities.« less

  11. Reliability analysis in interdependent smart grid systems

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  12. Theory-based pharmacokinetics and pharmacodynamics of S- and R-warfarin and effects on international normalized ratio: influence of body size, composition and genotype in cardiac surgery patients.

    PubMed

    Xue, Ling; Holford, Nick; Ding, Xiao-Liang; Shen, Zhen-Ya; Huang, Chen-Rong; Zhang, Hua; Zhang, Jing-Jing; Guo, Zhe-Ning; Xie, Cheng; Zhou, Ling; Chen, Zhi-Yao; Liu, Lin-Sheng; Miao, Li-Yan

    2017-04-01

    The aims of this study are to apply a theory-based mechanistic model to describe the pharmacokinetics (PK) and pharmacodynamics (PD) of S- and R-warfarin. Clinical data were obtained from 264 patients. Total concentrations for S- and R-warfarin were measured by ultra-high performance liquid tandem mass spectrometry. Genotypes were measured using pyrosequencing. A sequential population PK parameter with data method was used to describe the international normalized ratio (INR) time course. Data were analyzed with NONMEM. Model evaluation was based on parameter plausibility and prediction-corrected visual predictive checks. Warfarin PK was described using a one-compartment model. CYP2C9 *1/*3 genotype had reduced clearance for S-warfarin, but increased clearance for R-warfarin. The in vitro parameters for the relationship between prothrombin complex activity (PCA) and INR were markedly different (A = 0.560, B = 0.386) from the theory-based values (A = 1, B = 0). There was a small difference between healthy subjects and patients. A sigmoid E max PD model inhibiting PCA synthesis as a function of S-warfarin concentration predicted INR. Small R-warfarin effects was described by competitive antagonism of S-warfarin inhibition. Patients with VKORC1 AA and CYP4F2 CC or CT genotypes had lower C50 for S-warfarin. A theory-based PKPD model describes warfarin concentrations and clinical response. Expected PK and PD genotype effects were confirmed. The role of predicted fat free mass with theory-based allometric scaling of PK parameters was identified. R-warfarin had a minor effect compared with S-warfarin on PCA synthesis. INR is predictable from 1/PCA in vivo. © 2016 The British Pharmacological Society.

  13. Theory‐based pharmacokinetics and pharmacodynamics of S‐ and R‐warfarin and effects on international normalized ratio: influence of body size, composition and genotype in cardiac surgery patients

    PubMed Central

    Xue, Ling; Holford, Nick; Ding, Xiao‐liang; Shen, Zhen‐ya; Huang, Chen‐rong; Zhang, Hua; Zhang, Jing‐jing; Guo, Zhe‐ning; Xie, Cheng; Zhou, Ling; Chen, Zhi‐yao; Liu, Lin‐sheng

    2016-01-01

    Aims The aims of this study are to apply a theory‐based mechanistic model to describe the pharmacokinetics (PK) and pharmacodynamics (PD) of S‐ and R‐warfarin. Methods Clinical data were obtained from 264 patients. Total concentrations for S‐ and R‐warfarin were measured by ultra‐high performance liquid tandem mass spectrometry. Genotypes were measured using pyrosequencing. A sequential population PK parameter with data method was used to describe the international normalized ratio (INR) time course. Data were analyzed with NONMEM. Model evaluation was based on parameter plausibility and prediction‐corrected visual predictive checks. Results Warfarin PK was described using a one‐compartment model. CYP2C9 *1/*3 genotype had reduced clearance for S‐warfarin, but increased clearance for R‐warfarin. The in vitro parameters for the relationship between prothrombin complex activity (PCA) and INR were markedly different (A = 0.560, B = 0.386) from the theory‐based values (A = 1, B = 0). There was a small difference between healthy subjects and patients. A sigmoid Emax PD model inhibiting PCA synthesis as a function of S‐warfarin concentration predicted INR. Small R‐warfarin effects was described by competitive antagonism of S‐warfarin inhibition. Patients with VKORC1 AA and CYP4F2 CC or CT genotypes had lower C50 for S‐warfarin. Conclusion A theory‐based PKPD model describes warfarin concentrations and clinical response. Expected PK and PD genotype effects were confirmed. The role of predicted fat free mass with theory‐based allometric scaling of PK parameters was identified. R‐warfarin had a minor effect compared with S‐warfarin on PCA synthesis. INR is predictable from 1/PCA in vivo. PMID:27763679

  14. A computational cognitive model of syntactic priming.

    PubMed

    Reitter, David; Keller, Frank; Moore, Johanna D

    2011-01-01

    The psycholinguistic literature has identified two syntactic adaptation effects in language production: rapidly decaying short-term priming and long-lasting adaptation. To explain both effects, we present an ACT-R model of syntactic priming based on a wide-coverage, lexicalized syntactic theory that explains priming as facilitation of lexical access. In this model, two well-established ACT-R mechanisms, base-level learning and spreading activation, account for long-term adaptation and short-term priming, respectively. Our model simulates incremental language production and in a series of modeling studies, we show that it accounts for (a) the inverse frequency interaction; (b) the absence of a decay in long-term priming; and (c) the cumulativity of long-term adaptation. The model also explains the lexical boost effect and the fact that it only applies to short-term priming. We also present corpus data that verify a prediction of the model, that is, that the lexical boost affects all lexical material, rather than just heads. Copyright © 2011 Cognitive Science Society, Inc.

  15. A Framework for Model-Based Inquiry through Agent-Based Programming

    ERIC Educational Resources Information Center

    Xiang, Lin; Passmore, Cynthia

    2015-01-01

    There has been increased recognition in the past decades that model-based inquiry (MBI) is a promising approach for cultivating deep understandings by helping students unite phenomena and underlying mechanisms. Although multiple technology tools have been used to improve the effectiveness of MBI, there are not enough detailed examinations of how…

  16. Guidance for the application of a population modeling framework in coordination with field based monitoring studies for multiple species and sites

    EPA Science Inventory

    A modeling framework was developed that can be applied in conjunction with field based monitoring efforts (e.g., through effects-based monitoring programs) to link chemically-induced alterations in molecular and biochemical endpoints to adverse outcomes in whole organisms and pop...

  17. A quality-based cost model for new electronic systems and products

    NASA Astrophysics Data System (ADS)

    Shina, Sammy G.; Saigal, Anil

    1998-04-01

    This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.

  18. Modelling remediation scenarios in historical mining catchments.

    PubMed

    Gamarra, Javier G P; Brewer, Paul A; Macklin, Mark G; Martin, Katherine

    2014-01-01

    Local remediation measures, particularly those undertaken in historical mining areas, can often be ineffective or even deleterious because erosion and sedimentation processes operate at spatial scales beyond those typically used in point-source remediation. Based on realistic simulations of a hybrid landscape evolution model combined with stochastic rainfall generation, we demonstrate that similar remediation strategies may result in differing effects across three contrasting European catchments depending on their topographic and hydrologic regimes. Based on these results, we propose a conceptual model of catchment-scale remediation effectiveness based on three basic catchment characteristics: the degree of contaminant source coupling, the ratio of contaminated to non-contaminated sediment delivery, and the frequency of sediment transport events.

  19. Cost-effectiveness of community-based strategies to strengthen the continuum of HIV care in rural South Africa: a health economic modelling analysis.

    PubMed

    Smith, Jennifer A; Sharma, Monisha; Levin, Carol; Baeten, Jared M; van Rooyen, Heidi; Celum, Connie; Hallett, Timothy B; Barnabas, Ruanne V

    2015-04-01

    Home HIV counselling and testing (HTC) achieves high coverage of testing and linkage to care compared with existing facility-based approaches, particularly among asymptomatic individuals. In a modelling analysis we aimed to assess the effect on population-level health and cost-effectiveness of a community-based package of home HTC in KwaZulu-Natal, South Africa. We parameterised an individual-based model with data from home HTC and linkage field studies that achieved high coverage (91%) and linkage to antiretroviral therapy (80%) in rural KwaZulu-Natal, South Africa. Costs were derived from a linked microcosting study. The model simulated 10,000 individuals over 10 years and incremental cost-effectiveness ratios were calculated for the intervention relative to the existing status quo of facility-based testing, with costs discounted at 3% annually. The model predicted implementation of home HTC in addition to current practice to decrease HIV-associated morbidity by 10–22% and HIV infections by 9–48% with increasing CD4 cell count thresholds for antiretroviral therapy initiation. Incremental programme costs were US$2·7 million to $4·4 million higher in the intervention scenarios than at baseline, and costs increased with higher CD4 cell count thresholds for antiretroviral therapy initiation; antiretroviral therapy accounted for 48–87% of total costs. Incremental cost-effectiveness ratios per disability-adjusted life-year averted were $1340 at an antiretroviral therapy threshold of CD4 count lower than 200 cells per μL, $1090 at lower than 350 cells per μL, $1150 at lower than 500 cells per μL, and $1360 at universal access to antiretroviral therapy. Community-based HTC with enhanced linkage to care can result in increased HIV testing coverage and treatment uptake, decreasing the population burden of HIV-associated morbidity and mortality. The incremental cost-effectiveness ratios are less than 20% of South Africa's gross domestic product per person, and are therefore classed as very cost effective. Home HTC can be a viable means to achieve UNAIDS' ambitious new targets for HIV treatment coverage. National Institutes of Health, Bill & Melinda Gates Foundation, Wellcome Trust.

  20. A social marketing approach to implementing evidence-based practice in VHA QUERI: the TIDES depression collaborative care model

    PubMed Central

    2009-01-01

    Abstract Collaborative care models for depression in primary care are effective and cost-effective, but difficult to spread to new sites. Translating Initiatives for Depression into Effective Solutions (TIDES) is an initiative to promote evidence-based collaborative care in the U.S. Veterans Health Administration (VHA). Social marketing applies marketing techniques to promote positive behavior change. Described in this paper, TIDES used a social marketing approach to foster national spread of collaborative care models. TIDES social marketing approach The approach relied on a sequential model of behavior change and explicit attention to audience segmentation. Segments included VHA national leadership, Veterans Integrated Service Network (VISN) regional leadership, facility managers, frontline providers, and veterans. TIDES communications, materials and messages targeted each segment, guided by an overall marketing plan. Results Depression collaborative care based on the TIDES model was adopted by VHA as part of the new Primary Care Mental Health Initiative and associated policies. It is currently in use in more than 50 primary care practices across the United States, and continues to spread, suggesting success for its social marketing-based dissemination strategy. Discussion and conclusion Development, execution and evaluation of the TIDES marketing effort shows that social marketing is a promising approach for promoting implementation of evidence-based interventions in integrated healthcare systems. PMID:19785754

  1. MANAGING EXPOSURES TO NEUROTOXIC AIR POLLUTANTS.

    EPA Science Inventory

    Researchers at EPA's National Health and Environmental Effects Research Laboratory are developing a biologically-based dose-response model to describe the neurotoxic effects of exposure to volatile organic compounds (VOCs). The model is being developed to improve risk assessment...

  2. The Effect of the Demand Control and Effort Reward Imbalance Models on the Academic Burnout of Korean Adolescents

    ERIC Educational Resources Information Center

    Lee, Jayoung; Puig, Ana; Lee, Sang Min

    2012-01-01

    The purpose of this study was to examine the effects of the Demand Control Model (DCM) and the Effort Reward Imbalance Model (ERIM) on academic burnout for Korean students. Specifically, this study identified the effects of the predictor variables based on DCM and ERIM (i.e., demand, control, effort, reward, Demand Control Ratio, Effort Reward…

  3. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  4. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  5. Allee effect in the selection for prime-numbered cycles in periodical cicadas.

    PubMed

    Tanaka, Yumi; Yoshimura, Jin; Simon, Chris; Cooley, John R; Tainaka, Kei-ichi

    2009-06-02

    Periodical cicadas are well known for their prime-numbered life cycles (17 and 13 years) and their mass periodical emergences. The origination and persistence of prime-numbered cycles are explained by the hybridization hypothesis on the basis of their lower likelihood of hybridization with other cycles. Recently, we showed by using an integer-based numerical model that prime-numbered cycles are indeed selected for among 10- to 20-year cycles. Here, we develop a real-number-based model to investigate the factors affecting the selection of prime-numbered cycles. We include an Allee effect in our model, such that a critical population size is set as an extinction threshold. We compare the real-number models with and without the Allee effect. The results show that in the presence of an Allee effect, prime-numbered life cycles are most likely to persist and to be selected under a wide range of extinction thresholds.

  6. 3D model retrieval method based on mesh segmentation

    NASA Astrophysics Data System (ADS)

    Gan, Yuanchao; Tang, Yan; Zhang, Qingchen

    2012-04-01

    In the process of feature description and extraction, current 3D model retrieval algorithms focus on the global features of 3D models but ignore the combination of global and local features of the model. For this reason, they show less effective performance to the models with similar global shape and different local shape. This paper proposes a novel algorithm for 3D model retrieval based on mesh segmentation. The key idea is to exact the structure feature and the local shape feature of 3D models, and then to compares the similarities of the two characteristics and the total similarity between the models. A system that realizes this approach was built and tested on a database of 200 objects and achieves expected results. The results show that the proposed algorithm improves the precision and the recall rate effectively.

  7. Convex reformulation of biologically-based multi-criteria intensity-modulated radiation therapy optimization including fractionation effects

    NASA Astrophysics Data System (ADS)

    Hoffmann, Aswin L.; den Hertog, Dick; Siem, Alex Y. D.; Kaanders, Johannes H. A. M.; Huizenga, Henk

    2008-11-01

    Finding fluence maps for intensity-modulated radiation therapy (IMRT) can be formulated as a multi-criteria optimization problem for which Pareto optimal treatment plans exist. To account for the dose-per-fraction effect of fractionated IMRT, it is desirable to exploit radiobiological treatment plan evaluation criteria based on the linear-quadratic (LQ) cell survival model as a means to balance the radiation benefits and risks in terms of biologic response. Unfortunately, the LQ-model-based radiobiological criteria are nonconvex functions, which make the optimization problem hard to solve. We apply the framework proposed by Romeijn et al (2004 Phys. Med. Biol. 49 1991-2013) to find transformations of LQ-model-based radiobiological functions and establish conditions under which transformed functions result in equivalent convex criteria that do not change the set of Pareto optimal treatment plans. The functions analysed are: the LQ-Poisson-based model for tumour control probability (TCP) with and without inter-patient heterogeneity in radiation sensitivity, the LQ-Poisson-based relative seriality s-model for normal tissue complication probability (NTCP), the equivalent uniform dose (EUD) under the LQ-Poisson model and the fractionation-corrected Probit-based model for NTCP according to Lyman, Kutcher and Burman. These functions differ from those analysed before in that they cannot be decomposed into elementary EUD or generalized-EUD functions. In addition, we show that applying increasing and concave transformations to the convexified functions is beneficial for the piecewise approximation of the Pareto efficient frontier.

  8. The Effect of a Case-Based Reasoning Instructional Model on Korean High School Students' Awareness in Climate Change Unit

    ERIC Educational Resources Information Center

    Jeong, Jinwoo; Kim, Hyoungbum; Chae, Dong-hyun; Kim, Eunjeong

    2014-01-01

    The purpose of this study is to investigate the effects of the case-based reasoning instructional model on learning about climate change unit. Results suggest that students showed interest because it allowed them to find the solution to the problem and solve the problem for themselves by analogy from other cases such as crossword puzzles in an…

  9. Examining the Implementation of a Problem-Based Learning and Traditional Hybrid Model of Instruction in Remedial Mathematics Classes Designed for State Testing Preparation of Eleventh Grade Students

    ERIC Educational Resources Information Center

    Rodgers, Lindsay D.

    2011-01-01

    The following paper examined the effects of a new method of teaching for remedial mathematics, named the hybrid model of instruction. Due to increasing importance of high stakes testing, the study sought to determine if this method of instruction, that blends traditional teaching and problem-based learning, had different learning effects on…

  10. A review of the potential effects of climate change on quaking aspen (Populus tremuloides) in the Western United States and a new tool for surveying sudden aspen decline

    Treesearch

    Toni Lyn Morelli; Susan C. Carr

    2011-01-01

    We conducted a literature review of the effects of climate on the distribution and growth of quaking aspen (Populus tremuloides Michx.) in the Western United States. Based on our review, we summarize models of historical climate determinants of contemporary aspen distribution. Most quantitative climate-based models linked aspen presence and growth...

  11. Machine learning and linear regression models to predict catchment-level base cation weathering rates across the southern Appalachian Mountain region, USA

    Treesearch

    Nicholas A. Povak; Paul F. Hessburg; Todd C. McDonnell; Keith M. Reynolds; Timothy J. Sullivan; R. Brion Salter; Bernard J. Crosby

    2014-01-01

    Accurate estimates of soil mineral weathering are required for regional critical load (CL) modeling to identify ecosystems at risk of the deleterious effects from acidification. Within a correlative modeling framework, we used modeled catchment-level base cation weathering (BCw) as the response variable to identify key environmental correlates and predict a continuous...

  12. Development of a Mechanistically Based, Basin-Scale Stream Temperature Model: Applications to Cumulative Effects Modeling

    Treesearch

    Douglas Allen; William Dietrich; Peter Baker; Frank Ligon; Bruce Orr

    2007-01-01

    We describe a mechanistically-based stream model, BasinTemp, which assumes that direct shortwave radiation moderated by riparian and topographic shading, controls stream temperatures during the hottest part of the year. The model was developed to support a temperature TMDL for the South Fork Eel basin in Northern California and couples a GIS and a 1-D energy balance...

  13. Effects of microscale inertia on dynamic ductile crack growth

    NASA Astrophysics Data System (ADS)

    Jacques, N.; Mercier, S.; Molinari, A.

    2012-04-01

    The aim of this paper is to investigate the role of microscale inertia in dynamic ductile crack growth. A constitutive model for porous solids that accounts for dynamic effects due to void growth is proposed. The model has been implemented in a finite element code and simulations of crack growth in a notched bar and in an edge cracked specimen have been performed. Results are compared to predictions obtained via the Gurson-Tvergaard-Needleman (GTN) model where micro-inertia effects are not accounted for. It is found that microscale inertia has a significant influence on the crack growth. In particular, it is shown that micro-inertia plays an important role during the strain localisation process by impeding void growth. Therefore, the resulting damage accumulation occurs in a more progressive manner. For this reason, simulations based on the proposed modelling exhibit much less mesh sensitivity than those based on the viscoplastic GTN model. Microscale inertia is also found to lead to lower crack speeds. Effects of micro-inertia on fracture toughness are evaluated.

  14. Countermeasure effectiveness against an intelligent imaging infrared anti-ship missile

    NASA Astrophysics Data System (ADS)

    Gray, Greer J.; Aouf, Nabil; Richardson, Mark; Butters, Brian; Walmsley, Roy

    2013-02-01

    Ship self defense against heat-seeking anti-ship missiles is of great concern to modern naval forces. One way of protecting ships against these threats is to use infrared (IR) offboard countermeasures. These decoys need precise placement to maximize their effectiveness, and simulation is an invaluable tool used in determining optimum deployment strategies. To perform useful simulations, high-fidelity models of missiles are required. We describe the development of an imaging IR anti-ship missile model for use in countermeasure effectiveness simulations. The missile model's tracking algorithm is based on a target recognition system that uses a neural network to discriminate between ships and decoys. The neural network is trained on shape- and intensity-based features extracted from simulated imagery. The missile model is then used within ship-decoy-missile engagement simulations, to determine how susceptible it is to the well-known walk-off seduction countermeasure technique. Finally, ship survivability is improved by adjusting the decoy model to increase its effectiveness against the tracker.

  15. Health economic potential of early nutrition programming: a model calculation of long-term reduction in blood pressure and related morbidity costs by use of long-chain polyunsaturated fatty acid-supplemented formula.

    PubMed

    Straub, Niels; Grunert, Philipp; von Kries, Rüdiger; Koletzko, Berthold

    2011-12-01

    The reported effect sizes of early nutrition programming on long-term health outcomes are often small, and it has been questioned whether early interventions would be worthwhile in enhancing public health. We explored the possible health economic consequences of early nutrition programming by performing a model calculation, based on the only published study currently available for analysis, to evaluate the effects of supplementing infant formula with long-chain polyunsaturated fatty acids (LC-PUFAs) on lowering blood pressure and lowering the risk of hypertension-related diseases in later life. The costs and health effects of LC-PUFA-enriched and standard infant formulas were compared by using a Markov model, including all relevant direct and indirect costs based on German statistics. We assessed the effect size of blood pressure reduction from LC-PUFA-supplemented formula, the long-term persistence of the effect, and the effect of lowered blood pressure on hypertension-related morbidity. The cost-effectiveness analysis showed an increased life expectancy of 1.2 quality-adjusted life-years and an incremental cost-effectiveness ratio of -630 Euros (discounted to present value) for the LC-PUFA formula in comparison with standard formula. LC-PUFA nutrition was the superior strategy even when the blood pressure-lowering effect was reduced to the lower 95% CI. Breastfeeding is the recommended feeding practice, but infants who are not breastfed should receive an appropriate infant formula. Following this model calculation, LC-PUFA supplementation of infant formula represents an economically worthwhile prevention strategy, based on the costs derived from hypertension-linked diseases in later life. However, because our analysis was based on a single randomized controlled trial, further studies are required to verify the validity of this thesis.

  16. An agent-based simulation model to study accountable care organizations.

    PubMed

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.

  17. Nonlinear structural joint model updating based on instantaneous characteristics of dynamic responses

    NASA Astrophysics Data System (ADS)

    Wang, Zuo-Cai; Xin, Yu; Ren, Wei-Xin

    2016-08-01

    This paper proposes a new nonlinear joint model updating method for shear type structures based on the instantaneous characteristics of the decomposed structural dynamic responses. To obtain an accurate representation of a nonlinear system's dynamics, the nonlinear joint model is described as the nonlinear spring element with bilinear stiffness. The instantaneous frequencies and amplitudes of the decomposed mono-component are first extracted by the analytical mode decomposition (AMD) method. Then, an objective function based on the residuals of the instantaneous frequencies and amplitudes between the experimental structure and the nonlinear model is created for the nonlinear joint model updating. The optimal values of the nonlinear joint model parameters are obtained by minimizing the objective function using the simulated annealing global optimization method. To validate the effectiveness of the proposed method, a single-story shear type structure subjected to earthquake and harmonic excitations is simulated as a numerical example. Then, a beam structure with multiple local nonlinear elements subjected to earthquake excitation is also simulated. The nonlinear beam structure is updated based on the global and local model using the proposed method. The results show that the proposed local nonlinear model updating method is more effective for structures with multiple local nonlinear elements. Finally, the proposed method is verified by the shake table test of a real high voltage switch structure. The accuracy of the proposed method is quantified both in numerical and experimental applications using the defined error indices. Both the numerical and experimental results have shown that the proposed method can effectively update the nonlinear joint model.

  18. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    NASA Astrophysics Data System (ADS)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  19. Researches of fruit quality prediction model based on near infrared spectrum

    NASA Astrophysics Data System (ADS)

    Shen, Yulin; Li, Lian

    2018-04-01

    With the improvement in standards for food quality and safety, people pay more attention to the internal quality of fruits, therefore the measurement of fruit internal quality is increasingly imperative. In general, nondestructive soluble solid content (SSC) and total acid content (TAC) analysis of fruits is vital and effective for quality measurement in global fresh produce markets, so in this paper, we aim at establishing a novel fruit internal quality prediction model based on SSC and TAC for Near Infrared Spectrum. Firstly, the model of fruit quality prediction based on PCA + BP neural network, PCA + GRNN network, PCA + BP adaboost strong classifier, PCA + ELM and PCA + LS_SVM classifier are designed and implemented respectively; then, in the NSCT domain, the median filter and the SavitzkyGolay filter are used to preprocess the spectral signal, Kennard-Stone algorithm is used to automatically select the training samples and test samples; thirdly, we achieve the optimal models by comparing 15 kinds of prediction model based on the theory of multi-classifier competition mechanism, specifically, the non-parametric estimation is introduced to measure the effectiveness of proposed model, the reliability and variance of nonparametric estimation evaluation of each prediction model to evaluate the prediction result, while the estimated value and confidence interval regard as a reference, the experimental results demonstrate that this model can better achieve the optimal evaluation of the internal quality of fruit; finally, we employ cat swarm optimization to optimize two optimal models above obtained from nonparametric estimation, empirical testing indicates that the proposed method can provide more accurate and effective results than other forecasting methods.

  20. A New Method for Setting Calculation Sequence of Directional Relay Protection in Multi-Loop Networks

    NASA Astrophysics Data System (ADS)

    Haijun, Xiong; Qi, Zhang

    2016-08-01

    Workload of relay protection setting calculation in multi-loop networks may be reduced effectively by optimization setting calculation sequences. A new method of setting calculation sequences of directional distance relay protection in multi-loop networks based on minimum broken nodes cost vector (MBNCV) was proposed to solve the problem experienced in current methods. Existing methods based on minimum breakpoint set (MBPS) lead to more break edges when untying the loops in dependent relationships of relays leading to possibly more iterative calculation workloads in setting calculations. A model driven approach based on behavior trees (BT) was presented to improve adaptability of similar problems. After extending the BT model by adding real-time system characters, timed BT was derived and the dependency relationship in multi-loop networks was then modeled. The model was translated into communication sequence process (CSP) models and an optimization setting calculation sequence in multi-loop networks was finally calculated by tools. A 5-nodes multi-loop network was applied as an example to demonstrate effectiveness of the modeling and calculation method. Several examples were then calculated with results indicating the method effectively reduces the number of forced broken edges for protection setting calculation in multi-loop networks.

  1. Collaborative Care in Schools: Enhancing Integration and Impact in Youth Mental Health

    PubMed Central

    Lyon, Aaron R.; Whitaker, Kelly; French, William P.; Richardson, Laura P.; Wasse, Jessica Knaster; McCauley, Elizabeth

    2016-01-01

    Collaborative Care is an innovative approach to integrated mental health service delivery that focuses on reducing access barriers, improving service quality, and lowering healthcare expenditures. A large body of evidence supports the effectiveness of Collaborative Care models with adults and, increasingly, for youth. Although existing studies examining these models for youth have focused exclusively on primary care, the education sector is also an appropriate analog for the accessibility that primary care offers to adults. Collaborative Care aligns closely with the practical realities of the education sector and may represent a strategy to achieve some of the objectives of increasingly popular multi-tiered systems of supports frameworks. Unfortunately, no resources exist to guide the application of Collaborative Care models in schools. Based on the existing evidence for Collaborative Care models, the current paper (1) provides a rationale for the adaptation of Collaborative Care models to improve mental health service accessibility and effectiveness in the education sector; (2) presents a preliminary Collaborative Care model for use in schools; and (3) describes avenues for research surrounding school-based Collaborative Care, including the currently funded Accessible, Collaborative Care for Effective School-based Services (ACCESS) project. PMID:28392832

  2. Research on user behavior authentication model based on stochastic Petri nets

    NASA Astrophysics Data System (ADS)

    Zhang, Chengyuan; Xu, Haishui

    2017-08-01

    A behavioural authentication model based on stochastic Petri net is proposed to meet the randomness, uncertainty and concurrency characteristics of user behaviour. The use of random models in the location, changes, arc and logo to describe the characteristics of a variety of authentication and game relationships, so as to effectively implement the graphical user behaviour authentication model analysis method, according to the corresponding proof to verify the model is valuable.

  3. The influence of ligament modelling strategies on the predictive capability of finite element models of the human knee joint.

    PubMed

    Naghibi Beidokhti, Hamid; Janssen, Dennis; van de Groes, Sebastiaan; Hazrati, Javad; Van den Boogaard, Ton; Verdonschot, Nico

    2017-12-08

    In finite element (FE) models knee ligaments can represented either by a group of one-dimensional springs, or by three-dimensional continuum elements based on segmentations. Continuum models closer approximate the anatomy, and facilitate ligament wrapping, while spring models are computationally less expensive. The mechanical properties of ligaments can be based on literature, or adjusted specifically for the subject. In the current study we investigated the effect of ligament modelling strategy on the predictive capability of FE models of the human knee joint. The effect of literature-based versus specimen-specific optimized material parameters was evaluated. Experiments were performed on three human cadaver knees, which were modelled in FE models with ligaments represented either using springs, or using continuum representations. In spring representation collateral ligaments were each modelled with three and cruciate ligaments with two single-element bundles. Stiffness parameters and pre-strains were optimized based on laxity tests for both approaches. Validation experiments were conducted to evaluate the outcomes of the FE models. Models (both spring and continuum) with subject-specific properties improved the predicted kinematics and contact outcome parameters. Models incorporating literature-based parameters, and particularly the spring models (with the representations implemented in this study), led to relatively high errors in kinematics and contact pressures. Using a continuum modelling approach resulted in more accurate contact outcome variables than the spring representation with two (cruciate ligaments) and three (collateral ligaments) single-element-bundle representations. However, when the prediction of joint kinematics is of main interest, spring ligament models provide a faster option with acceptable outcome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A systematic review and qualitative analysis to inform the development of a new emergency department-based geriatric case management model.

    PubMed

    Sinha, Samir K; Bessman, Edward S; Flomenbaum, Neal; Leff, Bruce

    2011-06-01

    We inform the future development of a new geriatric emergency management practice model. We perform a systematic review of the existing evidence for emergency department (ED)-based case management models designed to improve the health, social, and health service utilization outcomes for noninstitutionalized older patients within the context of an index ED visit. This was a systematic review of English-language articles indexed in MEDLINE and CINAHL (1966 to 2010), describing ED-based case management models for older adults. Bibliographies of the retrieved articles were reviewed to identify additional references. A systematic qualitative case study analytic approach was used to identify the core operational components and outcome measures of the described clinical interventions. The authors of the included studies were also invited to verify our interpretations of their work. The determined patterns of component adherence were then used to postulate the relative importance and effect of the presence or absence of a particular component in influencing the overall effectiveness of their respective interventions. Eighteen of 352 studies (reported in 20 articles) met study criteria. Qualitative analyses identified 28 outcome measures and 8 distinct model characteristic components that included having an evidence-based practice model, nursing clinical involvement or leadership, high-risk screening processes, focused geriatric assessments, the initiation of care and disposition planning in the ED, interprofessional and capacity-building work practices, post-ED discharge follow-up with patients, and evaluation and monitoring processes. Of the 15 positive study results, 6 had all 8 characteristic components and 9 were found to be lacking at least 1 component. Two studies with positive results lacked 2 characteristic components and none lacked more than 2 components. Of the 3 studies with negative results demonstrating no positive effects based on any outcome tested, one lacked 2, one lacked 3, and one lacked 4 of the 8 model components. Successful models of ED-based case management models for older adults share certain key characteristics. This study builds on the emerging literature in this area and leverages the differences in these models and their associated outcomes to support the development of an evidence-based normative and effective geriatric emergency management practice model designed to address the special care needs and thereby improve the health and health service utilization outcomes of older patients. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  5. Urban stormwater inundation simulation based on SWMM and diffusive overland-flow model.

    PubMed

    Chen, Wenjie; Huang, Guoru; Zhang, Han

    2017-12-01

    With rapid urbanization, inundation-induced property losses have become more and more severe. Urban inundation modeling is an effective way to reduce these losses. This paper introduces a simplified urban stormwater inundation simulation model based on the United States Environmental Protection Agency Storm Water Management Model (SWMM) and a geographic information system (GIS)-based diffusive overland-flow model. SWMM is applied for computation of flows in storm sewer systems and flooding flows at junctions, while the GIS-based diffusive overland-flow model simulates surface runoff and inundation. One observed rainfall scenario on Haidian Island, Hainan Province, China was chosen to calibrate the model and the other two were used for validation. Comparisons of the model results with field-surveyed data and InfoWorks ICM (Integrated Catchment Modeling) modeled results indicated the inundation model in this paper can provide inundation extents and reasonable inundation depths even in a large study area.

  6. Directivity models produced for the Next Generation Attenuation West 2 (NGA-West 2) project

    USGS Publications Warehouse

    Spudich, Paul A.; Watson-Lamprey, Jennie; Somerville, Paul G.; Bayless, Jeff; Shahi, Shrey; Baker, Jack W.; Rowshandel, Badie; Chiou, Brian

    2012-01-01

    Five new directivity models are being developed for the NGA-West 2 project. All are based on the NGA-West 2 data base, which is considerably expanded from the original NGA-West data base, containing about 3,000 more records from earthquakes having finite-fault rupture models. All of the new directivity models have parameters based on fault dimension in km, not normalized fault dimension. This feature removes a peculiarity of previous models which made them inappropriate for modeling large magnitude events on long strike-slip faults. Two models are explicitly, and one is implicitly, 'narrowband' models, in which the effect of directivity does not monotonically increase with spectral period but instead peaks at a specific period that is a function of earthquake magnitude. These narrowband models' functional forms are capable of simulating directivity over a wider range of earthquake magnitude than previous models. The functional forms of the five models are presented.

  7. Perspective: Sloppiness and emergent theories in physics, biology, and beyond.

    PubMed

    Transtrum, Mark K; Machta, Benjamin B; Brown, Kevin S; Daniels, Bryan C; Myers, Christopher R; Sethna, James P

    2015-07-07

    Large scale models of physical phenomena demand the development of new statistical and computational tools in order to be effective. Many such models are "sloppy," i.e., exhibit behavior controlled by a relatively small number of parameter combinations. We review an information theoretic framework for analyzing sloppy models. This formalism is based on the Fisher information matrix, which is interpreted as a Riemannian metric on a parameterized space of models. Distance in this space is a measure of how distinguishable two models are based on their predictions. Sloppy model manifolds are bounded with a hierarchy of widths and extrinsic curvatures. The manifold boundary approximation can extract the simple, hidden theory from complicated sloppy models. We attribute the success of simple effective models in physics as likewise emerging from complicated processes exhibiting a low effective dimensionality. We discuss the ramifications and consequences of sloppy models for biochemistry and science more generally. We suggest that the reason our complex world is understandable is due to the same fundamental reason: simple theories of macroscopic behavior are hidden inside complicated microscopic processes.

  8. An overall strategy based on regression models to estimate relative survival and model the effects of prognostic factors in cancer survival studies.

    PubMed

    Remontet, L; Bossard, N; Belot, A; Estève, J

    2007-05-10

    Relative survival provides a measure of the proportion of patients dying from the disease under study without requiring the knowledge of the cause of death. We propose an overall strategy based on regression models to estimate the relative survival and model the effects of potential prognostic factors. The baseline hazard was modelled until 10 years follow-up using parametric continuous functions. Six models including cubic regression splines were considered and the Akaike Information Criterion was used to select the final model. This approach yielded smooth and reliable estimates of mortality hazard and allowed us to deal with sparse data taking into account all the available information. Splines were also used to model simultaneously non-linear effects of continuous covariates and time-dependent hazard ratios. This led to a graphical representation of the hazard ratio that can be useful for clinical interpretation. Estimates of these models were obtained by likelihood maximization. We showed that these estimates could be also obtained using standard algorithms for Poisson regression. Copyright 2006 John Wiley & Sons, Ltd.

  9. PHYSIOLOCIGALLY BASED PHARMACOKINETIC (PBPK) MODELING AND MODE OF ACTION IN DOSE-RESPONSE ASSESSMENT

    EPA Science Inventory

    PHYSIOLOGICALLY BASED PHARMACOKINETIC (PBPK) MODELING AND MODE OF ACTION IN DOSE-RESPONSE ASSESSMENT. Barton HA. Experimental Toxicology Division, National Health and Environmental Effects Laboratory, ORD, U.S. EPA
    Dose-response analysis requires quantitatively linking infor...

  10. Comparing methods to combine functional loss and mortality in clinical trials for amyotrophic lateral sclerosis

    PubMed Central

    van Eijk, Ruben PA; Eijkemans, Marinus JC; Rizopoulos, Dimitris

    2018-01-01

    Objective Amyotrophic lateral sclerosis (ALS) clinical trials based on single end points only partially capture the full treatment effect when both function and mortality are affected, and may falsely dismiss efficacious drugs as futile. We aimed to investigate the statistical properties of several strategies for the simultaneous analysis of function and mortality in ALS clinical trials. Methods Based on the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database, we simulated longitudinal patterns of functional decline, defined by the revised amyotrophic lateral sclerosis functional rating scale (ALSFRS-R) and conditional survival time. Different treatment scenarios with varying effect sizes were simulated with follow-up ranging from 12 to 18 months. We considered the following analytical strategies: 1) Cox model; 2) linear mixed effects (LME) model; 3) omnibus test based on Cox and LME models; 4) composite time-to-6-point decrease or death; 5) combined assessment of function and survival (CAFS); and 6) test based on joint modeling framework. For each analytical strategy, we calculated the empirical power and sample size. Results Both Cox and LME models have increased false-negative rates when treatment exclusively affects either function or survival. The joint model has superior power compared to other strategies. The composite end point increases false-negative rates among all treatment scenarios. To detect a 15% reduction in ALSFRS-R decline and 34% decline in hazard with 80% power after 18 months, the Cox model requires 524 patients, the LME model 794 patients, the omnibus test 526 patients, the composite end point 1,274 patients, the CAFS 576 patients and the joint model 464 patients. Conclusion Joint models have superior statistical power to analyze simultaneous effects on survival and function and may circumvent pitfalls encountered by other end points. Optimizing trial end points is essential, as selecting suboptimal outcomes may disguise important treatment clues. PMID:29593436

  11. Amazon rainforest responses to elevated CO2: Deriving model-based hypotheses for the AmazonFACE experiment

    NASA Astrophysics Data System (ADS)

    Rammig, A.; Fleischer, K.; Lapola, D.; Holm, J.; Hoosbeek, M.

    2017-12-01

    Increasing atmospheric CO2 concentration is assumed to have a stimulating effect ("CO2 fertilization effect") on forest growth and resilience. Empirical evidence, however, for the existence and strength of such a tropical CO2 fertilization effect is scarce and thus a major impediment for constraining the uncertainties in Earth System Model projections. The implications of the tropical CO2 effect are far-reaching, as it strongly influences the global carbon and water cycle, and hence future global climate. In the scope of the Amazon Free Air CO2 Enrichment (FACE) experiment, we addressed these uncertainties by assessing the CO2 fertilization effect at ecosystem scale. AmazonFACE is the first FACE experiment in an old-growth, highly diverse tropical rainforest. Here, we present a priori model-based hypotheses for the experiment derived from a set of 12 ecosystem models. Model simulations identified key uncertainties in our understanding of limiting processes and derived model-based hypotheses of expected ecosystem responses to elevated CO2 that can directly be tested during the experiment. Ambient model simulations compared satisfactorily with in-situ measurements of ecosystem carbon fluxes, as well as carbon, nitrogen, and phosphorus stocks. Models consistently predicted an increase in photosynthesis with elevated CO2, which declined over time due to developing limitations. The conversion of enhanced photosynthesis into biomass, and hence ecosystem carbon sequestration, varied strongly among the models due to different assumptions on nutrient limitation. Models with flexible allocation schemes consistently predicted an increased investment in belowground structures to alleviate nutrient limitation, in turn accelerating turnover rates of soil organic matter. The models diverged on the prediction for carbon accumulation after 10 years of elevated CO2, mainly due to contrasting assumptions in their phosphorus cycle representation. These differences define the expected response ratio to elevated CO2 at the AmazonFACE site and identify priorities for experimental work and model development.

  12. Emerging In Vitro Liver Technologies for Drug Metabolism and Inter-Organ Interactions

    PubMed Central

    Bale, Shyam Sundhar; Moore, Laura

    2016-01-01

    In vitro liver models provide essential information for evaluating drug metabolism, metabolite formation, and hepatotoxicity. Interfacing liver models with other organ models could provide insights into the desirable as well as unintended systemic side effects of therapeutic agents and their metabolites. Such information is invaluable for drug screening processes particularly in the context of secondary organ toxicity. While interfacing of liver models with other organ models has been achieved, platforms that effectively provide human-relevant precise information are needed. In this concise review, we discuss the current state-of-the-art of liver-based multiorgan cell culture platforms primarily from a drug and metabolite perspective, and highlight the importance of media-to-cell ratio in interfacing liver models with other organ models. In addition, we briefly discuss issues related to development of optimal liver models that include recent advances in hepatic cell lines, stem cells, and challenges associated with primary hepatocyte-based liver models. Liver-based multiorgan models that achieve physiologically relevant coupling of different organ models can have a broad impact in evaluating drug efficacy and toxicity, as well as mechanistic investigation of human-relevant disease conditions. PMID:27049038

  13. Modelling food-web mediated effects of hydrological variability and environmental flows.

    PubMed

    Robson, Barbara J; Lester, Rebecca E; Baldwin, Darren S; Bond, Nicholas R; Drouart, Romain; Rolls, Robert J; Ryder, Darren S; Thompson, Ross M

    2017-11-01

    Environmental flows are designed to enhance aquatic ecosystems through a variety of mechanisms; however, to date most attention has been paid to the effects on habitat quality and life-history triggers, especially for fish and vegetation. The effects of environmental flows on food webs have so far received little attention, despite food-web thinking being fundamental to understanding of river ecosystems. Understanding environmental flows in a food-web context can help scientists and policy-makers better understand and manage outcomes of flow alteration and restoration. In this paper, we consider mechanisms by which flow variability can influence and alter food webs, and place these within a conceptual and numerical modelling framework. We also review the strengths and weaknesses of various approaches to modelling the effects of hydrological management on food webs. Although classic bioenergetic models such as Ecopath with Ecosim capture many of the key features required, other approaches, such as biogeochemical ecosystem modelling, end-to-end modelling, population dynamic models, individual-based models, graph theory models, and stock assessment models are also relevant. In many cases, a combination of approaches will be useful. We identify current challenges and new directions in modelling food-web responses to hydrological variability and environmental flow management. These include better integration of food-web and hydraulic models, taking physiologically-based approaches to food quality effects, and better representation of variations in space and time that may create ecosystem control points. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  14. Integrating Partial Polarization into a Metal-Ferroelectric-Semiconductor Field Effect Transistor Model

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat Duen

    1999-01-01

    The ferroelectric channel in a Metal-Ferroelectric-Semiconductor Field Effect Transistor (MFSFET) can partially change its polarization when the gate voltage near the polarization threshold voltage. This causes the MFSFET Drain current to change with repeated pulses of the same gate voltage near the polarization threshold voltage. A previously developed model [11, based on the Fermi-Dirac function, assumed that for a given gate voltage and channel polarization, a sin-le Drain current value would be generated. A study has been done to characterize the effects of partial polarization on the Drain current of a MFSFET. These effects have been described mathematically and these equations have been incorporated into a more comprehensive mathematical model of the MFSFET. The model takes into account the hysteresis nature of the MFSFET and the time dependent decay as well as the effects of partial polarization. This model defines the Drain current based on calculating the degree of polarization from previous gate pulses, the present Gate voltage, and the amount of time since the last Gate volta-e pulse.

  15. Empowering Effective STEM Role Models to Promote STEM Equity in Local Communities

    NASA Astrophysics Data System (ADS)

    Harte, T.; Taylor, J.

    2017-12-01

    Empowering Effective STEM Role Models, a three-hour training developed and successfully implemented by NASA Langley Research Center's Science Directorate, is an effort to encourage STEM professionals to serve as role models within their community. The training is designed to help participants reflect on their identity as a role model and provide research-based strategies to effectively engage youth, particularly girls, in STEM (science, technology, engineering, and mathematics). Research shows that even though girls and boys do not demonstrate a significant difference in their ability to be successful in mathematics and science, there is a significant difference in their confidence level when participating in STEM subject matter and pursuing STEM careers. The Langley training model prepares professionals to disrupt this pattern and take on the habits and skills of effective role models. The training model is based on other successful models and resources for role modeling in STEM including SciGirls; the National Girls Collaborative; and publications by the American Association of University Women and the National Academies. It includes a significant reflection component, and participants walk through situation-based scenarios to practice a focused suite of research-based strategies. These strategies can be implemented in a variety of situations and adapted to the needs of groups that are underrepresented in STEM fields. Underpinning the training and the discussions is the fostering of a growth mindset and promoting perseverance. "The Power of Yet" becomes a means whereby role models encourage students to believe in themselves, working toward reaching their goals and dreams in the area of STEM. To provide additional support, NASA Langley role model trainers are available to work with a champion at other organizations to facilitate the training. This champion helps recruit participants, seeks leadership buy-in, and helps provide valuable insights for needs and interests specific to the organization. After the in-person training experience, participants receive additional follow-up support by working with their local champions and the NASA Langley trainers. The goal is to share the role model training model in an effort to empower STEM role models and assist in promoting STEM Equity in all communities.

  16. Modeling of surface dust concentration in snow cover at industrial area using neural networks and kriging

    NASA Astrophysics Data System (ADS)

    Sergeev, A. P.; Tarasov, D. A.; Buevich, A. G.; Shichkin, A. V.; Tyagunov, A. G.; Medvedev, A. N.

    2017-06-01

    Modeling of spatial distribution of pollutants in the urbanized territories is difficult, especially if there are multiple emission sources. When monitoring such territories, it is often impossible to arrange the necessary detailed sampling. Because of this, the usual methods of analysis and forecasting based on geostatistics are often less effective. Approaches based on artificial neural networks (ANNs) demonstrate the best results under these circumstances. This study compares two models based on ANNs, which are multilayer perceptron (MLP) and generalized regression neural networks (GRNNs) with the base geostatistical method - kriging. Models of the spatial dust distribution in the snow cover around the existing copper quarry and in the area of emissions of a nickel factory were created. To assess the effectiveness of the models three indices were used: the mean absolute error (MAE), the root-mean-square error (RMSE), and the relative root-mean-square error (RRMSE). Taking into account all indices the model of GRNN proved to be the most accurate which included coordinates of the sampling points and the distance to the likely emission source as input parameters for the modeling. Maps of spatial dust distribution in the snow cover were created in the study area. It has been shown that the models based on ANNs were more accurate than the kriging, particularly in the context of a limited data set.

  17. Biases and power for groups comparison on subjective health measurements.

    PubMed

    Hamel, Jean-François; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Roquelaure, Yves; Sébille, Véronique

    2012-01-01

    Subjective health measurements are increasingly used in clinical research, particularly for patient groups comparisons. Two main types of analytical strategies can be used for such data: so-called classical test theory (CTT), relying on observed scores and models coming from Item Response Theory (IRT) relying on a response model relating the items responses to a latent parameter, often called latent trait. Whether IRT or CTT would be the most appropriate method to compare two independent groups of patients on a patient reported outcomes measurement remains unknown and was investigated using simulations. For CTT-based analyses, groups comparison was performed using t-test on the scores. For IRT-based analyses, several methods were compared, according to whether the Rasch model was considered with random effects or with fixed effects, and the group effect was included as a covariate or not. Individual latent traits values were estimated using either a deterministic method or by stochastic approaches. Latent traits were then compared with a t-test. Finally, a two-steps method was performed to compare the latent trait distributions, and a Wald test was performed to test the group effect in the Rasch model including group covariates. The only unbiased IRT-based method was the group covariate Wald's test, performed on the random effects Rasch model. This model displayed the highest observed power, which was similar to the power using the score t-test. These results need to be extended to the case frequently encountered in practice where data are missing and possibly informative.

  18. Theory-based interventions for contraception.

    PubMed

    Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen, Mario; Stockton, Laurie L

    2013-08-07

    The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials (RCTs) that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. Through June 2013, we searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, ClinicalTrials.gov, and ICTRP). Previous searches also included EMBASE. For the initial review, we wrote to investigators to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups and preventing sexually transmitted infections or HIV. Interventions addressed the use of one or more contraceptive methods for contraception. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice or use, and contraceptive adherence or continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. For the dichotomous outcomes, the Mantel-Haenszel odds ratio (OR) with 95% CI was calculated using a fixed-effect model. Cluster randomized trials used various methods of accounting for the clustering, such as multilevel modeling. Most reports did not provide information to calculate the effective sample size. Therefore, we presented the results as reported by the investigators. No meta-analysis was conducted due to differences in interventions and outcome measures. We included three new trials for a total of 17. Ten randomly assigned individuals and seven were cluster-randomized. Eight trials showed some intervention effect.Two of 12 trials with pregnancy or birth data showed some effect. A theory-based group was less likely than the comparison group to have a second birth (OR 0.41; 95% CI 0.17 to 1.00) or to report a pregnancy (OR 0.24 (95% CI 0.10 to 0.56); OR 0.27 (95% CI 0.11 to 0.66)). The theoretical bases were social cognitive theory (SCT) and another social cognition model.Of 12 trials with data on contraceptive use (non-condom), six showed some effect. A theory-based group was more likely to consistently use oral contraceptives (OR 1.41; 95% CI 1.06 to 1.87), hormonal contraceptives (reported relative risk (RR) 1.30; 95% CI 1.06 to 1.58) or dual methods (reported RR 1.36; 95% CI 1.01 to 1.85); to use an effective contraceptive method (reported effect size 1.76; OR 2.04 (95% CI 1.47 to 2.83)) or use more habitual contraception (reported P < 0.05); and were less likely to use ineffective contraception (OR 0.56; 95% CI 0.31 to 0.98). Theories and models included the Health Belief Model (HBM), SCT, SCT plus another theory, other social cognition, and motivational interviewing (MI).For condom use, a theory-based group had favorable results in 5 of 11 trials. The main differences were reporting more consistent condom use (reported RR 1.57; 95% CI 1.28 to 1.94) and more condom use during last sex (reported results: risk ratio 1.47 (95% CI 1.12 to 1.93); effect size 1.68; OR 2.12 (95% CI 1.24 to 3.56); OR 1.45 (95% CI 1.03 to 2.03)). The theories were SCT, SCT plus another theory, and HBM.Nearly all trials provided multiple sessions or contacts. SCT provided the basis for seven trials focused on adolescents, of which five reported some effectiveness. Two others based on other social cognition models had favorable results with adolescents. Of six trials including adult women, five provided individual sessions. Some effect was seen in two using MI and one using the HBM. Two based on the Transtheoretical Model did not show any effect. Eight trials provided evidence of high or moderate quality. Family planning researchers and practitioners could adapt the effective interventions, although most provided group sessions for adolescents. Three were conducted outside the USA. Clinics and low-resource settings need high-quality evidence on changing behavior. Thorough use of single theories would help in identifying what works, as would better reporting on research design and intervention implementation.

  19. A Capability-Based Approach to Analyzing the Effectiveness and Robustness of an Offshore Patrol Vessel in the Search and Rescue Mission

    DTIC Science & Technology

    2012-12-01

    13 D. NPSS MODEL DEVELOPMENT...COMPARISON: NPSS MODEL WITH ITALIAN MODEL ...................51 IV. CONCLUSIONS AND RECOMMENDATIONS...58 1. Italian Model Recommendations ......................................................58 2. NPSS Model

  20. Effective Classroom Management and Instruction: An Exploration of Models. Executive Summary of Final Report.

    ERIC Educational Resources Information Center

    Evertson, Carolyn M.; And Others

    A summary is presented of the final report, "Effective Classroom Management and Instruction: An Exploration of Models." The final report presents a set of linked investigations of the effects of training teachers in effective classroom management practices in a series of school-based workshops. Four purposes were addressed by the study: (1) to…

  1. [Comparison of Flu Outbreak Reporting Standards Based on Transmission Dynamics Model].

    PubMed

    Yang, Guo-jing; Yi, Qing-jie; Li, Qin; Zeng, Qing

    2016-05-01

    To compare the current two flu outbreak reporting standards for the purpose of better prevention and control of flu outbreaks. A susceptible-exposed-infectious/asymptomatic-removed (SEIAR) model without interventions was set up first, followed by a model with interventions based on real situation. Simulated interventions were developed based on the two reporting standards, and evaluated by estimated duration of outbreaks, cumulative new cases, cumulative morbidity rates, decline in percentage of morbidity rates, and cumulative secondary cases. The basic reproductive number of the outbreak was estimated as 8. 2. The simulation produced similar results as the real situation. The effect of interventions based on reporting standard one (10 accumulated new cases in a week) was better than that of interventions based on reporting standard two (30 accumulated new cases in a week). The reporting standard one (10 accumulated new cases in a week) is more effective for prevention and control of flu outbreaks.

  2. Characteristics Of Ferroelectric Logic Gates Using a Spice-Based Model

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Phillips, Thomas A.; Ho, Fat D.

    2005-01-01

    A SPICE-based model of an n-channel ferroelectric field effect transistor has been developed based on both theoretical and empirical data. This model was used to generate the I-V characteristic of several logic gates. The use of ferroelectric field effect transistors in memory circuits is being developed by several organizations. The use of FFETs in other circuits, both analog and digital needs to be better understood. The ability of FFETs to have different characteristics depending on the initial polarization can be used to create logic gates. These gates can have properties not available to standard CMOS logic gates, such as memory, reconfigurability and memory. This paper investigates basic properties of FFET logic gates. It models FFET inverter, NAND gate and multi-input NAND gate. The I-V characteristics of the gates are presented as well as transfer characteristics and timing. The model used is a SPICE-based model developed from empirical data from actual Ferroelectric transistors. It simulates all major characteristics of the ferroelectric transistor, including polarization, hysteresis and decay. Contrasts are made of the differences between FFET logic gates and CMOS logic gates. FFET parameters are varied to show the effect on the overall gate. A recodigurable gate is investigated which is not possible with CMOS circuits. The paper concludes that FFETs can be used in logic gates and have several advantages over standard CMOS gates.

  3. A Hybrid Multi-Scale Model of Crystal Plasticity for Handling Stress Concentrations

    DOE PAGES

    Sun, Shang; Ramazani, Ali; Sundararaghavan, Veera

    2017-09-04

    Microstructural effects become important at regions of stress concentrators such as notches, cracks and contact surfaces. A multiscale model is presented that efficiently captures microstructural details at such critical regions. The approach is based on a multiresolution mesh that includes an explicit microstructure representation at critical regions where stresses are localized. At regions farther away from the stress concentration, a reduced order model that statistically captures the effect of the microstructure is employed. The statistical model is based on a finite element representation of the orientation distribution function (ODF). As an illustrative example, we have applied the multiscaling method tomore » compute the stress intensity factor K I around the crack tip in a wedge-opening load specimen. The approach is verified with an analytical solution within linear elasticity approximation and is then extended to allow modeling of microstructural effects on crack tip plasticity.« less

  4. A Hybrid Multi-Scale Model of Crystal Plasticity for Handling Stress Concentrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Shang; Ramazani, Ali; Sundararaghavan, Veera

    Microstructural effects become important at regions of stress concentrators such as notches, cracks and contact surfaces. A multiscale model is presented that efficiently captures microstructural details at such critical regions. The approach is based on a multiresolution mesh that includes an explicit microstructure representation at critical regions where stresses are localized. At regions farther away from the stress concentration, a reduced order model that statistically captures the effect of the microstructure is employed. The statistical model is based on a finite element representation of the orientation distribution function (ODF). As an illustrative example, we have applied the multiscaling method tomore » compute the stress intensity factor K I around the crack tip in a wedge-opening load specimen. The approach is verified with an analytical solution within linear elasticity approximation and is then extended to allow modeling of microstructural effects on crack tip plasticity.« less

  5. Improved turbulence models based on large eddy simulation of homogeneous, incompressible turbulent flows

    NASA Technical Reports Server (NTRS)

    Bardino, J.; Ferziger, J. H.; Reynolds, W. C.

    1983-01-01

    The physical bases of large eddy simulation and subgrid modeling are studied. A subgrid scale similarity model is developed that can account for system rotation. Large eddy simulations of homogeneous shear flows with system rotation were carried out. Apparently contradictory experimental results were explained. The main effect of rotation is to increase the transverse length scales in the rotation direction, and thereby decrease the rates of dissipation. Experimental results are shown to be affected by conditions at the turbulence producing grid, which make the initial states a function of the rotation rate. A two equation model is proposed that accounts for effects of rotation and shows good agreement with experimental results. In addition, a Reynolds stress model is developed that represents the turbulence structure of homogeneous shear flows very well and can account also for the effects of system rotation.

  6. Gate voltage dependent 1/f noise variance model based on physical noise generation mechanisms in n-channel metal-oxide-semiconductor field-effect transistors

    NASA Astrophysics Data System (ADS)

    Arai, Yukiko; Aoki, Hitoshi; Abe, Fumitaka; Todoroki, Shunichiro; Khatami, Ramin; Kazumi, Masaki; Totsuka, Takuya; Wang, Taifeng; Kobayashi, Haruo

    2015-04-01

    1/f noise is one of the most important characteristics for designing analog/RF circuits including operational amplifiers and oscillators. We have analyzed and developed a novel 1/f noise model in the strong inversion, saturation, and sub-threshold regions based on SPICE2 type model used in any public metal-oxide-semiconductor field-effect transistor (MOSFET) models developed by the University of California, Berkeley. Our model contains two noise generation mechanisms that are mobility and interface trap number fluctuations. Noise variability dependent on gate voltage is also newly implemented in our model. The proposed model has been implemented in BSIM4 model of a SPICE3 compatible circuit simulator. Parameters of the proposed model are extracted with 1/f noise measurements for simulation verifications. The simulation results show excellent agreements between measurement and simulations.

  7. Active imaging system performance model for target acquisition

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Teaney, Brian; Nguyen, Quang; Jacobs, Eddie L.; Halford, Carl E.; Tofsted, David H.

    2007-04-01

    The U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate has developed a laser-range-gated imaging system performance model for the detection, recognition, and identification of vehicle targets. The model is based on the established US Army RDECOM CERDEC NVESD sensor performance models of the human system response through an imaging system. The Java-based model, called NVLRG, accounts for the effect of active illumination, atmospheric attenuation, and turbulence effects relevant to LRG imagers, such as speckle and scintillation, and for the critical sensor and display components. This model can be used to assess the performance of recently proposed active SWIR systems through various trade studies. This paper will describe the NVLRG model in detail, discuss the validation of recent model components, present initial trade study results, and outline plans to validate and calibrate the end-to-end model with field data through human perception testing.

  8. A Logical Account of Diagnosis with Multiple Theories

    NASA Technical Reports Server (NTRS)

    Pandurang, P.; Lum, Henry Jr. (Technical Monitor)

    1994-01-01

    Model-based diagnosis is a powerful, first-principles approach to diagnosis. The primary drawback with model-based diagnosis is that it is based on a system model, and this model might be inappropriate. The inappropriateness of models usually stems from the fundamental tradeoff between completeness and efficiency. Recently, Struss has developed an elegant proposal for diagnosis with multiple models. Struss characterizes models as relations and develops a precise notion of abstraction. He defines relations between models and analyzes the effect of a model switch on the space of possible diagnoses. In this paper we extend Struss's proposal in three ways. First, our account of diagnosis with multiple models is based on representing models as more expressive first-order theories, rather than as relations. A key technical contribution is the use of a general notion of abstraction based on interpretations between theories. Second, Struss conflates component modes with models, requiring him to define models relations such as choices which result in non-relational models. We avoid this problem by differentiating component modes from models. Third, we present a more general account of simplifications that correctly handles situations where the simplification contradicts the base theory.

  9. Estimation of Spatial Dynamic Nonparametric Durbin Models with Fixed Effects

    ERIC Educational Resources Information Center

    Qian, Minghui; Hu, Ridong; Chen, Jianwei

    2016-01-01

    Spatial panel data models have been widely studied and applied in both scientific and social science disciplines, especially in the analysis of spatial influence. In this paper, we consider the spatial dynamic nonparametric Durbin model (SDNDM) with fixed effects, which takes the nonlinear factors into account base on the spatial dynamic panel…

  10. The study of human venous system dynamics using hybrid computer modeling

    NASA Technical Reports Server (NTRS)

    Snyder, M. F.; Rideout, V. C.

    1972-01-01

    A computer-based model of the cardiovascular system was created emphasizing effects on the systemic venous system. Certain physiological aspects were emphasized: effects of heart rate, tilting, changes in respiration, and leg muscular contractions. The results from the model showed close correlation with findings previously reported in the literature.

  11. Evaluation of Lightning Induced Effects in a Graphite Composite Fairing Structure. Parts 1 and 2

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Defining the electromagnetic environment inside a graphite composite fairing due to lightning is of interest to spacecraft developers. This paper is the first in a two part series and studies the shielding effectiveness of a graphite composite model fairing using derived equivalent properties. A frequency domain Method of Moments (MoM) model is developed and comparisons are made with shielding test results obtained using a vehicle-like composite fairing. The comparison results show that the analytical models can adequately predict the test results. Both measured and model data indicate that graphite composite fairings provide significant attenuation to magnetic fields as frequency increases. Diffusion effects are also discussed. Part 2 examines the time domain based effects through the development of a loop based induced field testing and a Transmission-Line-Matrix (TLM) model is developed in the time domain to study how the composite fairing affects lightning induced magnetic fields. Comparisons are made with shielding test results obtained using a vehicle-like composite fairing in the time domain. The comparison results show that the analytical models can adequately predict the test and industry results.

  12. Integrated multiple-model adaptive fault identification and reconfigurable fault-tolerant control for Lead-Wing close formation systems

    NASA Astrophysics Data System (ADS)

    Liu, Chun; Jiang, Bin; Zhang, Ke

    2018-03-01

    This paper investigates the attitude and position tracking control problem for Lead-Wing close formation systems in the presence of loss of effectiveness and lock-in-place or hardover failure. In close formation flight, Wing unmanned aerial vehicle movements are influenced by vortex effects of the neighbouring Lead unmanned aerial vehicle. This situation allows modelling of aerodynamic coupling vortex-effects and linearisation based on optimal close formation geometry. Linearised Lead-Wing close formation model is transformed into nominal robust H-infinity models with respect to Mach hold, Heading hold, and Altitude hold autopilots; static feedback H-infinity controller is designed to guarantee effective tracking of attitude and position while manoeuvring Lead unmanned aerial vehicle. Based on H-infinity control design, an integrated multiple-model adaptive fault identification and reconfigurable fault-tolerant control scheme is developed to guarantee asymptotic stability of close-loop systems, error signal boundedness, and attitude and position tracking properties. Simulation results for Lead-Wing close formation systems validate the efficiency of the proposed integrated multiple-model adaptive control algorithm.

  13. Hamiltonian closures in fluid models for plasmas

    NASA Astrophysics Data System (ADS)

    Tassi, Emanuele

    2017-11-01

    This article reviews recent activity on the Hamiltonian formulation of fluid models for plasmas in the non-dissipative limit, with emphasis on the relations between the fluid closures adopted for the different models and the Hamiltonian structures. The review focuses on results obtained during the last decade, but a few classical results are also described, in order to illustrate connections with the most recent developments. With the hope of making the review accessible not only to specialists in the field, an introduction to the mathematical tools applied in the Hamiltonian formalism for continuum models is provided. Subsequently, we review the Hamiltonian formulation of models based on the magnetohydrodynamics description, including those based on the adiabatic and double adiabatic closure. It is shown how Dirac's theory of constrained Hamiltonian systems can be applied to impose the incompressibility closure on a magnetohydrodynamic model and how an extended version of barotropic magnetohydrodynamics, accounting for two-fluid effects, is amenable to a Hamiltonian formulation. Hamiltonian reduced fluid models, valid in the presence of a strong magnetic field, are also reviewed. In particular, reduced magnetohydrodynamics and models assuming cold ions and different closures for the electron fluid are discussed. Hamiltonian models relaxing the cold-ion assumption are then introduced. These include models where finite Larmor radius effects are added by means of the gyromap technique, and gyrofluid models. Numerical simulations of Hamiltonian reduced fluid models investigating the phenomenon of magnetic reconnection are illustrated. The last part of the review concerns recent results based on the derivation of closures preserving a Hamiltonian structure, based on the Hamiltonian structure of parent kinetic models. Identification of such closures for fluid models derived from kinetic systems based on the Vlasov and drift-kinetic equations are presented, and connections with previously discussed fluid models are pointed out.

  14. The Constructionism and Neurocognitive-Based Teaching Model for Promoting Science Learning Outcomes and Creative Thinking

    ERIC Educational Resources Information Center

    Sripongwiwat, Supathida; Bunterm, Tassanee; Srisawat, Niwat; Tang, Keow Ngang

    2016-01-01

    The aim of this study was to examine the effect, after intervention on both experimental and control groups, of constructionism and neurocognitive-based teaching model, and conventional teaching model, on the science learning outcomes and creative thinking of Grade 11 students. The researchers developed a constructionism and neurocognitive-based…

  15. Exploring the Effect of Embedded Scaffolding within Curricular Tasks on Third-Grade Students' Model-Based Explanations about Hydrologic Cycling

    ERIC Educational Resources Information Center

    Zangori, Laura; Forbes, Cory T.; Schwarz, Christina V.

    2015-01-01

    Opportunities to generate model-based explanations are crucial for elementary students, yet are rarely foregrounded in elementary science learning environments despite evidence that early learners can reason from models when provided with scaffolding. We used a quasi-experimental research design to investigate the comparative impact of a scaffold…

  16. A continuum-based structural modeling approach for cellulose nanocrystals (CNCs)

    Treesearch

    Mehdi Shishehbor; Fernando L. Dri; Robert J. Moon; Pablo D. Zavattieri

    2018-01-01

    We present a continuum-based structural model to study the mechanical behavior of cel- lulose nanocrystals (CNCs), and analyze the effect of bonded and non-bonded interactions on the mechanical properties under various loading conditions. In particular, this model assumes the uncoupling between the bonded and non-bonded interactions and their be- havior is obtained...

  17. A Meta-Analysis of Video Modeling Interventions for Children and Adolescents with Emotional/Behavioral Disorders

    ERIC Educational Resources Information Center

    Clinton, Elias

    2016-01-01

    Video modeling is a non-punitive, evidence-based intervention that has been proven effective for teaching functional life skills and social skills to individuals with autism and developmental disabilities. Compared to the literature base on using video modeling for students with autism and developmental disabilities, fewer studies have examined…

  18. An Amorphous Model for Morphological Processing in Visual Comprehension Based on Naive Discriminative Learning

    ERIC Educational Resources Information Center

    Baayen, R. Harald; Milin, Petar; Durdevic, Dusica Filipovic; Hendrix, Peter; Marelli, Marco

    2011-01-01

    A 2-layer symbolic network model based on the equilibrium equations of the Rescorla-Wagner model (Danks, 2003) is proposed. The study first presents 2 experiments in Serbian, which reveal for sentential reading the inflectional paradigmatic effects previously observed by Milin, Filipovic Durdevic, and Moscoso del Prado Martin (2009) for unprimed…

  19. Improved Characters and Student Learning Outcomes through Development of Character Education Based General Physics Learning Model

    ERIC Educational Resources Information Center

    Derlina; Sabani; Mihardi, Satria

    2015-01-01

    Education Research in Indonesia has begun to lead to the development of character education and is no longer fixated on the outcomes of cognitive learning. This study purposed to produce character education based general physics learning model (CEBGP Learning Model) and with valid, effective and practical peripheral devices to improve character…

  20. Including non-additive genetic effects in Bayesian methods for the prediction of genetic values based on genome-wide markers

    PubMed Central

    2011-01-01

    Background Molecular marker information is a common source to draw inferences about the relationship between genetic and phenotypic variation. Genetic effects are often modelled as additively acting marker allele effects. The true mode of biological action can, of course, be different from this plain assumption. One possibility to better understand the genetic architecture of complex traits is to include intra-locus (dominance) and inter-locus (epistasis) interaction of alleles as well as the additive genetic effects when fitting a model to a trait. Several Bayesian MCMC approaches exist for the genome-wide estimation of genetic effects with high accuracy of genetic value prediction. Including pairwise interaction for thousands of loci would probably go beyond the scope of such a sampling algorithm because then millions of effects are to be estimated simultaneously leading to months of computation time. Alternative solving strategies are required when epistasis is studied. Methods We extended a fast Bayesian method (fBayesB), which was previously proposed for a purely additive model, to include non-additive effects. The fBayesB approach was used to estimate genetic effects on the basis of simulated datasets. Different scenarios were simulated to study the loss of accuracy of prediction, if epistatic effects were not simulated but modelled and vice versa. Results If 23 QTL were simulated to cause additive and dominance effects, both fBayesB and a conventional MCMC sampler BayesB yielded similar results in terms of accuracy of genetic value prediction and bias of variance component estimation based on a model including additive and dominance effects. Applying fBayesB to data with epistasis, accuracy could be improved by 5% when all pairwise interactions were modelled as well. The accuracy decreased more than 20% if genetic variation was spread over 230 QTL. In this scenario, accuracy based on modelling only additive and dominance effects was generally superior to that of the complex model including epistatic effects. Conclusions This simulation study showed that the fBayesB approach is convenient for genetic value prediction. Jointly estimating additive and non-additive effects (especially dominance) has reasonable impact on the accuracy of prediction and the proportion of genetic variation assigned to the additive genetic source. PMID:21867519

  1. Modeling the Effects of E-cigarettes on Smoking Behavior: Implications for Future Adult Smoking Prevalence.

    PubMed

    Cherng, Sarah T; Tam, Jamie; Christine, Paul J; Meza, Rafael

    2016-11-01

    Electronic cigarette (e-cigarette) use has increased rapidly in recent years. Given the unknown effects of e-cigarette use on cigarette smoking behaviors, e-cigarette regulation has become the subject of considerable controversy. In the absence of longitudinal data documenting the long-term effects of e-cigarette use on smoking behavior and population smoking outcomes, computational models can guide future empirical research and provide insights into the possible effects of e-cigarette use on smoking prevalence over time. Agent-based model examining hypothetical scenarios of e-cigarette use by smoking status and e-cigarette effects on smoking initiation and smoking cessation. If e-cigarettes increase individual-level smoking cessation probabilities by 20%, the model estimates a 6% reduction in smoking prevalence by 2060 compared with baseline model (no effects) outcomes. In contrast, e-cigarette use prevalence among never smokers would have to rise dramatically from current estimates, with e-cigarettes increasing smoking initiation by more than 200% relative to baseline model estimates to achieve a corresponding 6% increase in smoking prevalence by 2060. Based on current knowledge of the patterns of e-cigarette use by smoking status and the heavy concentration of e-cigarette use among current smokers, the simulated effects of e-cigarettes on smoking cessation generate substantially larger changes to smoking prevalence compared with their effects on smoking initiation.

  2. Modeling the Effects of E-Cigarettes on Smoking Behavior: Implications for Future Adult Smoking Prevalence

    PubMed Central

    Cherng, Sarah T.; Tam, Jamie; Christine, Paul; Meza, Rafael

    2016-01-01

    Background Electronic cigarette (e-cigarette) use has increased rapidly in recent years. Given the unknown effects of e-cigarette use on cigarette smoking behaviors, e-cigarette regulation has become the subject of considerable controversy. In the absence of longitudinal data documenting the long-term effects of e-cigarette use on smoking behavior and population smoking outcomes, computational models can guide future empirical research and provide insights into the possible effects of e-cigarette use on smoking prevalence over time. Methods Agent-based model examining hypothetical scenarios of e-cigarette use by smoking status and e-cigarette effects on smoking initiation and smoking cessation. Results If e-cigarettes increase individual-level smoking cessation probabilities by 20%, the model estimates a 6% reduction in smoking prevalence by 2060 compared to baseline model (no effects) outcomes. In contrast, e-cigarette use prevalence among never smokers would have to rise dramatically from current estimates, with e-cigarettes increasing smoking initiation by more than 200% relative to baseline model estimates in order to achieve a corresponding 6% increase in smoking prevalence by 2060. Conclusions Based on current knowledge of the patterns of e-cigarette use by smoking status and the heavy concentration of e-cigarette use among current smokers, the simulated effects of e-cigarettes on smoking cessation generate substantially larger changes to smoking prevalence relative to their effects on smoking initiation. PMID:27093020

  3. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  4. 3D-modelling of the thermal circumstances of a lake under artificial aeration

    NASA Astrophysics Data System (ADS)

    Tian, Xiaoqing; Pan, Huachen; Köngäs, Petrina; Horppila, Jukka

    2017-12-01

    A 3D-model was developed to study the effects of hypolimnetic aeration on the temperature profile of a thermally stratified Lake Vesijärvi (southern Finland). Aeration was conducted by pumping epilimnetic water through the thermocline to the hypolimnion without breaking the thermal stratification. The model used time transient equation based on Navier-Stokes equation. The model was fitted to the vertical temperature distribution and environmental parameters (wind, air temperature, and solar radiation) before the onset of aeration, and the model was used to predict the vertical temperature distribution 3 and 15 days after the onset of aeration (1 August and 22 August). The difference between the modelled and observed temperature was on average 0.6 °C. The average percentage model error was 4.0% on 1 August and 3.7% on 22 August. In the epilimnion, model accuracy depended on the difference between the observed temperature and boundary conditions. In the hypolimnion, the model residual decreased with increasing depth. On 1 August, the model predicted a homogenous temperature profile in the hypolimnion, while the observed temperature decreased moderately from the thermocline to the bottom. This was because the effect of sediment was not included in the model. On 22 August, the modelled and observed temperatures near the bottom were identical demonstrating that the heat transfer by the aerator masked the effect of sediment and that exclusion of sediment heat from the model does not cause considerable error unless very short-term effects of aeration are studied. In all, the model successfully described the effects of the aerator on the lake's temperature profile. The results confirmed the validity of the applied computational fluid dynamic in artificial aeration; based on the simulated results, the effect of aeration can be predicted.

  5. Effects of Training Leaders in Needs-Based Methods of Running Meetings

    ERIC Educational Resources Information Center

    Douglass, Emily M.; Malouff, John M.; Rangan, Julie A.

    2015-01-01

    This study evaluated the effects of brief training in how to lead organizational meetings. The training was based on an attendee-needs-based model of running meetings. Twelve mid-level managers completed the training. The study showed a significant pre to post increase in the number of needs-based behaviors displayed by meeting leaders and in…

  6. Evaluation of uncertainty in the adjustment of fundamental constants

    NASA Astrophysics Data System (ADS)

    Bodnar, Olha; Elster, Clemens; Fischer, Joachim; Possolo, Antonio; Toman, Blaza

    2016-02-01

    Combining multiple measurement results for the same quantity is an important task in metrology and in many other areas. Examples include the determination of fundamental constants, the calculation of reference values in interlaboratory comparisons, or the meta-analysis of clinical studies. However, neither the GUM nor its supplements give any guidance for this task. Various approaches are applied such as weighted least-squares in conjunction with the Birge ratio or random effects models. While the former approach, which is based on a location-scale model, is particularly popular in metrology, the latter represents a standard tool used in statistics for meta-analysis. We investigate the reliability and robustness of the location-scale model and the random effects model with particular focus on resulting coverage or credible intervals. The interval estimates are obtained by adopting a Bayesian point of view in conjunction with a non-informative prior that is determined by a currently favored principle for selecting non-informative priors. Both approaches are compared by applying them to simulated data as well as to data for the Planck constant and the Newtonian constant of gravitation. Our results suggest that the proposed Bayesian inference based on the random effects model is more reliable and less sensitive to model misspecifications than the approach based on the location-scale model.

  7. Implementation effect of productive 4-stage field orientation on the student technopreneur skill in vocational schools

    NASA Astrophysics Data System (ADS)

    Ismail, Edy; Samsudi, Widjanarko, Dwi; Joyce, Peter; Stearns, Roman

    2018-03-01

    This model integrates project base learning by creating a product based on environmental needs. The Produktif Orientasi Lapangan 4 Tahap (POL4T) combines technical skills and entrepreneurial elements together in the learning process. This study is to implement the result of technopreneurship learning model development which is environment-oriented by combining technology and entrepreneurship components on Machining Skill Program. This study applies research and development design by optimizing experimental subject. Data were obtained from questionnaires, learning material validation, interpersonal, intrapersonal observation forms, skills, product, teachers and students' responses, and cognitive tasks. Expert validation and t-test calculation are applied to see how effective POL4T learning model. The result of the study is in the form of 4 steps learning model to enhance interpersonal and intrapersonal attitudes, develop practical products which orient to society and appropriate technology so that the products can have high selling value. The model is effective based on the students' post test result, which is better than the pre-test. The product obtained from POL4T model is proven to be better than the productive learning. POL4T model is recommended to be implemented for XI grade students. This is can develop entrepreneurial attitudes that are environment oriented, community needs and technical competencies students.

  8. Simulations in Cyber-Security: A Review of Cognitive Modeling of Network Attackers, Defenders, and Users

    PubMed Central

    Veksler, Vladislav D.; Buchler, Norbou; Hoffman, Blaine E.; Cassenti, Daniel N.; Sample, Char; Sugrim, Shridat

    2018-01-01

    Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting. PMID:29867661

  9. A Practical Guide to Calibration of a GSSHA Hydrologic Model Using ERDC Automated Model Calibration Software - Effective and Efficient Stochastic Global Optimization

    DTIC Science & Technology

    2012-02-01

    parameter estimation method, but rather to carefully describe how to use the ERDC software implementation of MLSL that accommodates the PEST model...model independent LM method based parameter estimation software PEST (Doherty, 2004, 2007a, 2007b), which quantifies model to measure- ment misfit...et al. (2011) focused on one drawback associated with LM-based model independent parameter estimation as implemented in PEST ; viz., that it requires

  10. Fundamental studies on kinetic isotope effect (KIE) of hydrogen isotope fractionation in natural gas systems

    USGS Publications Warehouse

    Ni, Y.; Ma, Q.; Ellis, G.S.; Dai, J.; Katz, B.; Zhang, S.; Tang, Y.

    2011-01-01

    Based on quantum chemistry calculations for normal octane homolytic cracking, a kinetic hydrogen isotope fractionation model for methane, ethane, and propane formation is proposed. The activation energy differences between D-substitute and non-substituted methane, ethane, and propane are 318.6, 281.7, and 280.2cal/mol, respectively. In order to determine the effect of the entropy contribution for hydrogen isotopic substitution, a transition state for ethane bond rupture was determined based on density function theory (DFT) calculations. The kinetic isotope effect (KIE) associated with bond rupture in D and H substituted ethane results in a frequency factor ratio of 1.07. Based on the proposed mathematical model of hydrogen isotope fractionation, one can potentially quantify natural gas thermal maturity from measured hydrogen isotope values. Calculated gas maturity values determined by the proposed mathematical model using ??D values in ethane from several basins in the world are in close agreement with similar predictions based on the ??13C composition of ethane. However, gas maturity values calculated from field data of methane and propane using both hydrogen and carbon kinetic isotopic models do not agree as closely. It is possible that ??D values in methane may be affected by microbial mixing and that propane values might be more susceptible to hydrogen exchange with water or to analytical errors. Although the model used in this study is quite preliminary, the results demonstrate that kinetic isotope fractionation effects in hydrogen may be useful in quantitative models of natural gas generation, and that ??D values in ethane might be more suitable for modeling than comparable values in methane and propane. ?? 2011 Elsevier Ltd.

  11. Modeling of circulating fluised beds for post-combustion carbon capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, A.; Shadle, L.; Miller, D.

    2011-01-01

    A compartment based model for a circulating fluidized bed reactor has been developed based on experimental observations of riser hydrodynamics. The model uses a cluster based approach to describe the two-phase behavior of circulating fluidized beds. Fundamental mass balance equations have been derived to describe the movement of both gas and solids though the system. Additional work is being performed to develop the correlations required to describe the hydrodynamics of the system. Initial testing of the model with experimental data shows promising results and highlights the importance of including end effects within the model.

  12. A Biomechanical Model for Lung Fibrosis in Proton Beam Therapy

    NASA Astrophysics Data System (ADS)

    King, David J. S.

    The physics of protons makes them well-suited to conformal radiotherapy due to the well-known Bragg peak effect. From a proton's inherent stopping power, uncertainty effects can cause a small amount of dose to overflow to an organ at risk (OAR). Previous models for calculating normal tissue complication probabilities (NTCPs) relied on the equivalent uniform dose model (EUD), in which the organ was split into 1/3, 2/3 or whole organ irradiation. However, the problem of dealing with volumes <1/3 of the total volume renders this EUD based approach no longer applicable. In this work the case for an experimental data-based replacement at low volumes is investigated. Lung fibrosis is investigated as an NTCP effect typically arising from dose overflow from tumour irradiation at the spinal base. Considering a 3D geometrical model of the lungs, irradiations are modelled with variable parameters of dose overflow. To calculate NTCPs without the EUD model, experimental data is used from the quantitative analysis of normal tissue effects in the clinic (QUANTEC) data. Additional side projects are also investigated, introduced and explained at various points. A typical radiotherapy course for the patient of 30x2Gy per fraction is simulated. A range of geometry of the target volume and irradiation types is investigated. Investigations with X-rays found the majority of the data point ratios (ratio of EUD values found from calculation based and data based methods) at 20% within unity showing a relatively close agreement. The ratios have not systematically preferred one particular type of predictive method. No Vx metric was found to consistently outperform another. In certain cases there is a good agreement and not in other cases which can be found predicted in the literature. The overall results leads to conclusion that there is no reason to discount the use of the data based predictive method particularly, as a low volume replacement predictive method.

  13. Comparative Cost-Effectiveness Analysis of Three Different Automated Medication Systems Implemented in a Danish Hospital Setting.

    PubMed

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2018-02-01

    Automated medication systems have been found to reduce errors in the medication process, but little is known about the cost-effectiveness of such systems. The objective of this study was to perform a model-based indirect cost-effectiveness comparison of three different, real-world automated medication systems compared with current standard practice. The considered automated medication systems were a patient-specific automated medication system (psAMS), a non-patient-specific automated medication system (npsAMS), and a complex automated medication system (cAMS). The economic evaluation used original effect and cost data from prospective, controlled, before-and-after studies of medication systems implemented at a Danish hematological ward and an acute medical unit. Effectiveness was described as the proportion of clinical and procedural error opportunities that were associated with one or more errors. An error was defined as a deviation from the electronic prescription, from standard hospital policy, or from written procedures. The cost assessment was based on 6-month standardization of observed cost data. The model-based comparative cost-effectiveness analyses were conducted with system-specific assumptions of the effect size and costs in scenarios with consumptions of 15,000, 30,000, and 45,000 doses per 6-month period. With 30,000 doses the cost-effectiveness model showed that the cost-effectiveness ratio expressed as the cost per avoided clinical error was €24 for the psAMS, €26 for the npsAMS, and €386 for the cAMS. Comparison of the cost-effectiveness of the three systems in relation to different valuations of an avoided error showed that the psAMS was the most cost-effective system regardless of error type or valuation. The model-based indirect comparison against the conventional practice showed that psAMS and npsAMS were more cost-effective than the cAMS alternative, and that psAMS was more cost-effective than npsAMS.

  14. Modeling 3D Facial Shape from DNA

    PubMed Central

    Claes, Peter; Liberton, Denise K.; Daniels, Katleen; Rosana, Kerri Matthes; Quillen, Ellen E.; Pearson, Laurel N.; McEvoy, Brian; Bauchet, Marc; Zaidi, Arslan A.; Yao, Wei; Tang, Hua; Barsh, Gregory S.; Absher, Devin M.; Puts, David A.; Rocha, Jorge; Beleza, Sandra; Pereira, Rinaldo W.; Baynam, Gareth; Suetens, Paul; Vandermeulen, Dirk; Wagner, Jennifer K.; Boster, James S.; Shriver, Mark D.

    2014-01-01

    Human facial diversity is substantial, complex, and largely scientifically unexplained. We used spatially dense quasi-landmarks to measure face shape in population samples with mixed West African and European ancestry from three locations (United States, Brazil, and Cape Verde). Using bootstrapped response-based imputation modeling (BRIM), we uncover the relationships between facial variation and the effects of sex, genomic ancestry, and a subset of craniofacial candidate genes. The facial effects of these variables are summarized as response-based imputed predictor (RIP) variables, which are validated using self-reported sex, genomic ancestry, and observer-based facial ratings (femininity and proportional ancestry) and judgments (sex and population group). By jointly modeling sex, genomic ancestry, and genotype, the independent effects of particular alleles on facial features can be uncovered. Results on a set of 20 genes showing significant effects on facial features provide support for this approach as a novel means to identify genes affecting normal-range facial features and for approximating the appearance of a face from genetic markers. PMID:24651127

  15. Dual metal gate tunneling field effect transistors based on MOSFETs: A 2-D analytical approach

    NASA Astrophysics Data System (ADS)

    Ramezani, Zeinab; Orouji, Ali A.

    2018-01-01

    A novel 2-D analytical drain current model of novel Dual Metal Gate Tunnel Field Effect Transistors Based on MOSFETs (DMG-TFET) is presented in this paper. The proposed Tunneling FET is extracted from a MOSFET structure by employing an additional electrode in the source region with an appropriate work function to induce holes in the N+ source region and hence makes it as a P+ source region. The electric field is derived which is utilized to extract the expression of the drain current by analytically integrating the band to band tunneling generation rate in the tunneling region based on the potential profile by solving the Poisson's equation. Through this model, the effects of the thin film thickness and gate voltage on the potential, the electric field, and the effects of the thin film thickness on the tunneling current can be studied. To validate our present model we use SILVACO ATLAS device simulator and the analytical results have been compared with it and found a good agreement.

  16. Experimental Evaluation of the Effects of a Research-Based Preschool Mathematics Curriculum

    ERIC Educational Resources Information Center

    Clements, Douglas H.; Sarama, Julie

    2008-01-01

    A randomized-trials design was used to evaluate the effectiveness of a preschool mathematics program based on a comprehensive model of research-based curricula development. Thirty-six preschool classrooms were assigned to experimental (Building Blocks), comparison (a different preschool mathematics curriculum), or control conditions. Children were…

  17. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  18. Modeling of Mixing Behavior in a Combined Blowing Steelmaking Converter with a Filter-Based Euler-Lagrange Model

    NASA Astrophysics Data System (ADS)

    Li, Mingming; Li, Lin; Li, Qiang; Zou, Zongshu

    2018-05-01

    A filter-based Euler-Lagrange multiphase flow model is used to study the mixing behavior in a combined blowing steelmaking converter. The Euler-based volume of fluid approach is employed to simulate the top blowing, while the Lagrange-based discrete phase model that embeds the local volume change of rising bubbles for the bottom blowing. A filter-based turbulence method based on the local meshing resolution is proposed aiming to improve the modeling of turbulent eddy viscosities. The model validity is verified through comparison with physical experiments in terms of mixing curves and mixing times. The effects of the bottom gas flow rate on bath flow and mixing behavior are investigated and the inherent reasons for the mixing result are clarified in terms of the characteristics of bottom-blowing plumes, the interaction between plumes and top-blowing jets, and the change of bath flow structure.

  19. Modeling respiratory mechanics in the MCAT and spline-based MCAT phantoms

    NASA Astrophysics Data System (ADS)

    Segars, W. P.; Lalush, D. S.; Tsui, B. M. W.

    2001-02-01

    Respiratory motion can cause artifacts in myocardial SPECT and computed tomography (CT). The authors incorporate models of respiratory mechanics into the current 4D MCAT and into the next generation spline-based MCAT phantoms. In order to simulate respiratory motion in the current MCAT phantom, the geometric solids for the diaphragm, heart, ribs, and lungs were altered through manipulation of parameters defining them. Affine transformations were applied to the control points defining the same respiratory structures in the spline-based MCAT phantom to simulate respiratory motion. The Non-Uniform Rational B-Spline (NURBS) surfaces for the lungs and body outline were constructed in such a way as to be linked to the surrounding ribs. Expansion and contraction of the thoracic cage then coincided with expansion and contraction of the lungs and body. The changes both phantoms underwent were spline-interpolated over time to create time continuous 4D respiratory models. The authors then used the geometry-based and spline-based MCAT phantoms in an initial simulation study of the effects of respiratory motion on myocardial SPECT. The simulated reconstructed images demonstrated distinct artifacts in the inferior region of the myocardium. It is concluded that both respiratory models can be effective tools for researching effects of respiratory motion.

  20. Free-space optical channel simulator for weak-turbulence conditions.

    PubMed

    Bykhovsky, Dima

    2015-11-01

    Free-space optical (FSO) communication may be severely influenced by the inevitable turbulence effect that results in channel gain fluctuations and fading. The objective of this paper is to provide a simple and effective simulator of the weak-turbulence FSO channel that emulates the influence of the temporal covariance effect. Specifically, the proposed model is based on lognormal distributed samples with a corresponding correlation time. The simulator is based on the solution of the first-order stochastic differential equation (SDE). The results of the provided SDE analysis reveal its efficacy for turbulent channel modeling.

  1. Specific heat capacity of molten salt-based alumina nanofluid.

    PubMed

    Lu, Ming-Chang; Huang, Chien-Hsun

    2013-06-21

    There is no consensus on the effect of nanoparticle (NP) addition on the specific heat capacity (SHC) of fluids. In addition, the predictions from the existing model have a large discrepancy from the measured SHCs in nanofluids. We show that the SHC of the molten salt-based alumina nanofluid decreases with reducing particle size and increasing particle concentration. The NP size-dependent SHC is resulted from an augmentation of the nanolayer effect as particle size reduces. A model considering the nanolayer effect which supports the experimental results was proposed.

  2. Specific heat capacity of molten salt-based alumina nanofluid

    PubMed Central

    2013-01-01

    There is no consensus on the effect of nanoparticle (NP) addition on the specific heat capacity (SHC) of fluids. In addition, the predictions from the existing model have a large discrepancy from the measured SHCs in nanofluids. We show that the SHC of the molten salt-based alumina nanofluid decreases with reducing particle size and increasing particle concentration. The NP size-dependent SHC is resulted from an augmentation of the nanolayer effect as particle size reduces. A model considering the nanolayer effect which supports the experimental results was proposed. PMID:23800321

  3. Partial Deconvolution with Inaccurate Blur Kernel.

    PubMed

    Ren, Dongwei; Zuo, Wangmeng; Zhang, David; Xu, Jun; Zhang, Lei

    2017-10-17

    Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.

  4. Comparison of NGA-West2 directivity models

    USGS Publications Warehouse

    Spudich, Paul A.; Rowshandel, Badie; Shahi, Shrey; Baker, Jack W.; Chiou, Brian S-J

    2014-01-01

    Five directivity models have been developed based on data from the NGA-West2 database and based on numerical simulations of large strike-slip and reverse-slip earthquakes. All models avoid the use of normalized rupture dimension, enabling them to scale up to the largest earthquakes in a physically reasonable way. Four of the five models are explicitly “narrow-band” (in which the effect of directivity is maximum at a specific period that is a function of earthquake magnitude). Several strategies for determining the zero-level for directivity have been developed. We show comparisons of maps of the directivity amplification. This comparison suggests that the predicted geographic distributions of directivity amplification are dominated by effects of the models' assumptions, and more than one model should be used for ruptures dipping less than about 65 degrees.

  5. Surface potential based modeling of charge, current, and capacitances in DGTFET including mobile channel charge and ambipolar behaviour

    NASA Astrophysics Data System (ADS)

    Jain, Prateek; Yadav, Chandan; Agarwal, Amit; Chauhan, Yogesh Singh

    2017-08-01

    We present a surface potential based analytical model for double gate tunnel field effect transistor (DGTFET) for the current, terminal charges, and terminal capacitances. The model accounts for the effect of the mobile charge in the channel and captures the device physics in depletion as well as in the strong inversion regime. The narrowing of the tunnel barrier in the presence of mobile charges in the channel is incorporated via modeling of the inverse decay length, which is constant under channel depletion condition and bias dependent under inversion condition. To capture the ambipolar current behavior in the model, tunneling at the drain junction is also included. The proposed model is validated against TCAD simulation data and it shows close match with the simulation data.

  6. Effectiveness of Discovery Learning-Based Transformation Geometry Module

    NASA Astrophysics Data System (ADS)

    Febriana, R.; Haryono, Y.; Yusri, R.

    2017-09-01

    Development of transformation geometry module is conducted because the students got difficulties to understand the existing book. The purpose of the research was to find out the effectiveness of discovery learning-based transformation geometry module toward student’s activity. Model of the development was Plomp model consisting preliminary research, prototyping phase and assessment phase. The research was focused on assessment phase where it was to observe the designed product effectiveness. The instrument was observation sheet. The observed activities were visual activities, oral activities, listening activities, mental activities, emotional activities and motor activities. Based on the result of the research, it is found that visual activities, learning activities, writing activities, the student’s activity is in the criteria very effective. It can be concluded that the use of discovery learning-based transformation geometry module use can increase the positive student’s activity and decrease the negative activity.

  7. Measured and Modeled Toxicokinetics in Cultured Fish Cells and Application to In Vitro - In Vivo Toxicity Extrapolation

    PubMed Central

    Stadnicka-Michalak, Julita; Tanneberger, Katrin; Schirmer, Kristin; Ashauer, Roman

    2014-01-01

    Effect concentrations in the toxicity assessment of chemicals with fish and fish cells are generally based on external exposure concentrations. External concentrations as dose metrics, may, however, hamper interpretation and extrapolation of toxicological effects because it is the internal concentration that gives rise to the biological effective dose. Thus, we need to understand the relationship between the external and internal concentrations of chemicals. The objectives of this study were to: (i) elucidate the time-course of the concentration of chemicals with a wide range of physicochemical properties in the compartments of an in vitro test system, (ii) derive a predictive model for toxicokinetics in the in vitro test system, (iii) test the hypothesis that internal effect concentrations in fish (in vivo) and fish cell lines (in vitro) correlate, and (iv) develop a quantitative in vitro to in vivo toxicity extrapolation method for fish acute toxicity. To achieve these goals, time-dependent amounts of organic chemicals were measured in medium, cells (RTgill-W1) and the plastic of exposure wells. Then, the relation between uptake, elimination rate constants, and log KOW was investigated for cells in order to develop a toxicokinetic model. This model was used to predict internal effect concentrations in cells, which were compared with internal effect concentrations in fish gills predicted by a Physiologically Based Toxicokinetic model. Our model could predict concentrations of non-volatile organic chemicals with log KOW between 0.5 and 7 in cells. The correlation of the log ratio of internal effect concentrations in fish gills and the fish gill cell line with the log KOW was significant (r>0.85, p = 0.0008, F-test). This ratio can be predicted from the log KOW of the chemical (77% of variance explained), comprising a promising model to predict lethal effects on fish based on in vitro data. PMID:24647349

  8. Gain-clamped semiconductor optical amplifiers based on compensating light: Theoretical model and performance analysis

    NASA Astrophysics Data System (ADS)

    Jia, Xin-Hong; Wu, Zheng-Mao; Xia, Guang-Qiong

    2006-12-01

    It is well known that the gain-clamped semiconductor optical amplifier (GC-SOA) based on lasing effect is subject to transmission rate restriction because of relaxation oscillation. The GC-SOA based on compensating effect between signal light and amplified spontaneous emission by combined SOA and fiber Bragg grating (FBG) can be used to overcome this problem. In this paper, the theoretical model on GC-SOA based on compensating light has been constructed. The numerical simulations demonstrate that good gain and noise figure characteristics can be realized by selecting reasonably the FBG insertion position, the peak reflectivity of FBG and the biasing current of GC-SOA.

  9. Agent-Based Computational Modeling to Examine How Individual Cell Morphology Affects Dosimetry

    EPA Science Inventory

    Cell-based models utilizing high-content screening (HCS) data have applications for predictive toxicology. Evaluating concentration-dependent effects on cell fate and state response is a fundamental utilization of HCS data.Although HCS assays may capture quantitative readouts at ...

  10. [Modeling and analysis of volume conduction based on field-circuit coupling].

    PubMed

    Tang, Zhide; Liu, Hailong; Xie, Xiaohui; Chen, Xiufa; Hou, Deming

    2012-08-01

    Numerical simulations of volume conduction can be used to analyze the process of energy transfer and explore the effects of some physical factors on energy transfer efficiency. We analyzed the 3D quasi-static electric field by the finite element method, and developed A 3D coupled field-circuit model of volume conduction basing on the coupling between the circuit and the electric field. The model includes a circuit simulation of the volume conduction to provide direct theoretical guidance for energy transfer optimization design. A field-circuit coupling model with circular cylinder electrodes was established on the platform of the software FEM3.5. Based on this, the effects of electrode cross section area, electrode distance and circuit parameters on the performance of volume conduction system were obtained, which provided a basis for optimized design of energy transfer efficiency.

  11. The Power of Passion in Entrepreneurship Education: Entrepreneurial Role Models Encourage Passion?

    PubMed Central

    Fellnhofer, Katharina

    2018-01-01

    This study of Entrepreneurship Education (EE) centers on the impact of entrepreneurial role models on entrepreneurial passion, which also is expected to influence entrepreneurial intention. Based on 426 individuals recruited primarily from Austria, Finland, and Greece, Structural Equation Modeling (SEM) reveals the significant direct and indirect effects of entrepreneurial role models on entrepreneurial intention, mediated by entrepreneurial passion. These effects were found to be stronger following multimedia presentation of entrepreneurial stories, confirming the fruitful spillover effects of the innovative educational use of computers on entrepreneurial intentions among nascent entrepreneurs. Drawing on the theory of planned behavior (TPB) and social learning theory, this study confirms both the positive impact of entrepreneurial role models and significant short-term effects of web-based multimedia in the context of EE. This narrative approach is shown to be an effective pedagogical instrument in enhancing individual orientation toward entrepreneurship to facilitate entrepreneurial intention. This study identifies the great potential of these pioneering methods and tools, both for further research in the academic community and for entrepreneurship educators who hope to promote entrepreneurial intention in aspiring entrepreneurs. The findings are also relevant for policy makers designing effective instruments to achieve long-term goals. PMID:29877516

  12. The Power of Passion in Entrepreneurship Education: Entrepreneurial Role Models Encourage Passion?

    PubMed

    Fellnhofer, Katharina

    2017-07-01

    This study of Entrepreneurship Education (EE) centers on the impact of entrepreneurial role models on entrepreneurial passion, which also is expected to influence entrepreneurial intention. Based on 426 individuals recruited primarily from Austria, Finland, and Greece, Structural Equation Modeling (SEM) reveals the significant direct and indirect effects of entrepreneurial role models on entrepreneurial intention, mediated by entrepreneurial passion. These effects were found to be stronger following multimedia presentation of entrepreneurial stories, confirming the fruitful spillover effects of the innovative educational use of computers on entrepreneurial intentions among nascent entrepreneurs. Drawing on the theory of planned behavior (TPB) and social learning theory, this study confirms both the positive impact of entrepreneurial role models and significant short-term effects of web-based multimedia in the context of EE. This narrative approach is shown to be an effective pedagogical instrument in enhancing individual orientation toward entrepreneurship to facilitate entrepreneurial intention. This study identifies the great potential of these pioneering methods and tools, both for further research in the academic community and for entrepreneurship educators who hope to promote entrepreneurial intention in aspiring entrepreneurs. The findings are also relevant for policy makers designing effective instruments to achieve long-term goals.

  13. The intermediate endpoint effect in logistic and probit regression

    PubMed Central

    MacKinnon, DP; Lockwood, CM; Brown, CH; Wang, W; Hoffman, JM

    2010-01-01

    Background An intermediate endpoint is hypothesized to be in the middle of the causal sequence relating an independent variable to a dependent variable. The intermediate variable is also called a surrogate or mediating variable and the corresponding effect is called the mediated, surrogate endpoint, or intermediate endpoint effect. Clinical studies are often designed to change an intermediate or surrogate endpoint and through this intermediate change influence the ultimate endpoint. In many intermediate endpoint clinical studies the dependent variable is binary, and logistic or probit regression is used. Purpose The purpose of this study is to describe a limitation of a widely used approach to assessing intermediate endpoint effects and to propose an alternative method, based on products of coefficients, that yields more accurate results. Methods The intermediate endpoint model for a binary outcome is described for a true binary outcome and for a dichotomization of a latent continuous outcome. Plots of true values and a simulation study are used to evaluate the different methods. Results Distorted estimates of the intermediate endpoint effect and incorrect conclusions can result from the application of widely used methods to assess the intermediate endpoint effect. The same problem occurs for the proportion of an effect explained by an intermediate endpoint, which has been suggested as a useful measure for identifying intermediate endpoints. A solution to this problem is given based on the relationship between latent variable modeling and logistic or probit regression. Limitations More complicated intermediate variable models are not addressed in the study, although the methods described in the article can be extended to these more complicated models. Conclusions Researchers are encouraged to use an intermediate endpoint method based on the product of regression coefficients. A common method based on difference in coefficient methods can lead to distorted conclusions regarding the intermediate effect. PMID:17942466

  14. Boosting structured additive quantile regression for longitudinal childhood obesity data.

    PubMed

    Fenske, Nora; Fahrmeir, Ludwig; Hothorn, Torsten; Rzehak, Peter; Höhle, Michael

    2013-07-25

    Childhood obesity and the investigation of its risk factors has become an important public health issue. Our work is based on and motivated by a German longitudinal study including 2,226 children with up to ten measurements on their body mass index (BMI) and risk factors from birth to the age of 10 years. We introduce boosting of structured additive quantile regression as a novel distribution-free approach for longitudinal quantile regression. The quantile-specific predictors of our model include conventional linear population effects, smooth nonlinear functional effects, varying-coefficient terms, and individual-specific effects, such as intercepts and slopes. Estimation is based on boosting, a computer intensive inference method for highly complex models. We propose a component-wise functional gradient descent boosting algorithm that allows for penalized estimation of the large variety of different effects, particularly leading to individual-specific effects shrunken toward zero. This concept allows us to flexibly estimate the nonlinear age curves of upper quantiles of the BMI distribution, both on population and on individual-specific level, adjusted for further risk factors and to detect age-varying effects of categorical risk factors. Our model approach can be regarded as the quantile regression analog of Gaussian additive mixed models (or structured additive mean regression models), and we compare both model classes with respect to our obesity data.

  15. Prioritizing Conservation of Ungulate Calving Resources in Multiple-Use Landscapes

    PubMed Central

    Dzialak, Matthew R.; Harju, Seth M.; Osborn, Robert G.; Wondzell, John J.; Hayden-Wing, Larry D.; Winstead, Jeffrey B.; Webb, Stephen L.

    2011-01-01

    Background Conserving animal populations in places where human activity is increasing is an ongoing challenge in many parts of the world. We investigated how human activity interacted with maternal status and individual variation in behavior to affect reliability of spatially-explicit models intended to guide conservation of critical ungulate calving resources. We studied Rocky Mountain elk (Cervus elaphus) that occupy a region where 2900 natural gas wells have been drilled. Methodology/Principal Findings We present novel applications of generalized additive modeling to predict maternal status based on movement, and of random-effects resource selection models to provide population and individual-based inference on the effects of maternal status and human activity. We used a 2×2 factorial design (treatment vs. control) that included elk that were either parturient or non-parturient and in areas either with or without industrial development. Generalized additive models predicted maternal status (parturiency) correctly 93% of the time based on movement. Human activity played a larger role than maternal status in shaping resource use; elk showed strong spatiotemporal patterns of selection or avoidance and marked individual variation in developed areas, but no such pattern in undeveloped areas. This difference had direct consequences for landscape-level conservation planning. When relative probability of use was calculated across the study area, there was disparity throughout 72–88% of the landscape in terms of where conservation intervention should be prioritized depending on whether models were based on behavior in developed areas or undeveloped areas. Model validation showed that models based on behavior in developed areas had poor predictive accuracy, whereas the model based on behavior in undeveloped areas had high predictive accuracy. Conclusions/Significance By directly testing for differences between developed and undeveloped areas, and by modeling resource selection in a random-effects framework that provided individual-based inference, we conclude that: 1) amplified selection or avoidance behavior and individual variation, as responses to increasing human activity, complicate conservation planning in multiple-use landscapes, and 2) resource selection behavior in places where human activity is predictable or less dynamic may provide a more reliable basis from which to prioritize conservation action. PMID:21297866

  16. A Meta-Analysis of Video-Modeling Based Interventions for Reduction of Challenging Behaviors for Students with EBD

    ERIC Educational Resources Information Center

    Losinski, Mickey; Wiseman, Nicole; White, Sherry A.; Balluch, Felicity

    2016-01-01

    The current study examined the use of video modeling (VM)-based interventions to reduce the challenging behaviors of students with emotional or behavioral disorders. Each study was evaluated using Council for Exceptional Children's (CEC's) quality indicators for evidence-based practices. In addition, study effects were calculated along the three…

  17. Avalanches, loading and finite size effects in 2D amorphous plasticity: results from a finite element model

    NASA Astrophysics Data System (ADS)

    Sandfeld, Stefan; Budrikis, Zoe; Zapperi, Stefano; Fernandez Castellanos, David

    2015-02-01

    Crystalline plasticity is strongly interlinked with dislocation mechanics and nowadays is relatively well understood. Concepts and physical models of plastic deformation in amorphous materials on the other hand—where the concept of linear lattice defects is not applicable—still are lagging behind. We introduce an eigenstrain-based finite element lattice model for simulations of shear band formation and strain avalanches. Our model allows us to study the influence of surfaces and finite size effects on the statistics of avalanches. We find that even with relatively complex loading conditions and open boundary conditions, critical exponents describing avalanche statistics are unchanged, which validates the use of simpler scalar lattice-based models to study these phenomena.

  18. The alarming problems of confounding equivalence using logistic regression models in the perspective of causal diagrams.

    PubMed

    Yu, Yuanyuan; Li, Hongkai; Sun, Xiaoru; Su, Ping; Wang, Tingting; Liu, Yi; Yuan, Zhongshang; Liu, Yanxun; Xue, Fuzhong

    2017-12-28

    Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM) were compared. The "do-calculus" was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal strategy was to adjust for the parent nodes of outcome, which obtained the highest precision. All adjustment strategies through logistic regression were biased for causal effect estimation, while IPW-based-MSM could always obtain unbiased estimation when the adjusted set satisfied G-admissibility. Thus, IPW-based-MSM was recommended to adjust for confounders set.

  19. A Comparative Study of the Effects of the Neurocognitive-Based Model and the Conventional Model on Learner Attention, Working Memory and Mood

    ERIC Educational Resources Information Center

    Srikoon, Sanit; Bunterm, Tassanee; Nethanomsak, Teerachai; Ngang, Tang Keow

    2017-01-01

    Purpose: The attention, working memory, and mood of learners are the most important abilities in the learning process. This study was concerned with the comparison of contextualized attention, working memory, and mood through a neurocognitive-based model (5P) and a conventional model (5E). It sought to examine the significant change in attention,…

  20. An efficient current-based logic cell model for crosstalk delay analysis

    NASA Astrophysics Data System (ADS)

    Nazarian, Shahin; Das, Debasish

    2013-04-01

    Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.

Top