Sample records for model base case

  1. Model Documentation of Base Case Data | Regional Energy Deployment System

    Science.gov Websites

    Model | Energy Analysis | NREL Documentation of Base Case Data Model Documentation of Base Case base case of the model. The base case was developed simply as a point of departure for other analyses Base Case derives many of its inputs from the Energy Information Administration's (EIA's) Annual Energy

  2. Model-Based Assurance Case+ (MBAC+): Tutorial on Modeling Radiation Hardness Assurance Activities

    NASA Technical Reports Server (NTRS)

    Austin, Rebekah; Label, Ken A.; Sampson, Mike J.; Evans, John; Witulski, Art; Sierawski, Brian; Karsai, Gabor; Mahadevan, Nag; Schrimpf, Ron; Reed, Robert A.

    2017-01-01

    This presentation will cover why modeling is useful for radiation hardness assurance cases, and also provide information on Model-Based Assurance Case+ (MBAC+), NASAs Reliability Maintainability Template, and Fault Propagation Modeling.

  3. A fuzzy case based reasoning tool for model based approach to rocket engine health monitoring

    NASA Technical Reports Server (NTRS)

    Krovvidy, Srinivas; Nolan, Adam; Hu, Yong-Lin; Wee, William G.

    1992-01-01

    In this system we develop a fuzzy case based reasoner that can build a case representation for several past anomalies detected, and we develop case retrieval methods that can be used to index a relevant case when a new problem (case) is presented using fuzzy sets. The choice of fuzzy sets is justified by the uncertain data. The new problem can be solved using knowledge of the model along with the old cases. This system can then be used to generalize the knowledge from previous cases and use this generalization to refine the existing model definition. This in turn can help to detect failures using the model based algorithms.

  4. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. An evidence-based approach to case management model selection for an acute care facility: is there really a preferred model?

    PubMed

    Terra, Sandra M

    2007-01-01

    This research seeks to determine whether there is adequate evidence-based justification for selection of one acute care case management model over another. Acute Inpatient Hospital. This article presents a systematic review of published case management literature, resulting in classification specific to terms of level of evidence. This review examines the best available evidence in an effort to select an acute care case management model. Although no single case management model can be identified as preferred, it is clear that adequate evidence-based literature exists to acknowledge key factors driving the acute care model and to form a foundation for the efficacy of hospital case management practice. Although no single case management model can be identified as preferred, this systematic review demonstrates that adequate evidence-based literature exists to acknowledge key factors driving the acute care model and forming a foundation for the efficacy of hospital case management practice. Distinctive aspects of case management frameworks can be used to guide the development of an acute care case management model. The study illustrates: * The effectiveness of case management when there is direct patient contact by the case manager regardless of disease condition: not only does the quality of care increase but also length of stay (LOS) decreases, care is defragmented, and both patient and physician satisfaction can increase. * The preferred case management models result in measurable outcomes that can directly relate to, and demonstrate alignment with, organizational strategy. * Acute care management programs reduce cost and LOS, and improve outcomes. * An integrated case management program that includes social workers, as well as nursing, is the most effective acute care management model. * The successful case management model will recognize physicians, as well as patients, as valued customers with whom partnership can positively affect financial outcomes in terms of reduction in LOS, improvement in quality, and delivery of care.

  6. Base stock system for patient vs impatient customers with varying demand distribution

    NASA Astrophysics Data System (ADS)

    Fathima, Dowlath; Uduman, P. Sheik

    2013-09-01

    An optimal Base-Stock inventory policy for Patient and Impatient Customers using finite-horizon models is examined. The Base stock system for Patient and Impatient customers is a different type of inventory policy. In case of the model I, Base stock for Patient customer case is evaluated using the Truncated Exponential Distribution. The model II involves the study of Base-stock inventory policies for Impatient customer. A study on these systems reveals that the Customers wait until the arrival of the next order or the customers leaves the system which leads to lost sale. In both the models demand during the period [0, t] is taken to be a random variable. In this paper, Truncated Exponential Distribution satisfies the Base stock policy for the patient customer as a continuous model. So far the Base stock for Impatient Customers leaded to a discrete case but, in this paper we have modeled this condition into a continuous case. We justify this approach mathematically and also numerically.

  7. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    ERIC Educational Resources Information Center

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  8. Case-based explanation of non-case-based learning methods.

    PubMed Central

    Caruana, R.; Kangarloo, H.; Dionisio, J. D.; Sinha, U.; Johnson, D.

    1999-01-01

    We show how to generate case-based explanations for non-case-based learning methods such as artificial neural nets or decision trees. The method uses the trained model (e.g., the neural net or the decision tree) as a distance metric to determine which cases in the training set are most similar to the case that needs to be explained. This approach is well suited to medical domains, where it is important to understand predictions made by complex machine learning models, and where training and clinical practice makes users adept at case interpretation. PMID:10566351

  9. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed Central

    LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346

  10. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed

    LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.

  11. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  12. Case Problems for Problem-Based Pedagogical Approaches: A Comparative Analysis

    ERIC Educational Resources Information Center

    Dabbagh, Nada; Dass, Susan

    2013-01-01

    A comparative analysis of 51 case problems used in five problem-based pedagogical models was conducted to examine whether there are differences in their characteristics and the implications of such differences on the selection and generation of ill-structured case problems. The five pedagogical models were: situated learning, goal-based scenario,…

  13. QSPR modeling: graph connectivity indices versus line graph connectivity indices

    PubMed

    Basak; Nikolic; Trinajstic; Amic; Beslo

    2000-07-01

    Five QSPR models of alkanes were reinvestigated. Properties considered were molecular surface-dependent properties (boiling points and gas chromatographic retention indices) and molecular volume-dependent properties (molar volumes and molar refractions). The vertex- and edge-connectivity indices were used as structural parameters. In each studied case we computed connectivity indices of alkane trees and alkane line graphs and searched for the optimum exponent. Models based on indices with an optimum exponent and on the standard value of the exponent were compared. Thus, for each property we generated six QSPR models (four for alkane trees and two for the corresponding line graphs). In all studied cases QSPR models based on connectivity indices with optimum exponents have better statistical characteristics than the models based on connectivity indices with the standard value of the exponent. The comparison between models based on vertex- and edge-connectivity indices gave in two cases (molar volumes and molar refractions) better models based on edge-connectivity indices and in three cases (boiling points for octanes and nonanes and gas chromatographic retention indices) better models based on vertex-connectivity indices. Thus, it appears that the edge-connectivity index is more appropriate to be used in the structure-molecular volume properties modeling and the vertex-connectivity index in the structure-molecular surface properties modeling. The use of line graphs did not improve the predictive power of the connectivity indices. Only in one case (boiling points of nonanes) a better model was obtained with the use of line graphs.

  14. Documentation of the Retail Price Model

    EPA Pesticide Factsheets

    The Retail Price Model (RPM) provides a first‐order estimate of average retail electricity prices using information from the EPA Base Case v.5.13 Base Case or other scenarios for each of the 64 Integrated Planing Model (IPM) regions.

  15. [Application of ARIMA model to predict number of malaria cases in China].

    PubMed

    Hui-Yu, H; Hua-Qin, S; Shun-Xian, Z; Lin, A I; Yan, L U; Yu-Chun, C; Shi-Zhu, L I; Xue-Jiao, T; Chun-Li, Y; Wei, H U; Jia-Xu, C

    2017-08-15

    Objective To study the application of autoregressive integrated moving average (ARIMA) model to predict the monthly reported malaria cases in China, so as to provide a reference for prevention and control of malaria. Methods SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported malaria cases of the time series of 20062015 and 2011-2015, respectively. The data of malaria cases from January to December, 2016 were used as validation data to compare the accuracy of the two ARIMA models. Results The models of the monthly reported cases of malaria in China were ARIMA (2, 1, 1) (1, 1, 0) 12 and ARIMA (1, 0, 0) (1, 1, 0) 12 respectively. The comparison between the predictions of the two models and actual situation of malaria cases showed that the ARIMA model based on the data of 2011-2015 had a higher accuracy of forecasting than the model based on the data of 2006-2015 had. Conclusion The establishment and prediction of ARIMA model is a dynamic process, which needs to be adjusted unceasingly according to the accumulated data, and in addition, the major changes of epidemic characteristics of infectious diseases must be considered.

  16. A Case-Based Learning Model in Orthodontics.

    ERIC Educational Resources Information Center

    Engel, Francoise E.; Hendricson, William D.

    1994-01-01

    A case-based, student-centered instructional model designed to mimic orthodontic problem solving and decision making in dental general practice is described. Small groups of students analyze case data, then record and discuss their diagnoses and treatments. Students and instructors rated the seminars positively, and students reported improved…

  17. Predicate Argument Structure Analysis for Use Case Description Modeling

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  18. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  19. [Comparison of Flu Outbreak Reporting Standards Based on Transmission Dynamics Model].

    PubMed

    Yang, Guo-jing; Yi, Qing-jie; Li, Qin; Zeng, Qing

    2016-05-01

    To compare the current two flu outbreak reporting standards for the purpose of better prevention and control of flu outbreaks. A susceptible-exposed-infectious/asymptomatic-removed (SEIAR) model without interventions was set up first, followed by a model with interventions based on real situation. Simulated interventions were developed based on the two reporting standards, and evaluated by estimated duration of outbreaks, cumulative new cases, cumulative morbidity rates, decline in percentage of morbidity rates, and cumulative secondary cases. The basic reproductive number of the outbreak was estimated as 8. 2. The simulation produced similar results as the real situation. The effect of interventions based on reporting standard one (10 accumulated new cases in a week) was better than that of interventions based on reporting standard two (30 accumulated new cases in a week). The reporting standard one (10 accumulated new cases in a week) is more effective for prevention and control of flu outbreaks.

  20. Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Nutaro, James J

    This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigmmore » to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.« less

  1. Screening of pollution control and clean-up materials for river chemical spills using the multiple case-based reasoning method with a difference-driven revision strategy.

    PubMed

    Liu, Rentao; Jiang, Jiping; Guo, Liang; Shi, Bin; Liu, Jie; Du, Zhaolin; Wang, Peng

    2016-06-01

    In-depth filtering of emergency disposal technology (EDT) and materials has been required in the process of environmental pollution emergency disposal. However, an urgent problem that must be solved is how to quickly and accurately select the most appropriate materials for treating a pollution event from the existing spill control and clean-up materials (SCCM). To meet this need, the following objectives were addressed in this study. First, the material base and a case base for environment pollution emergency disposal were established to build a foundation and provide material for SCCM screening. Second, the multiple case-based reasoning model method with a difference-driven revision strategy (DDRS-MCBR) was applied to improve the original dual case-based reasoning model method system, and screening and decision-making was performed for SCCM using this model. Third, an actual environmental pollution accident from 2012 was used as a case study to verify the material base, case base, and screening model. The results demonstrated that the DDRS-MCBR method was fast, efficient, and practical. The DDRS-MCBR method changes the passive situation in which the choice of SCCM screening depends only on the subjective experience of the decision maker and offers a new approach to screening SCCM.

  2. Case-Based Modeling for Learning Management and Interpersonal Skills

    ERIC Educational Resources Information Center

    Lyons, Paul

    2008-01-01

    This article offers an introduction to case-based modeling (CBM) and a demonstration of the efficacy of this instructional model. CBM is grounded primarily in the concepts and theory of experiential learning, augmented by concepts of script creation. Although it is labor intensive, the model is one that has value for instruction in various…

  3. Data Clustering and Evolving Fuzzy Decision Tree for Data Base Classification Problems

    NASA Astrophysics Data System (ADS)

    Chang, Pei-Chann; Fan, Chin-Yuan; Wang, Yen-Wen

    Data base classification suffers from two well known difficulties, i.e., the high dimensionality and non-stationary variations within the large historic data. This paper presents a hybrid classification model by integrating a case based reasoning technique, a Fuzzy Decision Tree (FDT), and Genetic Algorithms (GA) to construct a decision-making system for data classification in various data base applications. The model is major based on the idea that the historic data base can be transformed into a smaller case-base together with a group of fuzzy decision rules. As a result, the model can be more accurately respond to the current data under classifying from the inductions by these smaller cases based fuzzy decision trees. Hit rate is applied as a performance measure and the effectiveness of our proposed model is demonstrated by experimentally compared with other approaches on different data base classification applications. The average hit rate of our proposed model is the highest among others.

  4. Everglades Landscape Model: Integrated Assessment of Hydrology, Biogeochemistry, and Biology

    NASA Astrophysics Data System (ADS)

    Fitz, H. C.; Wang, N.; Sklar, F. H.

    2002-05-01

    Water management infrastructure and operations have fragmented the greater Everglades into separate, impounded basins, altering flows and hydropatterns. A significant area of this managed system has experienced anthropogenic eutrophication. This combination of altered hydrology and water quality has interacted to degrade vegetative habitats and other ecological characteristics of the Everglades. One of the modeling tools to be used in developing restoration alternatives is the Everglades Landscape Model (ELM), a process-based, spatially explicit simulation of ecosystem dynamics across a heterogeneous, 10,000 km2 region. The model has been calibrated to capture hydrologic and surface water quality dynamics across most of the Everglades landscape over decadal time scales. We evaluated phosphorus loading throughout the Everglades system under two base scenarios. The 1995 base case assumed current management operations, with phosphorus inflow concentrations fixed at their long term, historical average. The 2050 base case assumed future modifications in water and nutrient management, with all managed inflows to the Everglades having reduced phosphorus concentrations. In an example indicator subregion that currently is highly eutrophic, the 31-yr simulations predicted that desirable periphyton and macrophyte communities were maintained under the 2050 base case, whereas in the 1995 base case, periphyton biomass and production decreased to negligible levels and macrophytes became extremely dense. The negative periphyton response in the 1995 base case was due to high phosphorus loads and rapid macrophyte growth that shaded this algal community. Along an existing 11 km eutrophication gradient, the model indicated that the 2050 base case had ecologically significant reductions in phosphorus accumulation compared to the 1995 base case. Indicator regions (in Everglades National Park) distant from phosphorus inflow points also exhibited reductions in phosphorus accumulation under the 2050 base case, albeit to a lesser extent due to its distance from phosphorus inflows. The ELM fills a critical information need in Everglades management, and has become an accepted tool in evaluating scenarios of potential restoration of the natural system.

  5. Is Dysfunctional Use of the Mobile Phone a Behavioural Addiction? Confronting Symptom-Based Versus Process-Based Approaches.

    PubMed

    Billieux, Joël; Philippot, Pierre; Schmid, Cécile; Maurage, Pierre; De Mol, Jan; Van der Linden, Martial

    2015-01-01

    Dysfunctional use of the mobile phone has often been conceptualized as a 'behavioural addiction' that shares most features with drug addictions. In the current article, we challenge the clinical utility of the addiction model as applied to mobile phone overuse. We describe the case of a woman who overuses her mobile phone from two distinct approaches: (1) a symptom-based categorical approach inspired from the addiction model of dysfunctional mobile phone use and (2) a process-based approach resulting from an idiosyncratic clinical case conceptualization. In the case depicted here, the addiction model was shown to lead to standardized and non-relevant treatment, whereas the clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific, empirically based psychological interventions. This finding highlights that conceptualizing excessive behaviours (e.g., gambling and sex) within the addiction model can be a simplification of an individual's psychological functioning, offering only limited clinical relevance. The addiction model, applied to excessive behaviours (e.g., gambling, sex and Internet-related activities) may lead to non-relevant standardized treatments. Clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific empirically based psychological interventions. The biomedical model might lead to the simplification of an individual's psychological functioning with limited clinical relevance. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Integrated Planning Model (IPM) Base Case v.4.10

    EPA Pesticide Factsheets

    Learn about EPA's IPM Base Case v.4.10, including Proposed Transport Rule results, documentation, the National Electric Energy Data System (NEEDS) database and user's guide, and run results using previous base cases.

  7. Influenza forecasting with Google Flu Trends.

    PubMed

    Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E

    2013-01-01

    We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.

  8. Moist air state above counterflow wet-cooling tower fill based on Merkel, generalised Merkel and Klimanek & Białecky models

    NASA Astrophysics Data System (ADS)

    Hyhlík, Tomáš

    2017-09-01

    The article deals with an evaluation of moist air state above counterflow wet-cooling tower fill. The results based on Klimanek & Białecky model are compared with results of Merkel model and generalised Merkel model. Based on the numerical simulation it is shown that temperature is predicted correctly by using generalised Merkel model in the case of saturated or super-saturated air above the fill, but the temperature is underpredicted in the case of unsaturated moist air above the fill. The classical Merkel model always under predicts temperature above the fill. The density of moist air above the fill, which is calculated using generalised Merkel model, is strongly over predicted in the case of unsaturated moist air above the fill.

  9. The use of multiple models in case-based diagnosis

    NASA Technical Reports Server (NTRS)

    Karamouzis, Stamos T.; Feyock, Stefan

    1993-01-01

    The work described in this paper has as its goal the integration of a number of reasoning techniques into a unified intelligent information system that will aid flight crews with malfunction diagnosis and prognostication. One of these approaches involves using the extensive archive of information contained in aircraft accident reports along with various models of the aircraft as the basis for case-based reasoning about malfunctions. Case-based reasoning draws conclusions on the basis of similarities between the present situation and prior experience. We maintain that the ability of a CBR program to reason about physical systems is significantly enhanced by the addition to the CBR program of various models. This paper describes the diagnostic concepts implemented in a prototypical case based reasoner that operates in the domain of in-flight fault diagnosis, the various models used in conjunction with the reasoner's CBR component, and results from a preliminary evaluation.

  10. Modeling vs. Coaching of Argumentation in a Case-Based Learning Environment.

    ERIC Educational Resources Information Center

    Li, Tiancheng; And Others

    The major purposes of this study are: (1) to investigate and compare the effectiveness of two instructional strategies, modeling and coaching on helping students to articulate and support their decisions in a case-based learning environment; (2) to compare the effectiveness of modeling and coaching on helping students address essential criteria in…

  11. Properties of inductive reasoning.

    PubMed

    Heit, E

    2000-12-01

    This paper reviews the main psychological phenomena of inductive reasoning, covering 25 years of experimental and model-based research, in particular addressing four questions. First, what makes a case or event generalizable to other cases? Second, what makes a set of cases generalizable? Third, what makes a property or predicate projectable? Fourth, how do psychological models of induction address these results? The key results in inductive reasoning are outlined, and several recent models, including a new Bayesian account, are evaluated with respect to these results. In addition, future directions for experimental and model-based work are proposed.

  12. Base Case v.5.15 Documentation Supplement to Support the Clean Power Plan

    EPA Pesticide Factsheets

    Learn about several modeling assumptions used as part of EPA's analysis of the Clean Power Plan (Carbon Pollution Guidelines for Existing Electric Generating Units) using the EPA v.5.15 Base Case using Integrated Planning Model (IPM).

  13. A public health decision support system model using reasoning methods.

    PubMed

    Mera, Maritza; González, Carolina; Blobel, Bernd

    2015-01-01

    Public health programs must be based on the real health needs of the population. However, the design of efficient and effective public health programs is subject to availability of information that can allow users to identify, at the right time, the health issues that require special attention. The objective of this paper is to propose a case-based reasoning model for the support of decision-making in public health. The model integrates a decision-making process and case-based reasoning, reusing past experiences for promptly identifying new population health priorities. A prototype implementation of the model was performed, deploying the case-based reasoning framework jColibri. The proposed model contributes to solve problems found today when designing public health programs in Colombia. Current programs are developed under uncertain environments, as the underlying analyses are carried out on the basis of outdated and unreliable data.

  14. RECURSIVE PROTEIN MODELING: A DIVIDE AND CONQUER STRATEGY FOR PROTEIN STRUCTURE PREDICTION AND ITS CASE STUDY IN CASP9

    PubMed Central

    CHENG, JIANLIN; EICKHOLT, JESSE; WANG, ZHENG; DENG, XIN

    2013-01-01

    After decades of research, protein structure prediction remains a very challenging problem. In order to address the different levels of complexity of structural modeling, two types of modeling techniques — template-based modeling and template-free modeling — have been developed. Template-based modeling can often generate a moderate- to high-resolution model when a similar, homologous template structure is found for a query protein but fails if no template or only incorrect templates are found. Template-free modeling, such as fragment-based assembly, may generate models of moderate resolution for small proteins of low topological complexity. Seldom have the two techniques been integrated together to improve protein modeling. Here we develop a recursive protein modeling approach to selectively and collaboratively apply template-based and template-free modeling methods to model template-covered (i.e. certain) and template-free (i.e. uncertain) regions of a protein. A preliminary implementation of the approach was tested on a number of hard modeling cases during the 9th Critical Assessment of Techniques for Protein Structure Prediction (CASP9) and successfully improved the quality of modeling in most of these cases. Recursive modeling can signicantly reduce the complexity of protein structure modeling and integrate template-based and template-free modeling to improve the quality and efficiency of protein structure prediction. PMID:22809379

  15. Developing Emotion-Based Case Formulations: A Research-Informed Method.

    PubMed

    Pascual-Leone, Antonio; Kramer, Ueli

    2017-01-01

    New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Correlation analysis of air pollutant index levels and dengue cases across five different zones in Selangor, Malaysia.

    PubMed

    Thiruchelvam, Loshini; Dass, Sarat C; Zaki, Rafdzah; Yahya, Abqariyah; Asirvadam, Vijanth S

    2018-05-07

    This study investigated the potential relationship between dengue cases and air quality - as measured by the Air Pollution Index (API) for five zones in the state of Selangor, Malaysia. Dengue case patterns can be learned using prediction models based on feedback (lagged terms). However, the question whether air quality affects dengue cases is still not thoroughly investigated based on such feedback models. This work developed dengue prediction models using the autoregressive integrated moving average (ARIMA) and ARIMA with an exogeneous variable (ARIMAX) time series methodologies with API as the exogeneous variable. The Box Jenkins approach based on maximum likelihood was used for analysis as it gives effective model estimates and prediction. Three stages of model comparison were carried out for each zone: first with ARIMA models without API, then ARIMAX models with API data from the API station for that zone and finally, ARIMAX models with API data from the zone and spatially neighbouring zones. Bayesian Information Criterion (BIC) gives goodness-of-fit versus parsimony comparisons between all elicited models. Our study found that ARIMA models, with the lowest BIC value, outperformed the rest in all five zones. The BIC values for the zone of Kuala Selangor were -800.66, -796.22, and -790.5229, respectively, for ARIMA only, ARIMAX with single API component and ARIMAX with API components from its zone and spatially neighbouring zones. Therefore, we concluded that API levels, either temporally for each zone or spatio- temporally based on neighbouring zones, do not have a significant effect on dengue cases.

  17. Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies (Final Report)

    EPA Science Inventory

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physi...

  18. Safety Case Development as an Information Modelling Problem

    NASA Astrophysics Data System (ADS)

    Lewis, Robert

    This paper considers the benefits from applying information modelling as the basis for creating an electronically-based safety case. It highlights the current difficulties of developing and managing large document-based safety cases for complex systems such as those found in Air Traffic Control systems. After a review of current tools and related literature on this subject, the paper proceeds to examine the many relationships between entities that can exist within a large safety case. The paper considers the benefits to both safety case writers and readers from the future development of an ideal safety case tool that is able to exploit these information models. The paper also introduces the idea that the safety case has formal relationships between entities that directly support the safety case argument using a methodology such as GSN, and informal relationships that provide links to direct and backing evidence and to supporting information.

  19. Integration of Optimal Scheduling with Case-Based Planning.

    DTIC Science & Technology

    1995-08-01

    integrates Case-Based Reasoning (CBR) and Rule-Based Reasoning (RBR) systems. ’ Tachyon : A Constraint-Based Temporal Reasoning Model and Its...Implementation’ provides an overview of the Tachyon temporal’s reasoning system and discusses its possible applications. ’Dual-Use Applications of Tachyon : From...Force Structure Modeling to Manufacturing Scheduling’ discusses the application of Tachyon to real world problems, specifically military force deployment and manufacturing scheduling.

  20. Model-based methods for case definitions from administrative health data: application to rheumatoid arthritis

    PubMed Central

    Kroeker, Kristine; Widdifield, Jessica; Muthukumarana, Saman; Jiang, Depeng; Lix, Lisa M

    2017-01-01

    Objective This research proposes a model-based method to facilitate the selection of disease case definitions from validation studies for administrative health data. The method is demonstrated for a rheumatoid arthritis (RA) validation study. Study design and setting Data were from 148 definitions to ascertain cases of RA in hospital, physician and prescription medication administrative data. We considered: (A) separate univariate models for sensitivity and specificity, (B) univariate model for Youden’s summary index and (C) bivariate (ie, joint) mixed-effects model for sensitivity and specificity. Model covariates included the number of diagnoses in physician, hospital and emergency department records, physician diagnosis observation time, duration of time between physician diagnoses and number of RA-related prescription medication records. Results The most common case definition attributes were: 1+ hospital diagnosis (65%), 2+ physician diagnoses (43%), 1+ specialist physician diagnosis (51%) and 2+ years of physician diagnosis observation time (27%). Statistically significant improvements in sensitivity and/or specificity for separate univariate models were associated with (all p values <0.01): 2+ and 3+ physician diagnoses, unlimited physician diagnosis observation time, 1+ specialist physician diagnosis and 1+ RA-related prescription medication records (65+ years only). The bivariate model produced similar results. Youden’s index was associated with these same case definition criteria, except for the length of the physician diagnosis observation time. Conclusion A model-based method provides valuable empirical evidence to aid in selecting a definition(s) for ascertaining diagnosed disease cases from administrative health data. The choice between univariate and bivariate models depends on the goals of the validation study and number of case definitions. PMID:28645978

  1. A broad scope knowledge based model for optimization of VMAT in esophageal cancer: validation and assessment of plan quality among different treatment centers.

    PubMed

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca

    2015-10-31

    To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice.

  2. Cases as Shared Inquiry: A Dialogical Model of Teacher Preparation.

    ERIC Educational Resources Information Center

    Harrington, Helen L.; Garrison, James W.

    1992-01-01

    A dialogical model is proposed for connecting theory to practice in teacher education by conceiving of cases from case-based pedagogy as problems that initiate shared inquiry. Cases with genuine cognitive and axiological content can initiate self-directed, student-centered inquiry while building democratic dialogical communities. (SLD)

  3. The Effect of a Case-Based Reasoning Instructional Model on Korean High School Students' Awareness in Climate Change Unit

    ERIC Educational Resources Information Center

    Jeong, Jinwoo; Kim, Hyoungbum; Chae, Dong-hyun; Kim, Eunjeong

    2014-01-01

    The purpose of this study is to investigate the effects of the case-based reasoning instructional model on learning about climate change unit. Results suggest that students showed interest because it allowed them to find the solution to the problem and solve the problem for themselves by analogy from other cases such as crossword puzzles in an…

  4. Estimation of the contribution of private providers in tuberculosis case notification and treatment outcome in Pakistan.

    PubMed

    Chughtai, A A; Qadeer, E; Khan, W; Hadi, H; Memon, I A

    2013-03-01

    To improve involvement of the private sector in the national tuberculosis (TB) programme in Pakistan various public-private mix projects were set up between 2004 and 2009. A retrospective analysis of data was made to study 6 different public-private mix models for TB control in Pakistan and estimate the contribution of the various private providers to TB case notification and treatment outcome. The number of TB cases notified through the private sector increased significantly from 77 cases in 2004 to 37,656 in 2009. Among the models, the nongovernmental organization model made the greatest contribution to case notification (58.3%), followed by the hospital-based model (18.9%). Treatment success was highest for the district-led model (94.1%) and lowest for the hospital-based model (74.2%). The private sector made an important contribution to the national data through the various public-private mix projects. Issues of sustainability and the lack of treatment supporters are discussed as reasons for lack of success of some projects.

  5. Environmental and cost life cycle assessment of disinfection options for municipal wastewater treatment

    EPA Science Inventory

    This document summarizes the data collection, analysis, and results for a base case wastewater treatment (WWT) plant reference model. The base case is modeled after the Metropolitan Sewer District of Greater Cincinnati (MSDGC) Mill Creek Plant. The plant has an activated sludge s...

  6. A Case Study of a School-Based Curriculum Development as a Model for INSET.

    ERIC Educational Resources Information Center

    Keiny, Shoshana; Weiss, Tzila

    1986-01-01

    Using a school-based curriculum development approach, the Israeli Environmental Education Project constructed a conceptual model for environmental education curriculum development. A team of teachers sharing knowledge developed a case study about water regulation and its consequences in a desert environment, which is described. (MT)

  7. Case-Based Policy and Goal Recognition

    DTIC Science & Technology

    2015-09-30

    or noisy. Ontanón et al. [8] use case-based reasoning (CBR) to model human driving vehicle control behaviors and skill level to reduce teen crash...Snodgrass, S., Bonfiglio, D., Winston, F.K., McDonald, C., Gonzalez, A.J.: Case-based prediction of teen driver behavior and skill. In: Pro- ceedings

  8. Improving the FLORIS wind plant model for compatibility with gradient-based optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Jared J.; Gebraad, Pieter MO; Ning, Andrew

    The FLORIS (FLOw Redirection and Induction in Steady-state) model, a parametric wind turbine wake model that predicts steady-state wake characteristics based on wind turbine position and yaw angle, was developed for optimization of control settings and turbine locations. This article provides details on changes made to the FLORIS model to make the model more suitable for gradient-based optimization. Changes to the FLORIS model were made to remove discontinuities and add curvature to regions of non-physical zero gradient. Exact gradients for the FLORIS model were obtained using algorithmic differentiation. A set of three case studies demonstrate that using exact gradients withmore » gradient-based optimization reduces the number of function calls by several orders of magnitude. The case studies also show that adding curvature improves convergence behavior, allowing gradient-based optimization algorithms used with the FLORIS model to more reliably find better solutions to wind farm optimization problems.« less

  9. Application of the critical pathway and integrated case teaching method to nursing orientation.

    PubMed

    Goodman, D

    1997-01-01

    Nursing staff development programs must be responsive to current changes in healthcare. New nursing staff must be prepared to manage continuous change and to function competently in clinical practice. The orientation pathway, based on a case management model, is used as a structure for the orientation phase of staff development. The integrated case is incorporated as a teaching strategy in orientation. The integrated case method is based on discussion and analysis of patient situations with emphasis on role modeling and integration of theory and skill. The orientation pathway and integrated case teaching method provide a useful framework for orientation of new staff. Educators, preceptors and orientees find the structure provided by the orientation pathway very useful. Orientation that is developed, implemented and evaluated based on a case management model with the use of an orientation pathway and incorporation of an integrated case teaching method provides a standardized structure for orientation of new staff. This approach is designed for the adult learner, promotes conceptual reasoning, and encourages the social and contextual basis for continued learning.

  10. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  11. Learning to Mean in Spanish Writing: A Case Study of a Genre-Based Pedagogy for Standards-Based Writing Instruction

    ERIC Educational Resources Information Center

    Troyan, Francis J.

    2016-01-01

    This case study reports the results of a genre-based approach, which was used to explicitly teach the touristic landmark description to fourth-grade students of Spanish as a foreign language. The instructional model and unit of instruction were informed by the pedagogies of the Sydney School of Linguistics and an instructional model for…

  12. Assessing the detail needed to capture rainfall-runoff dynamics with physics-based hydrologic response simulation

    USGS Publications Warehouse

    Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.

    2011-01-01

    Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.

  13. Augment clinical measurement using a constraint-based esophageal model

    NASA Astrophysics Data System (ADS)

    Kou, Wenjun; Acharya, Shashank; Kahrilas, Peter; Patankar, Neelesh; Pandolfino, John

    2017-11-01

    Quantifying the mechanical properties of the esophageal wall is crucial to understanding impairments of trans-esophageal flow characteristic of several esophageal diseases. However, these data are unavailable owing to technological limitations of current clinical diagnostic instruments that instead display esophageal luminal cross sectional area based on intraluminal impedance change. In this work, we developed an esophageal model to predict bolus flow and the wall property based on clinical measurements. The model used the constraint-based immersed-boundary method developed previously by our group. Specifically, we first approximate the time-dependent wall geometry based on impedance planimetry data on luminal cross sectional area. We then fed these along with pressure data into the model and computed wall tension based on simulated pressure and flow fields, and the material property based on the strain-stress relationship. As examples, we applied this model to augment FLIP (Functional Luminal Imaging Probe) measurements in three clinical cases: a normal subject, achalasia, and eosinophilic esophagitis (EoE). Our findings suggest that the wall stiffness was greatest in the EoE case, followed by the achalasia case, and then the normal. This is supported by NIH Grant R01 DK56033 and R01 DK079902.

  14. Applying cost accounting to operating room staffing in otolaryngology: time-driven activity-based costing and outpatient adenotonsillectomy.

    PubMed

    Balakrishnan, Karthik; Goico, Brian; Arjmand, Ellis M

    2015-04-01

    (1) To describe the application of a detailed cost-accounting method (time-driven activity-cased costing) to operating room personnel costs, avoiding the proxy use of hospital and provider charges. (2) To model potential cost efficiencies using different staffing models with the case study of outpatient adenotonsillectomy. Prospective cost analysis case study. Tertiary pediatric hospital. All otolaryngology providers and otolaryngology operating room staff at our institution. Time-driven activity-based costing demonstrated precise per-case and per-minute calculation of personnel costs. We identified several areas of unused personnel capacity in a basic staffing model. Per-case personnel costs decreased by 23.2% by allowing a surgeon to run 2 operating rooms, despite doubling all other staff. Further cost reductions up to a total of 26.4% were predicted with additional staffing rearrangements. Time-driven activity-based costing allows detailed understanding of not only personnel costs but also how personnel time is used. This in turn allows testing of alternative staffing models to decrease unused personnel capacity and increase efficiency. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  15. Case-Based Modeling for Learning: Socially Constructed Skill Development

    ERIC Educational Resources Information Center

    Lyons, Paul; Bandura, Randall P.

    2018-01-01

    Purpose: Grounded on components of experiential learning theory (ELT) and self-regulation of learning (SRL) theory, augmented by elements of action theory and script development, the purpose of this paper is to demonstrate the case-based modeling (CBM) instructional approach that stimulates learning in groups or teams. CBM is related to individual…

  16. a Study on Satellite Diagnostic Expert Systems Using Case-Based Approach

    NASA Astrophysics Data System (ADS)

    Park, Young-Tack; Kim, Jae-Hoon; Park, Hyun-Soo

    1997-06-01

    Many research works are on going to monitor and diagnose diverse malfunctions of satellite systems as the complexity and number of satellites increase. Currently, many works on monitoring and diagnosis are carried out by human experts but there are needs to automate much of the routine works of them. Hence, it is necessary to study on using expert systems which can assist human experts routine work by doing automatically, thereby allow human experts devote their expertise more critical and important areas of monitoring and diagnosis. In this paper, we are employing artificial intelligence techniques to model human experts' knowledge and inference the constructed knowledge. Especially, case-based approaches are used to construct a knowledge base to model human expert capabilities which use previous typical exemplars. We have designed and implemented a prototype case-based system for diagnosing satellite malfunctions using cases. Our system remembers typical failure cases and diagnoses a current malfunction by indexing the case base. Diverse methods are used to build a more user friendly interface which allows human experts can build a knowledge base in as easy way.

  17. Use of machine learning methods to reduce predictive error of groundwater models.

    PubMed

    Xu, Tianfang; Valocchi, Albert J; Choi, Jaesik; Amir, Eyal

    2014-01-01

    Quantitative analyses of groundwater flow and transport typically rely on a physically-based model, which is inherently subject to error. Errors in model structure, parameter and data lead to both random and systematic error even in the output of a calibrated model. We develop complementary data-driven models (DDMs) to reduce the predictive error of physically-based groundwater models. Two machine learning techniques, the instance-based weighting and support vector regression, are used to build the DDMs. This approach is illustrated using two real-world case studies of the Republican River Compact Administration model and the Spokane Valley-Rathdrum Prairie model. The two groundwater models have different hydrogeologic settings, parameterization, and calibration methods. In the first case study, cluster analysis is introduced for data preprocessing to make the DDMs more robust and computationally efficient. The DDMs reduce the root-mean-square error (RMSE) of the temporal, spatial, and spatiotemporal prediction of piezometric head of the groundwater model by 82%, 60%, and 48%, respectively. In the second case study, the DDMs reduce the RMSE of the temporal prediction of piezometric head of the groundwater model by 77%. It is further demonstrated that the effectiveness of the DDMs depends on the existence and extent of the structure in the error of the physically-based model. © 2013, National GroundWater Association.

  18. High laboratory cost predicted per tuberculosis case diagnosed with increased case finding without a triage strategy.

    PubMed

    Dunbar, R; Naidoo, P; Beyers, N; Langley, I

    2017-09-01

    Cape Town, South Africa. To model the effects of increased case finding and triage strategies on laboratory costs per tuberculosis (TB) case diagnosed. We used a validated operational model and published laboratory cost data. We modelled the effect of varying the proportion with TB among presumptive cases and Xpert cartridge price reductions on cost per TB case and per additional TB case diagnosed in the Xpert-based vs. smear/culture-based algorithms. In our current scenario (18.3% with TB among presumptive cases), the proportion of cases diagnosed increased by 8.7% (16.7% vs. 15.0%), and the cost per case diagnosed increased by 142% (US$121 vs. US$50). The cost per additional case diagnosed was US$986. This would increase to US$1619 if the proportion with TB among presumptive cases was 10.6%. At 25.9-30.8% of TB prevalence among presumptive cases and a 50% reduction in Xpert cartridge price, the cost per TB case diagnosed would range from US$50 to US$59 (comparable to the US$48.77 found in routine practice with smear/culture). The operational model illustrates the effect of increased case finding on laboratory costs per TB case diagnosed. Unless triage strategies are identified, the approach will not be sustainable, even if Xpert cartridge prices are reduced.

  19. COLLABORATE©: A Universal Competency-Based Paradigm for Professional Case Management, Part II: Competency Clarification.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2013-01-01

    The purpose of this second article of a 3-article series is to clarify the competencies for a new paradigm of case management built upon a value-driven foundation that : Applicable to all health care sectors where case management is practiced. In moving forward, the one fact that rings true is that there will be a constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby™ or Pokey™. This is exactly the time to define a competency-based case management model, highlighting one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. Although there is an inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  20. [Study on the ARIMA model application to predict echinococcosis cases in China].

    PubMed

    En-Li, Tan; Zheng-Feng, Wang; Wen-Ce, Zhou; Shi-Zhu, Li; Yan, Lu; Lin, Ai; Yu-Chun, Cai; Xue-Jiao, Teng; Shun-Xian, Zhang; Zhi-Sheng, Dang; Chun-Li, Yang; Jia-Xu, Chen; Wei, Hu; Xiao-Nong, Zhou; Li-Guang, Tian

    2018-02-26

    To predict the monthly reported echinococcosis cases in China with the autoregressive integrated moving average (ARIMA) model, so as to provide a reference for prevention and control of echinococcosis. SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported echinococcosis cases of time series from 2007 to 2015 and 2007 to 2014, respectively, and the accuracies of the two ARIMA models were compared. The model based on the data of the monthly reported cases of echinococcosis in China from 2007 to 2015 was ARIMA (1, 0, 0) (1, 1, 0) 12 , the relative error among reported cases and predicted cases was -13.97%, AR (1) = 0.367 ( t = 3.816, P < 0.001), SAR (1) = -0.328 ( t = -3.361, P = 0.001), and Ljung-Box Q = 14.119 ( df = 16, P = 0.590) . The model based on the data of the monthly reported cases of echinococcosis in China from 2007 to 2014 was ARIMA (1, 0, 0) (1, 0, 1) 12 , the relative error among reported cases and predicted cases was 0.56%, AR (1) = 0.413 ( t = 4.244, P < 0.001), SAR (1) = 0.809 ( t = 9.584, P < 0.001), SMA (1) = 0.356 ( t = 2.278, P = 0.025), and Ljung-Box Q = 18.924 ( df = 15, P = 0.217). The different time series may have different ARIMA models as for the same infectious diseases. It is needed to be further verified that the more data are accumulated, the shorter time of predication is, and the smaller the average of the relative error is. The establishment and prediction of an ARIMA model is a dynamic process that needs to be adjusted and optimized continuously according to the accumulated data, meantime, we should give full consideration to the intensity of the work related to infectious diseases reported (such as disease census and special investigation).

  1. Coding response to a case-mix measurement system based on multiple diagnoses.

    PubMed

    Preyra, Colin

    2004-08-01

    To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post.

  2. “Docs 'n Drugs” - A System for Case-Oriented and Web-based Training in Medicine

    PubMed Central

    Martens, A.; Bernauer, J.

    1999-01-01

    The tutoring process of conventional case-oriented medical training systems can be characterised as either guided or unguided. In contrast to that, the aim of the system “Docs'n Drugs” is to distinguish between different levels of guidance. The author can realise the tutoring case either as a guided, a half guided or a unguided tutoring process. The system architecture distinguishes between an authoring system and a tutoring system. Fundaments of these are the tutoring process model and the case-based knowledge model. This structure allows the reuse of elements of existing tutoring cases. The tutoring cases can be realised in German and English.

  3. Reaction time for trimolecular reactions in compartment-based reaction-diffusion models

    NASA Astrophysics Data System (ADS)

    Li, Fei; Chen, Minghan; Erban, Radek; Cao, Yang

    2018-05-01

    Trimolecular reaction models are investigated in the compartment-based (lattice-based) framework for stochastic reaction-diffusion modeling. The formulae for the first collision time and the mean reaction time are derived for the case where three molecules are present in the solution under periodic boundary conditions. For the case of reflecting boundary conditions, similar formulae are obtained using a computer-assisted approach. The accuracy of these formulae is further verified through comparison with numerical results. The presented derivation is based on the first passage time analysis of Montroll [J. Math. Phys. 10, 753 (1969)]. Montroll's results for two-dimensional lattice-based random walks are adapted and applied to compartment-based models of trimolecular reactions, which are studied in one-dimensional or pseudo one-dimensional domains.

  4. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  5. Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis

    NASA Astrophysics Data System (ADS)

    Gluhih, I. N.; Akhmadulin, R. K.

    2017-07-01

    One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.

  6. Genetic programming based quantitative structure-retention relationships for the prediction of Kovats retention indices.

    PubMed

    Goel, Purva; Bapat, Sanket; Vyas, Renu; Tambe, Amruta; Tambe, Sanjeev S

    2015-11-13

    The development of quantitative structure-retention relationships (QSRR) aims at constructing an appropriate linear/nonlinear model for the prediction of the retention behavior (such as Kovats retention index) of a solute on a chromatographic column. Commonly, multi-linear regression and artificial neural networks are used in the QSRR development in the gas chromatography (GC). In this study, an artificial intelligence based data-driven modeling formalism, namely genetic programming (GP), has been introduced for the development of quantitative structure based models predicting Kovats retention indices (KRI). The novelty of the GP formalism is that given an example dataset, it searches and optimizes both the form (structure) and the parameters of an appropriate linear/nonlinear data-fitting model. Thus, it is not necessary to pre-specify the form of the data-fitting model in the GP-based modeling. These models are also less complex, simple to understand, and easy to deploy. The effectiveness of GP in constructing QSRRs has been demonstrated by developing models predicting KRIs of light hydrocarbons (case study-I) and adamantane derivatives (case study-II). In each case study, two-, three- and four-descriptor models have been developed using the KRI data available in the literature. The results of these studies clearly indicate that the GP-based models possess an excellent KRI prediction accuracy and generalization capability. Specifically, the best performing four-descriptor models in both the case studies have yielded high (>0.9) values of the coefficient of determination (R(2)) and low values of root mean squared error (RMSE) and mean absolute percent error (MAPE) for training, test and validation set data. The characteristic feature of this study is that it introduces a practical and an effective GP-based method for developing QSRRs in gas chromatography that can be gainfully utilized for developing other types of data-driven models in chromatography science. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Probabilistic Solar Energetic Particle Models

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  8. Performance measurement and modeling of component applications in a high performance computing environment : a case study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Robert C.; Ray, Jaideep; Malony, A.

    2003-11-01

    We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.

  9. Combined Application of Study Design and Case-Based Learning Comprehensive Model in Epidemiology Teaching

    ERIC Educational Resources Information Center

    Shi, Xiuquan; Zhou, Yanna; Wang, Haiyan; Wang, Tao; Nie, Chan; Shi, Shangpeng

    2017-01-01

    This paper aims to conduct the SD-CBL (study design with the case based learning, SD-CBL) in Epidemiology teaching and evaluate its effect. Students from five classes were recruited, and a combined comprehensive teaching model of SD-CBL was used in the "Injury Epidemiology" chapter, while other chapters in "Epidemiology"…

  10. Vibration modelling and verifications for whole aero-engine

    NASA Astrophysics Data System (ADS)

    Chen, G.

    2015-08-01

    In this study, a new rotor-ball-bearing-casing coupling dynamic model for a practical aero-engine is established. In the coupling system, the rotor and casing systems are modelled using the finite element method, support systems are modelled as lumped parameter models, nonlinear factors of ball bearings and faults are included, and four types of supports and connection models are defined to model the complex rotor-support-casing coupling system of the aero-engine. A new numerical integral method that combines the Newmark-β method and the improved Newmark-β method (Zhai method) is used to obtain the system responses. Finally, the new model is verified in three ways: (1) modal experiment based on rotor-ball bearing rig, (2) modal experiment based on rotor-ball-bearing-casing rig, and (3) fault simulations for a certain type of missile turbofan aero-engine vibration. The results show that the proposed model can not only simulate the natural vibration characteristics of the whole aero-engine but also effectively perform nonlinear dynamic simulations of a whole aero-engine with faults.

  11. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  12. On quantum integrable models related to nonlinear quantum optics. An algebraic Bethe ansatz approach

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav

    1989-08-01

    A unified approach based on Bethe ansatz in a large variety of integrable models in quantum optics is given. Second harmonics generation, three-boson interaction, the Dicke model, and some cases of four-boson interaction as special cases of su(2)⊕su(1,1)-Gaudin models are included.

  13. COLLABORATE©: a universal competency-based paradigm for professional case management, part i: introduction, historical validation, and competency presentation.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2013-01-01

    The purpose of this first of a three-article series is to provide context and justification for a new paradigm of case management built upon a value-driven foundation that Applicable to all health care sectors where case management is practiced. In moving forward, the one fact that rings true is there will be constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby or Pokey. This is exactly why the definition of a competency-based case management model's time has come, one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than by the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. While there is inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  14. A Study of Fan Stage/Casing Interaction Models

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Carney, Kelly; Gallardo, Vicente

    2003-01-01

    The purpose of the present study is to investigate the performance of several existing and new, blade-case interactions modeling capabilities that are compatible with the large system simulations used to capture structural response during blade-out events. Three contact models are examined for simulating the interactions between a rotor bladed disk and a case: a radial and linear gap element and a new element based on a hydrodynamic formulation. The first two models are currently available in commercial finite element codes such as NASTRAN and have been showed to perform adequately for simulating rotor-case interactions. The hydrodynamic model, although not readily available in commercial codes, may prove to be better able to characterize rotor-case interactions.

  15. Coding Response to a Case-Mix Measurement System Based on Multiple Diagnoses

    PubMed Central

    Preyra, Colin

    2004-01-01

    Objective To examine the hospital coding response to a payment model using a case-mix measurement system based on multiple diagnoses and the resulting impact on a hospital cost model. Data Sources Financial, clinical, and supplementary data for all Ontario short stay hospitals from years 1997 to 2002. Study Design Disaggregated trends in hospital case-mix growth are examined for five years following the adoption of an inpatient classification system making extensive use of combinations of secondary diagnoses. Hospital case mix is decomposed into base and complexity components. The longitudinal effects of coding variation on a standard hospital payment model are examined in terms of payment accuracy and impact on adjustment factors. Principal Findings Introduction of the refined case-mix system provided incentives for hospitals to increase reporting of secondary diagnoses and resulted in growth in highest complexity cases that were not matched by increased resource use over time. Despite a pronounced coding response on the part of hospitals, the increase in measured complexity and case mix did not reduce the unexplained variation in hospital unit cost nor did it reduce the reliance on the teaching adjustment factor, a potential proxy for case mix. The main implication was changes in the size and distribution of predicted hospital operating costs. Conclusions Jurisdictions introducing extensive refinements to standard diagnostic related group (DRG)-type payment systems should consider the effects of induced changes to hospital coding practices. Assessing model performance should include analysis of the robustness of classification systems to hospital-level variation in coding practices. Unanticipated coding effects imply that case-mix models hypothesized to perform well ex ante may not meet expectations ex post. PMID:15230940

  16. Ethical Implications of Case-Based Payment in China: A Systematic Analysis.

    PubMed

    Jin, Pingyue; Biller-Andorno, Nikola; Wild, Verina

    2015-12-01

    How health care providers are paid affects how medicine is practiced. It is thus important to assess provider payment models not only from the economic perspective but also from the ethical perspective. China recently started to reform the provider payment model in the health care system from fee-for-service to case-based payment. This paper aims to examine this transition from an ethical perspective. We collected empirical studies on the impact of case-based payment in the Chinese health care system and applied a systematic ethical matrix that integrates clinical ethics and public health ethics to analyze the empirical findings. We identified eleven prominent ethical issues related to case-based payment. Some ethical problems of case-based payment in China are comparable to ethical problems of managed care and diagnosis related groups in high-income countries. However, in this paper we discuss in greater detail four specific ethical issues in the Chinese context: professionalism, the patient-physician relationship, access to care and patient autonomy. Based on the analysis, we cautiously infer that case-based payment is currently more ethically acceptable than fee-for-service in the context of China, mainly because it seems to lower financial barriers to access care. Nonetheless, it will be difficult to justify the implementation of case-based payment if no additional measures are taken to monitor and minimize its existing negative ethical implications. © 2014 John Wiley & Sons Ltd.

  17. Some Results of Weak Anticipative Concept Applied in Simulation Based Decision Support in Enterprise

    NASA Astrophysics Data System (ADS)

    Kljajić, Miroljub; Kofjač, Davorin; Kljajić Borštnar, Mirjana; Škraba, Andrej

    2010-11-01

    The simulation models are used as for decision support and learning in enterprises and in schools. Tree cases of successful applications demonstrate usefulness of weak anticipative information. Job shop scheduling production with makespan criterion presents a real case customized flexible furniture production optimization. The genetic algorithm for job shop scheduling optimization is presented. Simulation based inventory control for products with stochastic lead time and demand describes inventory optimization for products with stochastic lead time and demand. Dynamic programming and fuzzy control algorithms reduce the total cost without producing stock-outs in most cases. Values of decision making information based on simulation were discussed too. All two cases will be discussed from optimization, modeling and learning point of view.

  18. Investigating the Effect of Damage Progression Model Choice on Prognostics Performance

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Narasimhan, Sriram; Saha, Sankalita; Saha, Bhaskar; Goebel, Kai

    2011-01-01

    The success of model-based approaches to systems health management depends largely on the quality of the underlying models. In model-based prognostics, it is especially the quality of the damage progression models, i.e., the models describing how damage evolves as the system operates, that determines the accuracy and precision of remaining useful life predictions. Several common forms of these models are generally assumed in the literature, but are often not supported by physical evidence or physics-based analysis. In this paper, using a centrifugal pump as a case study, we develop different damage progression models. In simulation, we investigate how model changes influence prognostics performance. Results demonstrate that, in some cases, simple damage progression models are sufficient. But, in general, the results show a clear need for damage progression models that are accurate over long time horizons under varied loading conditions.

  19. A Corpus-Based Discourse Information Analysis of Chinese EFL Learners' Autonomy in Legal Case Brief Writing

    ERIC Educational Resources Information Center

    Chen, Jinshi

    2017-01-01

    Legal case brief writing is pedagogically important yet insufficiently discussed for Chinese EFL learners majoring in law. Based on process genre approach and discourse information theory (DIT), the present study designs a corpus-based analytical model for Chinese EFL learners' autonomy in legal case brief writing and explores the process of case…

  20. A general science-based framework for dynamical spatio-temporal models

    USGS Publications Warehouse

    Wikle, C.K.; Hooten, M.B.

    2010-01-01

    Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic nonlinearity and demonstrate that it accommodates many different classes of scientific-based parameterizations as special cases. The model is presented in a hierarchical Bayesian framework and is illustrated with examples from ecology and oceanography. ?? 2010 Sociedad de Estad??stica e Investigaci??n Operativa.

  1. Equilibrium pricing in an order book environment: Case study for a spin model

    NASA Astrophysics Data System (ADS)

    Meudt, Frederik; Schmitt, Thilo A.; Schäfer, Rudi; Guhr, Thomas

    2016-07-01

    When modeling stock market dynamics, the price formation is often based on an equilibrium mechanism. In real stock exchanges, however, the price formation is governed by the order book. It is thus interesting to check if the resulting stylized facts of a model with equilibrium pricing change, remain the same or, more generally, are compatible with the order book environment. We tackle this issue in the framework of a case study by embedding the Bornholdt-Kaizoji-Fujiwara spin model into the order book dynamics. To this end, we use a recently developed agent based model that realistically incorporates the order book. We find realistic stylized facts. We conclude for the studied case that equilibrium pricing is not needed and that the corresponding assumption of a ;fundamental; price may be abandoned.

  2. How Polish Children Switch from One Case to Another when Using Novel Nouns: Challenges for Models of Inflectional Morphology

    ERIC Educational Resources Information Center

    Krajewski, Grzegorz; Theakston, Anna L.; Lieven, Elena V. M.; Tomasello, Michael

    2011-01-01

    The two main models of children's acquisition of inflectional morphology--the Dual-Mechanism approach and the usage-based (schema-based) approach--have both been applied mainly to languages with fairly simple morphological systems. Here we report two studies of 2-3-year-old Polish children's ability to generalise across case-inflectional endings…

  3. Training Post-9/11 Police Officers with a Counter-Terrorism Reality-Based Training Model: A Case Study

    ERIC Educational Resources Information Center

    Biddle, Christopher J.

    2013-01-01

    The purpose of this qualitative holistic multiple-case study was to identify the optimal theoretical approach for a Counter-Terrorism Reality-Based Training (CTRBT) model to train post-9/11 police officers to perform effectively in their counter-terrorism assignments. Post-9/11 police officers assigned to counter-terrorism duties are not trained…

  4. A Hybrid Approach Using Case-Based Reasoning and Rule-Based Reasoning to Support Cancer Diagnosis: A Pilot Study.

    PubMed

    Saraiva, Renata M; Bezerra, João; Perkusich, Mirko; Almeida, Hyggo; Siebra, Clauirton

    2015-01-01

    Recently there has been an increasing interest in applying information technology to support the diagnosis of diseases such as cancer. In this paper, we present a hybrid approach using case-based reasoning (CBR) and rule-based reasoning (RBR) to support cancer diagnosis. We used symptoms, signs, and personal information from patients as inputs to our model. To form specialized diagnoses, we used rules to define the input factors' importance according to the patient's characteristics. The model's output presents the probability of the patient having a type of cancer. To carry out this research, we had the approval of the ethics committee at Napoleão Laureano Hospital, in João Pessoa, Brazil. To define our model's cases, we collected real patient data at Napoleão Laureano Hospital. To define our model's rules and weights, we researched specialized literature and interviewed health professional. To validate our model, we used K-fold cross validation with the data collected at Napoleão Laureano Hospital. The results showed that our approach is an effective CBR system to diagnose cancer.

  5. Numerical modeling of subsidence induced by hydrocarbon production in a reservoir in coastal Louisiana

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Voyiadjis, G.

    2017-12-01

    Subsidence has caused significant wetland losses in coastal Louisiana due to various anthropogenic and geologic processes. Releveling data from National Geodetic Survey show that one of the governing factors in the coastal Louisiana is hydrocarbon production, which has led to the acceleration of spatial- and temporal-dependent subsidence. This work investigates the influence of hydrocarbon production on subsidence for a typical reservoir, the Valentine field in coastal Louisiana, based on finite element modeling in the framework of poroelasticity and poroplasticity. Geertsma's analytical model is first used in this work to interpret the observed subsidence, for a disc-shaped reservoir embedded in a semi-infinite homogeneous elastic medium. Based on the calibrated elastic material properties, the authors set up a 3D finite element model and validate the numerical results with Geertsma's analytical model. As the plastic deformation of a reservoir in an inhomogeneous medium plays an important role in the compaction of the reservoir and the land subsidence, the authors further adopt a modified Cam-Clay model to take account of the plastic compaction of the reservoir. The material properties in the Cam-Clay model are calibrated based on the subsidence observed in the field and that in the homogeneous elastic case. The observed trend and magnitude of subsidence in the Valentine field can be approximately reproduced through finite element modeling in both the homogeneous elastic case and the inhomogeneous plastic case, by using the calibrated material properties. The maximum compaction in the inhomogeneous plastic case is around half of that in the homogeneous elastic case, and thus the ratio of subsidence over compaction is larger in the inhomogeneous plastic case for a softer reservoir embedded in a stiffer medium.

  6. Discovering relevance knowledge in data: a growing cell structures approach.

    PubMed

    Azuaje, F; Dubitzky, W; Black, N; Adamson, K

    2000-01-01

    Both information retrieval and case-based reasoning systems rely on effective and efficient selection of relevant data. Typically, relevance in such systems is approximated by similarity or indexing models. However, the definition of what makes data items similar or how they should be indexed is often nontrivial and time-consuming. Based on growing cell structure artificial neural networks, this paper presents a method that automatically constructs a case retrieval model from existing data. Within the case-based reasoning (CBR) framework, the method is evaluated for two medical prognosis tasks, namely, colorectal cancer survival and coronary heart disease risk prognosis. The results of the experiments suggest that the proposed method is effective and robust. To gain a deeper insight and understanding of the underlying mechanisms of the proposed model, a detailed empirical analysis of the models structural and behavioral properties is also provided.

  7. Robust radio interferometric calibration using the t-distribution

    NASA Astrophysics Data System (ADS)

    Kazemi, S.; Yatawatta, S.

    2013-10-01

    A major stage of radio interferometric data processing is calibration or the estimation of systematic errors in the data and the correction for such errors. A stochastic error (noise) model is assumed, and in most cases, this underlying model is assumed to be Gaussian. However, outliers in the data due to interference or due to errors in the sky model would have adverse effects on processing based on a Gaussian noise model. Most of the shortcomings of calibration such as the loss in flux or coherence, and the appearance of spurious sources, could be attributed to the deviations of the underlying noise model. In this paper, we propose to improve the robustness of calibration by using a noise model based on Student's t-distribution. Student's t-noise is a special case of Gaussian noise when the variance is unknown. Unlike Gaussian-noise-model-based calibration, traditional least-squares minimization would not directly extend to a case when we have a Student's t-noise model. Therefore, we use a variant of the expectation-maximization algorithm, called the expectation-conditional maximization either algorithm, when we have a Student's t-noise model and use the Levenberg-Marquardt algorithm in the maximization step. We give simulation results to show the robustness of the proposed calibration method as opposed to traditional Gaussian-noise-model-based calibration, especially in preserving the flux of weaker sources that are not included in the calibration model.

  8. [Study on the automatic parameters identification of water pipe network model].

    PubMed

    Jia, Hai-Feng; Zhao, Qi-Feng

    2010-01-01

    Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.

  9. Black-Box System Testing of Real-Time Embedded Systems Using Random and Search-Based Testing

    NASA Astrophysics Data System (ADS)

    Arcuri, Andrea; Iqbal, Muhammad Zohaib; Briand, Lionel

    Testing real-time embedded systems (RTES) is in many ways challenging. Thousands of test cases can be potentially executed on an industrial RTES. Given the magnitude of testing at the system level, only a fully automated approach can really scale up to test industrial RTES. In this paper we take a black-box approach and model the RTES environment using the UML/MARTE international standard. Our main motivation is to provide a more practical approach to the model-based testing of RTES by allowing system testers, who are often not familiar with the system design but know the application domain well-enough, to model the environment to enable test automation. Environment models can support the automation of three tasks: the code generation of an environment simulator, the selection of test cases, and the evaluation of their expected results (oracles). In this paper, we focus on the second task (test case selection) and investigate three test automation strategies using inputs from UML/MARTE environment models: Random Testing (baseline), Adaptive Random Testing, and Search-Based Testing (using Genetic Algorithms). Based on one industrial case study and three artificial systems, we show how, in general, no technique is better than the others. Which test selection technique to use is determined by the failure rate (testing stage) and the execution time of test cases. Finally, we propose a practical process to combine the use of all three test strategies.

  10. Case-Mix Adjusting Performance Measures in a Veteran Population: Pharmacy- and Diagnosis-Based Approaches

    PubMed Central

    Liu, Chuan-Fen; Sales, Anne E; Sharp, Nancy D; Fishman, Paul; Sloan, Kevin L; Todd-Stenberg, Jeff; Nichol, W Paul; Rosen, Amy K; Loveland, Susan

    2003-01-01

    Objective To compare the rankings for health care utilization performance measures at the facility level in a Veterans Health Administration (VHA) health care delivery network using pharmacy- and diagnosis-based case-mix adjustment measures. Data Sources/Study Setting The study included veterans who used inpatient or outpatient services in Veterans Integrated Service Network (VISN) 20 during fiscal year 1998 (October 1997 to September 1998; N=126,076). Utilization and pharmacy data were extracted from VHA national databases and the VISN 20 data warehouse. Study Design We estimated concurrent regression models using pharmacy or diagnosis information in the base year (FY1998) to predict health service utilization in the same year. Utilization measures included bed days of care for inpatient care and provider visits for outpatient care. Principal Findings Rankings of predicted utilization measures across facilities vary by case-mix adjustment measure. There is greater consistency within the diagnosis-based models than between the diagnosis- and pharmacy-based models. The eight facilities were ranked differently by the diagnosis- and pharmacy-based models. Conclusions Choice of case-mix adjustment measure affects rankings of facilities on performance measures, raising concerns about the validity of profiling practices. Differences in rankings may reflect differences in comparability of data capture across facilities between pharmacy and diagnosis data sources, and unstable estimates due to small numbers of patients in a facility. PMID:14596393

  11. Sound scattering by several zooplankton groups. II. Scattering models.

    PubMed

    Stanton, T K; Chu, D; Wiebe, P H

    1998-01-01

    Mathematical scattering models are derived and compared with data from zooplankton from several gross anatomical groups--fluidlike, elastic shelled, and gas bearing. The models are based upon the acoustically inferred boundary conditions determined from laboratory backscattering data presented in part I of this series [Stanton et al., J. Acoust. Soc. Am. 103, 225-235 (1998)]. The models use a combination of ray theory, modal-series solution, and distorted wave Born approximation (DWBA). The formulations, which are inherently approximate, are designed to include only the dominant scattering mechanisms as determined from the experiments. The models for the fluidlike animals (euphausiids in this case) ranged from the simplest case involving two rays, which could qualitatively describe the structure of target strength versus frequency for single pings, to the most complex case involving a rough inhomogeneous asymmetrically tapered bent cylinder using the DWBA-based formulation which could predict echo levels over all angles of incidence (including the difficult region of end-on incidence). The model for the elastic shelled body (gastropods in this case) involved development of an analytical model which takes into account irregularities and discontinuities of the shell. The model for gas-bearing animals (siphonophores) is a hybrid model which is composed of the summation of the exact solution to the gas sphere and the approximate DWBA-based formulation for arbitrarily shaped fluidlike bodies. There is also a simplified ray-based model for the siphonophore. The models are applied to data involving single pings, ping-to-ping variability, and echoes averaged over many pings. There is reasonable qualitative agreement between the predictions and single ping data, and reasonable quantitative agreement between the predictions and variability and averages of echo data.

  12. A comparison between conventional and LANDSAT based hydrologic modeling: The Four Mile Run case study

    NASA Technical Reports Server (NTRS)

    Ragan, R. M.; Jackson, T. J.; Fitch, W. N.; Shubinski, R. P.

    1976-01-01

    Models designed to support the hydrologic studies associated with urban water resources planning require input parameters that are defined in terms of land cover. Estimating the land cover is a difficult and expensive task when drainage areas larger than a few sq. km are involved. Conventional and LANDSAT based methods for estimating the land cover based input parameters required by hydrologic planning models were compared in a case study of the 50.5 sq. km (19.5 sq. mi) Four Mile Run Watershed in Virginia. Results of the study indicate that the LANDSAT based approach is highly cost effective for planning model studies. The conventional approach to define inputs was based on 1:3600 aerial photos, required 110 man-days and a total cost of $14,000. The LANDSAT based approach required 6.9 man-days and cost $2,350. The conventional and LANDSAT based models gave similar results relative to discharges and estimated annual damages expected from no flood control, channelization, and detention storage alternatives.

  13. Development of a Medicaid Behavioral Health Case-Mix Model

    ERIC Educational Resources Information Center

    Robst, John

    2009-01-01

    Many Medicaid programs have either fully or partially carved out mental health services. The evaluation of carve-out plans requires a case-mix model that accounts for differing health status across Medicaid managed care plans. This article develops a diagnosis-based case-mix adjustment system specific to Medicaid behavioral health care. Several…

  14. Description of Exemplar Cases in the Intensive Mental Health Program: Illustrations of Application of the Therapeutic Model

    ERIC Educational Resources Information Center

    Nelson, Timothy D.; Mashunkashey, Joanna O.; Mitchell, Montserrat C.; Benson, Eric R.; Vernberg, Eric M.; Roberts, Michael C.

    2008-01-01

    We describe cases from the clinical records in the Intensive Mental Health Program to illustrate the diverse presenting problems, intervention strategies, therapeutic process, and outcomes for children receiving services in this school-based, community-oriented treatment model. Cases reflect varying degrees of treatment response and potential…

  15. Performance analysis and dynamic modeling of a single-spool turbojet engine

    NASA Astrophysics Data System (ADS)

    Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin

    2017-01-01

    The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.

  16. Mixed Model Association with Family-Biased Case-Control Ascertainment.

    PubMed

    Hayeck, Tristan J; Loh, Po-Ru; Pollack, Samuela; Gusev, Alexander; Patterson, Nick; Zaitlen, Noah A; Price, Alkes L

    2017-01-05

    Mixed models have become the tool of choice for genetic association studies; however, standard mixed model methods may be poorly calibrated or underpowered under family sampling bias and/or case-control ascertainment. Previously, we introduced a liability threshold-based mixed model association statistic (LTMLM) to address case-control ascertainment in unrelated samples. Here, we consider family-biased case-control ascertainment, where case and control subjects are ascertained non-randomly with respect to family relatedness. Previous work has shown that this type of ascertainment can severely bias heritability estimates; we show here that it also impacts mixed model association statistics. We introduce a family-based association statistic (LT-Fam) that is robust to this problem. Similar to LTMLM, LT-Fam is computed from posterior mean liabilities (PML) under a liability threshold model; however, LT-Fam uses published narrow-sense heritability estimates to avoid the problem of biased heritability estimation, enabling correct calibration. In simulations with family-biased case-control ascertainment, LT-Fam was correctly calibrated (average χ 2 = 1.00-1.02 for null SNPs), whereas the Armitage trend test (ATT), standard mixed model association (MLM), and case-control retrospective association test (CARAT) were mis-calibrated (e.g., average χ 2 = 0.50-1.22 for MLM, 0.89-2.65 for CARAT). LT-Fam also attained higher power than other methods in some settings. In 1,259 type 2 diabetes-affected case subjects and 5,765 control subjects from the CARe cohort, downsampled to induce family-biased ascertainment, LT-Fam was correctly calibrated whereas ATT, MLM, and CARAT were again mis-calibrated. Our results highlight the importance of modeling family sampling bias in case-control datasets with related samples. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  17. Proof of Economic Viability of Blended Learning Business Models

    ERIC Educational Resources Information Center

    Druhmann, Carsten; Hohenberg, Gregor

    2014-01-01

    The discussion on economically sustainable business models with respect to information technology is lacking in many aspects of proven approaches. In the following contribution the economic viability is valued based on a procedural model for design and evaluation of e-learning business models in the form of a case study. As a case study object a…

  18. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    NASA Astrophysics Data System (ADS)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  19. Design considerations for case series models with exposure onset measurement error.

    PubMed

    Mohammed, Sandra M; Dalrymple, Lorien S; Sentürk, Damla; Nguyen, Danh V

    2013-02-28

    The case series model allows for estimation of the relative incidence of events, such as cardiovascular events, within a pre-specified time window after an exposure, such as an infection. The method requires only cases (individuals with events) and controls for all fixed/time-invariant confounders. The measurement error case series model extends the original case series model to handle imperfect data, where the timing of an infection (exposure) is not known precisely. In this work, we propose a method for power/sample size determination for the measurement error case series model. Extensive simulation studies are used to assess the accuracy of the proposed sample size formulas. We also examine the magnitude of the relative loss of power due to exposure onset measurement error, compared with the ideal situation where the time of exposure is measured precisely. To facilitate the design of case series studies, we provide publicly available web-based tools for determining power/sample size for both the measurement error case series model as well as the standard case series model. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    ERIC Educational Resources Information Center

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  1. Model-based prediction of nephropathia epidemica outbreaks based on climatological and vegetation data and bank vole population dynamics.

    PubMed

    Haredasht, S Amirpour; Taylor, C J; Maes, P; Verstraeten, W W; Clement, J; Barrios, M; Lagrou, K; Van Ranst, M; Coppin, P; Berckmans, D; Aerts, J-M

    2013-11-01

    Wildlife-originated zoonotic diseases in general are a major contributor to emerging infectious diseases. Hantaviruses more specifically cause thousands of human disease cases annually worldwide, while understanding and predicting human hantavirus epidemics pose numerous unsolved challenges. Nephropathia epidemica (NE) is a human infection caused by Puumala virus, which is naturally carried and shed by bank voles (Myodes glareolus). The objective of this study was to develop a method that allows model-based predicting 3 months ahead of the occurrence of NE epidemics. Two data sets were utilized to develop and test the models. These data sets were concerned with NE cases in Finland and Belgium. In this study, we selected the most relevant inputs from all the available data for use in a dynamic linear regression (DLR) model. The number of NE cases in Finland were modelled using data from 1996 to 2008. The NE cases were predicted based on the time series data of average monthly air temperature (°C) and bank voles' trapping index using a DLR model. The bank voles' trapping index data were interpolated using a related dynamic harmonic regression model (DHR). Here, the DLR and DHR models used time-varying parameters. Both the DHR and DLR models were based on a unified state-space estimation framework. For the Belgium case, no time series of the bank voles' population dynamics were available. Several studies, however, have suggested that the population of bank voles is related to the variation in seed production of beech and oak trees in Northern Europe. Therefore, the NE occurrence pattern in Belgium was predicted based on a DLR model by using remotely sensed phenology parameters of broad-leaved forests, together with the oak and beech seed categories and average monthly air temperature (°C) using data from 2001 to 2009. Our results suggest that even without any knowledge about hantavirus dynamics in the host population, the time variation in NE outbreaks in Finland could be predicted 3 months ahead with a 34% mean relative prediction error (MRPE). This took into account solely the population dynamics of the carrier species (bank voles). The time series analysis also revealed that climate change, as represented by the vegetation index, changes in forest phenology derived from satellite images and directly measured air temperature, may affect the mechanics of NE transmission. NE outbreaks in Belgium were predicted 3 months ahead with a 40% MRPE, based only on the climatological and vegetation data, in this case, without any knowledge of the bank vole's population dynamics. In this research, we demonstrated that NE outbreaks can be predicted using climate and vegetation data or the bank vole's population dynamics, by using dynamic data-based models with time-varying parameters. Such a predictive modelling approach might be used as a step towards the development of new tools for the prevention of future NE outbreaks. © 2012 Blackwell Verlag GmbH.

  2. Numerical Investigation of Flapwise-Torsional Vibration Model of a Smart Section Blade with Microtab

    DOE PAGES

    Li, Nailu; Balas, Mark J.; Yang, Hua; ...

    2015-01-01

    This paper presents a method to develop an aeroelastic model of a smart section blade equipped with microtab. The model is suitable for potential passive vibration control study of the blade section in classic flutter. Equations of the model are described by the nondimensional flapwise and torsional vibration modes coupled with the aerodynamic model based on the Theodorsen theory and aerodynamic effects of the microtab based on the wind tunnel experimental data. The aeroelastic model is validated using numerical data available in the literature and then utilized to analyze the microtab control capability on flutter instability case and divergence instabilitymore » case. The effectiveness of the microtab is investigated with the scenarios of different output controllers and actuation deployments for both instability cases. The numerical results show that the microtab can effectively suppress both vibration modes with the appropriate choice of the output feedback controller.« less

  3. Coupled incompressible Smoothed Particle Hydrodynamics model for continuum-based modelling sediment transport

    NASA Astrophysics Data System (ADS)

    Pahar, Gourabananda; Dhar, Anirban

    2017-04-01

    A coupled solenoidal Incompressible Smoothed Particle Hydrodynamics (ISPH) model is presented for simulation of sediment displacement in erodible bed. The coupled framework consists of two separate incompressible modules: (a) granular module, (b) fluid module. The granular module considers a friction based rheology model to calculate deviatoric stress components from pressure. The module is validated for Bagnold flow profile and two standardized test cases of sediment avalanching. The fluid module resolves fluid flow inside and outside porous domain. An interaction force pair containing fluid pressure, viscous term and drag force acts as a bridge between two different flow modules. The coupled model is validated against three dambreak flow cases with different initial conditions of movable bed. The simulated results are in good agreement with experimental data. A demonstrative case considering effect of granular column failure under full/partial submergence highlights the capability of the coupled model for application in generalized scenario.

  4. Numerical Investigation of Flapwise-Torsional Vibration Model of a Smart Section Blade with Microtab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Nailu; Balas, Mark J.; Yang, Hua

    2015-01-01

    This study presents a method to develop an aeroelastic model of a smart section blade equipped with microtab. The model is suitable for potential passive vibration control study of the blade section in classic flutter. Equations of the model are described by the nondimensional flapwise and torsional vibration modes coupled with the aerodynamic model based on the Theodorsen theory and aerodynamic effects of the microtab based on the wind tunnel experimental data. The aeroelastic model is validated using numerical data available in the literature and then utilized to analyze the microtab control capability on flutter instability case and divergence instabilitymore » case. The effectiveness of the microtab is investigated with the scenarios of different output controllers and actuation deployments for both instability cases. The numerical results show that the microtab can effectively suppress both vibration modes with the appropriate choice of the output feedback controller.« less

  5. Statistical Analysis of Q-matrix Based Diagnostic Classification Models

    PubMed Central

    Chen, Yunxiao; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang

    2014-01-01

    Diagnostic classification models have recently gained prominence in educational assessment, psychiatric evaluation, and many other disciplines. Central to the model specification is the so-called Q-matrix that provides a qualitative specification of the item-attribute relationship. In this paper, we develop theories on the identifiability for the Q-matrix under the DINA and the DINO models. We further propose an estimation procedure for the Q-matrix through the regularized maximum likelihood. The applicability of this procedure is not limited to the DINA or the DINO model and it can be applied to essentially all Q-matrix based diagnostic classification models. Simulation studies are conducted to illustrate its performance. Furthermore, two case studies are presented. The first case is a data set on fraction subtraction (educational application) and the second case is a subsample of the National Epidemiological Survey on Alcohol and Related Conditions concerning the social anxiety disorder (psychiatric application). PMID:26294801

  6. School-Based Cognitive-Behavioral Therapy for an Adolescent Presenting with ADHD and Explosive Anger: A Case Study

    ERIC Educational Resources Information Center

    Parker, Janise; Zaboski, Brian; Joyce-Beaulieu, Diana

    2016-01-01

    This case demonstrates the efficacy of utilizing an intensive, multi-faceted behavioral intervention paradigm. A comprehensive, integrative, school-based service model was applied to address attention deficit hyperactivity disorder symptomology, oppositional behaviors, and explosive anger at the secondary level. The case reviews a multi-modal…

  7. COLLABORATE©: a universal competency-based paradigm for professional case management, Part III: key considerations for making the paradigm shift.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2014-01-01

    The purpose of the third of this 3-article series is to provide context and justification for a new paradigm of case management built upon a value-driven foundation that * improves the patient's experience of health care delivery, * provides consistency in approach applicable across health care populations, and * optimizes the potential for return on investment. Applicable to all health care sectors where case management is practiced. In moving forward the one fact that rings true is there will be constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby or Pokey. This is exactly why the definition of a competency-based case management model's time has come, one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. While there is inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  8. Agent-Based Modeling of Growth Processes

    ERIC Educational Resources Information Center

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  9. [Case finding in early prevention networks - a heuristic for ambulatory care settings].

    PubMed

    Barth, Michael; Belzer, Florian

    2016-06-01

    One goal of early prevention is the support of families with small children up to three years who are exposed to psychosocial risks. The identification of these cases is often complex and not well-directed, especially in the ambulatory care setting. Development of a model of a feasible and empirical based strategy for case finding in ambulatory care. Based on the risk factors of postpartal depression, lack of maternal responsiveness, parental stress with regulation disorders and poverty a lexicographic and non-compensatory heuristic model with simple decision rules, will be constructed and empirically tested. Therefore the original data set from an evaluation of the pediatric documentary form on psychosocial issues of families with small children in well-child visits will be used and reanalyzed. The first diagnostic step in the non-compensatory and hierarchical classification process is the assessment of postpartal depression followed by maternal responsiveness, parental stress and poverty. The classification model identifies 89.0 % cases from the original study. Compared to the original study the decision process becomes clearer and more concise. The evidence-based and data-driven model exemplifies a strategy for the assessment of psychosocial risk factors in ambulatory care settings. It is based on four evidence-based risk factors and offers a quick and reliable classification. A further advantage of this model is that after a risk factor is identified the diagnostic procedure will be stopped and the counselling process can commence. For further validation of the model studies, in well suited early prevention networks are needed.

  10. An Integrated Scenario Ensemble-Based Framework for Hurricane Evacuation Modeling: Part 2-Hazard Modeling.

    PubMed

    Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia

    2018-04-25

    Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.

  11. Physician-owned Surgical Hospitals Outperform Other Hospitals in the Medicare Value-based Purchasing Program

    PubMed Central

    Ramirez, Adriana G; Tracci, Margaret C; Stukenborg, George J; Turrentine, Florence E; Kozower, Benjamin D; Jones, R Scott

    2016-01-01

    Background The Hospital Value-Based Purchasing Program measures value of care provided by participating Medicare hospitals while creating financial incentives for quality improvement and fostering increased transparency. Limited information is available comparing hospital performance across healthcare business models. Study Design 2015 hospital Value-Based Purchasing Program results were used to examine hospital performance by business model. General linear modeling assessed differences in mean total performance score, hospital case mix index, and differences after adjustment for differences in hospital case mix index. Results Of 3089 hospitals with Total Performance Scores (TPS), categories of representative healthcare business models included 104 Physician-owned Surgical Hospitals (POSH), 111 University HealthSystem Consortium (UHC), 14 US News & World Report Honor Roll (USNWR) Hospitals, 33 Kaiser Permanente, and 124 Pioneer Accountable Care Organization affiliated hospitals. Estimated mean TPS for POSH (64.4, 95% CI 61.83, 66.38) and Kaiser (60.79, 95% CI 56.56, 65.03) were significantly higher compared to all remaining hospitals while UHC members (36.8, 95% CI 34.51, 39.17) performed below the mean (p < 0.0001). Significant differences in mean hospital case mix index included POSH (mean 2.32, p<0.0001), USNWR honorees (mean 2.24, p 0.0140) and UHC members (mean =1.99, p<0.0001) while Kaiser Permanente hospitals had lower case mix value (mean =1.54, p<0.0001). Re-estimation of TPS did not change the original results after adjustment for differences in hospital case mix index. Conclusions The Hospital Value-Based Purchasing Program revealed superior hospital performance associated with business model. Closer inspection of high-value hospitals may guide value improvement and policy-making decisions for all Medicare Value-Based Purchasing Program Hospitals. PMID:27502368

  12. Development of Modified Incompressible Ideal Gas Model for Natural Draft Cooling Tower Flow Simulation

    NASA Astrophysics Data System (ADS)

    Hyhlík, Tomáš

    2018-06-01

    The article deals with the development of incompressible ideal gas like model, which can be used as a part of mathematical model describing natural draft wet-cooling tower flow, heat and mass transfer. It is shown, based on the results of a complex mathematical model of natural draft wet-cooling tower flow, that behaviour of pressure, temperature and density is very similar to the case of hydrostatics of moist air, where heat and mass transfer in the fill zone must be taken into account. The behaviour inside the cooling tower is documented using density, pressure and temperature distributions. The proposed equation for the density is based on the same idea like the incompressible ideal gas model, which is only dependent on temperature, specific humidity and in this case on elevation. It is shown that normalized density difference of the density based on proposed model and density based on the nonsimplified model is in the order of 10-4. The classical incompressible ideal gas model, Boussinesq model and generalised Boussinesq model are also tested. These models show deviation in percentages.

  13. An Intelligent Case-Based Help Desk Providing Web-Based Support for EOSDIS Customers

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.; Thurman, David A.

    1998-01-01

    This paper describes a project that extends the concept of help desk automation by offering World Wide Web access to a case-based help desk. It explores the use of case-based reasoning and cognitive engineering models to create an 'intelligent' help desk system, one that learns. It discusses the AutoHelp architecture for such a help desk and summarizes the technologies used to create a help desk for NASA data users.

  14. The oral case presentation: toward a performance-based rhetorical model for teaching and learning.

    PubMed

    Chan, Mei Yuit

    2015-01-01

    The oral case presentation is an important communicative activity in the teaching and assessment of students. Despite its importance, not much attention has been paid to providing support for teachers to teach this difficult task to medical students who are novices to this form of communication. As a formalized piece of talk that takes a regularized form and used for a specific communicative goal, the case presentation is regarded as a rhetorical activity and awareness of its rhetorical and linguistic characteristics should be given due consideration in teaching. This paper reviews practitioners' and the limited research literature that relates to expectations of medical educators about what makes a good case presentation, and explains the rhetorical aspect of the activity. It is found there is currently a lack of a comprehensive model of the case presentation that projects the rhetorical and linguistic skills needed to produce and deliver a good presentation. Attempts to describe the structure of the case presentation have used predominantly opinion-based methodologies. In this paper, I argue for a performance-based model that would not only allow a description of the rhetorical structure of the oral case presentation, but also enable a systematic examination of the tacit genre knowledge that differentiates the expert from the novice. Such a model will be a useful resource for medical educators to provide more structured feedback and teaching support to medical students in learning this important genre.

  15. The oral case presentation: toward a performance-based rhetorical model for teaching and learning

    PubMed Central

    Chan, Mei Yuit

    2015-01-01

    The oral case presentation is an important communicative activity in the teaching and assessment of students. Despite its importance, not much attention has been paid to providing support for teachers to teach this difficult task to medical students who are novices to this form of communication. As a formalized piece of talk that takes a regularized form and used for a specific communicative goal, the case presentation is regarded as a rhetorical activity and awareness of its rhetorical and linguistic characteristics should be given due consideration in teaching. This paper reviews practitioners’ and the limited research literature that relates to expectations of medical educators about what makes a good case presentation, and explains the rhetorical aspect of the activity. It is found there is currently a lack of a comprehensive model of the case presentation that projects the rhetorical and linguistic skills needed to produce and deliver a good presentation. Attempts to describe the structure of the case presentation have used predominantly opinion-based methodologies. In this paper, I argue for a performance-based model that would not only allow a description of the rhetorical structure of the oral case presentation, but also enable a systematic examination of the tacit genre knowledge that differentiates the expert from the novice. Such a model will be a useful resource for medical educators to provide more structured feedback and teaching support to medical students in learning this important genre. PMID:26194482

  16. On the application of semantic technologies to the domain of forensic investigations in financial crimes

    NASA Astrophysics Data System (ADS)

    Scheidat, Tobias; Merkel, Ronny; Krummel, Volker; Gerlach, Andreas; Weisensee, Michala; Zeihe, Jana; Dittmann, Jana

    2017-10-01

    In daily police practice, forensic investigation of criminal cases is mainly based on manual work and the experience of individual forensic experts, using basic storage and data processing technologies. However, an individual criminal case does not only consist of the actual offence, but also of a variety of different aspects involved. For example, in order to solve a financial criminal case, an investigator has to find interrelations between different case entities as well as to other cases. The required information about these different entities is often stored in various databases and mostly requires to be manually requested and processed by forensic investigators. We propose the application of semantic technologies to the domain of forensic investigations at the example of financial crimes. Such combination allows for modelling specific case entities and their interrelations within and between cases. As a result, an explorative search of connections between case entities in the scope of an investigation as well as an automated derivation of conclusions from an established fact base is enabled. The proposed model is presented in the form of a crime field ontology, based on different types of knowledge obtained from three individual sources: open source intelligence, forensic investigators and captive interviews of detained criminals. The modelled crime field ontology is illustrated at two examples using the well known crime type of explosive attack on ATM and the potentially upcoming crime type data theft by NFC crowd skimming. Of these criminal modi operandi, anonymized fictional are modelled, visualized and exploratively searched. Modelled case entities include modi operandi, events, actors, resources, exploited weaknesses as well as flows of money, data and know how. The potential exploration of interrelations between the different case entities of such examples is illustrated in the scope of a fictitious investigation, highlighting the potential of the approach.

  17. Two-year outcome of team-based intensive case management for patients with schizophrenia.

    PubMed

    Aberg-Wistedt, A; Cressell, T; Lidberg, Y; Liljenberg, B; Osby, U

    1995-12-01

    Two-year outcomes of patients with schizophrenic disorders who were assigned to an intensive, team-based case management program and patients who received standard psychiatric services were assessed. The case management model featured increased staff contact time with patients, rehabilitation plans based on patients' expressed needs, and patients' attendance at team meetings where their rehabilitation plan was discussed. Forty patients were randomly assigned to either the case management group or the control group that received standard services. Patients' use of emergency and inpatient services, their quality of life, the size of their social networks, and their relatives' burden of care were assessed at assignment to the study groups and at two-year follow-up. Patients in the case management group had significantly fewer emergency visits compared with the two years before the study, and their relatives reported significantly reduced burden of care associated with relationships with psychiatric services over the two-year period. The size of patients' social networks increased for the case management group and decreased for the control group. A team-based intensive case management model is an effective intervention in the rehabilitation of patients with chronic schizophrenia.

  18. Enhancing Large-Group Problem-Based Learning in Veterinary Medical Education.

    ERIC Educational Resources Information Center

    Pickrell, John A.

    This project for large-group, problem-based learning at Kansas State University College of Veterinary Medicine developed 47 case-based videotapes that are used to model clinical conditions and also involved veterinary practitioners to formulate true practice cases into student learning opportunities. Problem-oriented, computer-assisted diagnostic…

  19. A systematic review and qualitative analysis to inform the development of a new emergency department-based geriatric case management model.

    PubMed

    Sinha, Samir K; Bessman, Edward S; Flomenbaum, Neal; Leff, Bruce

    2011-06-01

    We inform the future development of a new geriatric emergency management practice model. We perform a systematic review of the existing evidence for emergency department (ED)-based case management models designed to improve the health, social, and health service utilization outcomes for noninstitutionalized older patients within the context of an index ED visit. This was a systematic review of English-language articles indexed in MEDLINE and CINAHL (1966 to 2010), describing ED-based case management models for older adults. Bibliographies of the retrieved articles were reviewed to identify additional references. A systematic qualitative case study analytic approach was used to identify the core operational components and outcome measures of the described clinical interventions. The authors of the included studies were also invited to verify our interpretations of their work. The determined patterns of component adherence were then used to postulate the relative importance and effect of the presence or absence of a particular component in influencing the overall effectiveness of their respective interventions. Eighteen of 352 studies (reported in 20 articles) met study criteria. Qualitative analyses identified 28 outcome measures and 8 distinct model characteristic components that included having an evidence-based practice model, nursing clinical involvement or leadership, high-risk screening processes, focused geriatric assessments, the initiation of care and disposition planning in the ED, interprofessional and capacity-building work practices, post-ED discharge follow-up with patients, and evaluation and monitoring processes. Of the 15 positive study results, 6 had all 8 characteristic components and 9 were found to be lacking at least 1 component. Two studies with positive results lacked 2 characteristic components and none lacked more than 2 components. Of the 3 studies with negative results demonstrating no positive effects based on any outcome tested, one lacked 2, one lacked 3, and one lacked 4 of the 8 model components. Successful models of ED-based case management models for older adults share certain key characteristics. This study builds on the emerging literature in this area and leverages the differences in these models and their associated outcomes to support the development of an evidence-based normative and effective geriatric emergency management practice model designed to address the special care needs and thereby improve the health and health service utilization outcomes of older patients. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  20. Achieving Accreditation Council for Graduate Medical Education duty hours compliance within advanced surgical training: a simulation-based feasibility assessment.

    PubMed

    Obi, Andrea; Chung, Jennifer; Chen, Ryan; Lin, Wandi; Sun, Siyuan; Pozehl, William; Cohn, Amy M; Daskin, Mark S; Seagull, F Jacob; Reddy, Rishindra M

    2015-11-01

    Certain operative cases occur unpredictably and/or have long operative times, creating a conflict between Accreditation Council for Graduate Medical Education (ACGME) rules and adequate training experience. A ProModel-based simulation was developed based on historical data. Probabilistic distributions of operative time calculated and combined with an ACGME compliant call schedule. For the advanced surgical cases modeled (cardiothoracic transplants), 80-hour violations were 6.07% and the minimum number of days off was violated 22.50%. There was a 36% chance of failure to fulfill any (either heart or lung) minimum case requirement despite adequate volume. The variable nature of emergency cases inevitably leads to work hour violations under ACGME regulations. Unpredictable cases mandate higher operative volume to ensure achievement of adequate caseloads. Publically available simulation technology provides a valuable avenue to identify adequacy of case volumes for trainees in both the elective and emergency setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Applying the compound Poisson process model to the reporting of injury-related mortality rates.

    PubMed

    Kegler, Scott R

    2007-02-16

    Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.

  2. A Case Analysis of a Model Program for the Leadership Development of Women Faculty and Staff Seeking to Advance Their Careers in Higher Education

    ERIC Educational Resources Information Center

    Calizo, Lee Scherer Hawthorne

    2011-01-01

    The purpose of this case study was to explore a model of leadership development for women faculty and staff in higher education. This study is significant because it explored the only identified campus-based program open to both faculty and staff. The campus-based Women's Institute for Leadership Development (WILD) program at the University of…

  3. Appendix 2: Risk-based framework and risk case studies. Risk Assessment for two bird species in northern Wisconsin.

    Treesearch

    Megan M. Friggens; Stephen N. Matthews

    2012-01-01

    Species distribution models for 147 bird species have been derived using climate, elevation, and distribution of current tree species as potential predictors (Matthews et al. 2011). In this case study, a risk matrix was developed for two bird species (fig. A2-5), with projected change in bird habitat (the x axis) based on models of changing suitable habitat resulting...

  4. Case-based Influence in Conflict Management

    DTIC Science & Technology

    2014-10-31

    AFRL-OSR-VA-TR-2014-0337 CASE-BASED INFLUENCE IN CONFLICT MANAGEMENT Robert Axelrod ARTIS RESEARCH & RISK MODELING Final Report 10/31/2014...FA9550-10-1-0373 Dr. Robert Axelrod - PI Dr. Richard Davis- PD ARTIS Research & Risk Modeling ARTIS 5741 Canyon Ridge North Cave Creek, AZ 85331-9318...analysis of the timing of cyber conflict that quickly received attention from over 30 countries. 3 1 Axelrod , Final Report and Publications Final

  5. Life cycle assessment based environmental impact estimation model for pre-stressed concrete beam bridge in the early design phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kyong Ju, E-mail: kjkim@cau.ac.kr; Yun, Won Gun, E-mail: ogun78@naver.com; Cho, Namho, E-mail: nhc51@cau.ac.kr

    The late rise in global concern for environmental issues such as global warming and air pollution is accentuating the need for environmental assessments in the construction industry. Promptly evaluating the environmental loads of the various design alternatives during the early stages of a construction project and adopting the most environmentally sustainable candidate is therefore of large importance. Yet, research on the early evaluation of a construction project's environmental load in order to aid the decision making process is hitherto lacking. In light of this dilemma, this study proposes a model for estimating the environmental load by employing only the mostmore » basic information accessible during the early design phases of a project for the pre-stressed concrete (PSC) beam bridge, the most common bridge structure. Firstly, a life cycle assessment (LCA) was conducted on the data from 99 bridges by integrating the bills of quantities (BOQ) with a life cycle inventory (LCI) database. The processed data was then utilized to construct a case based reasoning (CBR) model for estimating the environmental load. The accuracy of the estimation model was then validated using five test cases; the model's mean absolute error rates (MAER) for the total environmental load was calculated as 7.09%. Such test results were shown to be superior compared to those obtained from a multiple-regression based model and a slab area base-unit analysis model. Henceforth application of this model during the early stages of a project is expected to highly complement environmentally friendly designs and construction by facilitating the swift evaluation of the environmental load from multiple standpoints. - Highlights: • This study is to develop the model of assessing the environmental impacts on LCA. • Bills of quantity from completed designs of PSC Beam were linked with the LCI DB. • Previous cases were used to estimate the environmental load of new case by CBR model. • CBR model produces more accurate estimations (7.09%) than other conventional models. • This study supports decision making process in the early stage of a new construction case.« less

  6. Gene Model Annotations for Drosophila melanogaster: The Rule-Benders

    PubMed Central

    Crosby, Madeline A.; Gramates, L. Sian; dos Santos, Gilberto; Matthews, Beverley B.; St. Pierre, Susan E.; Zhou, Pinglei; Schroeder, Andrew J.; Falls, Kathleen; Emmert, David B.; Russo, Susan M.; Gelbart, William M.

    2015-01-01

    In the context of the FlyBase annotated gene models in Drosophila melanogaster, we describe the many exceptional cases we have curated from the literature or identified in the course of FlyBase analysis. These range from atypical but common examples such as dicistronic and polycistronic transcripts, noncanonical splices, trans-spliced transcripts, noncanonical translation starts, and stop-codon readthroughs, to single exceptional cases such as ribosomal frameshifting and HAC1-type intron processing. In FlyBase, exceptional genes and transcripts are flagged with Sequence Ontology terms and/or standardized comments. Because some of the rule-benders create problems for handlers of high-throughput data, we discuss plans for flagging these cases in bulk data downloads. PMID:26109356

  7. Estimation of inlet flow rates for image-based aneurysm CFD models: where and how to begin?

    PubMed

    Valen-Sendstad, Kristian; Piccinelli, Marina; KrishnankuttyRema, Resmi; Steinman, David A

    2015-06-01

    Patient-specific flow rates are rarely available for image-based computational fluid dynamics models. Instead, flow rates are often assumed to scale according to the diameters of the arteries of interest. Our goal was to determine how choice of inlet location and scaling law affect such model-based estimation of inflow rates. We focused on 37 internal carotid artery (ICA) aneurysm cases from the Aneurisk cohort. An average ICA flow rate of 245 mL min(-1) was assumed from the literature, and then rescaled for each case according to its inlet diameter squared (assuming a fixed velocity) or cubed (assuming a fixed wall shear stress). Scaling was based on diameters measured at various consistent anatomical locations along the models. Choice of location introduced a modest 17% average uncertainty in model-based flow rate, but within individual cases estimated flow rates could vary by >100 mL min(-1). A square law was found to be more consistent with physiological flow rates than a cube law. Although impact of parent artery truncation on downstream flow patterns is well studied, our study highlights a more insidious and potentially equal impact of truncation site and scaling law on the uncertainty of assumed inlet flow rates and thus, potentially, downstream flow patterns.

  8. Educating resident physicians using virtual case-based simulation improves diabetes management: a randomized controlled trial.

    PubMed

    Sperl-Hillen, JoAnn; O'Connor, Patrick J; Ekstrom, Heidi L; Rush, William A; Asche, Stephen E; Fernandes, Omar D; Appana, Deepika; Amundson, Gerald H; Johnson, Paul E; Curran, Debra M

    2014-12-01

    To test a virtual case-based Simulated Diabetes Education intervention (SimDE) developed to teach primary care residents how to manage diabetes. Nineteen primary care residency programs, with 341 volunteer residents in all postgraduate years (PGY), were randomly assigned to a SimDE intervention group or control group (CG). The Web-based interactive educational intervention used computerized virtual patients who responded to provider actions through programmed simulation models. Eighteen distinct learning cases (L-cases) were assigned to SimDE residents over six months from 2010 to 2011. Impact was assessed using performance on four virtual assessment cases (A-cases), an objective knowledge test, and pre-post changes in self-assessed diabetes knowledge and confidence. Group comparisons were analyzed using generalized linear mixed models, controlling for clustering of residents within residency programs and differences in baseline knowledge. The percentages of residents appropriately achieving A-case composite clinical goals for glucose, blood pressure, and lipids were as follows: A-case 1: SimDE = 21.2%, CG = 1.8%, P = .002; A-case 2: SimDE = 15.7%, CG = 4.7%, P = .02; A-case 3: SimDE = 48.0%, CG = 10.4%, P < .001; and A-case 4: SimDE = 42.1%, CG = 18.7%, P = .004. The mean knowledge score and pre-post changes in self-assessed knowledge and confidence were significantly better for SimDE group than CG participants. A virtual case-based simulated diabetes education intervention improved diabetes management skills, knowledge, and confidence for primary care residents.

  9. Comparing microscopic activity-based and traditional models of travel demand : an Austin area case study

    DOT National Transportation Integrated Search

    2007-09-01

    Two competing approaches to travel demand modeling exist today. The more traditional 4-step travel demand models rely on aggregate demographic data at a traffic analysis zone (TAZ) level. Activity-based microsimulation methods employ more robus...

  10. Case Studies' Effect on Undergraduates' Achievement, Attitudes, and Team Shared Mental Models in Educational Psychology

    ERIC Educational Resources Information Center

    Razzouk, Rim; Johnson, Tristan E.

    2013-01-01

    The purpose of this study was to examine the effect of case studies on learning outcomes, attitudes toward instructions, and team shared mental models (SMM) in a team-based learning environment in an undergraduate educational psychology course. Approximately 105 students who participated in this study were randomly assigned to either a case-study…

  11. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    PubMed

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  12. DAMS: A Model to Assess Domino Effects by Using Agent-Based Modeling and Simulation.

    PubMed

    Zhang, Laobing; Landucci, Gabriele; Reniers, Genserik; Khakzad, Nima; Zhou, Jianfeng

    2017-12-19

    Historical data analysis shows that escalation accidents, so-called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent-based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent-based modeling technique explains the domino effects from a bottom-up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher-level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large-scale complicated cases. © 2017 Society for Risk Analysis.

  13. Transient thermal and stress analysis of maxillary second premolar tooth using an exact three-dimensional model.

    PubMed

    Hashemipour, Maryam Alsadat; Mohammadpour, Ali; Nassab, Seiied Abdolreza Gandjalikhan

    2010-01-01

    In this paper, the temperature and stress distributions in an exact 3D-model of a restored maxillary second premolar tooth are obtained with finite element approach. The carious teeth need to restore with appropriate restorative materials. There are too many restorative materials which can be used instead of tooth structures; since tooth structures are being replaced, the restorative materials should be similar to original structure as could as possible. In the present study, a Mesial Occlusal Distal (MOD) type of restoration is chosen and applied to a sound tooth model. Four cases of restoration are investigated: two cases in which base are used under restorative materials and two cases in which base is deleted. The restorative materials are amalgam and composite and glass-inomer is used as a base material. Modeling is done in the solid works ambient by means of an exact measuring of a typical human tooth dimensions. Tooth behavior under thermal load due to consuming hot liquids is analyzed by means of a three dimensional finite element method using ANSYS software. The highest values of tensile and compressive stresses are compared with tensile and compressive strength of the tooth and restorative materials and the value of shear stress on the tooth and restoration junctions is compared with the bond strength. Also, sound tooth under the same thermal load is analyzed and the results are compared with those obtained for restored models. Temperature and stress distributions in the tooth are calculated for each case, with a special consideration in the vicinity of pulp and restoration region. Numerical results show that in two cases with amalgam, using the base material (Glass-ionomer) under the restorative material causes to decrease the maximum temperature in the restorative teeth. In the stress analysis, it is seen that the principal stress has its maximum values in composite restorations. The maximum temperatures are found in the restoration case of amalgam without base. Besides, it is found that restoration has not any influence on the stress values at DEJ, such that for all cases, these values are close to sound tooth results.

  14. iHelp: an intelligent online helpdesk system.

    PubMed

    Wang, Dingding; Li, Tao; Zhu, Shenghuo; Gong, Yihong

    2011-02-01

    Due to the importance of high-quality customer service, many companies use intelligent helpdesk systems (e.g., case-based systems) to improve customer service quality. However, these systems face two challenges: 1) Case retrieval measures: most case-based systems use traditional keyword-matching-based ranking schemes for case retrieval and have difficulty to capture the semantic meanings of cases and 2) result representation: most case-based systems return a list of past cases ranked by their relevance to a new request, and customers have to go through the list and examine the cases one by one to identify their desired cases. To address these challenges, we develop iHelp, an intelligent online helpdesk system, to automatically find problem-solution patterns from the past customer-representative interactions. When a new customer request arrives, iHelp searches and ranks the past cases based on their semantic relevance to the request, groups the relevant cases into different clusters using a mixture language model and symmetric matrix factorization, and summarizes each case cluster to generate recommended solutions. Case and user studies have been conducted to show the full functionality and the effectiveness of iHelp.

  15. A continuum mechanics-based musculo-mechanical model for esophageal transport

    NASA Astrophysics Data System (ADS)

    Kou, Wenjun; Griffith, Boyce E.; Pandolfino, John E.; Kahrilas, Peter J.; Patankar, Neelesh A.

    2017-11-01

    In this work, we extend our previous esophageal transport model using an immersed boundary (IB) method with discrete fiber-based structural model, to one using a continuum mechanics-based model that is approximated based on finite elements (IB-FE). To deal with the leakage of flow when the Lagrangian mesh becomes coarser than the fluid mesh, we employ adaptive interaction quadrature points to deal with Lagrangian-Eulerian interaction equations based on a previous work (Griffith and Luo [1]). In particular, we introduce a new anisotropic adaptive interaction quadrature rule. The new rule permits us to vary the interaction quadrature points not only at each time-step and element but also at different orientations per element. This helps to avoid the leakage issue without sacrificing the computational efficiency and accuracy in dealing with the interaction equations. For the material model, we extend our previous fiber-based model to a continuum-based model. We present formulations for general fiber-reinforced material models in the IB-FE framework. The new material model can handle non-linear elasticity and fiber-matrix interactions, and thus permits us to consider more realistic material behavior of biological tissues. To validate our method, we first study a case in which a three-dimensional short tube is dilated. Results on the pressure-displacement relationship and the stress distribution matches very well with those obtained from the implicit FE method. We remark that in our IB-FE case, the three-dimensional tube undergoes a very large deformation and the Lagrangian mesh-size becomes about 6 times of Eulerian mesh-size in the circumferential orientation. To validate the performance of the method in handling fiber-matrix material models, we perform a second study on dilating a long fiber-reinforced tube. Errors are small when we compare numerical solutions with analytical solutions. The technique is then applied to the problem of esophageal transport. We use two fiber-reinforced models for the esophageal tissue: a bi-linear model and an exponential model. We present three cases on esophageal transport that differ in the material model and the muscle fiber architecture. The overall transport features are consistent with those observed from the previous model. We remark that the continuum-based model can handle more realistic and complicated material behavior. This is demonstrated in our third case where a spatially varying fiber architecture is included based on experimental study. We find that this unique muscle fiber architecture could generate a so-called pressure transition zone, which is a luminal pressure pattern that is of clinical interest. This suggests an important role of muscle fiber architecture in esophageal transport.

  16. Case studies, cross-site comparisons, and the challenge of generalization: comparing agent-based models of land-use change in frontier regions

    PubMed Central

    Parker, Dawn C.; Entwisle, Barbara; Rindfuss, Ronald R.; Vanwey, Leah K.; Manson, Steven M.; Moran, Emilio; An, Li; Deadman, Peter; Evans, Tom P.; Linderman, Marc; Rizi, S. Mohammad Mussavi; Malanson, George

    2009-01-01

    Cross-site comparisons of case studies have been identified as an important priority by the land-use science community. From an empirical perspective, such comparisons potentially allow generalizations that may contribute to production of global-scale land-use and land-cover change projections. From a theoretical perspective, such comparisons can inform development of a theory of land-use science by identifying potential hypotheses and supporting or refuting evidence. This paper undertakes a structured comparison of four case studies of land-use change in frontier regions that follow an agent-based modeling approach. Our hypothesis is that each case study represents a particular manifestation of a common process. Given differences in initial conditions among sites and the time at which the process is observed, actual mechanisms and outcomes are anticipated to differ substantially between sites. Our goal is to reveal both commonalities and differences among research sites, model implementations, and ultimately, conclusions derived from the modeling process. PMID:19960107

  17. Case studies, cross-site comparisons, and the challenge of generalization: comparing agent-based models of land-use change in frontier regions.

    PubMed

    Parker, Dawn C; Entwisle, Barbara; Rindfuss, Ronald R; Vanwey, Leah K; Manson, Steven M; Moran, Emilio; An, Li; Deadman, Peter; Evans, Tom P; Linderman, Marc; Rizi, S Mohammad Mussavi; Malanson, George

    2008-01-01

    Cross-site comparisons of case studies have been identified as an important priority by the land-use science community. From an empirical perspective, such comparisons potentially allow generalizations that may contribute to production of global-scale land-use and land-cover change projections. From a theoretical perspective, such comparisons can inform development of a theory of land-use science by identifying potential hypotheses and supporting or refuting evidence. This paper undertakes a structured comparison of four case studies of land-use change in frontier regions that follow an agent-based modeling approach. Our hypothesis is that each case study represents a particular manifestation of a common process. Given differences in initial conditions among sites and the time at which the process is observed, actual mechanisms and outcomes are anticipated to differ substantially between sites. Our goal is to reveal both commonalities and differences among research sites, model implementations, and ultimately, conclusions derived from the modeling process.

  18. Dynamic Forecasting of Zika Epidemics Using Google Trends

    PubMed Central

    Jin, Yuan; Huang, Yong; Lin, Baihan; An, Xiaoping; Feng, Dan; Tong, Yigang

    2017-01-01

    We developed a dynamic forecasting model for Zika virus (ZIKV), based on real-time online search data from Google Trends (GTs). It was designed to provide Zika virus disease (ZVD) surveillance and detection for Health Departments, and predictive numbers of infection cases, which would allow them sufficient time to implement interventions. In this study, we found a strong correlation between Zika-related GTs and the cumulative numbers of reported cases (confirmed, suspected and total cases; p<0.001). Then, we used the correlation data from Zika-related online search in GTs and ZIKV epidemics between 12 February and 20 October 2016 to construct an autoregressive integrated moving average (ARIMA) model (0, 1, 3) for the dynamic estimation of ZIKV outbreaks. The forecasting results indicated that the predicted data by ARIMA model, which used the online search data as the external regressor to enhance the forecasting model and assist the historical epidemic data in improving the quality of the predictions, are quite similar to the actual data during ZIKV epidemic early November 2016. Integer-valued autoregression provides a useful base predictive model for ZVD cases. This is enhanced by the incorporation of GTs data, confirming the prognostic utility of search query based surveillance. This accessible and flexible dynamic forecast model could be used in the monitoring of ZVD to provide advanced warning of future ZIKV outbreaks. PMID:28060809

  19. Dynamic Forecasting of Zika Epidemics Using Google Trends.

    PubMed

    Teng, Yue; Bi, Dehua; Xie, Guigang; Jin, Yuan; Huang, Yong; Lin, Baihan; An, Xiaoping; Feng, Dan; Tong, Yigang

    2017-01-01

    We developed a dynamic forecasting model for Zika virus (ZIKV), based on real-time online search data from Google Trends (GTs). It was designed to provide Zika virus disease (ZVD) surveillance and detection for Health Departments, and predictive numbers of infection cases, which would allow them sufficient time to implement interventions. In this study, we found a strong correlation between Zika-related GTs and the cumulative numbers of reported cases (confirmed, suspected and total cases; p<0.001). Then, we used the correlation data from Zika-related online search in GTs and ZIKV epidemics between 12 February and 20 October 2016 to construct an autoregressive integrated moving average (ARIMA) model (0, 1, 3) for the dynamic estimation of ZIKV outbreaks. The forecasting results indicated that the predicted data by ARIMA model, which used the online search data as the external regressor to enhance the forecasting model and assist the historical epidemic data in improving the quality of the predictions, are quite similar to the actual data during ZIKV epidemic early November 2016. Integer-valued autoregression provides a useful base predictive model for ZVD cases. This is enhanced by the incorporation of GTs data, confirming the prognostic utility of search query based surveillance. This accessible and flexible dynamic forecast model could be used in the monitoring of ZVD to provide advanced warning of future ZIKV outbreaks.

  20. Expert knowledge maps for knowledge management: a case study in Traditional Chinese Medicine research.

    PubMed

    Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan

    2013-10-01

    To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.

  1. Modelling for reactor-style aerobic composting based on coupling theory of mass-heat-momentum transport and Contois equation.

    PubMed

    He, Xueqin; Han, Lujia; Ge, Jinyi; Huang, Guangqun

    2018-04-01

    This study establishes an optimal mathematical modelling to rationally describe the dynamic changes and spatial distribution of temperature and oxygen concentration in the aerobic composting process using coupling mass-heat-momentum transfer based on the microbial mechanism. Two different conditional composting experiments, namely continuous aeration and intermittent aeration, were performed to verify the proposed model. The results show that the model accurately predicted the dynamic changes in temperature (case I: R 2  = 0.93, RMSE = 1.95 K; case II: R 2  = 0.86, RMSE = 4.69 K) and oxygen concentration (case I: R 2  = 0.90, RMSE = 1.26%; case II: R 2  = 0.75, RMSE = 2.93%) in the central point of compost substrates. It also systematically simulated fluctuations in oxygen concentration caused by boundary conditions and the spatial distribution of the actual temperature and oxygen concentration. The proposed model exhibits good applicability in simulating the actual working conditions of aerobic composting process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Genotype-Based Association Mapping of Complex Diseases: Gene-Environment Interactions with Multiple Genetic Markers and Measurement Error in Environmental Exposures

    PubMed Central

    Lobach, Irvna; Fan, Ruzone; Carroll, Raymond T.

    2011-01-01

    With the advent of dense single nucleotide polymorphism genotyping, population-based association studies have become the major tools for identifying human disease genes and for fine gene mapping of complex traits. We develop a genotype-based approach for association analysis of case-control studies of gene-environment interactions in the case when environmental factors are measured with error and genotype data are available on multiple genetic markers. To directly use the observed genotype data, we propose two genotype-based models: genotype effect and additive effect models. Our approach offers several advantages. First, the proposed risk functions can directly incorporate the observed genotype data while modeling the linkage disequihbrium information in the regression coefficients, thus eliminating the need to infer haplotype phase. Compared with the haplotype-based approach, an estimating procedure based on the proposed methods can be much simpler and significantly faster. In addition, there is no potential risk due to haplotype phase estimation. Further, by fitting the proposed models, it is possible to analyze the risk alleles/variants of complex diseases, including their dominant or additive effects. To model measurement error, we adopt the pseudo-likelihood method by Lobach et al. [2008]. Performance of the proposed method is examined using simulation experiments. An application of our method is illustrated using a population-based case-control study of association between calcium intake with the risk of colorectal adenoma development. PMID:21031455

  3. Developing High PV Penetration Cases for Frequency Response Study of U.S. Western Interconnection: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Jin; Zhang, Yingchen; Veda, Santosh

    Recent large penetrations of solar photovoltaic (PV) generation and the inertial characteristics of inverter-based generation technologies have caught the attention of those in the electric power industry in the United States. This paper presents a systematic approach to developing test cases of high penetrations of PV for the Western Interconnection. First, to examine the accuracy of the base case model, the Western Electricity Coordinating Council (WECC) model is validated by using measurement data from synchronized phasor measurement units. Based on the 2022 Light Spring case, we developed four high PV penetration cases for the WECC system that are of interestmore » to the industry: 5% PV+15 % wind, 25% PV+15% wind, 45% PV+15% wind, 65% PV+15% wind). Additionally, a method to project PV is proposed that is based on collected, realistic PV distribution information, including the current and future PV power plant locations and penetrations in the WECC system. Both the utility-scale PV plant and residential rooftop PV are included in this study.« less

  4. Developing High PV Penetration Cases for Frequency Response Study of U.S. Western Interconnection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Jin; Zhang, Yingchen; Veda, Santosh

    Recent large penetrations of solar photovoltaic (PV) generation and the inertial characteristics of inverter-based generation technologies have caught the attention of those in the electric power industry in the United States. This paper presents a systematic approach to developing test cases of high penetrations of PV for the Western Interconnection. First, to examine the accuracy of the base case model, the Western Electricity Coordinating Council (WECC) model is validated by using measurement data from synchronized phasor measurement units. Based on the 2022 Light Spring case, we developed four high PV penetration cases for the WECC system that are of interestmore » to the industry: 5% PV+15 % wind, 25% PV+15% wind, 45% PV+15% wind, 65% PV+15% wind). Additionally, a method to project PV is proposed that is based on collected, realistic PV distribution information, including the current and future PV power plant locations and penetrations in the WECC system. Both the utility-scale PV plant and residential rooftop PV are included in this study.« less

  5. Developing High PV Penetration Cases for Frequency Response Study of U.S. Western Interconnection: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Jin; Zhang, Yingchen; Veda, Santosh

    2017-04-11

    Recent large penetrations of solar photovoltaic (PV) generation and the inertial characteristics of inverter-based generation technologies have caught the attention of those in the electric power industry in the United States. This paper presents a systematic approach to developing test cases of high penetrations of PV for the Western Interconnection. First, to examine the accuracy of the base case model, the Western Electricity Coordinating Council (WECC) model is validated by using measurement data from synchronized phasor measurement units. Based on the 2022 Light Spring case, we developed four high PV penetration cases for the WECC system that are of interestmore » to the industry: 5% PV+15 % wind, 25% PV+15% wind, 45% PV+15% wind, 65% PV+15% wind). Additionally, a method to project PV is proposed that is based on collected, realistic PV distribution information, including the current and future PV power plant locations and penetrations in the WECC system. Both the utility-scale PV plant and residential rooftop PV are included in this study.« less

  6. Locating, characterizing and minimizing sources of error for a paper case-based structured oral examination in a multi-campus clerkship.

    PubMed

    Kumar, A; Bridgham, R; Potts, M; Gushurst, C; Hamp, M; Passal, D

    2001-01-01

    To determine consistency of assessment in a new paper case-based structured oral examination in a multi-community pediatrics clerkship, and to identify correctable problems in the administration of examination and assessment process. Nine paper case-based oral examinations were audio-taped. From audio-tapes five community coordinators scored examiner behaviors and graded student performance. Correlations among examiner behaviors scores were examined. Graphs identified grading patterns of evaluators. The effect of exam-giving on evaluators was assessed by t-test. Reliability of grades was calculated and the effect of reducing assessment problems was modeled. Exam-givers differed most in their "teaching-guiding" behavior, and this negatively correlated with student grades. Exam reliability was lowered mainly by evaluator differences in leniency and grading pattern; less important was absence of standardization in cases. While grade reliability was low in early use of the paper case-based oral examination, modeling of plausible effects of training and monitoring for greater uniformity in administration of the examination and assigning scores suggests that more adequate reliabilities can be attained.

  7. Beyond the Central Dogma: Model-Based Learning of How Genes Determine Phenotypes

    ERIC Educational Resources Information Center

    Reinagel, Adam; Speth, Elena Bray

    2016-01-01

    In an introductory biology course, we implemented a learner-centered, model-based pedagogy that frequently engaged students in building conceptual models to explain how genes determine phenotypes. Model-building tasks were incorporated within case studies and aimed at eliciting students' understanding of 1) the origin of variation in a population…

  8. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  9. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  10. Flexible system model reduction and control system design based upon actuator and sensor influence functions

    NASA Technical Reports Server (NTRS)

    Yam, Yeung; Johnson, Timothy L.; Lang, Jeffrey H.

    1987-01-01

    A model reduction technique based on aggregation with respect to sensor and actuator influence functions rather than modes is presented for large systems of coupled second-order differential equations. Perturbation expressions which can predict the effects of spillover on both the reduced-order plant model and the neglected plant model are derived. For the special case of collocated actuators and sensors, these expressions lead to the derivation of constraints on the controller gains that are, given the validity of the perturbation technique, sufficient to guarantee the stability of the closed-loop system. A case study demonstrates the derivation of stabilizing controllers based on the present technique. The use of control and observation synthesis in modifying the dimension of the reduced-order plant model is also discussed. A numerical example is provided for illustration.

  11. From translational research to open technology innovation systems.

    PubMed

    Savory, Clive; Fortune, Joyce

    2015-01-01

    The purpose of this paper is to question whether the emphasis placed within translational research on a linear model of innovation provides the most effective model for managing health technology innovation. Several alternative perspectives are presented that have potential to enhance the existing model of translational research. A case study is presented of innovation of a clinical decision support system. The paper concludes from the case study that an extending the triple helix model of technology transfer, to one based on a quadruple helix, present a basis for improving the performance translational research. A case study approach is used to help understand development of an innovative technology within a teaching hospital. The case is then used to develop and refine a model of the health technology innovation system. The paper concludes from the case study that existing models of translational research could be refined further through the development of a quadruple helix model of heath technology innovation that encompasses greater emphasis on user-led and open innovation perspectives. The paper presents several implications for future research based on the need to enhance the model of health technology innovation used to guide policy and practice. The quadruple helix model of innovation that is proposed can potentially guide alterations to the existing model of translational research in the healthcare sector. Several suggestions are made for how innovation activity can be better supported at both a policy and operational level. This paper presents a synthesis of the innovation literature applied to a theoretically important case of open innovation in the UK National Health Service. It draws in perspectives from other industrial sectors and applies them specifically to the management and organisation of innovation activities around health technology and the services in which they are embedded.

  12. Computer aided system engineering and analysis (CASE/A) modeling package for ECLS systems - An overview

    NASA Technical Reports Server (NTRS)

    Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.

    1990-01-01

    An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.

  13. Identification and calibration of the structural model of historical masonry building damaged during the 2016 Italian earthquakes: The case study of Palazzo del Podestà in Montelupone

    NASA Astrophysics Data System (ADS)

    Catinari, Federico; Pierdicca, Alessio; Clementi, Francesco; Lenci, Stefano

    2017-11-01

    The results of an ambient-vibration based investigation conducted on the "Palazzo del Podesta" in Montelupone (Italy) is presented. The case study was damaged during the 20I6 Italian earthquakes that stroke the central part of the Italy. The assessment procedure includes full-scale ambient vibration testing, modal identification from ambient vibration responses, finite element modeling and dynamic-based identification of the uncertain structural parameters of the model. A very good match between theoretical and experimental modal parameters was reached and the model updating has been performed identifying some structural parameters.

  14. New mathematics for old physics: The case of lattice fluids

    NASA Astrophysics Data System (ADS)

    Barberousse, Anouk; Imbert, Cyrille

    2013-08-01

    We analyze the effects of the introduction of new mathematical tools on an old branch of physics by focusing on lattice fluids, which are cellular automata (CA)-based hydrodynamical models. We examine the nature of these discrete models, the type of novelty they bring about within scientific practice and the role they play in the field of fluid dynamics. We critically analyze Rohrlich's, Fox Keller's and Hughes' claims about CA-based models. We distinguish between different senses of the predicates "phenomenological" and "theoretical" for scientific models and argue that it is erroneous to conclude, as they do, that CA-based models are necessarily phenomenological in any sense of the term. We conversely claim that CA-based models of fluids, though at first sight blatantly misrepresenting fluids, are in fact conservative as far as the basic laws of statistical physics are concerned and not less theoretical than more traditional models in the field. Based on our case-study, we propose a general discussion of the prospect of CA for modeling in physics. We finally emphasize that lattice fluids are not just exotic oddities but do bring about new advantages in the investigation of fluids' behavior.

  15. Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.

    PubMed

    Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M

    2017-01-01

    Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.

  16. Modelling of occupational respirable crystalline silica exposure for quantitative exposure assessment in community-based case-control studies.

    PubMed

    Peters, Susan; Vermeulen, Roel; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2011-11-01

    We describe an empirical model for exposure to respirable crystalline silica (RCS) to create a quantitative job-exposure matrix (JEM) for community-based studies. Personal measurements of exposure to RCS from Europe and Canada were obtained for exposure modelling. A mixed-effects model was elaborated, with region/country and job titles as random effect terms. The fixed effect terms included year of measurement, measurement strategy (representative or worst-case), sampling duration (minutes) and a priori exposure intensity rating for each job from an independently developed JEM (none, low, high). 23,640 personal RCS exposure measurements, covering a time period from 1976 to 2009, were available for modelling. The model indicated an overall downward time trend in RCS exposure levels of -6% per year. Exposure levels were higher in the UK and Canada, and lower in Northern Europe and Germany. Worst-case sampling was associated with higher reported exposure levels and an increase in sampling duration was associated with lower reported exposure levels. Highest predicted RCS exposure levels in the reference year (1998) were for chimney bricklayers (geometric mean 0.11 mg m(-3)), monument carvers and other stone cutters and carvers (0.10 mg m(-3)). The resulting model enables us to predict time-, job-, and region/country-specific exposure levels of RCS. These predictions will be used in the SYNERGY study, an ongoing pooled multinational community-based case-control study on lung cancer.

  17. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  18. The effect of precrash velocity reduction on occupant response using a human body finite element model.

    PubMed

    Guleyupoglu, B; Schap, J; Kusano, K D; Gayzik, F S

    2017-07-04

    The objective of this study is to use a validated finite element model of the human body and a certified model of an anthropomorphic test dummy (ATD) to evaluate the effect of simulated precrash braking on driver kinematics, restraint loads, body loads, and computed injury criteria in 4 commonly injured body regions. The Global Human Body Models Consortium (GHBMC) 50th percentile male occupant (M50-O) and the Humanetics Hybrid III 50th percentile models were gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and driver airbag. Fifteen simulations per model (30 total) were conducted, including 4 scenarios at 3 severity levels: median, severe, and the U.S. New Car Assessment Program (U.S.-NCAP) and 3 extra per model with high-intensity braking. The 4 scenarios were no precollision system (no PCS), forward collision warning (FCW), FCW with prebraking assist (FCW+PBA), and FCW and PBA with autonomous precrash braking (FCW + PBA + PB). The baseline ΔV was 17, 34, and 56.4 kph for median, severe, and U.S.-NCAP scenarios, respectively, and were based on crash reconstructions from NASS/CDS. Pulses were then developed based on the assumed precrash systems equipped. Restraint properties and the generic pulse used were based on literature. In median crash severity cases, little to no risk (<10% risk for Abbreviated injury Scale [AIS] 3+) was found for all injury measures for both models. In the severe set of cases, little to no risk for AIS 3+ injury was also found for all injury measures. In NCAP cases, highest risk was typically found with No PCS and lowest with FCW + PBA + PB. In the higher intensity braking cases (1.0-1.4 g), head injury criterion (HIC), brain injury criterion (BrIC), and chest deflection injury measures increased with increased braking intensity. All other measures for these cases tended to decrease. The ATD also predicted and trended similar to the human body models predictions for both the median, severe, and NCAP cases. Forward excursion for both models decreased across median, severe, and NCAP cases and diverged from each other in cases above 1.0 g of braking intensity. The addition of precrash systems simulated through reduced precrash speeds caused reductions in some injury criteria, whereas others (chest deflection, HIC, and BrIC) increased due to a modified occupant position. The human model and ATD models trended similarly in nearly all cases with greater risk indicated in the human model. These results suggest the need for integrated safety systems that have restraints that optimize the occupant's position during precrash braking and prior to impact.

  19. Optimizing global liver function in radiation therapy treatment planning

    NASA Astrophysics Data System (ADS)

    Wu, Victor W.; Epelman, Marina A.; Wang, Hesheng; Romeijn, H. Edwin; Feng, Mary; Cao, Yue; Ten Haken, Randall K.; Matuszak, Martha M.

    2016-09-01

    Liver stereotactic body radiation therapy (SBRT) patients differ in both pre-treatment liver function (e.g. due to degree of cirrhosis and/or prior treatment) and radiosensitivity, leading to high variability in potential liver toxicity with similar doses. This work investigates three treatment planning optimization models that minimize risk of toxicity: two consider both voxel-based pre-treatment liver function and local-function-based radiosensitivity with dose; one considers only dose. Each model optimizes different objective functions (varying in complexity of capturing the influence of dose on liver function) subject to the same dose constraints and are tested on 2D synthesized and 3D clinical cases. The normal-liver-based objective functions are the linearized equivalent uniform dose (\\ell \\text{EUD} ) (conventional ‘\\ell \\text{EUD} model’), the so-called perfusion-weighted \\ell \\text{EUD} (\\text{fEUD} ) (proposed ‘fEUD model’), and post-treatment global liver function (GLF) (proposed ‘GLF model’), predicted by a new liver-perfusion-based dose-response model. The resulting \\ell \\text{EUD} , fEUD, and GLF plans delivering the same target \\ell \\text{EUD} are compared with respect to their post-treatment function and various dose-based metrics. Voxel-based portal venous liver perfusion, used as a measure of local function, is computed using DCE-MRI. In cases used in our experiments, the GLF plan preserves up to 4.6 % ≤ft(7.5 % \\right) more liver function than the fEUD (\\ell \\text{EUD} ) plan does in 2D cases, and up to 4.5 % ≤ft(5.6 % \\right) in 3D cases. The GLF and fEUD plans worsen in \\ell \\text{EUD} of functional liver on average by 1.0 Gy and 0.5 Gy in 2D and 3D cases, respectively. Liver perfusion information can be used during treatment planning to minimize the risk of toxicity by improving expected GLF; the degree of benefit varies with perfusion pattern. Although fEUD model optimization is computationally inexpensive and often achieves better GLF than \\ell \\text{EUD} model optimization does, the GLF model directly optimizes a more clinically relevant metric and can further improve fEUD plan quality.

  20. “And Yet It Was a Blessing”: The Case for Existential Maturity

    PubMed Central

    Reddy, Neha; Hauser, Joshua; Sonnenfeld, Sarah B.

    2017-01-01

    Abstract We are interested in the kind of well-being that can occur as a person approaches death; we call it “existential maturity.” We describe a conceptual model of this state that we felt was realized in an individual case, illustrating the state by describing the case. Our goal is to articulate a generalizable, working model of existential maturity in concepts and terms taken from fundamentals of psychodynamic theory. We hope that a recognizable case and a model-based way of thinking about what was going on can both help guide care that fosters existential maturity and stimulate more theoretical modeling of the state. PMID:28128674

  1. History Places: A Case Study for Relational Database and Information Retrieval System Design

    ERIC Educational Resources Information Center

    Hendry, David G.

    2007-01-01

    This article presents a project-based case study that was developed for students with diverse backgrounds and varied inclinations for engaging technical topics. The project, called History Places, requires that student teams develop a vision for a kind of digital library, propose a conceptual model, and use the model to derive a logical model and…

  2. An Application Practice of the IFLA FRBR Model: A Metadata Case Study for the National Palace Museum in Taipei.

    ERIC Educational Resources Information Center

    Chen, Ya-ning; Lin, Simon C.; Chen, Shu-jiun

    2002-01-01

    Explains the Functional Requirements for Bibliographic Records (FRBR) model which was proposed by the International Federation of Library Associations and Institutions (IFLA) as a framework to proceed content-based analysis and developing metadata format. Presents a case study that examines the feasibility of the FRBR model at the National Palace…

  3. Comparison between phenomenological and ab-initio reaction and relaxation models in DSMC

    NASA Astrophysics Data System (ADS)

    Sebastião, Israel B.; Kulakhmetov, Marat; Alexeenko, Alina

    2016-11-01

    New state-specific vibrational-translational energy exchange and dissociation models, based on ab-initio data, are implemented in direct simulation Monte Carlo (DSMC) method and compared to the established Larsen-Borgnakke (LB) and total collision energy (TCE) phenomenological models. For consistency, both the LB and TCE models are calibrated with QCT-calculated O2+O data. The model comparison test cases include 0-D thermochemical relaxation under adiabatic conditions and 1-D normal shockwave calculations. The results show that both the ME-QCT-VT and LB models can reproduce vibrational relaxation accurately but the TCE model is unable to reproduce nonequilibrium rates even when it is calibrated to accurate equilibrium rates. The new reaction model does capture QCT-calculated nonequilibrium rates. For all investigated cases, we discuss the prediction differences based on the new model features.

  4. Disease-Free Survival after Hepatic Resection in Hepatocellular Carcinoma Patients: A Prediction Approach Using Artificial Neural Network

    PubMed Central

    Ho, Wen-Hsien; Lee, King-Teh; Chen, Hong-Yaw; Ho, Te-Wei; Chiu, Herng-Chia

    2012-01-01

    Background A database for hepatocellular carcinoma (HCC) patients who had received hepatic resection was used to develop prediction models for 1-, 3- and 5-year disease-free survival based on a set of clinical parameters for this patient group. Methods The three prediction models included an artificial neural network (ANN) model, a logistic regression (LR) model, and a decision tree (DT) model. Data for 427, 354 and 297 HCC patients with histories of 1-, 3- and 5-year disease-free survival after hepatic resection, respectively, were extracted from the HCC patient database. From each of the three groups, 80% of the cases (342, 283 and 238 cases of 1-, 3- and 5-year disease-free survival, respectively) were selected to provide training data for the prediction models. The remaining 20% of cases in each group (85, 71 and 59 cases in the three respective groups) were assigned to validation groups for performance comparisons of the three models. Area under receiver operating characteristics curve (AUROC) was used as the performance index for evaluating the three models. Conclusions The ANN model outperformed the LR and DT models in terms of prediction accuracy. This study demonstrated the feasibility of using ANNs in medical decision support systems for predicting disease-free survival based on clinical databases in HCC patients who have received hepatic resection. PMID:22235270

  5. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images

    NASA Astrophysics Data System (ADS)

    Erdt, Marius; Sakas, Georgios

    2010-03-01

    This work presents a novel approach for model based segmentation of the kidney in images acquired by Computed Tomography (CT). The developed computer aided segmentation system is expected to support computer aided diagnosis and operation planning. We have developed a deformable model based approach based on local shape constraints that prevents the model from deforming into neighboring structures while allowing the global shape to adapt freely to the data. Those local constraints are derived from the anatomical structure of the kidney and the presence and appearance of neighboring organs. The adaptation process is guided by a rule-based deformation logic in order to improve the robustness of the segmentation in areas of diffuse organ boundaries. Our work flow consists of two steps: 1.) a user guided positioning and 2.) an automatic model adaptation using affine and free form deformation in order to robustly extract the kidney. In cases which show pronounced pathologies, the system also offers real time mesh editing tools for a quick refinement of the segmentation result. Evaluation results based on 30 clinical cases using CT data sets show an average dice correlation coefficient of 93% compared to the ground truth. The results are therefore in most cases comparable to manual delineation. Computation times of the automatic adaptation step are lower than 6 seconds which makes the proposed system suitable for an application in clinical practice.

  6. Toward Creating Synergy Among Policy, Procedures, and Implementation of Evidence-Based Models in Child Welfare Systems: Two Case Examples.

    PubMed

    Chamberlain, Patricia

    2017-03-01

    Over the past four to five decades, multiple randomized controlled trials have verified that preventive interventions targeting key parenting skills can have far-reaching effects on improving a diverse array of child outcomes. Further, these studies have shown that parenting skills can be taught, and they are malleable. Given these advances, prevention scientists are in a position to make solid empirically based recommendations to public child service systems on using parent-mediated interventions to optimize positive outcomes for the children and families that they serve. Child welfare systems serve some of this country's most vulnerable children and families, yet they have been slow (compared to juvenile justice and mental health systems) to adopt empirically based interventions. This paper describes two child-welfare-initiated, policy-based case studies that have sought to scale-up research-based parenting skills into the routine services that caseworkers deliver to the families that they serve. In both case studies, the child welfare system leaders worked with evaluators and model developers to tailor policy, administrative, and fiscal system practices to institutionalize and sustain evidence-based practices into usual foster care services. Descriptions of the implementations, intervention models, and preliminary results are described.

  7. Generation of “Virtual” Control Groups for Single Arm Prostate Cancer Adjuvant Trials

    PubMed Central

    Koziol, James A.; Chen, Xin; Xia, Xiao-Qin; Wang, Yipeng; Skarecky, Douglas; Sutton, Manuel; Sawyers, Anne; Ruckle, Herbert; Carpenter, Philip M.; Wang-Rodriguez, Jessica; Jiang, Jun; Deng, Mingsen; Pan, Cong; Zhu, Jian-guo; McLaren, Christine E.; Gurley, Michael J.; Lee, Chung; McClelland, Michael; Ahlering, Thomas; Kattan, Michael W.; Mercola, Dan

    2014-01-01

    It is difficult to construct a control group for trials of adjuvant therapy (Rx) of prostate cancer after radical prostatectomy (RP) due to ethical issues and patient acceptance. We utilized 8 curve-fitting models to estimate the time to 60%, 65%, … 95% chance of progression free survival (PFS) based on the data derived from Kattan post-RP nomogram. The 8 models were systematically applied to a training set of 153 post-RP cases without adjuvant Rx to develop 8 subsets of cases (reference case sets) whose observed PFS times were most accurately predicted by each model. To prepare a virtual control group for a single-arm adjuvant Rx trial, we first select the optimal model for the trial cases based on the minimum weighted Euclidean distance between the trial case set and the reference case set in terms of clinical features, and then compare the virtual PFS times calculated by the optimum model with the observed PFSs of the trial cases by the logrank test. The method was validated using an independent dataset of 155 post-RP patients without adjuvant Rx. We then applied the method to patients on a Phase II trial of adjuvant chemo-hormonal Rx post RP, which indicated that the adjuvant Rx is highly effective in prolonging PFS after RP in patients at high risk for prostate cancer recurrence. The method can accurately generate control groups for single-arm, post-RP adjuvant Rx trials for prostate cancer, facilitating development of new therapeutic strategies. PMID:24465467

  8. Generation of "virtual" control groups for single arm prostate cancer adjuvant trials.

    PubMed

    Jia, Zhenyu; Lilly, Michael B; Koziol, James A; Chen, Xin; Xia, Xiao-Qin; Wang, Yipeng; Skarecky, Douglas; Sutton, Manuel; Sawyers, Anne; Ruckle, Herbert; Carpenter, Philip M; Wang-Rodriguez, Jessica; Jiang, Jun; Deng, Mingsen; Pan, Cong; Zhu, Jian-Guo; McLaren, Christine E; Gurley, Michael J; Lee, Chung; McClelland, Michael; Ahlering, Thomas; Kattan, Michael W; Mercola, Dan

    2014-01-01

    It is difficult to construct a control group for trials of adjuvant therapy (Rx) of prostate cancer after radical prostatectomy (RP) due to ethical issues and patient acceptance. We utilized 8 curve-fitting models to estimate the time to 60%, 65%, … 95% chance of progression free survival (PFS) based on the data derived from Kattan post-RP nomogram. The 8 models were systematically applied to a training set of 153 post-RP cases without adjuvant Rx to develop 8 subsets of cases (reference case sets) whose observed PFS times were most accurately predicted by each model. To prepare a virtual control group for a single-arm adjuvant Rx trial, we first select the optimal model for the trial cases based on the minimum weighted Euclidean distance between the trial case set and the reference case set in terms of clinical features, and then compare the virtual PFS times calculated by the optimum model with the observed PFSs of the trial cases by the logrank test. The method was validated using an independent dataset of 155 post-RP patients without adjuvant Rx. We then applied the method to patients on a Phase II trial of adjuvant chemo-hormonal Rx post RP, which indicated that the adjuvant Rx is highly effective in prolonging PFS after RP in patients at high risk for prostate cancer recurrence. The method can accurately generate control groups for single-arm, post-RP adjuvant Rx trials for prostate cancer, facilitating development of new therapeutic strategies.

  9. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    PubMed

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.

  10. Teacher Conceptions and Approaches Associated with an Immersive Instructional Implementation of Computer-Based Models and Assessment in a Secondary Chemistry Classroom

    ERIC Educational Resources Information Center

    Waight, Noemi; Liu, Xiufeng; Gregorius, Roberto Ma.; Smith, Erica; Park, Mihwa

    2014-01-01

    This paper reports on a case study of an immersive and integrated multi-instructional approach (namely computer-based model introduction and connection with content; facilitation of individual student exploration guided by exploratory worksheet; use of associated differentiated labs and use of model-based assessments) in the implementation of…

  11. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan.

    PubMed

    Chang, Hsien-Yen; Weiner, Jonathan P

    2010-01-18

    Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory power of claims-based risk adjustment models over demographics-only models, Taiwan's government should consider using claims-based models for policy-relevant applications. The performance of the ACG case-mix system in Taiwan was comparable to that found in other countries. This suggested that the ACG system could be applied to Taiwan's NHI even though it was originally developed in the USA. Many of the findings in this paper are likely to be relevant to other diagnosis-based risk adjustment methodologies.

  12. Using information communication technology in models of integrated community-based primary health care: learning from the iCOACH case studies.

    PubMed

    Steele Gray, Carolyn; Barnsley, Jan; Gagnon, Dominique; Belzile, Louise; Kenealy, Tim; Shaw, James; Sheridan, Nicolette; Wankah Nji, Paul; Wodchis, Walter P

    2018-06-26

    Information communication technology (ICT) is a critical enabler of integrated models of community-based primary health care; however, little is known about how existing technologies have been used to support new models of integrated care. To address this gap, we draw on data from an international study of integrated models, exploring how ICT is used to support activities of integrated care and the organizational and environmental barriers and enablers to its adoption. We take an embedded comparative multiple-case study approach using data from a study of implementation of nine models of integrated community-based primary health care, the Implementing Integrated Care for Older Adults with Complex Health Needs (iCOACH) study. Six cases from Canada, three each in Ontario and Quebec, and three in New Zealand, were studied. As part of the case studies, interviews were conducted with managers and front-line health care providers from February 2015 to March 2017. A qualitative descriptive approach was used to code data from 137 interviews and generate word tables to guide analysis. Despite different models and contexts, we found strikingly similar accounts of the types of activities supported through ICT systems in each of the cases. ICT systems were used most frequently to support activities like care coordination by inter-professional teams through information sharing. However, providers were limited in their ability to efficiently share patient data due to data access issues across organizational and professional boundaries and due to system functionality limitations, such as a lack of interoperability. Even in innovative models of care, managers and providers in our cases mainly use technology to enable traditional ways of working. Technology limitations prevent more innovative uses of technology that could support disruption necessary to improve care delivery. We argue the barriers to more innovative use of technology are linked to three factors: (1) information access barriers, (2) limited functionality of available technology, and (3) organizational and provider inertia.

  13. Tuberculosis active case finding in Cambodia: a pragmatic, cost-effectiveness comparison of three implementation models.

    PubMed

    James, Richard; Khim, Keovathanak; Boudarene, Lydia; Yoong, Joanne; Phalla, Chea; Saint, Saly; Koeut, Pichenda; Mao, Tan Eang; Coker, Richard; Khan, Mishal Sameer

    2017-08-22

    Globally, almost 40% of tuberculosis (TB) patients remain undiagnosed, and those that are diagnosed often experience prolonged delays before initiating correct treatment, leading to ongoing transmission. While there is a push for active case finding (ACF) to improve early detection and treatment of TB, there is extremely limited evidence about the relative cost-effectiveness of different ACF implementation models. Cambodia presents a unique opportunity for addressing this gap in evidence as ACF has been implemented using different models, but no comparisons have been conducted. The objective of our study is to contribute to knowledge and methodology on comparing cost-effectiveness of alternative ACF implementation models from the health service perspective, using programmatic data, in order to inform national policy and practice. We retrospectively compared three distinct ACF implementation models - door to door symptom screening in urban slums, checking contacts of TB patients, and door to door symptom screening focusing on rural populations aged above 55 - in terms of the number of new bacteriologically-positive pulmonary TB cases diagnosed and the cost of implementation assuming activities are conducted by the national TB program of Cambodia. We calculated the cost per additional case detected using the alternative ACF models. Our analysis, which is the first of its kind for TB, revealed that the ACF model based on door to door screening in poor urban areas of Phnom Penh was the most cost-effective (249 USD per case detected, 737 cases diagnosed), followed by the model based on testing contacts of TB patients (308 USD per case detected, 807 cases diagnosed), and symptomatic screening of older rural populations (316 USD per case detected, 397 cases diagnosed). Our study provides new evidence on the relative effectiveness and economics of three implementation models for enhanced TB case finding, in line with calls for data from 'routine conditions' to be included in disease control program strategic planning. Such cost-effectiveness comparisons are essential to inform resource allocation decisions of national policy makers in resource constraint settings. We applied a novel, pragmatic methodological approach, which was designed to provide results that are directly relevant to policy makers, costing the interventions from Cambodia's national TB program's perspective and using case finding data from implementation activities, rather than experimental settings.

  14. An Algebraic Implicitization and Specialization of Minimum KL-Divergence Models

    NASA Astrophysics Data System (ADS)

    Dukkipati, Ambedkar; Manathara, Joel George

    In this paper we study representation of KL-divergence minimization, in the cases where integer sufficient statistics exists, using tools from polynomial algebra. We show that the estimation of parametric statistical models in this case can be transformed to solving a system of polynomial equations. In particular, we also study the case of Kullback-Csisźar iteration scheme. We present implicit descriptions of these models and show that implicitization preserves specialization of prior distribution. This result leads us to a Gröbner bases method to compute an implicit representation of minimum KL-divergence models.

  15. Case-based synthesis in automatic advertising creation system

    NASA Astrophysics Data System (ADS)

    Zhuang, Yueting; Pan, Yunhe

    1995-08-01

    Advertising (ads) is an important design area. Though many interactive ad-design softwares have come into commercial use, none of them ever support the intelligent work -- automatic ad creation. The potential for this is enormous. This paper gives a description of our current work in research of an automatic advertising creation system (AACS). After careful analysis of the mental behavior of a human ad designer, we conclude that case-based approach is appropriate to its intelligent modeling. A model for AACS is given in the paper. A case in ads is described as two parts: the creation process and the configuration of the ads picture, with detailed data structures given in the paper. Along with the case representation, we put forward an algorithm. Some issues such as similarity measure computing, and case adaptation have also been discussed.

  16. Case formulation and management using pattern-based formulation (PBF) methodology: clinical case 1.

    PubMed

    Fernando, Irosh; Cohen, Martin

    2014-02-01

    A tool for psychiatric case formulation known as pattern-based formulation (PBF) has been recently introduced. This paper presents an application of this methodology in formulating and managing complex clinical cases. The symptomatology of the clinical presentation has been parsed into individual clinical phenomena and interpreted by selecting explanatory models. The clinical presentation demonstrates how PBF has been used as a clinical tool to guide clinicians' thinking, that takes a structured approach to manage multiple issues using a broad range of management strategies. In doing so, the paper also introduces a number of patterns related to the observed clinical phenomena that can be re-used as explanatory models when formulating other clinical cases. It is expected that this paper will assist clinicians, and particularly trainees, to better understand PBF methodology and apply it to improve their formulation skills.

  17. Integrating distributional, spatial prioritization, and individual-based models to evaluate potential critical habitat networks: A case study using the Northern Spotted Owl

    EPA Science Inventory

    As part of the northern spotted owl recovery planning effort, we evaluated a series of alternative critical habitat scenarios using a species-distribution model (MaxEnt), a conservation-planning model (Zonation), and an individual-based population model (HexSim). With this suite ...

  18. Prediction model for the return to work of workers with injuries in Hong Kong.

    PubMed

    Xu, Yanwen; Chan, Chetwyn C H; Lo, Karen Hui Yu-Ling; Tang, Dan

    2008-01-01

    This study attempts to formulate a prediction model of return to work for a group of workers who have been suffering from chronic pain and physical injury while also being out of work in Hong Kong. The study used Case-based Reasoning (CBR) method, and compared the result with the statistical method of logistic regression model. The database of the algorithm of CBR was composed of 67 cases who were also used in the logistic regression model. The testing cases were 32 participants who had a similar background and characteristics to those in the database. The methods of setting constraints and Euclidean distance metric were used in CBR to search the closest cases to the trial case based on the matrix. The usefulness of the algorithm was tested on 32 new participants, and the accuracy of predicting return to work outcomes was 62.5%, which was no better than the 71.2% accuracy derived from the logistic regression model. The results of the study would enable us to have a better understanding of the CBR applied in the field of occupational rehabilitation by comparing with the conventional regression analysis. The findings would also shed light on the development of relevant interventions for the return-to-work process of these workers.

  19. Use of a business case model for organizational change.

    PubMed

    Shirey, Maria R

    2011-01-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author discusses the concept of a business case and introduces a 3-phase business case model for organizational change.

  20. Physician-Owned Surgical Hospitals Outperform Other Hospitals in Medicare Value-Based Purchasing Program.

    PubMed

    Ramirez, Adriana G; Tracci, Margaret C; Stukenborg, George J; Turrentine, Florence E; Kozower, Benjamin D; Jones, R Scott

    2016-10-01

    The Hospital Value-Based Purchasing Program measures value of care provided by participating Medicare hospitals and creates financial incentives for quality improvement and fosters increased transparency. Limited information is available comparing hospital performance across health care business models. The 2015 Hospital Value-Based Purchasing Program results were used to examine hospital performance by business model. General linear modeling assessed differences in mean total performance score, hospital case mix index, and differences after adjustment for differences in hospital case mix index. Of 3,089 hospitals with total performance scores, categories of representative health care business models included 104 physician-owned surgical hospitals, 111 University HealthSystem Consortium, 14 US News & World Report Honor Roll hospitals, 33 Kaiser Permanente, and 124 Pioneer accountable care organization affiliated hospitals. Estimated mean total performance scores for physician-owned surgical hospitals (64.4; 95% CI, 61.83-66.38) and Kaiser Permanente (60.79; 95% CI, 56.56-65.03) were significantly higher compared with all remaining hospitals, and University HealthSystem Consortium members (36.8; 95% CI, 34.51-39.17) performed below the mean (p < 0.0001). Significant differences in mean hospital case mix index included physician-owned surgical hospitals (mean 2.32; p < 0.0001), US News & World Report honorees (mean 2.24; p = 0.0140), and University HealthSystem Consortium members (mean 1.99; p < 0.0001), and Kaiser Permanente hospitals had lower case mix value (mean 1.54; p < 0.0001). Re-estimation of total performance scores did not change the original results after adjustment for differences in hospital case mix index. The Hospital Value-Based Purchasing Program revealed superior hospital performance associated with business model. Closer inspection of high-value hospitals can guide value improvement and policy-making decisions for all Medicare Value-Based Purchasing Program Hospitals. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross application model yields reasonable results which can be used for preliminary landslide hazard mapping.

  2. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  3. Discussion on accuracy degree evaluation of accident velocity reconstruction model

    NASA Astrophysics Data System (ADS)

    Zou, Tiefang; Dai, Yingbiao; Cai, Ming; Liu, Jike

    In order to investigate the applicability of accident velocity reconstruction model in different cases, a method used to evaluate accuracy degree of accident velocity reconstruction model is given. Based on pre-crash velocity in theory and calculation, an accuracy degree evaluation formula is obtained. With a numerical simulation case, Accuracy degrees and applicability of two accident velocity reconstruction models are analyzed; results show that this method is feasible in practice.

  4. Search algorithm complexity modeling with application to image alignment and matching

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2014-05-01

    Search algorithm complexity modeling, in the form of penetration rate estimation, provides a useful way to estimate search efficiency in application domains which involve searching over a hypothesis space of reference templates or models, as in model-based object recognition, automatic target recognition, and biometric recognition. The penetration rate quantifies the expected portion of the database that must be searched, and is useful for estimating search algorithm computational requirements. In this paper we perform mathematical modeling to derive general equations for penetration rate estimates that are applicable to a wide range of recognition problems. We extend previous penetration rate analyses to use more general probabilistic modeling assumptions. In particular we provide penetration rate equations within the framework of a model-based image alignment application domain in which a prioritized hierarchical grid search is used to rank subspace bins based on matching probability. We derive general equations, and provide special cases based on simplifying assumptions. We show how previously-derived penetration rate equations are special cases of the general formulation. We apply the analysis to model-based logo image alignment in which a hierarchical grid search is used over a geometric misalignment transform hypothesis space. We present numerical results validating the modeling assumptions and derived formulation.

  5. [Comparing case management care models for people with dementia and their caregivers: the design of the COMPAS study].

    PubMed

    van Hout, H P J; Macneil Vroomen, J L; Van Mierlo, L D; Meiland, F J M; Moll van Charante, E P; Joling, K J; van den Dungen, P; Dröes, R M; van der Horst, H E; de Rooij, S E J A

    2014-04-01

    Dementia care in The Netherlands is shifting from fragmented, ad hoc care to more coordinated and personalized care. Case management contributes to this shift. The linkage model and a combination of intensive case management and joint agency care models were selected based on their emerging prominence in The Netherlands. It is unclear if these different forms of case management are more effective than usual care in improving or preserving the functioning and well-being at the patient and caregiver level and at the societal cost. The objective of this article is to describe the design of a study comparing these two case management care models against usual care. Clinical and cost outcomes are investigated while care processes and the facilitators and barriers for implementation of these models are considered. Mixed methods include a prospective, observational, controlled, cohort study among persons with dementia and their primary informal caregiver in regions of The Netherlands with and without case management including a qualitative process evaluation. Community-dwelling individuals with a dementia diagnosis with an informal caregiver are included. The primary outcome measure is the Neuropsychiatric Inventory for the people with dementia and the General Health Questionnaire for their caregivers. Costs are measured from a societal perspective. Semi-structured interviews with stakeholders based on the theoretical model of adaptive implementation are planned. 521 pairs of persons with dementia and their primary informal caregiver were included and are followed over two years. In the linked model substantially more impeding factors for implementation were identified compared with the model. This article describes the design of an evaluation study of two case management models along with clinical and economic data from persons with dementia and caregivers. The impeding and facilitating factors differed substantially between the two models. Further results on cost-effectiveness are expected by the beginning of 2015. This is a Dutch adaptation of MacNeil Vroomen et al., Comparing Dutch case management care models for people with dementia and their caregivers: The design of the COMPAS study.

  6. ALL-PATHWAYS DOSE ANALYSIS FOR THE PORTSMOUTH ON-SITE WASTE DISPOSAL FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Phifer, M.

    A Portsmouth On-Site Waste Disposal Facility (OSWDF) All-Pathways analysis has been conducted that considers the radiological impacts to a resident farmer. It is assumed that the resident farmer utilizes a farm pond contaminated by the OSWDF to irrigate a garden and pasture and water livestock from which food for the resident farmer is obtained, and that the farmer utilizes groundwater from the Berea sandstone aquifer for domestic purposes (i.e. drinking water and showering). As described by FBP 2014b the Hydrologic Evaluation of Landfill Performance (HELP) model (Schroeder et al. 1994) and the Surface Transport Over Multiple Phases (STOMP) model (Whitemore » and Oostrom 2000, 2006) were used to model the flow and transport from the OSWDF to the Points of Assessment (POAs) associated with the 680-ft elevation sandstone layer (680 SSL) and the Berea sandstone aquifer. From this modeling the activity concentrations radionuclides were projected over time at the POAs. The activity concentrations were utilized as input to a GoldSimTM (GTG 2010) dose model, described herein, in order to project the dose to a resident farmer over time. A base case and five sensitivity cases were analyzed. The sensitivity cases included an evaluation of the impacts of using a conservative inventory, an uncased well to the Berea sandstone aquifer, a low waste zone uranium distribution coefficient (Kd), different transfer factors, and reference person exposure parameters (i.e. at 95 percentile). The maximum base case dose within the 1,000 year assessment period was projected to be 1.5E-14 mrem/yr, and the maximum base case dose at any time less than 10,000 years was projected to be 0.002 mrem/yr. The maximum projected dose of any sensitivity case was approximately 2.6 mrem/yr associated with the use of an uncased well to the Berea sandstone aquifer. This sensitivity case is considered very unlikely because it assumes leakage from the location of greatest concentration in the 680 SSL in to the Berea sandstone aquiver over time and does not conform to standard private water well construction practices. The bottom-line is that all predicted doses from the base case and five sensitivity cases fall well below the DOE all-pathways 25 mrem/yr Performance Objective.« less

  7. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  8. Application research of 3D additive manufacturing technology in the nail shell

    NASA Astrophysics Data System (ADS)

    Xiao, Shanhua; Yan, Ruiqiang; Song, Ning

    2018-04-01

    Based on the analysis of hierarchical slicing algorithm, 3D scanning of enterprise product nailing handle case file is carried out, point cloud data processing is performed on the source file, and the surface modeling and innovative design of nail handling handle case are completed. Using MakerBot Replicator2X-based 3D printer for layered 3D print samples, for the new nail product development to provide reverse modeling and rapid prototyping technical support.

  9. Forecasting landslide activations by means of GA-SAKe. An example of application to three case studies in Calabria (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Iovine, Giulio G. R.; De Rango, Alessio; Gariano, Stefano L.; Terranova, Oreste G.

    2016-04-01

    GA-SAKe - the Genetic-Algorithm based release of the hydrological model SAKe (Self Adaptive Kernel) - allows to forecast the timing of activation of landslides [1, 2], based on dates of landslide activations and rainfall series. The model can be applied to either single or set of similar landslides in a homogeneous context. Calibration of the model is performed through Genetic-Algorithm, and provides families of optimal, discretized solutions (kernels) that maximize the fitness function. The mobility functions are obtained through convolution of the optimal kernels with rain series. The shape of the kernel, including its base time, is related to magnitude of the landslide and hydro-geological complexity of the slope. Once validated, the model can be applied to estimate the timing of future landslide activations in the same study area, by employing measured or forecasted rainfall. GA-SAKe is here employed to analyse the historical activations of three rock slides in Calabria (Southern Italy), threatening villages and main infrastructures. In particular: 1) the Acri-Serra di Buda case, developed within a Sackung, involving weathered crystalline and metamorphic rocks; for this case study, 6 dates of activation are available; 2) the San Fili-Uncino case, developed in clay and conglomerate overlaying gneiss and biotitic schist; for this case study, 7 dates of activation are available [2]; 3) the San Benedetto Ullano-San Rocco case, developed in weathered metamorphic rocks; for this case study, 3 dates of activation are available [1, 3, 4, 5]. The obtained results are quite promising, given the high performance of the model against slope movements characterized by numerous historical activations. Obtained results, in terms of shape and base time of the kernels, are compared by taking into account types and sizes of the considered case studies, and involved rock types. References [1] Terranova O.G., Iaquinta P., Gariano S.L., Greco R. & Iovine G. (2013) In: Landslide Science and Practice, Margottini, Canuti, Sassa (Eds.), Vol. 3, pp.73-79. [2] Terranova O.G., Gariano S.L., Iaquinta P. & Iovine G.G.R. (2015). Geosci. Model Dev., 8, 1955-1978. [3] Iovine G., Iaquinta P. & Terranova O. (2009). In Anderssen, Braddock & Newham (Eds.), Proc. 18th World IMACS Congr. and MODSIM09 Int. Congr. on Modelling and Simulation, pp. 2686-2693. [4] Iovine G., Lollino P., Gariano S.L. & Terranova O.G. (2010). NHESS, 10, 2341-2354. [5] Capparelli G., Iaquinta P., Iovine G., Terranova O.G. & Versace P. (2012). Natural Hazards, 61(1), pp.247-256.

  10. Purpose, Processes, Partnerships, and Products: 4Ps to advance Participatory Socio-Environmental Modeling

    NASA Astrophysics Data System (ADS)

    Gray, S. G.; Voinov, A. A.; Jordan, R.; Paolisso, M.

    2016-12-01

    Model-based reasoning is a basic part of human understanding, decision-making, and communication. Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding environmental change since stakeholders often hold valuable knowledge about socio-environmental dynamics and since collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four dimensional framework that includes reporting on dimensions of: (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of environmental changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of environmental policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.

  11. A Database-Based and Web-Based Meta-CASE System

    NASA Astrophysics Data System (ADS)

    Eessaar, Erki; Sgirka, Rünno

    Each Computer Aided Software Engineering (CASE) system provides support to a software process or specific tasks or activities that are part of a software process. Each meta-CASE system allows us to create new CASE systems. The creators of a new CASE system have to specify abstract syntax of the language that is used in the system and functionality as well as non-functional properties of the new system. Many meta-CASE systems record their data directly in files. In this paper, we introduce a meta-CASE system, the enabling technology of which is an object-relational database system (ORDBMS). The system allows users to manage specifications of languages and create models by using these languages. The system has web-based and form-based user interface. We have created a proof-of-concept prototype of the system by using PostgreSQL ORDBMS and PHP scripting language.

  12. Quantitative analysis of factors that affect oil pipeline network accident based on Bayesian networks: A case study in China

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan

    2018-06-01

    Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.

  13. Uncertainty and Variability in Physiologically-Based ...

    EPA Pesticide Factsheets

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment. This report summarizes some of the recent progress in characterizing uncertainty and variability in physiologically-based pharmacokinetic models and their predictions for use in risk assessment.

  14. Violent reinjury risk assessment instrument (VRRAI) for hospital-based violence intervention programs.

    PubMed

    Kramer, Erik J; Dodington, James; Hunt, Ava; Henderson, Terrell; Nwabuo, Adaobi; Dicker, Rochelle; Juillard, Catherine

    2017-09-01

    Violent injury is the second most common cause of death among 15- to 24-year olds in the US. Up to 58% of violently injured youth return to the hospital with a second violent injury. Hospital-based violence intervention programs (HVIPs) have been shown to reduce injury recidivism through intensive case management. However, no validated guidelines for risk assessment strategies in the HVIP setting have been reported. We aimed to use qualitative methods to investigate the key components of risk assessments employed by HVIP case managers and to propose a risk assessment model based on this qualitative analysis. An established academic hospital-affiliated HVIP served as the nexus for this research. Thematic saturation was reached with 11 semi-structured interviews and two focus groups conducted with HVIP case managers and key informants identified through snowball sampling. Interactions were analyzed by a four-member team using Nvivo 10, employing the constant comparison method. Risk factors identified were used to create a set of models presented in two follow-up HVIP case managers and leadership focus groups. Eighteen key themes within seven domains (environment, identity, mental health, behavior, conflict, indicators of lower risk, and case management) and 141 potential risk factors for use in the risk assessment framework were identified. The most salient factors were incorporated into eight models that were presented to the HVIP case managers. A 29-item algorithmic structured professional judgment model was chosen. We identified four tiers of risk factors for violent reinjury that were incorporated into a proposed risk assessment instrument, VRRAI. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  16. Modeling the public health impact of malaria vaccines for developers and policymakers.

    PubMed

    Nunes, Julia K; Cárdenas, Vicky; Loucq, Christian; Maire, Nicolas; Smith, Thomas; Shaffer, Craig; Måseide, Kårstein; Brooks, Alan

    2013-07-01

    Efforts to develop malaria vaccines show promise. Mathematical model-based estimates of the potential demand, public health impact, and cost and financing requirements can be used to inform investment and adoption decisions by vaccine developers and policymakers on the use of malaria vaccines as complements to existing interventions. However, the complexity of such models may make their outputs inaccessible to non-modeling specialists. This paper describes a Malaria Vaccine Model (MVM) developed to address the specific needs of developers and policymakers, who need to access sophisticated modeling results and to test various scenarios in a user-friendly interface. The model's functionality is demonstrated through a hypothetical vaccine. The MVM has three modules: supply and demand forecast; public health impact; and implementation cost and financing requirements. These modules include pre-entered reference data and also allow for user-defined inputs. The model includes an integrated sensitivity analysis function. Model functionality was demonstrated by estimating the public health impact of a hypothetical pre-erythrocytic malaria vaccine with 85% efficacy against uncomplicated disease and a vaccine efficacy decay rate of four years, based on internationally-established targets. Demand for this hypothetical vaccine was estimated based on historical vaccine implementation rates for routine infant immunization in 40 African countries over a 10-year period. Assumed purchase price was $5 per dose and injection equipment and delivery costs were $0.40 per dose. The model projects the number of doses needed, uncomplicated and severe cases averted, deaths and disability-adjusted life years (DALYs) averted, and cost to avert each. In the demonstration scenario, based on a projected demand of 532 million doses, the MVM estimated that 150 million uncomplicated cases of malaria and 1.1 million deaths would be averted over 10 years. This is equivalent to 943 uncomplicated cases and 7 deaths averted per 1,000 vaccinees. In discounted 2011 US dollars, this represents $11 per uncomplicated case averted and $1,482 per death averted. If vaccine efficacy were reduced to 75%, the estimated uncomplicated cases and deaths averted over 10 years would decrease by 14% and 19%, respectively. The MVM can provide valuable information to assist decision-making by vaccine developers and policymakers, information which will be refined and strengthened as field studies progress allowing further validation of modeling assumptions.

  17. Fuzzy Temporal Logic Based Railway Passenger Flow Forecast Model

    PubMed Central

    Dou, Fei; Jia, Limin; Wang, Li; Xu, Jie; Huang, Yakun

    2014-01-01

    Passenger flow forecast is of essential importance to the organization of railway transportation and is one of the most important basics for the decision-making on transportation pattern and train operation planning. Passenger flow of high-speed railway features the quasi-periodic variations in a short time and complex nonlinear fluctuation because of existence of many influencing factors. In this study, a fuzzy temporal logic based passenger flow forecast model (FTLPFFM) is presented based on fuzzy logic relationship recognition techniques that predicts the short-term passenger flow for high-speed railway, and the forecast accuracy is also significantly improved. An applied case that uses the real-world data illustrates the precision and accuracy of FTLPFFM. For this applied case, the proposed model performs better than the k-nearest neighbor (KNN) and autoregressive integrated moving average (ARIMA) models. PMID:25431586

  18. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  19. Influence of air quality model resolution on uncertainty associated with health impacts

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2012-06-01

    We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model (CAMx), we ran a modeling episode with meteorological inputs representing conditions as they occurred during August through September 2006, and two emissions inventories (a 2006 base case and a 2018 proposed control scenario, both for Houston, Texas) at 36, 12, 4 and 2 km resolution. The base case model performance was evaluated for each resolution against daily maximum 8-h averaged ozone measured at monitoring stations. Results from each resolution were more similar to each other than they were to measured values. Population-weighted ozone concentrations were calculated for each resolution and applied to concentration response functions (with 95% confidence intervals) to estimate the health impacts of modeled ozone reduction from the base case to the control scenario. We found that estimated avoided mortalities were not significantly different between 2, 4 and 12 km resolution runs, but 36 km resolution may over-predict some potential health impacts. Given the cost/benefit analysis requirements of the Clean Air Act, the uncertainty associated with human health impacts and therefore the results reported in this study, we conclude that health impacts calculated from population weighted ozone concentrations obtained using regional photochemical models at 36 km resolution fall within the range of values obtained using fine (12 km or finer) resolution modeling. However, in some cases, 36 km resolution may not be fine enough to statistically replicate the results achieved using 2 and 4 km resolution. On average, when modeling at 36 km resolution, 7 deaths per ozone month were avoided because of ozone reductions resulting from the proposed emissions reductions (95% confidence interval was 2-9). When modeling at 2, 4 or 12 km finer scale resolution, on average 5 deaths were avoided due to the same reductions (95% confidence interval was 2-7). Initial results for this specific region show that modeling at a resolution finer than 12 km is unlikely to improve uncertainty in benefits analysis. We suggest that 12 km resolution may be appropriate for uncertainty analyses in areas with similar chemistry, but that resolution requirements should be assessed on a case-by-case basis and revised as confidence intervals for concentration-response functions are updated.

  20. Diagnosis of Parkinsonian disorders using a channelized Hotelling observer model: Proof of principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bal, H.; Bal, G.; Acton, P. D.

    2007-10-15

    Imaging dopamine transporters using PET and SPECT probes is a powerful technique for the early diagnosis of Parkinsonian disorders. In order to perform automated accurate diagnosis of these diseases, a channelized Hotelling observer (CHO) based model was developed and evaluated using the SPECT tracer [Tc-99m]TRODAT-1. Computer simulations were performed using a digitized striatal phantom to characterize early stages of the disease (20 lesion-present cases with varying lesion size and contrast). Projection data, modeling the effects of attenuation and geometric response function, were obtained for each case. Statistical noise levels corresponding to those observed clinically were added to the projection datamore » to obtain 100 noise realizations for each case. All the projection data were reconstructed, and a subset of the transaxial slices containing the striatum was summed and used for further analysis. CHO models, using the Laguerre-Gaussian functions as channels, were designed for two cases: (1) By training the model using individual lesion-present samples and (2) by training the model using pooled lesion-present samples. A decision threshold obtained for each CHO model was used to classify the study population (n=40). It was observed that individual lesion trained CHO models gave high diagnostic accuracy for lesions that were larger than those used to train the model and vice-versa. On the other hand, the pooled CHO model was found to give a high diagnostic accuracy for all the lesion cases (average diagnostic accuracy=0.95{+-}0.07; p<0.0001 Fisher's exact test). Based on our results, we conclude that a CHO model has the potential to provide early and accurate diagnosis of Parkinsonian disorders, thereby improving patient management.« less

  1. Spatial Pattern of Cell Damage in Tissue from Heavy Ions

    NASA Technical Reports Server (NTRS)

    Ponomarev, Artem L.; Huff, Janice L.; Cucinotta, Francis A.

    2007-01-01

    A new Monte Carlo algorithm was developed that can model passage of heavy ions in a tissue, and their action on the cellular matrix for 2- or 3-dimensional cases. The build-up of secondaries such as projectile fragments, target fragments, other light fragments, and delta-rays was simulated. Cells were modeled as a cell culture monolayer in one example, where the data were taken directly from microscopy (2-d cell matrix). A simple model of tissue was given as abstract spheres with close approximation to real cell geometries (3-d cell matrix), as well as a realistic model of tissue was proposed based on microscopy images. Image segmentation was used to identify cells in an irradiated cell culture monolayer, or slices of tissue. The cells were then inserted into the model box pixel by pixel. In the case of cell monolayers (2-d), the image size may exceed the modeled box size. Such image was is moved with respect to the box in order to sample as many cells as possible. In the case of the simple tissue (3-d), the tissue box is modeled with periodic boundary conditions, which extrapolate the technique to macroscopic volumes of tissue. For real tissue, specific spatial patterns for cell apoptosis and necrosis are expected. The cell patterns were modeled based on action cross sections for apoptosis and necrosis estimated based on BNL data, and other experimental data.

  2. Applying Service-Oriented Architecture on The Development of Groundwater Modeling Support System

    NASA Astrophysics Data System (ADS)

    Li, C. Y.; WANG, Y.; Chang, L. C.; Tsai, J. P.; Hsiao, C. T.

    2016-12-01

    Groundwater simulation has become an essential step on the groundwater resources management and assessment. There are many stand-alone pre- and post-processing software packages to alleviate the model simulation loading, but the stand-alone software do not consider centralized management of data and simulation results neither do they provide network sharing functions. Hence, it is difficult to share and reuse the data and knowledge (simulation cases) systematically within or across companies. Therefore, this study develops a centralized and network based groundwater modeling support system to assist model construction. The system is based on service-oriented architecture and allows remote user to develop their modeling cases on internet. The data and cases (knowledge) are thus easy to manage centralized. MODFLOW is the modeling engine of the system, which is the most popular groundwater model in the world. The system provides a data warehouse to restore groundwater observations, MODFLOW Support Service, MODFLOW Input File & Shapefile Convert Service, MODFLOW Service, and Expert System Service to assist researchers to build models. Since the system architecture is service-oriented, it is scalable and flexible. The system can be easily extended to include the scenarios analysis and knowledge management to facilitate the reuse of groundwater modeling knowledge.

  3. Mapping environmental susceptibility to Saint Louis encephalitis virus, based on a decision tree model of remotely-sensed data.

    PubMed

    Rotela, Camilo H; Spinsanti, Lorena I; Lamfri, Mario A; Contigiani, Marta S; Almirón, Walter R; Scavuzzo, Carlos M

    2011-11-01

    In response to the first human outbreak (January May 2005) of Saint Louis encephalitis (SLE) virus in Córdoba province, Argentina, we developed an environmental SLE virus risk map for the capital, i.e. Córdoba city. The aim was to provide a map capable of detecting macro-environmental factors associated with the spatial distribution of SLE cases, based on remotely sensed data and a geographical information system. Vegetation, soil brightness, humidity status, distances to water-bodies and areas covered by vegetation were assessed based on pre-outbreak images provided by the Landsat 5TM satellite. A strong inverse relationship between the number of humans infected by SLEV and distance to high-vigor vegetation was noted. A statistical non-hierarchic decision tree model was constructed, based on environmental variables representing the areas surrounding patient residences. From this point of view, 18% of the city could be classified as being at high risk for SLEV infection, while 34% carried a low risk, or none at all. Taking the whole 2005 epidemic into account, 80% of the cases came from areas classified by the model as medium-high or high risk. Almost 46% of the cases were registered in high-risk areas, while there were no cases (0%) in areas affirmed as risk free.

  4. Case-Deletion Diagnostics for Nonlinear Structural Equation Models

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Lu, Bin

    2003-01-01

    In this article, a case-deletion procedure is proposed to detect influential observations in a nonlinear structural equation model. The key idea is to develop the diagnostic measures based on the conditional expectation of the complete-data log-likelihood function in the EM algorithm. An one-step pseudo approximation is proposed to reduce the…

  5. Innovating Education with an Educational Modeling Language: Two Case Studies

    ERIC Educational Resources Information Center

    Sloep, Peter B.; van Bruggen, Jan; Tattersall, Colin; Vogten, Hubert; Koper, Rob; Brouns, Francis; van Rosmalen, Peter

    2006-01-01

    The intent of this study was to investigate how to maximize the chances of success of an educational innovation--specifically one based on the implementation of the educational modeling language called EML. This language is both technically and organizationally demanding. Two different implementation cases were investigated, one situated in an…

  6. Technique for ranking potential predictor layers for use in remote sensing analysis

    Treesearch

    Andrew Lister; Mike Hoppus; Rachel Riemann

    2004-01-01

    Spatial modeling using GIS-based predictor layers often requires that extraneous predictors be culled before conducting analysis. In some cases, using extraneous predictor layers might improve model accuracy but at the expense of increasing complexity and interpretability. In other cases, using extraneous layers can dilute the relationship between predictors and target...

  7. A Different Call to Arms: Women in the Core of the Communications Revolution.

    ERIC Educational Resources Information Center

    Rush, Ramona R.

    A "best case" model for the role of women in the postindustrial communications era predicts positive leadership roles based on the preindustrial work characteristics of cooperation and consensus. A "worst case" model finds women entrepreneurs succumbing to the competitive male ethos and extracting the maximum amount of work…

  8. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  9. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  10. Modeling subjective evaluation of soundscape quality in urban open spaces: An artificial neural network approach.

    PubMed

    Yu, Lei; Kang, Jian

    2009-09-01

    This research aims to explore the feasibility of using computer-based models to predict the soundscape quality evaluation of potential users in urban open spaces at the design stage. With the data from large scale field surveys in 19 urban open spaces across Europe and China, the importance of various physical, behavioral, social, demographical, and psychological factors for the soundscape evaluation has been statistically analyzed. Artificial neural network (ANN) models have then been explored at three levels. It has been shown that for both subjective sound level and acoustic comfort evaluation, a general model for all the case study sites is less feasible due to the complex physical and social environments in urban open spaces; models based on individual case study sites perform well but the application range is limited; and specific models for certain types of location/function would be reliable and practical. The performance of acoustic comfort models is considerably better than that of sound level models. Based on the ANN models, soundscape quality maps can be produced and this has been demonstrated with an example.

  11. Contributions of the Model of Modelling Diagram to the Learning of Ionic Bonding: Analysis of a Case Study

    ERIC Educational Resources Information Center

    Mendonca, Paula Cristina Cardoso; Justi, Rosaria

    2011-01-01

    Current proposals for science education recognise the importance of students' involvement in activities aimed at favouring the understanding of science as a human, dynamic and non-linear construct. Modelling-based teaching is one of the alternatives through which to address such issues. Modelling-based teaching activities for ionic bonding were…

  12. Semantics-Based Composition of Integrated Cardiomyocyte Models Motivated by Real-World Use Cases.

    PubMed

    Neal, Maxwell L; Carlson, Brian E; Thompson, Christopher T; James, Ryan C; Kim, Karam G; Tran, Kenneth; Crampin, Edmund J; Cook, Daniel L; Gennari, John H

    2015-01-01

    Semantics-based model composition is an approach for generating complex biosimulation models from existing components that relies on capturing the biological meaning of model elements in a machine-readable fashion. This approach allows the user to work at the biological rather than computational level of abstraction and helps minimize the amount of manual effort required for model composition. To support this compositional approach, we have developed the SemGen software, and here report on SemGen's semantics-based merging capabilities using real-world modeling use cases. We successfully reproduced a large, manually-encoded, multi-model merge: the "Pandit-Hinch-Niederer" (PHN) cardiomyocyte excitation-contraction model, previously developed using CellML. We describe our approach for annotating the three component models used in the PHN composition and for merging them at the biological level of abstraction within SemGen. We demonstrate that we were able to reproduce the original PHN model results in a semi-automated, semantics-based fashion and also rapidly generate a second, novel cardiomyocyte model composed using an alternative, independently-developed tension generation component. We discuss the time-saving features of our compositional approach in the context of these merging exercises, the limitations we encountered, and potential solutions for enhancing the approach.

  13. Semantics-Based Composition of Integrated Cardiomyocyte Models Motivated by Real-World Use Cases

    PubMed Central

    Neal, Maxwell L.; Carlson, Brian E.; Thompson, Christopher T.; James, Ryan C.; Kim, Karam G.; Tran, Kenneth; Crampin, Edmund J.; Cook, Daniel L.; Gennari, John H.

    2015-01-01

    Semantics-based model composition is an approach for generating complex biosimulation models from existing components that relies on capturing the biological meaning of model elements in a machine-readable fashion. This approach allows the user to work at the biological rather than computational level of abstraction and helps minimize the amount of manual effort required for model composition. To support this compositional approach, we have developed the SemGen software, and here report on SemGen’s semantics-based merging capabilities using real-world modeling use cases. We successfully reproduced a large, manually-encoded, multi-model merge: the “Pandit-Hinch-Niederer” (PHN) cardiomyocyte excitation-contraction model, previously developed using CellML. We describe our approach for annotating the three component models used in the PHN composition and for merging them at the biological level of abstraction within SemGen. We demonstrate that we were able to reproduce the original PHN model results in a semi-automated, semantics-based fashion and also rapidly generate a second, novel cardiomyocyte model composed using an alternative, independently-developed tension generation component. We discuss the time-saving features of our compositional approach in the context of these merging exercises, the limitations we encountered, and potential solutions for enhancing the approach. PMID:26716837

  14. Ontology-Based Method for Fault Diagnosis of Loaders.

    PubMed

    Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-02-28

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.

  15. Ontology-Based Method for Fault Diagnosis of Loaders

    PubMed Central

    Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-01-01

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study. PMID:29495646

  16. Development of Airport Noise Mapping using Matlab Software (Case Study: Adi Soemarmo Airport - Boyolali, Indonesia)

    NASA Astrophysics Data System (ADS)

    Andarani, Pertiwi; Setiyo Huboyo, Haryono; Setyanti, Diny; Budiawan, Wiwik

    2018-02-01

    Noise is considered as one of the main environmental impact of Adi Soemarmo International Airport (ASIA), the second largest airport in Central Java Province, Indonesia. In order to manage the noise of airport, airport noise mapping is necessary. However, a model that requires simple input but still reliable was not available in ASIA. Therefore, the objective of this study are to develop model using Matlab software, to verify its reliability by measuring actual noise exposure, and to analyze the area of noise levels‥ The model was developed based on interpolation or extrapolation of identified Noise-Power-Distance (NPD) data. In accordance with Indonesian Government Ordinance No.40/2012, the noise metric used is WECPNL (Weighted Equivalent Continuous Perceived Noise Level). Based on this model simulation, there are residence area in the region of noise level II (1.912 km2) and III (1.16 km2) and 18 school buildings in the area of noise levels I, II, and III. These land-uses are actually prohibited unless noise insulation is equipped. The model using Matlab in the case of Adi Soemarmo International Airport is valid based on comparison of the field measurement (6 sampling points). However, it is important to validate the model again once the case study (the airport) is changed.

  17. Multipole-Based Cable Braid Electromagnetic Penetration Model: Electric Penetration Case

    DOE PAGES

    Campione, Salvatore; Warne, Larry K.; Langston, William L.; ...

    2017-07-11

    In this paper, we investigate the electric penetration case of the first principles multipole-based cable braid electromagnetic penetration model reported in the Progress in Electromagnetics Research B 66, 63–89 (2016). We first analyze the case of a 1-D array of wires: this is a problem which is interesting on its own, and we report its modeling based on a multipole-conformal mapping expansion and extension by means of Laplace solutions in bipolar coordinates. We then compare the elastance (inverse of capacitance) results from our first principles cable braid electromagnetic penetration model to that obtained using the multipole-conformal mapping bipolar solution. Thesemore » results are found in a good agreement up to a radius to half spacing ratio of 0.6, demonstrating a robustness needed for many commercial cables. We then analyze realistic cable implementations without dielectrics and compare the results from our first principles braid electromagnetic penetration model to the semiempirical results reported by Kley in the IEEE Transactions on Electromagnetic Compatibility 35, 1–9 (1993). Finally, although we find results on the same order of magnitude of Kley's results, the full dependence on the actual cable geometry is accounted for only in our proposed multipole model which, in addition, enables us to treat perturbations from those commercial cables measured.« less

  18. Multipole-Based Cable Braid Electromagnetic Penetration Model: Electric Penetration Case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campione, Salvatore; Warne, Larry K.; Langston, William L.

    In this paper, we investigate the electric penetration case of the first principles multipole-based cable braid electromagnetic penetration model reported in the Progress in Electromagnetics Research B 66, 63–89 (2016). We first analyze the case of a 1-D array of wires: this is a problem which is interesting on its own, and we report its modeling based on a multipole-conformal mapping expansion and extension by means of Laplace solutions in bipolar coordinates. We then compare the elastance (inverse of capacitance) results from our first principles cable braid electromagnetic penetration model to that obtained using the multipole-conformal mapping bipolar solution. Thesemore » results are found in a good agreement up to a radius to half spacing ratio of 0.6, demonstrating a robustness needed for many commercial cables. We then analyze realistic cable implementations without dielectrics and compare the results from our first principles braid electromagnetic penetration model to the semiempirical results reported by Kley in the IEEE Transactions on Electromagnetic Compatibility 35, 1–9 (1993). Finally, although we find results on the same order of magnitude of Kley's results, the full dependence on the actual cable geometry is accounted for only in our proposed multipole model which, in addition, enables us to treat perturbations from those commercial cables measured.« less

  19. Using Object Oriented Bayesian Networks to Model Linkage, Linkage Disequilibrium and Mutations between STR Markers

    PubMed Central

    Kling, Daniel; Egeland, Thore; Mostad, Petter

    2012-01-01

    In a number of applications there is a need to determine the most likely pedigree for a group of persons based on genetic markers. Adequate models are needed to reach this goal. The markers used to perform the statistical calculations can be linked and there may also be linkage disequilibrium (LD) in the population. The purpose of this paper is to present a graphical Bayesian Network framework to deal with such data. Potential LD is normally ignored and it is important to verify that the resulting calculations are not biased. Even if linkage does not influence results for regular paternity cases, it may have substantial impact on likelihood ratios involving other, more extended pedigrees. Models for LD influence likelihoods for all pedigrees to some degree and an initial estimate of the impact of ignoring LD and/or linkage is desirable, going beyond mere rules of thumb based on marker distance. Furthermore, we show how one can readily include a mutation model in the Bayesian Network; extending other programs or formulas to include such models may require considerable amounts of work and will in many case not be practical. As an example, we consider the two STR markers vWa and D12S391. We estimate probabilities for population haplotypes to account for LD using a method based on data from trios, while an estimate for the degree of linkage is taken from the literature. The results show that accounting for haplotype frequencies is unnecessary in most cases for this specific pair of markers. When doing calculations on regular paternity cases, the markers can be considered statistically independent. In more complex cases of disputed relatedness, for instance cases involving siblings or so-called deficient cases, or when small differences in the LR matter, independence should not be assumed. (The networks are freely available at http://arken.umb.no/~dakl/BayesianNetworks.) PMID:22984448

  20. Corrosion-Fatigue Crack Growth in Plates: A Model Based on the Paris Law

    PubMed Central

    Toribio, Jesús; Matos, Juan-Carlos; González, Beatriz

    2017-01-01

    In this paper, a Paris law-based model is presented whereby crack propagation occurs under cyclic loading in air (fatigue) and in an aggressive environment (corrosion-fatigue) for the case of corner cracks (with a wide range of aspect ratios in the matter of the initial cracks) in finite-thickness plates of 316L austenitic stainless steel subjected to tension, bending, or combined (tension + bending) loading. Results show that the cracks tend during their growth towards a preferential propagation path, exhibiting aspect ratios slightly lower than unity only for the case of very shallow cracks, and diminishing as the crack grows (increasing the relative crack depth)—more intensely in the case of bending than in the case of tension (the mixed loading tension/bending representing an intermediate case). In addition, the crack aspect ratios during fatigue propagation evolution are lower in fatigue (in air) than in corrosion-fatigue (in aggressive environment). PMID:28772798

  1. Economic impacts of a transition to higher oil prices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tessmer, Jr, R. G.; Carhart, S. C.; Marcuse, W.

    1978-06-01

    Economic impacts of sharply higher oil and gas prices in the eighties are estimated using a combination of optimization and input-output models. A 1985 Base Case is compared with a High Case in which crude oil and crude natural gas are, respectively, 2.1 and 1.4 times as expensive as in the Base Case. Impacts examined include delivered energy prices and demands, resource consumption, emission levels and costs, aggregate and compositional changes in gross national product, balance of payments, output, employment, and sectoral prices. Methodology is developed for linking models in both quantity and price space for energy service--specific fuel demands.more » A set of energy demand elasticities is derived which is consistent between alternative 1985 cases and between the 1985 cases and an historical year (1967). A framework and methodology are also presented for allocating portions of the DOE Conservation budget according to broad policy objectives and allocation rules.« less

  2. Conceptualizing race in economic models of medical utilization: a case study of community-based elders and the emergency room.

    PubMed Central

    White-Means, S I

    1995-01-01

    There is no consensus on the appropriate conceptualization of race in economic models of health care. This is because race is rarely the primary focus for analysis of the market. This article presents an alternative framework for conceptualizing race in health economic models. A case study is analyzed to illustrate the value of the alternative conceptualization. The case study findings clearly document the importance of model stratification according to race. Moreover, the findings indicate that empirical results are improved when medical utilization models are refined in a way that reflects the unique experiences of the population that is studied. PMID:7721593

  3. Modeled Forecasts of Dengue Fever in San Juan, Puerto Rico Using NASA Satellite Enhanced Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.

    2015-12-01

    Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.

  4. Vector-model-supported approach in prostate plan optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100more » previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration number without compromising the plan quality.« less

  5. Health care use and costs of adverse drug events emerging from outpatient treatment in Germany: a modelling approach.

    PubMed

    Stark, Renee G; John, Jürgen; Leidl, Reiner

    2011-01-13

    This study's aim was to develop a first quantification of the frequency and costs of adverse drug events (ADEs) originating in ambulatory medical practice in Germany. The frequencies and costs of ADEs were quantified for a base case, building on an existing cost-of-illness model for ADEs. The model originates from the U.S. health care system, its structure of treatment probabilities linked to ADEs was transferred to Germany. Sensitivity analyses based on values determined from a literature review were used to test the postulated results. For Germany, the base case postulated that about 2 million adults ingesting medications have will have an ADE in 2007. Health care costs related to ADEs in this base case totalled 816 million Euros, mean costs per case were 381 Euros. About 58% of costs resulted from hospitalisations, 11% from emergency department visits and 21% from long-term care. Base case estimates of frequency and costs of ADEs were lower than all estimates of the sensitivity analyses. The postulated frequency and costs of ADEs illustrate the possible size of the health problems and economic burden related to ADEs in Germany. The validity of the U.S. treatment structure used remains to be determined for Germany. The sensitivity analysis used assumptions from different studies and thus further quantified the information gap in Germany regarding ADEs. This study found costs of ADEs in the ambulatory setting in Germany to be significant. Due to data scarcity, results are only a rough indication.

  6. Overcoming rule-based rigidity and connectionist limitations through massively-parallel case-based reasoning

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Symbol manipulation as used in traditional Artificial Intelligence has been criticized by neural net researchers for being excessively inflexible and sequential. On the other hand, the application of neural net techniques to the types of high-level cognitive processing studied in traditional artificial intelligence presents major problems as well. A promising way out of this impasse is to build neural net models that accomplish massively parallel case-based reasoning. Case-based reasoning, which has received much attention recently, is essentially the same as analogy-based reasoning, and avoids many of the problems leveled at traditional artificial intelligence. Further problems are avoided by doing many strands of case-based reasoning in parallel, and by implementing the whole system as a neural net. In addition, such a system provides an approach to some aspects of the problems of noise, uncertainty and novelty in reasoning systems. The current neural net system (Conposit), which performs standard rule-based reasoning, is being modified into a massively parallel case-based reasoning version.

  7. An Export-Marketing Model for Pharmaceutical Firms (The Case of Iran)

    PubMed Central

    Mohammadzadeh, Mehdi; Aryanpour, Narges

    2013-01-01

    Internationalization is a matter of committed decision-making that starts with export marketing, in which an organization tries to diagnose and use opportunities in target markets based on realistic evaluation of internal strengths and weaknesses with analysis of macro and microenvironments in order to gain presence in other countries. A developed model for export and international marketing of pharmaceutical companies is introduced. The paper reviews common theories of the internationalization process, followed by examining different methods and models for assessing preparation for export activities and examining conceptual model based on a single case study method on a basket of seven leading domestic firms by using mainly questionares as the data gathering tool along with interviews for bias reduction. Finally, in keeping with the study objectives, the special aspects of the pharmaceutical marketing environment have been covered, revealing special dimensions of pharmaceutical marketing that have been embedded within the appropriate base model. The new model for international activities of pharmaceutical companies was refined by expert opinions extracted from result of questionnaires. PMID:24250597

  8. An export-marketing model for pharmaceutical firms (the case of iran).

    PubMed

    Mohammadzadeh, Mehdi; Aryanpour, Narges

    2013-01-01

    Internationalization is a matter of committed decision-making that starts with export marketing, in which an organization tries to diagnose and use opportunities in target markets based on realistic evaluation of internal strengths and weaknesses with analysis of macro and microenvironments in order to gain presence in other countries. A developed model for export and international marketing of pharmaceutical companies is introduced. The paper reviews common theories of the internationalization process, followed by examining different methods and models for assessing preparation for export activities and examining conceptual model based on a single case study method on a basket of seven leading domestic firms by using mainly questionares as the data gathering tool along with interviews for bias reduction. Finally, in keeping with the study objectives, the special aspects of the pharmaceutical marketing environment have been covered, revealing special dimensions of pharmaceutical marketing that have been embedded within the appropriate base model. The new model for international activities of pharmaceutical companies was refined by expert opinions extracted from result of questionnaires.

  9. A Correlation-Based Transition Model using Local Variables. Part 2; Test Cases and Industrial Applications

    NASA Technical Reports Server (NTRS)

    Langtry, R. B.; Menter, F. R.; Likki, S. R.; Suzen, Y. B.; Huang, P. G.; Volker, S.

    2006-01-01

    A new correlation-based transition model has been developed, which is built strictly on local variables. As a result, the transition model is compatible with modern computational fluid dynamics (CFD) methods using unstructured grids and massive parallel execution. The model is based on two transport equations, one for the intermittency and one for the transition onset criteria in terms of momentum thickness Reynolds number. The proposed transport equations do not attempt to model the physics of the transition process (unlike, e.g., turbulence models), but form a framework for the implementation of correlation-based models into general-purpose CFD methods.

  10. Potential-based and non-potential-based cohesive zone formulations under mixed-mode separation and over-closure-Part II: Finite element applications

    NASA Astrophysics Data System (ADS)

    Máirtín, Éamonn Ó.; Parry, Guillaume; Beltz, Glenn E.; McGarry, J. Patrick

    2014-02-01

    This paper, the second of two parts, presents three novel finite element case studies to demonstrate the importance of normal-tangential coupling in cohesive zone models (CZMs) for the prediction of mixed-mode interface debonding. Specifically, four new CZMs proposed in Part I of this study are implemented, namely the potential-based MP model and the non-potential-based NP1, NP2 and SMC models. For comparison, simulations are also performed for the well established potential-based Xu-Needleman (XN) model and the non-potential-based model of van den Bosch, Schreurs and Geers (BSG model). Case study 1: Debonding and rebonding of a biological cell from a cyclically deforming silicone substrate is simulated when the mode II work of separation is higher than the mode I work of separation at the cell-substrate interface. An active formulation for the contractility and remodelling of the cell cytoskeleton is implemented. It is demonstrated that when the XN potential function is used at the cell-substrate interface repulsive normal tractions are computed, preventing rebonding of significant regions of the cell to the substrate. In contrast, the proposed MP potential function at the cell-substrate interface results in negligible repulsive normal tractions, allowing for the prediction of experimentally observed patterns of cell cytoskeletal remodelling. Case study 2: Buckling of a coating from the compressive surface of a stent is simulated. It is demonstrated that during expansion of the stent the coating is initially compressed into the stent surface, while simultaneously undergoing tangential (shear) tractions at the coating-stent interface. It is demonstrated that when either the proposed NP1 or NP2 model is implemented at the stent-coating interface mixed-mode over-closure is correctly penalised. Further expansion of the stent results in the prediction of significant buckling of the coating from the stent surface, as observed experimentally. In contrast, the BSG model does not correctly penalise mixed-mode over-closure at the stent-coating interface, significantly altering the stress state in the coating and preventing the prediction of buckling. Case study 3: Application of a displacement to the base of a bi-layered composite arch results in a symmetric sinusoidal distribution of normal and tangential traction at the arch interface. The traction defined mode mixity at the interface ranges from pure mode II at the base of the arch to pure mode I at the top of the arch. It is demonstrated that predicted debonding patterns are highly sensitive to normal-tangential coupling terms in a CZM. The NP2, XN, and BSG models exhibit a strong bias towards mode I separation at the top of the arch, while the NP1 model exhibits a bias towards mode II debonding at the base of the arch. Only the SMC model provides mode-independent behaviour in the early stages of debonding. This case study provides a practical example of the importance of the behaviour of CZMs under conditions of traction controlled mode mixity, following from the theoretical analysis presented in Part I of this study.

  11. Conformational Transitions upon Ligand Binding: Holo-Structure Prediction from Apo Conformations

    PubMed Central

    Seeliger, Daniel; de Groot, Bert L.

    2010-01-01

    Biological function of proteins is frequently associated with the formation of complexes with small-molecule ligands. Experimental structure determination of such complexes at atomic resolution, however, can be time-consuming and costly. Computational methods for structure prediction of protein/ligand complexes, particularly docking, are as yet restricted by their limited consideration of receptor flexibility, rendering them not applicable for predicting protein/ligand complexes if large conformational changes of the receptor upon ligand binding are involved. Accurate receptor models in the ligand-bound state (holo structures), however, are a prerequisite for successful structure-based drug design. Hence, if only an unbound (apo) structure is available distinct from the ligand-bound conformation, structure-based drug design is severely limited. We present a method to predict the structure of protein/ligand complexes based solely on the apo structure, the ligand and the radius of gyration of the holo structure. The method is applied to ten cases in which proteins undergo structural rearrangements of up to 7.1 Å backbone RMSD upon ligand binding. In all cases, receptor models within 1.6 Å backbone RMSD to the target were predicted and close-to-native ligand binding poses were obtained for 8 of 10 cases in the top-ranked complex models. A protocol is presented that is expected to enable structure modeling of protein/ligand complexes and structure-based drug design for cases where crystal structures of ligand-bound conformations are not available. PMID:20066034

  12. Long-term Evaluation of Landuse Changes On Landscape Water Balance - A Case Study From North-east Germany

    NASA Astrophysics Data System (ADS)

    Wegehenkel, M.

    In this paper, long-term effects of different afforestation scenarios on landscape wa- ter balance will be analyzed taking into account the results of a regional case study. This analysis is based on using a GIS-coupled simulation model for the the spatially distributed calculation of water balance.For this purpose, the modelling system THE- SEUS with a simple GIS-interface will be used. To take into account the special case of change in forest cover proportion, THESEUS was enhanced with a simple for- est growth model. In the regional case study, model runs will be performed using a detailed spatial data set from North-East Germany. This data set covers a mesoscale catchment located at the moraine landscape of North-East Germany. Based on this data set, the influence of the actual landuse and of different landuse change scenarios on water balance dynamics will be investigated taking into account the spatial distributed modelling results from THESEUS. The model was tested using different experimen- tal data sets from field plots as well as obsverded catchment discharge. Additionally to such convential validation techniques, remote sensing data were used to check the simulated regional distribution of water balance components like evapotranspiration in the catchment.

  13. Physiologically Based Absorption Modeling to Impact Biopharmaceutics and Formulation Strategies in Drug Development-Industry Case Studies.

    PubMed

    Kesisoglou, Filippos; Chung, John; van Asperen, Judith; Heimbach, Tycho

    2016-09-01

    In recent years, there has been a significant increase in use of physiologically based pharmacokinetic models in drug development and regulatory applications. Although most of the published examples have focused on aspects such as first-in-human (FIH) dose predictions or drug-drug interactions, several publications have highlighted the application of these models in the biopharmaceutics field and their use to inform formulation development. In this report, we present 5 case studies of use of such models in this biopharmaceutics/formulation space across different pharmaceutical companies. The case studies cover different aspects of biopharmaceutics or formulation questions including (1) prediction of absorption prior to FIH studies; (2) optimization of formulation and dissolution method post-FIH data; (3) early exploration of a modified-release formulation; (4) addressing bridging questions for late-stage formulation changes; and (5) prediction of pharmacokinetics in the fed state for a Biopharmaceutics Classification System class I drug with fasted state data. The discussion of the case studies focuses on how such models can facilitate decisions and biopharmaceutic understanding of drug candidates and the opportunities for increased use and acceptance of such models in drug development and regulatory interactions. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Logic-Based Models for the Analysis of Cell Signaling Networks†

    PubMed Central

    2010-01-01

    Computational models are increasingly used to analyze the operation of complex biochemical networks, including those involved in cell signaling networks. Here we review recent advances in applying logic-based modeling to mammalian cell biology. Logic-based models represent biomolecular networks in a simple and intuitive manner without describing the detailed biochemistry of each interaction. A brief description of several logic-based modeling methods is followed by six case studies that demonstrate biological questions recently addressed using logic-based models and point to potential advances in model formalisms and training procedures that promise to enhance the utility of logic-based methods for studying the relationship between environmental inputs and phenotypic or signaling state outputs of complex signaling networks. PMID:20225868

  15. A New Approach to Integrate Internet-of-Things and Software-as-a-Service Model for Logistic Systems: A Case Study

    PubMed Central

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-01-01

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728

  16. A new approach to integrate Internet-of-things and software-as-a-service model for logistic systems: a case study.

    PubMed

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-03-28

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.

  17. Cognitive problem solving patterns of medical students correlate with success in diagnostic case solutions.

    PubMed

    Kiesewetter, Jan; Ebersbach, René; Görlitz, Anja; Holzer, Matthias; Fischer, Martin R; Schmidmaier, Ralf

    2013-01-01

    Problem-solving in terms of clinical reasoning is regarded as a key competence of medical doctors. Little is known about the general cognitive actions underlying the strategies of problem-solving among medical students. In this study, a theory-based model was used and adapted in order to investigate the cognitive actions in which medical students are engaged when dealing with a case and how patterns of these actions are related to the correct solution. Twenty-three medical students worked on three cases on clinical nephrology using the think-aloud method. The transcribed recordings were coded using a theory-based model consisting of eight different cognitive actions. The coded data was analysed using time sequences in a graphical representation software. Furthermore the relationship between the coded data and accuracy of diagnosis was investigated with inferential statistical methods. The observation of all main actions in a case elaboration, including evaluation, representation and integration, was considered a complete model and was found in the majority of cases (56%). This pattern significantly related to the accuracy of the case solution (φ = 0.55; p<.001). Extent of prior knowledge was neither related to the complete model nor to the correct solution. The proposed model is suitable to empirically verify the cognitive actions of problem-solving of medical students. The cognitive actions evaluation, representation and integration are crucial for the complete model and therefore for the accuracy of the solution. The educational implication which may be drawn from this study is to foster students reasoning by focusing on higher level reasoning.

  18. Validation of ACG Case-mix for equitable resource allocation in Swedish primary health care.

    PubMed

    Zielinski, Andrzej; Kronogård, Maria; Lenhoff, Håkan; Halling, Anders

    2009-09-18

    Adequate resource allocation is an important factor to ensure equity in health care. Previous reimbursement models have been based on age, gender and socioeconomic factors. An explanatory model based on individual need of primary health care (PHC) has not yet been used in Sweden to allocate resources. The aim of this study was to examine to what extent the ACG case-mix system could explain concurrent costs in Swedish PHC. Diagnoses were obtained from electronic PHC records of inhabitants in Blekinge County (approx. 150,000) listed with public PHC (approx. 120,000) for three consecutive years, 2004-2006. The inhabitants were then classified into six different resource utilization bands (RUB) using the ACG case-mix system. The mean costs for primary health care were calculated for each RUB and year. Using linear regression models and log-cost as dependent variable the adjusted R2 was calculated in the unadjusted model (gender) and in consecutive models where age, listing with specific PHC and RUB were added. In an additional model the ACG groups were added. Gender, age and listing with specific PHC explained 14.48-14.88% of the variance in individual costs for PHC. By also adding information on level of co-morbidity, as measured by the ACG case-mix system, to specific PHC the adjusted R2 increased to 60.89-63.41%. The ACG case-mix system explains patient costs in primary care to a high degree. Age and gender are important explanatory factors, but most of the variance in concurrent patient costs was explained by the ACG case-mix system.

  19. Modeling the public health impact of malaria vaccines for developers and policymakers

    PubMed Central

    2013-01-01

    Background Efforts to develop malaria vaccines show promise. Mathematical model-based estimates of the potential demand, public health impact, and cost and financing requirements can be used to inform investment and adoption decisions by vaccine developers and policymakers on the use of malaria vaccines as complements to existing interventions. However, the complexity of such models may make their outputs inaccessible to non-modeling specialists. This paper describes a Malaria Vaccine Model (MVM) developed to address the specific needs of developers and policymakers, who need to access sophisticated modeling results and to test various scenarios in a user-friendly interface. The model’s functionality is demonstrated through a hypothetical vaccine. Methods The MVM has three modules: supply and demand forecast; public health impact; and implementation cost and financing requirements. These modules include pre-entered reference data and also allow for user-defined inputs. The model includes an integrated sensitivity analysis function. Model functionality was demonstrated by estimating the public health impact of a hypothetical pre-erythrocytic malaria vaccine with 85% efficacy against uncomplicated disease and a vaccine efficacy decay rate of four years, based on internationally-established targets. Demand for this hypothetical vaccine was estimated based on historical vaccine implementation rates for routine infant immunization in 40 African countries over a 10-year period. Assumed purchase price was $5 per dose and injection equipment and delivery costs were $0.40 per dose. Results The model projects the number of doses needed, uncomplicated and severe cases averted, deaths and disability-adjusted life years (DALYs) averted, and cost to avert each. In the demonstration scenario, based on a projected demand of 532 million doses, the MVM estimated that 150 million uncomplicated cases of malaria and 1.1 million deaths would be averted over 10 years. This is equivalent to 943 uncomplicated cases and 7 deaths averted per 1,000 vaccinees. In discounted 2011 US dollars, this represents $11 per uncomplicated case averted and $1,482 per death averted. If vaccine efficacy were reduced to 75%, the estimated uncomplicated cases and deaths averted over 10 years would decrease by 14% and 19%, respectively. Conclusions The MVM can provide valuable information to assist decision-making by vaccine developers and policymakers, information which will be refined and strengthened as field studies progress allowing further validation of modeling assumptions. PMID:23815273

  20. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    PubMed

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Does the organisational model of dementia case management make a difference in satisfaction with case management and caregiver burden? An evaluation study.

    PubMed

    Peeters, José M; Pot, Anne Margriet; de Lange, Jacomine; Spreeuwenberg, Peter M; Francke, Anneke L

    2016-03-09

    In the Netherlands, various organisational models of dementia case management exist. In this study the following four models are distinguished, based on differences in the availability of the service and in the case management function: Model 1: the case management service is available from first dementia symptoms + is always a separate specialist function; Model 2: the case management service is only available after a formal dementia diagnosis + is always a separate specialist function; Model 3: the case management service is available from first dementia symptoms + is often a combined function; Model 4: the case management service is only available after a formal dementia diagnosis + is often a combined function. The objectives of this study are to give insight into whether satisfaction with dementia case management and the development of caregiver burden depend on the organisational model. A survey was carried out in regional dementia care networks in the Netherlands among 554 informal carers for people with dementia at the start of case management (response of 85 %), and one year later. Descriptive statistics and multilevel models were used to analyse the data. The satisfaction with the case manager was high in general (an average of 8.0 within a possible range of 1 to 10), although the caregiver burden did not decrease in the first year after starting with case management. No differences were found between the four organisational models regarding the development of caregiver burden. However, statistically significant differences (p < 0.05) were found regarding satisfaction: informal carers in the organisational model where case management is only available after formal diagnosis of dementia and is often a combined function had on average the lowest satisfaction scores. Nevertheless, the satisfaction of informal carers within all organisational models was high (ranging from 7.51 to 8.40 within a range of 1 to 10). Organisational features of case management seem to make little or no difference to the development in caregiver burden and the satisfaction of informal carers. Future research is needed to explore whether the individual characteristics of the case managers themselves are associated with case management outcomes.

  2. Uncertainty Analysis of Coupled Socioeconomic-Cropping Models: Building Confidence in Climate Change Decision-Support Tools for Local Stakeholders

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.

    2015-12-01

    While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.

  3. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  4. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  5. Evaluation of strategies for nature-based solutions to drought: a decision support model at the national scale

    NASA Astrophysics Data System (ADS)

    Simpson, Mike; Ives, Matthew; Hall, Jim

    2016-04-01

    There is an increasing body of evidence in support of the use of nature based solutions as a strategy to mitigate drought. Restored or constructed wetlands, grasslands and in some cases forests have been used with success in numerous case studies. Such solutions remain underused in the UK, where they are not considered as part of long-term plans for supply by water companies. An important step is the translation of knowledge on the benefits of nature based solutions at the upland/catchment scale into a model of the impact of these solutions on national water resource planning in terms of financial costs, carbon benefits and robustness to drought. Our project, 'A National Scale Model of Green Infrastructure for Water Resources', addresses this issue through development of a model that can show the costs and benefits associated with a broad roll-out of nature based solutions for water supply. We have developed generalised models of both the hydrological effects of various classes and implementations of nature-based approaches and their economic impacts in terms of construction costs, running costs, time to maturity, land use and carbon benefits. Our next step will be to compare this work with our recent evaluation of conventional water infrastructure, allowing a case to be made in financial terms and in terms of security of water supply. By demonstrating the benefits of nature based solutions under multiple possible climate and population scenarios we aim to demonstrate the potential value of using nature based solutions as a component of future long-term water resource plans. Strategies for decision making regarding the selection of nature based and conventional approaches, developed through discussion with government and industry, will be applied to the final model. Our focus is on keeping our work relevant to the requirements of decision-makers involved in conventional water planning. We propose to present the outcomes of our model for the evaluation of nature-based solutions at catchment scale and ongoing results of our national-scale model.

  6. Short-term solar flare prediction using image-case-based reasoning

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Fu; Li, Fei; Zhang, Huai-Peng; Yu, Da-Ren

    2017-10-01

    Solar flares strongly influence space weather and human activities, and their prediction is highly complex. The existing solutions such as data based approaches and model based approaches have a common shortcoming which is the lack of human engagement in the forecasting process. An image-case-based reasoning method is introduced to achieve this goal. The image case library is composed of SOHO/MDI longitudinal magnetograms, the images from which exhibit the maximum horizontal gradient, the length of the neutral line and the number of singular points that are extracted for retrieving similar image cases. Genetic optimization algorithms are employed for optimizing the weight assignment for image features and the number of similar image cases retrieved. Similar image cases and prediction results derived by majority voting for these similar image cases are output and shown to the forecaster in order to integrate his/her experience with the final prediction results. Experimental results demonstrate that the case-based reasoning approach has slightly better performance than other methods, and is more efficient with forecasts improved by humans.

  7. Integration of system identification and finite element modelling of nonlinear vibrating structures

    NASA Astrophysics Data System (ADS)

    Cooper, Samson B.; DiMaio, Dario; Ewins, David J.

    2018-03-01

    The Finite Element Method (FEM), Experimental modal analysis (EMA) and other linear analysis techniques have been established as reliable tools for the dynamic analysis of engineering structures. They are often used to provide solutions to small and large structures and other variety of cases in structural dynamics, even those exhibiting a certain degree of nonlinearity. Unfortunately, when the nonlinear effects are substantial or the accuracy of the predicted response is of vital importance, a linear finite element model will generally prove to be unsatisfactory. As a result, the validated linear FE model requires further enhancement so that it can represent and predict the nonlinear behaviour exhibited by the structure. In this paper, a pragmatic approach to integrating test-based system identification and FE modelling of a nonlinear structure is presented. This integration is based on three different phases: the first phase involves the derivation of an Underlying Linear Model (ULM) of the structure, the second phase includes experiment-based nonlinear identification using measured time series and the third phase covers augmenting the linear FE model and experimental validation of the nonlinear FE model. The proposed case study is demonstrated on a twin cantilever beam assembly coupled with a flexible arch shaped beam. In this case, polynomial-type nonlinearities are identified and validated with force-controlled stepped-sine test data at several excitation levels.

  8. Preserving privacy whilst maintaining robust epidemiological predictions.

    PubMed

    Werkman, Marleen; Tildesley, Michael J; Brooks-Pollock, Ellen; Keeling, Matt J

    2016-12-01

    Mathematical models are invaluable tools for quantifying potential epidemics and devising optimal control strategies in case of an outbreak. State-of-the-art models increasingly require detailed individual farm-based and sensitive data, which may not be available due to either lack of capacity for data collection or privacy concerns. However, in many situations, aggregated data are available for use. In this study, we systematically investigate the accuracy of predictions made by mathematical models initialised with varying data aggregations, using the UK 2001 Foot-and-Mouth Disease Epidemic as a case study. We consider the scenario when the only data available are aggregated into spatial grid cells, and develop a metapopulation model where individual farms in a single subpopulation are assumed to behave uniformly and transmit randomly. We also adapt this standard metapopulation model to capture heterogeneity in farm size and composition, using farm census data. Our results show that homogeneous models based on aggregated data overestimate final epidemic size but can perform well for predicting spatial spread. Recognising heterogeneity in farm sizes improves predictions of the final epidemic size, identifying risk areas, determining the likelihood of epidemic take-off and identifying the optimal control strategy. In conclusion, in cases where individual farm-based data are not available, models can still generate meaningful predictions, although care must be taken in their interpretation and use. Copyright © 2016. Published by Elsevier B.V.

  9. Predicting the effect of cytochrome P450 inhibitors on substrate drugs: analysis of physiologically based pharmacokinetic modeling submissions to the US Food and Drug Administration.

    PubMed

    Wagner, Christian; Pan, Yuzhuo; Hsu, Vicky; Grillo, Joseph A; Zhang, Lei; Reynolds, Kellie S; Sinha, Vikram; Zhao, Ping

    2015-01-01

    The US Food and Drug Administration (FDA) has seen a recent increase in the application of physiologically based pharmacokinetic (PBPK) modeling towards assessing the potential of drug-drug interactions (DDI) in clinically relevant scenarios. To continue our assessment of such approaches, we evaluated the predictive performance of PBPK modeling in predicting cytochrome P450 (CYP)-mediated DDI. This evaluation was based on 15 substrate PBPK models submitted by nine sponsors between 2009 and 2013. For these 15 models, a total of 26 DDI studies (cases) with various CYP inhibitors were available. Sponsors developed the PBPK models, reportedly without considering clinical DDI data. Inhibitor models were either developed by sponsors or provided by PBPK software developers and applied with minimal or no modification. The metric for assessing predictive performance of the sponsors' PBPK approach was the R predicted/observed value (R predicted/observed = [predicted mean exposure ratio]/[observed mean exposure ratio], with the exposure ratio defined as [C max (maximum plasma concentration) or AUC (area under the plasma concentration-time curve) in the presence of CYP inhibition]/[C max or AUC in the absence of CYP inhibition]). In 81 % (21/26) and 77 % (20/26) of cases, respectively, the R predicted/observed values for AUC and C max ratios were within a pre-defined threshold of 1.25-fold of the observed data. For all cases, the R predicted/observed values for AUC and C max were within a 2-fold range. These results suggest that, based on the submissions to the FDA to date, there is a high degree of concordance between PBPK-predicted and observed effects of CYP inhibition, especially CYP3A-based, on the exposure of drug substrates.

  10. Spatial Prediction of Coxiella burnetii Outbreak Exposure via Notified Case Counts in a Dose-Response Model.

    PubMed

    Brooke, Russell J; Kretzschmar, Mirjam E E; Hackert, Volker; Hoebe, Christian J P A; Teunis, Peter F M; Waller, Lance A

    2017-01-01

    We develop a novel approach to study an outbreak of Q fever in 2009 in the Netherlands by combining a human dose-response model with geostatistics prediction to relate probability of infection and associated probability of illness to an effective dose of Coxiella burnetii. The spatial distribution of the 220 notified cases in the at-risk population are translated into a smooth spatial field of dose. Based on these symptomatic cases, the dose-response model predicts a median of 611 asymptomatic infections (95% range: 410, 1,084) for the 220 reported symptomatic cases in the at-risk population; 2.78 (95% range: 1.86, 4.93) asymptomatic infections for each reported case. The low attack rates observed during the outbreak range from (Equation is included in full-text article.)to (Equation is included in full-text article.). The estimated peak levels of exposure extend to the north-east from the point source with an increasing proportion of asymptomatic infections further from the source. Our work combines established methodology from model-based geostatistics and dose-response modeling allowing for a novel approach to study outbreaks. Unobserved infections and the spatially varying effective dose can be predicted using the flexible framework without assuming any underlying spatial structure of the outbreak process. Such predictions are important for targeting interventions during an outbreak, estimating future disease burden, and determining acceptable risk levels.

  11. Possible superconductivity in Sr₂IrO₄ probed by quasiparticle interference.

    PubMed

    Gao, Yi; Zhou, Tao; Huang, Huaixiang; Wang, Qiang-Hua

    2015-03-18

    Based on the possible superconducting (SC) pairing symmetries recently proposed, the quasiparticle interference (QPI) patterns in electron- and hole-doped Sr₂IrO₄ are theoretically investigated. In the electron-doped case, the QPI spectra can be explained based on a model similar to the octet model of the cuprates while in the hole-doped case, both the Fermi surface topology and the sign of the SC order parameter resemble those of the iron pnictides and there exists a QPI vector resulting from the interpocket scattering between the electron and hole pockets. In both cases, the evolution of the QPI vectors with energy and their behaviors in the nonmagnetic and magnetic impurity scattering cases can well be explained based on the evolution of the constant-energy contours and the sign structure of the SC order parameter. The QPI spectra presented in this paper can be compared with future scanning tunneling microscopy experiments to test whether there are SC phases in electron- and hole-doped Sr₂IrO₄ and what the pairing symmetry is.

  12. Performance Prediction of a MongoDB-Based Traceability System in Smart Factory Supply Chains

    PubMed Central

    Kang, Yong-Shin; Park, Il-Ha; Youm, Sekyoung

    2016-01-01

    In the future, with the advent of the smart factory era, manufacturing and logistics processes will become more complex, and the complexity and criticality of traceability will further increase. This research aims at developing a performance assessment method to verify scalability when implementing traceability systems based on key technologies for smart factories, such as Internet of Things (IoT) and BigData. To this end, based on existing research, we analyzed traceability requirements and an event schema for storing traceability data in MongoDB, a document-based Not Only SQL (NoSQL) database. Next, we analyzed the algorithm of the most representative traceability query and defined a query-level performance model, which is composed of response times for the components of the traceability query algorithm. Next, this performance model was solidified as a linear regression model because the response times increase linearly by a benchmark test. Finally, for a case analysis, we applied the performance model to a virtual automobile parts logistics. As a result of the case study, we verified the scalability of a MongoDB-based traceability system and predicted the point when data node servers should be expanded in this case. The traceability system performance assessment method proposed in this research can be used as a decision-making tool for hardware capacity planning during the initial stage of construction of traceability systems and during their operational phase. PMID:27983654

  13. Performance Prediction of a MongoDB-Based Traceability System in Smart Factory Supply Chains.

    PubMed

    Kang, Yong-Shin; Park, Il-Ha; Youm, Sekyoung

    2016-12-14

    In the future, with the advent of the smart factory era, manufacturing and logistics processes will become more complex, and the complexity and criticality of traceability will further increase. This research aims at developing a performance assessment method to verify scalability when implementing traceability systems based on key technologies for smart factories, such as Internet of Things (IoT) and BigData. To this end, based on existing research, we analyzed traceability requirements and an event schema for storing traceability data in MongoDB, a document-based Not Only SQL (NoSQL) database. Next, we analyzed the algorithm of the most representative traceability query and defined a query-level performance model, which is composed of response times for the components of the traceability query algorithm. Next, this performance model was solidified as a linear regression model because the response times increase linearly by a benchmark test. Finally, for a case analysis, we applied the performance model to a virtual automobile parts logistics. As a result of the case study, we verified the scalability of a MongoDB-based traceability system and predicted the point when data node servers should be expanded in this case. The traceability system performance assessment method proposed in this research can be used as a decision-making tool for hardware capacity planning during the initial stage of construction of traceability systems and during their operational phase.

  14. A Prediction Model for ROS1-Rearranged Lung Adenocarcinomas based on Histologic Features.

    PubMed

    Zhou, Jianya; Zhao, Jing; Zheng, Jing; Kong, Mei; Sun, Ke; Wang, Bo; Chen, Xi; Ding, Wei; Zhou, Jianying

    2016-01-01

    To identify the clinical and histological characteristics of ROS1-rearranged non-small-cell lung carcinomas (NSCLCs) and build a prediction model to prescreen suitable patients for molecular testing. We identified 27 cases of ROS1-rearranged lung adenocarcinomas in 1165 patients with NSCLCs confirmed by real-time PCR and FISH and performed univariate and multivariate analyses to identify predictive factors associated with ROS1 rearrangement and finally developed prediction model. Detected with ROS1 immunochemistry, 59 cases of 1165 patients had a certain degree of ROS1 expression. Among these cases, 19 cases (68%, 19/28) with 3+ and 8 cases (47%, 8/17) with 2+ staining were ROS1 rearrangement verified by real-time PCR and FISH. In the resected group, the acinar-predominant growth pattern was the most commonly observed (57%, 8/14), while in the biopsy group, solid patterns were the most frequently observed (78%, 7/13). Based on multiple logistic regression analysis, we determined that female sex, cribriform structure and the presence of psammoma body were the three most powerful indicators of ROS1 rearrangement, and we have developed a predictive model for the presence of ROS1 rearrangements in lung adenocarcinomas. Female, cribriform structure and presence of psammoma body were the three most powerful indicator of ROS1 rearrangement status, and predictive formula was helpful in screening ROS1-rearranged NSCLC, especially for ROS1 immunochemistry equivocal cases.

  15. Development of Conceptual Models for Internet Search: A Case Study.

    ERIC Educational Resources Information Center

    Uden, Lorna; Tearne, Stephen; Alderson, Albert

    This paper describes the creation and evaluation of a World Wide Web-based courseware module, using conceptual models based on constructivism, that teaches novices how to use the Internet for searching. Questionnaires and interviews were used to understand the difficulties of a group of novices. The conceptual model of the experts for the task was…

  16. Problem-Based Learning--Buginese Cultural Knowledge Model--Case Study: Teaching Mathematics at Junior High School

    ERIC Educational Resources Information Center

    Cheriani, Cheriani; Mahmud, Alimuddin; Tahmir, Suradi; Manda, Darman; Dirawan, Gufran Darma

    2015-01-01

    This study aims to determine the differences in learning output by using Problem Based Model combines with the "Buginese" Local Cultural Knowledge (PBL-Culture). It is also explores the students activities in learning mathematics subject by using PBL-Culture Models. This research is using Mixed Methods approach that combined quantitative…

  17. [Automated detection of estrus and mastitis in dairy cows].

    PubMed

    de Mol, R M

    2001-02-15

    The development and test of detection models for oestrus and mastitis in dairy cows is described in a PhD thesis that was defended in Wageningen on June 5, 2000. These models were based on sensors for milk yield, milk temperature, electrical conductivity of milk, and cow activity and concentrate intake, and on combined processing of the sensor data. The models alert farmers to cows that need attention, because of possible oestrus or mastitis. A first detection model for cows, milked twice a day, was based on time series models for the sensor variables. A time series model describes the dependence between successive observations. The parameters of the time series models were fitted on-line for each cow after each milking by means of a Kalman filter, a mathematical method to estimate the state of a system on-line. The Kalman filter gives the best estimate of the current state of a system based on all preceding observations. This model was tested for 2 years on two experimental farms, and under field conditions on four farms over several years. A second detection model, for cow milked in an automatic milking system (AMS), was based on a generalization of the first model. Two data sets (one small, one large) were used for testing. The results for oestrus detection were good for both models. The results for mastitis detection were varying (in some cases good, in other cases moderate). Fuzzy logic was used to classify mastitis and oestrus alerts with both detection models, to reduce the number of false positive alerts. Fuzzy logic makes approximate reasoning possible, where statements can be partly true or false. Input for the fuzzy logic model were alerts from the detection models and additional information. The number of false positive alerts decreased considerably, while the number of detected cases remained at the same level. These models make automated detection possible in practice.

  18. Weather-based prediction of Plasmodium falciparum malaria in epidemic-prone regions of Ethiopia II. Weather-based prediction systems perform comparably to early detection systems in identifying times for interventions.

    PubMed

    Teklehaimanot, Hailay D; Schwartz, Joel; Teklehaimanot, Awash; Lipsitch, Marc

    2004-11-19

    Timely and accurate information about the onset of malaria epidemics is essential for effective control activities in epidemic-prone regions. Early warning methods that provide earlier alerts (usually by the use of weather variables) may permit control measures to interrupt transmission earlier in the epidemic, perhaps at the expense of some level of accuracy. Expected case numbers were modeled using a Poisson regression with lagged weather factors in a 4th-degree polynomial distributed lag model. For each week, the numbers of malaria cases were predicted using coefficients obtained using all years except that for which the prediction was being made. The effectiveness of alerts generated by the prediction system was compared against that of alerts based on observed cases. The usefulness of the prediction system was evaluated in cold and hot districts. The system predicts the overall pattern of cases well, yet underestimates the height of the largest peaks. Relative to alerts triggered by observed cases, the alerts triggered by the predicted number of cases performed slightly worse, within 5% of the detection system. The prediction-based alerts were able to prevent 10-25% more cases at a given sensitivity in cold districts than in hot ones. The prediction of malaria cases using lagged weather performed well in identifying periods of increased malaria cases. Weather-derived predictions identified epidemics with reasonable accuracy and better timeliness than early detection systems; therefore, the prediction of malarial epidemics using weather is a plausible alternative to early detection systems.

  19. Cooling tower plume - model and experiment

    NASA Astrophysics Data System (ADS)

    Cizek, Jan; Gemperle, Jiri; Strob, Miroslav; Nozicka, Jiri

    The paper discusses the description of the simple model of the, so-called, steam plume, which in many cases forms during the operation of the evaporative cooling systems of the power plants, or large technological units. The model is based on semi-empirical equations that describe the behaviour of a mixture of two gases in case of the free jet stream. In the conclusion of the paper, a simple experiment is presented through which the results of the designed model shall be validated in the subsequent period.

  20. A Comparative Analysis of Reynolds-Averaged Navier-Stokes Model Predictions for Rayleigh-Taylor Instability and Mixing with Constant and Complex Accelerations

    NASA Astrophysics Data System (ADS)

    Schilling, Oleg

    2016-11-01

    Two-, three- and four-equation, single-velocity, multicomponent Reynolds-averaged Navier-Stokes (RANS) models, based on the turbulent kinetic energy dissipation rate or lengthscale, are used to simulate At = 0 . 5 Rayleigh-Taylor turbulent mixing with constant and complex accelerations. The constant acceleration case is inspired by the Cabot and Cook (2006) DNS, and the complex acceleration cases are inspired by the unstable/stable and unstable/neutral cases simulated using DNS (Livescu, Wei & Petersen 2011) and the unstable/stable/unstable case simulated using ILES (Ramaprabhu, Karkhanis & Lawrie 2013). The four-equation models couple equations for the mass flux a and negative density-specific volume correlation b to the K- ɛ or K- L equations, while the three-equation models use a two-fluid algebraic closure for b. The lengthscale-based models are also applied with no buoyancy production in the L equation to explore the consequences of neglecting this term. Predicted mixing widths, turbulence statistics, fields, and turbulent transport equation budgets are compared among these models to identify similarities and differences in the turbulence production, dissipation and diffusion physics represented by the closures used in these models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  1. Testing CMAQ chemistry sensitivities in base case and emissions control runs at SEARCH and SOS99 surface sites in the southeastern US

    NASA Astrophysics Data System (ADS)

    Arnold, J. R.; Dennis, Robin L.

    CMAQ was run to simulate urban and regional tropospheric conditions in the southeastern US over 14 days in July 1999 at 32, 8 and 2 km grid spacings. Runs were made with either of two older mechanisms, Carbon Bond IV (CB4) and the Regional Acid Deposition Model, version 2 (RADM2), and with the more recent and complete California Statewide Air Pollution Research Center, version 1999 mechanism (SAPRC99) in a sensitivity matrix with a full emissions base case and separate 50% control scenarios for emissions of nitrogen oxides (NO X) and volatile organic compounds (VOC). Results from the base case were compared to observations at the Southeastern Aerosol Research and Characterization Study (SEARCH) site at Jefferson Street in Atlanta, GA (JST) and the Southern Oxidant Study (SOS) Cornelia Fort Airpark (CFA) site downwind of Nashville, TN. In the base case, SAPRC99 predicted more ozone (O 3) than CB4 or RADM2 almost every hour and especially for afternoon maxima at both JST and CFA. Performance of the 8 km models at JST was better than that of the 32 km ones for all chemistries, reducing the 1 h peak bias by as much as 30 percentage points; at CFA only the RADM2 8 km model improved. The 2 km solutions did not show improved performance over the 8 km ones at either site, with normalized 1 h bias in the peak O 3 ranging from 21% at CFA to 43% at JST. In the emissions control cases, SAPRC99 was generally more responsive than CB4 and RADM2 to NO X and VOC controls, excepting hours at JST with predicted increased O 3 from NO X control. Differential sensitivity to chemical mechanism varied by more than ±10% for NO X control at JST and CFA, and in a similar range for VOC control at JST. VOC control at the more strongly NO X- limited urban CFA site produced a differential sensitivity response of <5%. However, even when differential sensitivities in control cases were small, neither their sign nor their magnitude could be reliably determined from model performance in the full emissions case, meaning that the degree of O 3 response to a change in chemical mechanism can differ substantially with the level of precursor emissions. Hence we conclude that properly understanding the effects of changes in a model's chemical mechanism always requires emissions control cases as part of model sensitivity analysis.

  2. Improved Horvitz-Thompson Estimation of Model Parameters from Two-phase Stratified Samples: Applications in Epidemiology

    PubMed Central

    Breslow, Norman E.; Lumley, Thomas; Ballantyne, Christie M; Chambless, Lloyd E.; Kulich, Michal

    2009-01-01

    The case-cohort study involves two-phase sampling: simple random sampling from an infinite super-population at phase one and stratified random sampling from a finite cohort at phase two. Standard analyses of case-cohort data involve solution of inverse probability weighted (IPW) estimating equations, with weights determined by the known phase two sampling fractions. The variance of parameter estimates in (semi)parametric models, including the Cox model, is the sum of two terms: (i) the model based variance of the usual estimates that would be calculated if full data were available for the entire cohort; and (ii) the design based variance from IPW estimation of the unknown cohort total of the efficient influence function (IF) contributions. This second variance component may be reduced by adjusting the sampling weights, either by calibration to known cohort totals of auxiliary variables correlated with the IF contributions or by their estimation using these same auxiliary variables. Both adjustment methods are implemented in the R survey package. We derive the limit laws of coefficients estimated using adjusted weights. The asymptotic results suggest practical methods for construction of auxiliary variables that are evaluated by simulation of case-cohort samples from the National Wilms Tumor Study and by log-linear modeling of case-cohort data from the Atherosclerosis Risk in Communities Study. Although not semiparametric efficient, estimators based on adjusted weights may come close to achieving full efficiency within the class of augmented IPW estimators. PMID:20174455

  3. Energy Performance Assessment of Radiant Cooling System through Modeling and Calibration at Component Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, Yasin; Mathur, Jyotirmay; Bhandari, Mahabir S

    2016-01-01

    The paper describes a case study of an information technology office building with a radiant cooling system and a conventional variable air volume (VAV) system installed side by side so that performancecan be compared. First, a 3D model of the building involving architecture, occupancy, and HVAC operation was developed in EnergyPlus, a simulation tool. Second, a different calibration methodology was applied to develop the base case for assessing the energy saving potential. This paper details the calibration of the whole building energy model to the component level, including lighting, equipment, and HVAC components such as chillers, pumps, cooling towers, fans,more » etc. Also a new methodology for the systematic selection of influence parameter has been developed for the calibration of a simulated model which requires large time for the execution. The error at the whole building level [measured in mean bias error (MBE)] is 0.2%, and the coefficient of variation of root mean square error (CvRMSE) is 3.2%. The total errors in HVAC at the hourly are MBE = 8.7% and CvRMSE = 23.9%, which meet the criteria of ASHRAE 14 (2002) for hourly calibration. Different suggestions have been pointed out to generalize the energy saving of radiant cooling system through the existing building system. So a base case model was developed by using the calibrated model for quantifying the energy saving potential of the radiant cooling system. It was found that a base case radiant cooling system integrated with DOAS can save 28% energy compared with the conventional VAV system.« less

  4. Web-Based Tools for Modelling and Analysis of Multivariate Data: California Ozone Pollution Activity

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas

    2011-01-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting…

  5. A neurosurgical simulation of skull base tumors using a 3D printed rapid prototyping model containing mesh structures.

    PubMed

    Kondo, Kosuke; Harada, Naoyuki; Masuda, Hiroyuki; Sugo, Nobuo; Terazono, Sayaka; Okonogi, Shinichi; Sakaeyama, Yuki; Fuchinoue, Yutaka; Ando, Syunpei; Fukushima, Daisuke; Nomoto, Jun; Nemoto, Masaaki

    2016-06-01

    Deep regions are not visible in three-dimensional (3D) printed rapid prototyping (RP) models prepared from opaque materials, which is not the case with translucent images. The objectives of this study were to develop an RP model in which a skull base tumor was simulated using mesh, and to investigate its usefulness for surgical simulations by evaluating the visibility of its deep regions. A 3D printer that employs binder jetting and is mainly used to prepare plaster models was used. RP models containing a solid tumor, no tumor, and a mesh tumor were prepared based on computed tomography, magnetic resonance imaging, and angiographic data for four cases of petroclival tumor. Twelve neurosurgeons graded the three types of RP model into the following four categories: 'clearly visible,' 'visible,' 'difficult to see,' and 'invisible,' based on the visibility of the internal carotid artery, basilar artery, and brain stem through a craniotomy performed via the combined transpetrosal approach. In addition, the 3D positional relationships between these structures and the tumor were assessed. The internal carotid artery, basilar artery, and brain stem and the positional relationships of these structures with the tumor were significantly more visible in the RP models with mesh tumors than in the RP models with solid or no tumors. The deep regions of PR models containing mesh skull base tumors were easy to visualize. This 3D printing-based method might be applicable to various surgical simulations.

  6. Comparative effectiveness of incorporating a hypothetical DCIS prognostic marker into breast cancer screening.

    PubMed

    Trentham-Dietz, Amy; Ergun, Mehmet Ali; Alagoz, Oguzhan; Stout, Natasha K; Gangnon, Ronald E; Hampton, John M; Dittus, Kim; James, Ted A; Vacek, Pamela M; Herschorn, Sally D; Burnside, Elizabeth S; Tosteson, Anna N A; Weaver, Donald L; Sprague, Brian L

    2018-02-01

    Due to limitations in the ability to identify non-progressive disease, ductal carcinoma in situ (DCIS) is usually managed similarly to localized invasive breast cancer. We used simulation modeling to evaluate the potential impact of a hypothetical test that identifies non-progressive DCIS. A discrete-event model simulated a cohort of U.S. women undergoing digital screening mammography. All women diagnosed with DCIS underwent the hypothetical DCIS prognostic test. Women with test results indicating progressive DCIS received standard breast cancer treatment and a decrement to quality of life corresponding to the treatment. If the DCIS test indicated non-progressive DCIS, no treatment was received and women continued routine annual surveillance mammography. A range of test performance characteristics and prevalence of non-progressive disease were simulated. Analysis compared discounted quality-adjusted life years (QALYs) and costs for test scenarios to base-case scenarios without the test. Compared to the base case, a perfect prognostic test resulted in a 40% decrease in treatment costs, from $13,321 to $8005 USD per DCIS case. A perfect test produced 0.04 additional QALYs (16 days) for women diagnosed with DCIS, added to the base case of 5.88 QALYs per DCIS case. The results were sensitive to the performance characteristics of the prognostic test, the proportion of DCIS cases that were non-progressive in the model, and the frequency of mammography screening in the population. A prognostic test that identifies non-progressive DCIS would substantially reduce treatment costs but result in only modest improvements in quality of life when averaged over all DCIS cases.

  7. Medicare: Better Controls Needed for Peer Review Organizations’ Evaluations.

    DTIC Science & Technology

    1987-10-01

    reflected significant uncertainties (percent of cases reviewed and cost per case). The national funding model utilized to develop the Goverrnent estimates...differences. As GAO stated, HCFA’s funding model was based on data from HCFA’s actuary, the Departnent of Labor, professional medical associations and... funding model extremely valuable in negotiating PRO contracts. The conclusion reached by GAO that the overall average cost contracted per review was less

  8. GLISTRboost: Combining Multimodal MRI Segmentation, Registration, and Biophysical Tumor Growth Modeling with Gradient Boosting Machines for Glioma Segmentation.

    PubMed

    Bakas, Spyridon; Zeng, Ke; Sotiras, Aristeidis; Rathore, Saima; Akbari, Hamed; Gaonkar, Bilwaj; Rozycki, Martin; Pati, Sarthak; Davatzikos, Christos

    2016-01-01

    We present an approach for segmenting low- and high-grade gliomas in multimodal magnetic resonance imaging volumes. The proposed approach is based on a hybrid generative-discriminative model. Firstly, a generative approach based on an Expectation-Maximization framework that incorporates a glioma growth model is used to segment the brain scans into tumor, as well as healthy tissue labels. Secondly, a gradient boosting multi-class classification scheme is used to refine tumor labels based on information from multiple patients. Lastly, a probabilistic Bayesian strategy is employed to further refine and finalize the tumor segmentation based on patient-specific intensity statistics from the multiple modalities. We evaluated our approach in 186 cases during the training phase of the BRAin Tumor Segmentation (BRATS) 2015 challenge and report promising results. During the testing phase, the algorithm was additionally evaluated in 53 unseen cases, achieving the best performance among the competing methods.

  9. The modeler's influence on calculated solubilities for performance assessments at the Aspo Hard-rock Laboratory

    USGS Publications Warehouse

    Ernren, A.T.; Arthur, R.; Glynn, P.D.; McMurry, J.

    1999-01-01

    Four researchers were asked to provide independent modeled estimates of the solubility of a radionuclide solid phase, specifically Pu(OH)4, under five specified sets of conditions. The objectives of the study were to assess the variability in the results obtained and to determine the primary causes for this variability.In the exercise, modelers were supplied with the composition, pH and redox properties of the water and with a description of the mineralogy of the surrounding fracture system A standard thermodynamic data base was provided to all modelers. Each modeler was encouraged to use other data bases in addition to the standard data base and to try different approaches to solving the problem.In all, about fifty approaches were used, some of which included a large number of solubility calculations. For each of the five test cases, the calculated solubilities from different approaches covered several orders of magnitude. The variability resulting from the use of different thermodynamic data bases was in most cases, far smaller than that resulting from the use of different approaches to solving the problem.

  10. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  11. Target prioritization and strategy selection for active case-finding of pulmonary tuberculosis: a tool to support country-level project planning.

    PubMed

    Nishikiori, Nobuyuki; Van Weezenbeek, Catharina

    2013-02-02

    Despite the progress made in the past decade, tuberculosis (TB) control still faces significant challenges. In many countries with declining TB incidence, the disease tends to concentrate in vulnerable populations that often have limited access to health care. In light of the limitations of the current case-finding approach and the global urgency to improve case detection, active case-finding (ACF) has been suggested as an important complementary strategy to accelerate tuberculosis control especially among high-risk populations. The present exercise aims to develop a model that can be used for county-level project planning. A simple deterministic model was developed to calculate the number of estimated TB cases diagnosed and the associated costs of diagnosis. The model was designed to compare cost-effectiveness parameters, such as the cost per case detected, for different diagnostic algorithms when they are applied to different risk populations. The model was transformed into a web-based tool that can support national TB programmes and civil society partners in designing ACF activities. According to the model output, tuberculosis active case-finding can be a costly endeavor, depending on the target population and the diagnostic strategy. The analysis suggests the following: (1) Active case-finding activities are cost-effective only if the tuberculosis prevalence among the target population is high. (2) Extensive diagnostic methods (e.g. X-ray screening for the entire group, use of sputum culture or molecular diagnostics) can be applied only to very high-risk groups such as TB contacts, prisoners or people living with human immunodeficiency virus (HIV) infection. (3) Basic diagnostic approaches such as TB symptom screening are always applicable although the diagnostic yield is very limited. The cost-effectiveness parameter was sensitive to local diagnostic costs and the tuberculosis prevalence of target populations. The prioritization of appropriate target populations and careful selection of cost-effective diagnostic strategies are critical prerequisites for rational active case-finding activities. A decision to conduct such activities should be based on the setting-specific cost-effectiveness analysis and programmatic assessment. A web-based tool was developed and is available to support national tuberculosis programmes and partners in the formulation of cost-effective active case-finding activities at the national and subnational levels.

  12. ARMA-Based SEM When the Number of Time Points T Exceeds the Number of Cases N: Raw Data Maximum Likelihood.

    ERIC Educational Resources Information Center

    Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.

    2003-01-01

    Demonstrated, through simulation, that stationary autoregressive moving average (ARMA) models may be fitted readily when T>N, using normal theory raw maximum likelihood structural equation modeling. Also provides some illustrations based on real data. (SLD)

  13. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    PubMed

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  14. Occupational lead poisoning: who should conduct surveillance and training?

    PubMed

    Keogh, J P; Gordon, J

    1994-11-01

    This commentary challenges the current employer-controlled model for delivering occupational health services. Problems emanating from traditional employer-based medical surveillance and worker education programs for occupational lead poisoning are identified. A new public health model for delivering these services is proposed. This model utilizes a case-based and hazard-based method for bringing workplaces and employers into the program and features direct delivery of surveillance and training services by public health agencies.

  15. An approach to verification and validation of a reliable multicasting protocol: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the process of implementing a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally using a combination of formal and informal techniques in an attempt to ensure the correctness of its implementation. Our development process involved three concurrent activities: (1) the initial construction and incremental enhancement of a formal state model of the protocol machine; (2) the initial coding and incremental enhancement of the implementation; and (3) model-based testing of iterative implementations of the protocol. These activities were carried out by two separate teams: a design team and a V&V team. The design team built the first version of RMP with limited functionality to handle only nominal requirements of data delivery. This initial version did not handle off-nominal cases such as network partitions or site failures. Meanwhile, the V&V team concurrently developed a formal model of the requirements using a variant of SCR-based state tables. Based on these requirements tables, the V&V team developed test cases to exercise the implementation. In a series of iterative steps, the design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation. This was done by generating test cases based on suspected errant or off-nominal behaviors predicted by the current model. If the execution of a test in the model and implementation agreed, then the test either found a potential problem or verified a required behavior. However, if the execution of a test was different in the model and implementation, then the differences helped identify inconsistencies between the model and implementation. In either case, the dialogue between both teams drove the co-evolution of the model and implementation. We have found that this interactive, iterative approach to development allows software designers to focus on delivery of nominal functionality while the V&V team can focus on analysis of off nominal cases. Testing serves as the vehicle for keeping the model and implementation in fidelity with each other. This paper describes (1) our experiences in developing our process model; and (2) three example problems found during the development of RMP. Although RMP has provided our research effort with a rich set of test cases, it also has practical applications within NASA. For example, RMP is being considered for use in the NASA EOSDIS project due to its significant performance benefits in applications that need to replicate large amounts of data to many network sites.

  16. Bayes factors based on robust TDT-type tests for family trio design.

    PubMed

    Yuan, Min; Pan, Xiaoqing; Yang, Yaning

    2015-06-01

    Adaptive transmission disequilibrium test (aTDT) and MAX3 test are two robust-efficient association tests for case-parent family trio data. Both tests incorporate information of common genetic models including recessive, additive and dominant models and are efficient in power and robust to genetic model specifications. The aTDT uses information of departure from Hardy-Weinberg disequilibrium to identify the potential genetic model underlying the data and then applies the corresponding TDT-type test, and the MAX3 test is defined as the maximum of the absolute value of three TDT-type tests under the three common genetic models. In this article, we propose three robust Bayes procedures, the aTDT based Bayes factor, MAX3 based Bayes factor and Bayes model averaging (BMA), for association analysis with case-parent trio design. The asymptotic distributions of aTDT under the null and alternative hypothesis are derived in order to calculate its Bayes factor. Extensive simulations show that the Bayes factors and the p-values of the corresponding tests are generally consistent and these Bayes factors are robust to genetic model specifications, especially so when the priors on the genetic models are equal. When equal priors are used for the underlying genetic models, the Bayes factor method based on aTDT is more powerful than those based on MAX3 and Bayes model averaging. When the prior placed a small (large) probability on the true model, the Bayes factor based on aTDT (BMA) is more powerful. Analysis of a simulation data about RA from GAW15 is presented to illustrate applications of the proposed methods.

  17. Applying Corpus-Based Findings to Form-Focused Instruction: The Case of Reported Speech

    ERIC Educational Resources Information Center

    Barbieri, Federica; Eckhardt, Suzanne E. B.

    2007-01-01

    Arguing that the introduction of corpus linguistics in teaching materials and the language classroom should be informed by theories and principles of SLA, this paper presents a case study illustrating how corpus-based findings on reported speech can be integrated into a form-focused model of instruction. After overviewing previous work which…

  18. Rural Governance, Community Empowerment and the New Institutionalism: A Case Study of the Isle of Wight

    ERIC Educational Resources Information Center

    Clark, David; Southern, Rebekah; Beer, Julian

    2007-01-01

    This article compares two different institutional models--state-sponsored rural partnerships and community-based development trusts--for engaging and empowering local communities in area-based regeneration, using the Isle of Wight as a case study. Following a critical review of the literature on community governance, we evaluate the effectiveness…

  19. Case-Based Planning: An Integrated Theory of Planning, Learning and Memory

    DTIC Science & Technology

    1986-10-01

    rtvoeoo oldo II nocomtmry and Idonltly by block numbor) planning Case-based reasoning learning Artificial Intelligence 20. ABSTRACT (Conllnum...Computational Model of Analogical Prob- lem Solving, Proceedings of the Seventh International Joint Conference on Artificial Intelligence ...Understanding and Generalizing Plans., Proceedings of the Eight Interna- tional Joint Conference on Artificial Intelligence , IJCAI, Karlsrhue, Germany

  20. Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.

    PubMed

    Park, Eun-Jun; Park, Mihyun

    2015-11-01

    The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.

  1. Combining Model-Based and Feature-Driven Diagnosis Approaches - A Case Study on Electromechanical Actuators

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Roychoudhury, Indranil; Balaban, Edward; Saxena, Abhinav

    2010-01-01

    Model-based diagnosis typically uses analytical redundancy to compare predictions from a model against observations from the system being diagnosed. However this approach does not work very well when it is not feasible to create analytic relations describing all the observed data, e.g., for vibration data which is usually sampled at very high rates and requires very detailed finite element models to describe its behavior. In such cases, features (in time and frequency domains) that contain diagnostic information are extracted from the data. Since this is a computationally intensive process, it is not efficient to extract all the features all the time. In this paper we present an approach that combines the analytic model-based and feature-driven diagnosis approaches. The analytic approach is used to reduce the set of possible faults and then features are chosen to best distinguish among the remaining faults. We describe an implementation of this approach on the Flyable Electro-mechanical Actuator (FLEA) test bed.

  2. Improving Parolees' Participation in Drug Treatment and Other Services through Strengths Case Management.

    PubMed

    Prendergast, Michael; Cartier, Jerome J

    2008-01-01

    In an effort to increase participation in community aftercare treatment for substance-abusing parolees, an intervention based on a transitional case management (TCM) model that focuses mainly on offenders' strengths has been developed and is under testing. This model consists of completion, by the inmate, of a self-assessment of strengths that informs the development of the continuing care plan, a case conference call shortly before release, and strengths case management for three months post-release to promote retention in substance abuse treatment and support the participant's access to designated services in the community. The post-release component consists of a minimum of one weekly client/case manager meeting (in person or by telephone) for 12 weeks. The intervention is intended to improve the transition process from prison to community at both the individual and systems level. Specifically, the intervention is designed to improve outcomes in parolee admission to, and retention in, community-based substance-abuse treatment, parolee access to other needed services, and recidivism rates during the first year of parole. On the systems level, the intervention is intended to improve the communication and collaboration between criminal justice agencies, community-based treatment organizations, and other social and governmental service providers. The TCM model is being tested in a multisite study through the Criminal Justice Drug Abuse Treatment Studies (CJ-DATS) research cooperative funded by the National Institute of Drug Abuse.

  3. Comparing Dutch case management care models for people with dementia and their caregivers: The design of the COMPAS study.

    PubMed

    MacNeil Vroomen, Janet; Van Mierlo, Lisa D; van de Ven, Peter M; Bosmans, Judith E; van den Dungen, Pim; Meiland, Franka J M; Dröes, Rose-Marie; Moll van Charante, Eric P; van der Horst, Henriëtte E; de Rooij, Sophia E; van Hout, Hein P J

    2012-05-28

    Dementia care in the Netherlands is shifting from fragmented, ad hoc care to more coordinated and personalised care. Case management contributes to this shift. The linkage model and a combination of intensive case management and joint agency care models were selected based on their emerging prominence in the Netherlands. It is unclear if these different forms of case management are more effective than usual care in improving or preserving the functioning and well-being at the patient and caregiver level and at the societal cost. The objective of this article is to describe the design of a study comparing these two case management care models against usual care. Clinical and cost outcomes are investigated while care processes and the facilitators and barriers for implementation of these models are considered. Mixed methods include a prospective, observational, controlled, cohort study among persons with dementia and their primary informal caregiver in regions of the Netherlands with and without case management including a qualitative process evaluation. Inclusion criteria for the cohort study are: community-dwelling individuals with a dementia diagnosis who are not terminally-ill or anticipate admission to a nursing home within 6 months and with an informal caregiver who speaks fluent Dutch. Person with dementia-informal caregiver dyads are followed for two years. The primary outcome measure is the Neuropsychiatric Inventory for the people with dementia and the General Health Questionnaire for their caregivers. Secondary outcomes include: quality of life and needs assessment in both persons with dementia and caregivers, activity of daily living, competence of care, and number of crises. Costs are measured from a societal perspective using cost diaries. Process indicators measure the quality of care from the participant's perspective. The qualitative study uses purposive sampling methods to ensure a wide variation of respondents. Semi-structured interviews with stakeholders based on the theoretical model of adaptive implementation are planned. This study provides relevant insights into care processes, description of two case management models along with clinical and economic data from persons with dementia and caregivers to clarify important differences in two case management care models compared to usual care.

  4. Forecasting invasive pneumococcal disease trends after the introduction of 13-valent pneumococcal conjugate vaccine in the United States, 2010-2020.

    PubMed

    Link-Gelles, Ruth; Taylor, Thomas; Moore, Matthew R

    2013-05-24

    Pneumococcal vaccines are highly effective at preventing invasive pneumococcal disease (IPD), a leading cause of global morbidity. Because pneumococcal vaccines can be expensive, it is useful to estimate what impact might be expected from their introduction. Our objective was to develop a statistical model that could predict rates of IPD following introduction of 13-valent pneumococcal conjugate vaccine (PCV13) in the U.S. We used active surveillance data to design and validate a Poisson model forecasting the reductions in IPD observed after U.S. introduction of 7-valent pneumococcal conjugate vaccine (PCV7) in 2000. We used this model to forecast rates of IPD from 2010 to 2020 in the presence of PCV13. Because increases in non-PCV7-type IPD were evident following PCV7 introduction, we evaluated varying levels of increase in non-PCV13-type IPD ("serotype replacement") by sensitivity analyses. A total of 43,507 cases of IPD were identified during 1998-2009; cases from this period were used to develop the model, which accurately predicted indirect effects of PCV7 in adults, as well as serotype replacement. Assuming that PCV13 provides similar protection against PCV13 serotypes as PCV7 did against PCV7 serotypes, the base-case model predicted approximately 168,000 cases of IPD prevented from 2011 to 2020. When serotype replacement was varied in sensitivity analyses from 0 to levels comparable to that seen with serotype 19A (the most common replacement serotype since PCV7 was introduced), the model predicted 167,000-170,000 cases prevented. The base-case model predicted rates of IPD in children under five years of age decreasing from 21.9 to 9.3 cases per 100,000 population. This model provides a "benchmark" for assessing progress in the prevention of IPD in the years after PCV13 introduction. The amount of serotype replacement is unlikely to greatly affect the overall number of cases prevented by PCV13. Published by Elsevier Ltd.

  5. Comparing Dutch Case management care models for people with dementia and their caregivers: The design of the COMPAS study

    PubMed Central

    2012-01-01

    Background Dementia care in the Netherlands is shifting from fragmented, ad hoc care to more coordinated and personalised care. Case management contributes to this shift. The linkage model and a combination of intensive case management and joint agency care models were selected based on their emerging prominence in the Netherlands. It is unclear if these different forms of case management are more effective than usual care in improving or preserving the functioning and well-being at the patient and caregiver level and at the societal cost. The objective of this article is to describe the design of a study comparing these two case management care models against usual care. Clinical and cost outcomes are investigated while care processes and the facilitators and barriers for implementation of these models are considered. Design Mixed methods include a prospective, observational, controlled, cohort study among persons with dementia and their primary informal caregiver in regions of the Netherlands with and without case management including a qualitative process evaluation. Inclusion criteria for the cohort study are: community-dwelling individuals with a dementia diagnosis who are not terminally-ill or anticipate admission to a nursing home within 6 months and with an informal caregiver who speaks fluent Dutch. Person with dementia-informal caregiver dyads are followed for two years. The primary outcome measure is the Neuropsychiatric Inventory for the people with dementia and the General Health Questionnaire for their caregivers. Secondary outcomes include: quality of life and needs assessment in both persons with dementia and caregivers, activity of daily living, competence of care, and number of crises. Costs are measured from a societal perspective using cost diaries. Process indicators measure the quality of care from the participant’s perspective. The qualitative study uses purposive sampling methods to ensure a wide variation of respondents. Semi-structured interviews with stakeholders based on the theoretical model of adaptive implementation are planned. Discussion This study provides relevant insights into care processes, description of two case management models along with clinical and economic data from persons with dementia and caregivers to clarify important differences in two case management care models compared to usual care. PMID:22640695

  6. A modeled economic analysis of a digital tele-ophthalmology system as used by three federal health care agencies for detecting proliferative diabetic retinopathy.

    PubMed

    Whited, John D; Datta, Santanu K; Aiello, Lloyd M; Aiello, Lloyd P; Cavallerano, Jerry D; Conlin, Paul R; Horton, Mark B; Vigersky, Robert A; Poropatich, Ronald K; Challa, Pratap; Darkins, Adam W; Bursell, Sven-Erik

    2005-12-01

    The objective of this study was to compare, using a 12-month time frame, the cost-effectiveness of a non-mydriatic digital tele-ophthalmology system (Joslin Vision Network) versus traditional clinic-based ophthalmoscopy examinations with pupil dilation to detect proliferative diabetic retinopathy and its consequences. Decision analysis techniques, including Monte Carlo simulation, were used to model the use of the Joslin Vision Network versus conventional clinic-based ophthalmoscopy among the entire diabetic populations served by the Indian Health Service, the Department of Veterans Affairs, and the active duty Department of Defense. The economic perspective analyzed was that of each federal agency. Data sources for costs and outcomes included the published literature, epidemiologic data, administrative data, market prices, and expert opinion. Outcome measures included the number of true positive cases of proliferative diabetic retinopathy detected, the number of patients treated with panretinal laser photocoagulation, and the number of cases of severe vision loss averted. In the base-case analyses, the Joslin Vision Network was the dominant strategy in all but two of the nine modeled scenarios, meaning that it was both less costly and more effective. In the active duty Department of Defense population, the Joslin Vision Network would be more effective but cost an extra 1,618 dollars per additional patient treated with panretinal laser photo-coagulation and an additional 13,748 dollars per severe vision loss event averted. Based on our economic model, the Joslin Vision Network has the potential to be more effective than clinic-based ophthalmoscopy for detecting proliferative diabetic retinopathy and averting cases of severe vision loss, and may do so at lower cost.

  7. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  8. Unsteady Computational Tests of a Non-Equilibrium

    NASA Astrophysics Data System (ADS)

    Jirasek, Adam; Hamlington, Peter; Lofthouse, Andrew; Usafa Collaboration; Cu Boulder Collaboration

    2017-11-01

    A non-equilibrium turbulence model is assessed on simulations of three practically-relevant unsteady test cases; oscillating channel flow, transonic flow around an oscillating airfoil, and transonic flow around the Benchmark Super-Critical Wing. The first case is related to piston-driven flows while the remaining cases are relevant to unsteady aerodynamics at high angles of attack and transonic speeds. Non-equilibrium turbulence effects arise in each of these cases in the form of a lag between the mean strain rate and Reynolds stresses, resulting in reduced kinetic energy production compared to classical equilibrium turbulence models that are based on the gradient transport (or Boussinesq) hypothesis. As a result of the improved representation of unsteady flow effects, the non-equilibrium model provides substantially better agreement with available experimental data than do classical equilibrium turbulence models. This suggests that the non-equilibrium model may be ideally suited for simulations of modern high-speed, high angle of attack aerodynamics problems.

  9. A jazz-based approach for optimal setting of pressure reducing valves in water distribution networks

    NASA Astrophysics Data System (ADS)

    De Paola, Francesco; Galdiero, Enzo; Giugni, Maurizio

    2016-05-01

    This study presents a model for valve setting in water distribution networks (WDNs), with the aim of reducing the level of leakage. The approach is based on the harmony search (HS) optimization algorithm. The HS mimics a jazz improvisation process able to find the best solutions, in this case corresponding to valve settings in a WDN. The model also interfaces with the improved version of a popular hydraulic simulator, EPANET 2.0, to check the hydraulic constraints and to evaluate the performances of the solutions. Penalties are introduced in the objective function in case of violation of the hydraulic constraints. The model is applied to two case studies, and the obtained results in terms of pressure reductions are comparable with those of competitive metaheuristic algorithms (e.g. genetic algorithms). The results demonstrate the suitability of the HS algorithm for water network management and optimization.

  10. Iowa Case Management for Rural Drug Abuse

    ERIC Educational Resources Information Center

    Hall, James A.; Vaughan Sarrazin, Mary S.; Huber, Diane L.; Vaughn, Thomas; Block, Robert I.; Reedy, Amanda R.; Jang, MiJin

    2009-01-01

    Objective: The purpose of this research was to evaluate the effectiveness of a comprehensive, strengths-based model of case management for clients in drug abuse treatment. Method: 503 volunteers from residential or intensive outpatient treatment were randomly assigned to one of three conditions of Iowa Case Management (ICM) plus treatment as usual…

  11. Continuum Fatigue Damage Modeling for Use in Life Extending Control

    NASA Technical Reports Server (NTRS)

    Lorenzo, Carl F.

    1994-01-01

    This paper develops a simplified continuum (continuous wrp to time, stress, etc.) fatigue damage model for use in Life Extending Controls (LEC) studies. The work is based on zero mean stress local strain cyclic damage modeling. New nonlinear explicit equation forms of cyclic damage in terms of stress amplitude are derived to facilitate the continuum modeling. Stress based continuum models are derived. Extension to plastic strain-strain rate models are also presented. Application of these models to LEC applications is considered. Progress toward a nonzero mean stress based continuum model is presented. Also, new nonlinear explicit equation forms in terms of stress amplitude are also derived for this case.

  12. The Second Prototype of the Development of a Technological Pedagogical Content Knowledge Based Instructional Design Model: An Implementation Study in a Technology Integration Course

    ERIC Educational Resources Information Center

    Lee, Chia-Jung; Kim, ChanMin

    2014-01-01

    This study presents a refined technological pedagogical content knowledge (also known as TPACK) based instructional design model, which was revised using findings from the implementation study of a prior model. The refined model was applied in a technology integration course with 38 preservice teachers. A case study approach was used in this…

  13. A sorption model for alkalis in cement-based materials - Correlations with solubility and electrokinetic properties

    NASA Astrophysics Data System (ADS)

    Henocq, Pierre

    2017-06-01

    In cement-based materials, radionuclide uptake is mainly controlled by calcium silicate hydrates (C-S-H). This work presents an approach for defining a unique set of parameters of a surface complexation model describing the sorption behavior of alkali ions on the C-S-H surface. Alkali sorption processes are modeled using the CD-MUSIC function integrated in the Phreeqc V.3.0.6 geochemical code. Parameterization of the model was performed based on (1) retention, (2) zeta potential, and (3) solubility experimental data from the literature. This paper shows an application of this model to sodium ions. It was shown that retention, i.e. surface interactions, and solubility are closely related, and a consistent sorption model for radionuclides in cement-based materials requires a coupled surface interaction/chemical equilibrium model. In case of C-S-H with low calcium-to-silicon ratios, sorption of sodium ions on the C-S-H surface strongly influences the chemical equilibrium of the C-S-H + NaCl system by significantly increasing the aqueous calcium concentration. The close relationship between sorption and chemical equilibrium was successfully illustrated by modeling the effect of the solid-to-liquid ratio on the calcium content in solution in the case of C-S-H + NaCl systems.

  14. A Risk Prediction Model for Sporadic CRC Based on Routine Lab Results.

    PubMed

    Boursi, Ben; Mamtani, Ronac; Hwang, Wei-Ting; Haynes, Kevin; Yang, Yu-Xiao

    2016-07-01

    Current risk scores for colorectal cancer (CRC) are based on demographic and behavioral factors and have limited predictive values. To develop a novel risk prediction model for sporadic CRC using clinical and laboratory data in electronic medical records. We conducted a nested case-control study in a UK primary care database. Cases included those with a diagnostic code of CRC, aged 50-85. Each case was matched with four controls using incidence density sampling. CRC predictors were examined using univariate conditional logistic regression. Variables with p value <0.25 in the univariate analysis were further evaluated in multivariate models using backward elimination. Discrimination was assessed using receiver operating curve. Calibration was evaluated using the McFadden's R2. Net reclassification index (NRI) associated with incorporation of laboratory results was calculated. Results were internally validated. A model similar to existing CRC prediction models which included age, sex, height, obesity, ever smoking, alcohol dependence, and previous screening colonoscopy had an AUC of 0.58 (0.57-0.59) with poor goodness of fit. A laboratory-based model including hematocrit, MCV, lymphocytes, and neutrophil-lymphocyte ratio (NLR) had an AUC of 0.76 (0.76-0.77) and a McFadden's R2 of 0.21 with a NRI of 47.6 %. A combined model including sex, hemoglobin, MCV, white blood cells, platelets, NLR, and oral hypoglycemic use had an AUC of 0.80 (0.79-0.81) with a McFadden's R2 of 0.27 and a NRI of 60.7 %. Similar results were shown in an internal validation set. A laboratory-based risk model had good predictive power for sporadic CRC risk.

  15. Stemflow estimation in a redwood forest using model-based stratified random sampling

    Treesearch

    Jack Lewis

    2003-01-01

    Model-based stratified sampling is illustrated by a case study of stemflow volume in a redwood forest. The approach is actually a model-assisted sampling design in which auxiliary information (tree diameter) is utilized in the design of stratum boundaries to optimize the efficiency of a regression or ratio estimator. The auxiliary information is utilized in both the...

  16. Comparing effects of fire modeling methods on simulated fire patterns and succession: a case study in the Missouri Ozarks

    Treesearch

    Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson

    2008-01-01

    We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...

  17. Evaluation of Student Models on Current Socio-Scientific Topics Based on System Dynamics

    ERIC Educational Resources Information Center

    Nuhoglu, Hasret

    2014-01-01

    This study aims to 1) enable primary school students to develop models that will help them understand and analyze a system, through a learning process based on system dynamics approach, 2) examine and evaluate students' models related to socio-scientific issues using certain criteria. The research method used is a case study. The study sample…

  18. Modeling and projection of dengue fever cases in Guangzhou based on variation of weather factors.

    PubMed

    Li, Chenlu; Wang, Xiaofeng; Wu, Xiaoxu; Liu, Jianing; Ji, Duoying; Du, Juan

    2017-12-15

    Dengue fever is one of the most serious vector-borne infectious diseases, especially in Guangzhou, China. Dengue viruses and their vectors Aedes albopictus are sensitive to climate change primarily in relation to weather factors. Previous research has mainly focused on identifying the relationship between climate factors and dengue cases, or developing dengue case models with some non-climate factors. However, there has been little research addressing the modeling and projection of dengue cases only from the perspective of climate change. This study considered this topic using long time series data (1998-2014). First, sensitive weather factors were identified through meta-analysis that included literature review screening, lagged analysis, and collinear analysis. Then, key factors that included monthly average temperature at a lag of two months, and monthly average relative humidity and monthly average precipitation at lags of three months were determined. Second, time series Poisson analysis was used with the generalized additive model approach to develop a dengue model based on key weather factors for January 1998 to December 2012. Data from January 2013 to July 2014 were used to validate that the model was reliable and reasonable. Finally, future weather data (January 2020 to December 2070) were input into the model to project the occurrence of dengue cases under different climate scenarios (RCP 2.6 and RCP 8.5). Longer time series analysis and scientifically selected weather variables were used to develop a dengue model to ensure reliability. The projections suggested that seasonal disease control (especially in summer and fall) and mitigation of greenhouse gas emissions could help reduce the incidence of dengue fever. The results of this study hope to provide a scientifically theoretical basis for the prevention and control of dengue fever in Guangzhou. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Development of a traffic noise prediction model for an urban environment.

    PubMed

    Sharma, Asheesh; Bodhe, G L; Schimak, G

    2014-01-01

    The objective of this study is to develop a traffic noise model under diverse traffic conditions in metropolitan cities. The model has been developed to calculate equivalent traffic noise based on four input variables i.e. equivalent traffic flow (Q e ), equivalent vehicle speed (S e ) and distance (d) and honking (h). The traffic data is collected and statistically analyzed in three different cases for 15-min during morning and evening rush hours. Case I represents congested traffic where equivalent vehicle speed is <30 km/h while case II represents free-flowing traffic where equivalent vehicle speed is >30 km/h and case III represents calm traffic where no honking is recorded. The noise model showed better results than earlier developed noise model for Indian traffic conditions. A comparative assessment between present and earlier developed noise model has also been presented in the study. The model is validated with measured noise levels and the correlation coefficients between measured and predicted noise levels were found to be 0.75, 0.83 and 0.86 for case I, II and III respectively. The noise model performs reasonably well under different traffic conditions and could be implemented for traffic noise prediction at other region as well.

  20. Linear control of oscillator and amplifier flows*

    NASA Astrophysics Data System (ADS)

    Schmid, Peter J.; Sipp, Denis

    2016-08-01

    Linear control applied to fluid systems near an equilibrium point has important applications for many flows of industrial or fundamental interest. In this article we give an exposition of tools and approaches for the design of control strategies for globally stable or unstable flows. For unstable oscillator flows a feedback configuration and a model-based approach is proposed, while for stable noise-amplifier flows a feedforward setup and an approach based on system identification is advocated. Model reduction and robustness issues are addressed for the oscillator case; statistical learning techniques are emphasized for the amplifier case. Effective suppression of global and convective instabilities could be demonstrated for either case, even though the system-identification approach results in a superior robustness to off-design conditions.

  1. Competency-Based Human Resource Development Strategy

    ERIC Educational Resources Information Center

    Gangani, Noordeen T.; McLean, Gary N.; Braden, Richard A.

    2004-01-01

    This paper explores issues in developing and implementing a competency-based human resource development strategy. The paper summarizes a literature review on how competency models can improve HR performance. A case study is presented of American Medical Systems (AMS), a mid-sized health-care and medical device company, where the model is being…

  2. Goal Structuring Notation in a Radiation Hardening Assurance Case for COTS-Based Spacecraft

    NASA Technical Reports Server (NTRS)

    Witulski, A.; Austin, R.; Evans, J.; Mahadevan, N.; Karsai, G.; Sierawski, B.; LaBel, K.; Reed, R.

    2016-01-01

    The attached presentation is a summary of how mission assurance is supported by model-based representations of spacecraft systems that can define sub-system functionality and interfacing, reliability parameters, as well as detailing a new paradigm for assurance, a model-centric and not document-centric process.

  3. Improving Conceptual Understanding and Representation Skills through Excel-Based Modeling

    ERIC Educational Resources Information Center

    Malone, Kathy L.; Schunn, Christian D.; Schuchardt, Anita M.

    2018-01-01

    The National Research Council framework for science education and the Next Generation Science Standards have developed a need for additional research and development of curricula that is both technologically model-based and includes engineering practices. This is especially the case for biology education. This paper describes a quasi-experimental…

  4. A Suggested Model for a Working Cyberschool.

    ERIC Educational Resources Information Center

    Javid, Mahnaz A.

    2000-01-01

    Suggests a model for a working cyberschool based on a case study of Kamiak Cyberschool (Washington), a technology-driven public high school. Topics include flexible hours; one-to-one interaction with teachers; a supportive school environment; use of computers, interactive media, and online resources; and self-paced, project-based learning.…

  5. Demonstration of risk based, goal driven framework for hydrological field campaigns and inverse modeling with case studies

    NASA Astrophysics Data System (ADS)

    Harken, B.; Geiges, A.; Rubin, Y.

    2013-12-01

    There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and forward modeling and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration, plume travel time, or aquifer recharge rate. These predictions often have significant bearing on some decision that must be made. Examples include: how to allocate limited remediation resources between multiple contaminated groundwater sites, where to place a waste repository site, and what extraction rates can be considered sustainable in an aquifer. Providing an answer to these questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in model parameters, such as hydraulic conductivity, leads to uncertainty in EPM predictions. Often, field campaigns and inverse modeling efforts are planned and undertaken with reduction of parametric uncertainty as the objective. The tool of hypothesis testing allows this to be taken one step further by considering uncertainty reduction in the ultimate prediction of the EPM as the objective and gives a rational basis for weighing costs and benefits at each stage. When using the tool of statistical hypothesis testing, the EPM is cast into a binary outcome. This is formulated as null and alternative hypotheses, which can be accepted and rejected with statistical formality. When accounting for all sources of uncertainty at each stage, the level of significance of this test provides a rational basis for planning, optimization, and evaluation of the entire campaign. Case-specific information, such as consequences prediction error and site-specific costs can be used in establishing selection criteria based on what level of risk is deemed acceptable. This framework is demonstrated and discussed using various synthetic case studies. The case studies involve contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a given location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical value of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. Different field campaigns are analyzed based on effectiveness in reducing the probability of selecting the wrong hypothesis, which in this case corresponds to reducing uncertainty in the prediction of plume arrival time. To examine the role of inverse modeling in this framework, case studies involving both Maximum Likelihood parameter estimation and Bayesian inversion are used.

  6. An intelligent case-adjustment algorithm for the automated design of population-based quality auditing protocols.

    PubMed

    Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A

    2004-01-01

    We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.

  7. Renewable generation technology choice and policies in a competitive electricity supply industry

    NASA Astrophysics Data System (ADS)

    Sarkar, Ashok

    Renewable energy generation technologies have lower externality costs but higher private costs than fossil fuel-based generation. As a result, the choice of renewables in the future generation mix could be affected by the industry's future market-oriented structure because market objectives based on private value judgments may conflict with social policy objectives toward better environmental quality. This research assesses how renewable energy generation choices would be affected in a restructured electricity generation market. A multi-period linear programming-based model (Resource Planning Model) is used to characterize today's electricity supply market in the United States. The model simulates long-range (2000-2020) generation capacity planning and operation decisions under alternative market paradigms. Price-sensitive demand is used to simulate customer preferences in the market. Dynamically changing costs for renewables and a two-step load duration curve are used. A Reference Case represents the benchmark for a socially-optimal diffusion of renewables and a basis for comparing outcomes under alternative market structures. It internalizes externality costs associated with emissions of sulfur dioxide (SOsb2), nitrous oxides (NOsbx), and carbon dioxide (COsb2). A Competitive Case represents a market with many generation suppliers and decision-making based on private costs. Finally, a Market Power Case models the extreme case of market power: monopoly. The results suggest that the share of renewables would decrease (and emissions would increase) considerably in both the Competitive and the Market Power Cases with respect to the Reference Case. The reduction is greater in the Market Power Case due to pricing decisions under existing supply capability. The research evaluates the following environmental policy options that could overcome market failures in achieving an appropriate level of renewable generation: COsb2 emissions tax, SOsb2 emissions cap, renewable portfolio standards (RPS), and enhanced research and development (R&D). RPS would best ensure an appropriate share of renewables, whereas SOsb2 emissions caps would not support a shift to renewables in an era of inexpensive natural gas. The effectiveness of the policies are dependent on the market structure. If market power exists, the analyses indicate that generally higher levels of intervention would be necessary to achieve a shift to renewables.

  8. Target space pseudoduality in supersymmetric sigma models on symmetric spaces

    NASA Astrophysics Data System (ADS)

    Sarisaman, Mustafa

    We discuss the target space pseudoduality in supersymmetric sigma models on symmetric spaces. We first consider the case where sigma models based on real compact connected Lie groups of the same dimensionality and give examples using three dimensional models on target spaces. We show explicit construction of nonlocal conserved currents on the pseudodual manifold. We then switch the Lie group valued pseudoduality equations to Lie algebra valued ones, which leads to an infinite number of pseudoduality equations. We obtain an infinite number of conserved currents on the tangent bundle of the pseudo-dual manifold. Since pseudoduality imposes the condition that sigma models pseudodual to each other are based on symmetric spaces with opposite curvatures (i.e. dual symmetric spaces), we investigate pseudoduality transformation on the symmetric space sigma models in the third chapter. We see that there can be mixing of decomposed spaces with each other, which leads to mixings of the following expressions. We obtain the pseudodual conserved currents which are viewed as the orthonormal frame on the pullback bundle of the tangent space of G˜ which is the Lie group on which the pseudodual model based. Hence we obtain the mixing forms of curvature relations and one loop renormalization group beta function by means of these currents. In chapter four, we generalize the classical construction of pseudoduality transformation to supersymmetric case. We perform this both by component expansion method on manifold M and by orthonormal coframe method on manifold SO( M). The component method produces the result that pseudoduality transformation is not invertible at all points and occurs from all points on one manifold to only one point where riemann normal coordinates valid on the second manifold. Torsion of the sigma model on M must vanish while it is nonvanishing on M˜, and curvatures of the manifolds must be constant and the same because of anticommuting grassmann numbers. We obtain the similar results with the classical case in orthonormal coframe method. In case of super WZW sigma models pseudoduality equations result in three different pseudoduality conditions; flat space, chiral and antichiral pseudoduality. Finally we study the pseudoduality transformations on symmetric spaces using two different methods again. These two methods yield similar results to the classical cases with the exception that commuting bracket relations in classical case turns out to be anticommuting ones because of the appearance of grassmann numbers. It is understood that constraint relations in case of non-mixing pseudoduality are the remnants of mixing pseudoduality. Once mixing terms are included in the pseudoduality the constraint relations disappear.

  9. Modelling of Fluidised Geomaterials: The Case of the Aberfan and the Gypsum Tailings Impoundment Flowslides

    PubMed Central

    Dutto, Paola; Stickle, Miguel Martin; Pastor, Manuel; Manzanal, Diego; Yague, Angel; Moussavi Tayyebi, Saeid; Lin, Chuan; Elizalde, Maria Dolores

    2017-01-01

    The choice of a pure cohesive or a pure frictional viscoplastic model to represent the rheological behaviour of a flowslide is of paramount importance in order to obtain accurate results for real cases. The principal goal of the present work is to clarify the influence of the type of viscous model—pure cohesive versus pure frictional—with the numerical reproduction of two different real flowslides that occurred in 1966: the Aberfan flowslide and the Gypsum tailings impoundment flowslide. In the present work, a depth-integrated model based on the v-pw Biot–Zienkiewicz formulation, enhanced with a diffusion-like equation to account for the pore pressure evolution within the soil mass, is applied to both 1966 cases. For the Aberfan flowslide, a frictional viscous model based on Perzyna viscoplasticity is considered, while a pure cohesive viscous model (Bingham model) is considered for the case of the Gypsum flowslide. The numerical approach followed is the SPH method, which has been enriched by adding a 1D finite difference grid to each SPH node in order to improve the description of the pore water evolution in the propagating mixture. The results obtained by the performed simulations are in agreement with the documentation obtained through the UK National Archive (Aberfan flowslide) and the International Commission of large Dams (Gypsum flowslide). PMID:28772924

  10. Large-eddy simulation of turbulent flow with a surface-mounted two-dimensional obstacle

    NASA Technical Reports Server (NTRS)

    Yang, Kyung-Soo; Ferziger, Joel H.

    1993-01-01

    In this paper, we perform a large eddy simulation (LES) of turbulent flow in a channel containing a two-dimensional obstacle on one wall using a dynamic subgrid-scale model (DSGSM) at Re = 3210, based on bulk velocity above the obstacle and obstacle height; the wall layers are fully resolved. The low Re enables us to perform a DNS (Case 1) against which to validate the LES results. The LES with the DSGSM is designated Case 2. In addition, an LES with the conventional fixed model constant (Case 3) is conducted to allow identification of improvements due to the DSGSM. We also include LES at Re = 82,000 (Case 4) using conventional Smagorinsky subgrid-scale model and a wall-layer model. The results will be compared with the experiment of Dimaczek et al.

  11. Printed three-dimensional anatomic templates for virtual preoperative planning before reconstruction of old pelvic injuries: initial results.

    PubMed

    Wu, Xin-Bao; Wang, Jun-Qiang; Zhao, Chun-Peng; Sun, Xu; Shi, Yin; Zhang, Zi-An; Li, Yu-Neng; Wang, Man-Yi

    2015-02-20

    Old pelvis fractures are among the most challenging fractures to treat because of their complex anatomy, difficult-to-access surgical sites, and the relatively low incidence of such cases. Proper evaluation and surgical planning are necessary to achieve the pelvic ring symmetry and stable fixation of the fracture. The goal of this study was to assess the use of three-dimensional (3D) printing techniques for surgical management of old pelvic fractures. First, 16 dried human cadaveric pelvises were used to confirm the anatomical accuracy of the 3D models printed based on radiographic data. Next, nine clinical cases between January 2009 and April 2013 were used to evaluate the surgical reconstruction based on the 3D printed models. The pelvic injuries were all type C, and the average time from injury to reconstruction was 11 weeks (range: 8-17 weeks). The workflow consisted of: (1) Printing patient-specific bone models based on preoperative computed tomography (CT) scans, (2) virtual fracture reduction using the printed 3D anatomic template, (3) virtual fracture fixation using Kirschner wires, and (4) preoperatively measuring the osteotomy and implant position relative to landmarks using the virtually defined deformation. These models aided communication between surgical team members during the procedure. This technique was validated by comparing the preoperative planning to the intraoperative procedure. The accuracy of the 3D printed models was within specification. Production of a model from standard CT DICOM data took 7 hours (range: 6-9 hours). Preoperative planning using the 3D printed models was feasible in all cases. Good correlation was found between the preoperative planning and postoperative follow-up X-ray in all nine cases. The patients were followed for 3-29 months (median: 5 months). The fracture healing time was 9-17 weeks (mean: 10 weeks). No delayed incision healing, wound infection, or nonunions occurred. The results were excellent in two cases, good in five, and poor in two based on the Majeed score. The 3D printing planning technique for pelvic surgery was successfully integrated into a clinical workflow to improve patient-specific preoperative planning by providing a visual and haptic model of the injury and allowing patient-specific adaptation of each osteosynthesis implant to the virtually reduced pelvis.

  12. Case-Mix for Performance Management: A Risk Algorithm Based on ICD-10-CM.

    PubMed

    Gao, Jian; Moran, Eileen; Almenoff, Peter L

    2018-06-01

    Accurate risk adjustment is the key to a reliable comparison of cost and quality performance among providers and hospitals. However, the existing case-mix algorithms based on age, sex, and diagnoses can only explain up to 50% of the cost variation. More accurate risk adjustment is desired for provider performance assessment and improvement. To develop a case-mix algorithm that hospitals and payers can use to measure and compare cost and quality performance of their providers. All 6,048,895 patients with valid diagnoses and cost recorded in the US Veterans health care system in fiscal year 2016 were included in this study. The dependent variable was total cost at the patient level, and the explanatory variables were age, sex, and comorbidities represented by 762 clinically homogeneous groups, which were created by expanding the 283 categories from Clinical Classifications Software based on ICD-10-CM codes. The split-sample method was used to assess model overfitting and coefficient stability. The predictive power of the algorithms was ascertained by comparing the R, mean absolute percentage error, root mean square error, predictive ratios, and c-statistics. The expansion of the Clinical Classifications Software categories resulted in higher predictive power. The R reached 0.72 and 0.52 for the transformed and raw scale cost, respectively. The case-mix algorithm we developed based on age, sex, and diagnoses outperformed the existing case-mix models reported in the literature. The method developed in this study can be used by other health systems to produce tailored risk models for their specific purpose.

  13. Predictive models of alcohol use based on attitudes and individual values.

    PubMed

    García del Castillo Rodríguez, José A; López-Sánchez, Carmen; Quiles Soler, M Carmen; García del Castillo-López, Alvaro; Gázquez Pertusa, Mónica; Marzo Campos, Juan Carlos; Inglés, Candido J

    2013-01-01

    Two predictive models are developed in this article: the first is designed to predict people's attitudes to alcoholic drinks, while the second sets out to predict the use of alcohol in relation to selected individual values. University students (N = 1,500) were recruited through stratified sampling based on sex and academic discipline. The questionnaire used obtained information on participants' alcohol use, attitudes and personal values. The results show that the attitudes model correctly classifies 76.3% of cases. Likewise, the model for level of alcohol use correctly classifies 82% of cases. According to our results, we can conclude that there are a series of individual values that influence drinking and attitudes to alcohol use, which therefore provides us with a potentially powerful instrument for developing preventive intervention programs.

  14. Fan broadband interaction noise modeling using a low-order method

    NASA Astrophysics Data System (ADS)

    Grace, S. M.

    2015-06-01

    A low-order method for simulating broadband interaction noise downstream of the fan stage in a turbofan engine is explored in this paper. The particular noise source of interest is due to the interaction of the fan rotor wake with the fan exit guide vanes (FEGVs). The vanes are modeled as flat plates and the method utilizes strip theory relying on unsteady aerodynamic cascade theory at each strip. This paper shows predictions for 6 of the 9 cases from NASA's Source Diagnostic Test (SDT) and all 4 cases from the 2014 Fan Broadband Workshop Fundamental Case 2 (FC2). The turbulence in the rotor wake is taken from hot-wire data for the low speed SDT cases and the FC2 cases. Additionally, four different computational simulations of the rotor wake flow for all of the SDT rotor speeds have been used to determine the rotor wake turbulence parameters. Comparisons between predictions based on the different inputs highlight the possibility of a potential effect present in the hot-wire data for the SDT as well as the importance of accurately describing the turbulence length scale when using this model. The method produces accurate predictions of the spectral shape for all of the cases. It also predicts reasonably well all of the trends that can be considered based on the included cases such as vane geometry, vane count, turbulence level, and rotor speed.

  15. A Pathway Based Classification Method for Analyzing Gene Expression for Alzheimer's Disease Diagnosis.

    PubMed

    Voyle, Nicola; Keohane, Aoife; Newhouse, Stephen; Lunnon, Katie; Johnston, Caroline; Soininen, Hilkka; Kloszewska, Iwona; Mecocci, Patrizia; Tsolaki, Magda; Vellas, Bruno; Lovestone, Simon; Hodges, Angela; Kiddle, Steven; Dobson, Richard Jb

    2016-01-01

    Recent studies indicate that gene expression levels in blood may be able to differentiate subjects with Alzheimer's disease (AD) from normal elderly controls and mild cognitively impaired (MCI) subjects. However, there is limited replicability at the single marker level. A pathway-based interpretation of gene expression may prove more robust. This study aimed to investigate whether a case/control classification model built on pathway level data was more robust than a gene level model and may consequently perform better in test data. The study used two batches of gene expression data from the AddNeuroMed (ANM) and Dementia Case Registry (DCR) cohorts. Our study used Illumina Human HT-12 Expression BeadChips to collect gene expression from blood samples. Random forest modeling with recursive feature elimination was used to predict case/control status. Age and APOE ɛ4 status were used as covariates for all analysis. Gene and pathway level models performed similarly to each other and to a model based on demographic information only. Any potential increase in concordance from the novel pathway level approach used here has not lead to a greater predictive ability in these datasets. However, we have only tested one method for creating pathway level scores. Further, we have been able to benchmark pathways against genes in datasets that had been extensively harmonized. Further work should focus on the use of alternative methods for creating pathway level scores, in particular those that incorporate pathway topology, and the use of an endophenotype based approach.

  16. Trajectory-Based Loads for the Ares I-X Test Flight Vehicle

    NASA Technical Reports Server (NTRS)

    Vause, Roland F.; Starr, Brett R.

    2011-01-01

    In trajectory-based loads, the structural engineer treats each point on the trajectory as a load case. Distributed aero, inertial, and propulsion forces are developed for the structural model which are equivalent to the integrated values of the trajectory model. Free-body diagrams are then used to solve for the internal forces, or loads, that keep the applied aero, inertial, and propulsion forces in dynamic equilibrium. There are several advantages to using trajectory-based loads. First, consistency is maintained between the integrated equilibrium equations of the trajectory analysis and the distributed equilibrium equations of the structural analysis. Second, the structural loads equations are tied to the uncertainty model for the trajectory systems analysis model. Atmosphere, aero, propulsion, mass property, and controls uncertainty models all feed into the dispersions that are generated for the trajectory systems analysis model. Changes in any of these input models will affect structural loads response. The trajectory systems model manages these inputs as well as the output from the structural model over thousands of dispersed cases. Large structural models with hundreds of thousands of degrees of freedom would execute too slowly to be an efficient part of several thousand system analyses. Trajectory-based loads provide a means for the structures discipline to be included in the integrated systems analysis. Successful applications of trajectory-based loads methods for the Ares I-X vehicle are covered in this paper. Preliminary design loads were based on 2000 trajectories using Monte Carlo dispersions. Range safety loads were tied to 8423 malfunction turn trajectories. In addition, active control system loads were based on 2000 preflight trajectories using Monte Carlo dispersions.

  17. Maximum likelihood estimation for Cox's regression model under nested case-control sampling.

    PubMed

    Scheike, Thomas H; Juul, Anders

    2004-04-01

    Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin-like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used to obtain information additional to the relative risk estimates of covariates.

  18. Contagion Shocks in One Dimension

    NASA Astrophysics Data System (ADS)

    Bertozzi, Andrea L.; Rosado, Jesus; Short, Martin B.; Wang, Li

    2015-02-01

    We consider an agent-based model of emotional contagion coupled with motion in one dimension that has recently been studied in the computer science community. The model involves movement with a speed proportional to a "fear" variable that undergoes a temporal consensus averaging based on distance to other agents. We study the effect of Riemann initial data for this problem, leading to shock dynamics that are studied both within the agent-based model as well as in a continuum limit. We examine the behavior of the model under distinguished limits as the characteristic contagion interaction distance and the interaction timescale both approach zero. The limiting behavior is related to a classical model for pressureless gas dynamics with "sticky" particles. In comparison, we observe a threshold for the interaction distance vs. interaction timescale that produce qualitatively different behavior for the system - in one case particle paths do not cross and there is a natural Eulerian limit involving nonlocal interactions and in the other case particle paths can cross and one may consider only a kinetic model in the continuum limit.

  19. Log-Multiplicative Association Models as Item Response Models

    ERIC Educational Resources Information Center

    Anderson, Carolyn J.; Yu, Hsiu-Ting

    2007-01-01

    Log-multiplicative association (LMA) models, which are special cases of log-linear models, have interpretations in terms of latent continuous variables. Two theoretical derivations of LMA models based on item response theory (IRT) arguments are presented. First, we show that Anderson and colleagues (Anderson & Vermunt, 2000; Anderson & Bockenholt,…

  20. Practical strategies for developing the business case for hospital glycemic control teams.

    PubMed

    Magee, Michelle F; Beck, Adam

    2008-09-01

    Many business models may be used to make the business case for support of a multidisciplinary team to implement targeted glucose control in the hospital. Models may be hospital-supported or self-supporting. In the former, the hospital provides financial support based on improved documentation opportunities, reduction in length of stay, and improved resource utilization. In the latter, clinical revenues for diabetes management offsets costs of salary, fringe benefits, and overheads. A combination of these strategies may also be used. The business plan presented to administration must justify return on investment. It is imperative to involve hospital administration, particularly representatives from coding, billing, and finance, in the development of the business plan. The business case for hospital support will be based on opportunities related to improving accuracy of documentation and coding for diabetes-related diagnoses, including level of control and complications present, on reduction in length of stay and on optimization of resource utilization through reduction in morbidity and mortality (cost aversion). The case for revenue generation through billing for clinical services will be based on opportunities to increase the provision of glycemic management services in the hospital. Examples from the literature and of analyses to support each of these models are presented. (c) 2008 Society of Hospital Medicine.

  1. Stability Analysis Susceptible, Exposed, Infected, Recovered (SEIR) Model for Spread Model for Spread of Dengue Fever in Medan

    NASA Astrophysics Data System (ADS)

    Side, Syafruddin; Molliq Rangkuti, Yulita; Gerhana Pane, Dian; Setia Sinaga, Marlina

    2018-01-01

    Dengue fever is endemic disease which spread through vector, Aedes Aegypty. This disease is found more than 100 countries, such as, United State, Africa as well Asia, especially in country that have tropic climate. Mathematical modeling in this paper, discusses the speed of the spread of dengue fever. The model adopting divided over four classes, such as Susceptible (S), Exposed (E), Infected (I) and Recovered (R). SEIR model further analyzed to detect the re-breeding value based on the number reported case by dengue in Medan city. Analysis of the stability of the system in this study is asymptotically stable indicating a case of endemic and unstable that show cases the endemic cases. Simulation on the mathematical model of SEIR showed that require a very long time to produce infected humans will be free of dengue virus infection. This happens because of dengue virus infection that occurs continuously between human and vector populations.

  2. Evaluation of Cirrus Cloud Simulations using ARM Data-Development of Case Study Data Set

    NASA Technical Reports Server (NTRS)

    Starr, David OC.; Demoz, Belay; Wang, Yansen; Lin, Ruei-Fong; Lare, Andrew; Mace, Jay; Poellot, Michael; Sassen, Kenneth; Brown, Philip

    2002-01-01

    Cloud-resolving models (CRMs) are being increasingly used to develop parametric treatments of clouds and related processes for use in global climate models (GCMs). CRMs represent the integrated knowledge of the physical processes acting to determine cloud system lifecycle and are well matched to typical observational data in terms of physical parameters/measurables and scale-resolved physical processes. Thus, they are suitable for direct comparison to field observations for model validation and improvement. The goal of this project is to improve state-of-the-art CRMs used for studies of cirrus clouds and to establish a relative calibration with GCMs through comparisons among CRMs, single column model (SCM) versions of the GCMs, and observations. The objective is to compare and evaluate a variety of CRMs and SCMs, under the auspices of the GEWEX Cloud Systems Study (GCSS) Working Group on Cirrus Cloud Systems (WG2), using ARM data acquired at the Southern Great Plains (SGP) site. This poster will report on progress in developing a suitable WG2 case study data set based on the September 26, 1996 ARM IOP case - the Hurricane Nora outflow case. Progress is assessing cloud and other environmental conditions will be described. Results of preliminary simulations using a regional cloud system model (MM5) and a CRM will be discussed. Focal science questions for the model comparison are strongly based on results of the idealized GCSS WG2 cirrus cloud model comparison projects (Idealized Cirrus Cloud Model Comparison Project and Cirrus Parcel Model Comparison Project), which will also be briefly summarized.

  3. FAST Model Calibration and Validation of the OC5-DeepCwind Floating Offshore Wind System Against Wave Tank Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Robertson, Amy N; Jonkman, Jason

    During the course of the Offshore Code Comparison Collaboration, Continued, with Correlation (OC5) project, which focused on the validation of numerical methods through comparison against tank test data, the authors created a numerical FAST model of the 1:50-scale DeepCwind semisubmersible system that was tested at the Maritime Research Institute Netherlands ocean basin in 2013. This paper discusses several model calibration studies that were conducted to identify model adjustments that improve the agreement between the numerical simulations and the experimental test data. These calibration studies cover wind-field-specific parameters (coherence, turbulence), hydrodynamic and aerodynamic modeling approaches, as well as rotor model (blade-pitchmore » and blade-mass imbalances) and tower model (structural tower damping coefficient) adjustments. These calibration studies were conducted based on relatively simple calibration load cases (wave only/wind only). The agreement between the final FAST model and experimental measurements is then assessed based on more-complex combined wind and wave validation cases.« less

  4. Importance of aggregation and small ice crystals in cirrus clouds, based on observations and an ice particle growth model

    NASA Technical Reports Server (NTRS)

    Mitchell, David L.; Chai, Steven K.; Dong, Yayi; Arnott, W. Patrick; Hallett, John

    1993-01-01

    The 1 November 1986 FIRE I case study was used to test an ice particle growth model which predicts bimodal size spectra in cirrus clouds. The model was developed from an analytically based model which predicts the height evolution of monomodal ice particle size spectra from the measured ice water content (IWC). Size spectra from the monomodal model are represented by a gamma distribution, N(D) = N(sub o)D(exp nu)exp(-lambda D), where D = ice particle maximum dimension. The slope parameter, lambda, and the parameter N(sub o) are predicted from the IWC through the growth processes of vapor diffusion and aggregation. The model formulation is analytical, computationally efficient, and well suited for incorporation into larger models. The monomodal model has been validated against two other cirrus cloud case studies. From the monomodal size spectra, the size distributions which determine concentrations of ice particles less than about 150 mu m are predicted.

  5. Model Averaging for Predicting the Exposure to Aflatoxin B1 Using DNA Methylation in White Blood Cells of Infants

    NASA Astrophysics Data System (ADS)

    Rahardiantoro, S.; Sartono, B.; Kurnia, A.

    2017-03-01

    In recent years, DNA methylation has been the special issue to reveal the pattern of a lot of human diseases. Huge amount of data would be the inescapable phenomenon in this case. In addition, some researchers interesting to take some predictions based on these huge data, especially using regression analysis. The classical approach would be failed to take the task. Model averaging by Ando and Li [1] could be an alternative approach to face this problem. This research applied the model averaging to get the best prediction in high dimension of data. In the practice, the case study by Vargas et al [3], data of exposure to aflatoxin B1 (AFB1) and DNA methylation in white blood cells of infants in The Gambia, take the implementation of model averaging. The best ensemble model selected based on the minimum of MAPE, MAE, and MSE of predictions. The result is ensemble model by model averaging with number of predictors in model candidate is 15.

  6. A Prediction Model for ROS1-Rearranged Lung Adenocarcinomas based on Histologic Features

    PubMed Central

    Zheng, Jing; Kong, Mei; Sun, Ke; Wang, Bo; Chen, Xi; Ding, Wei; Zhou, Jianying

    2016-01-01

    Aims To identify the clinical and histological characteristics of ROS1-rearranged non-small-cell lung carcinomas (NSCLCs) and build a prediction model to prescreen suitable patients for molecular testing. Methods and Results We identified 27 cases of ROS1-rearranged lung adenocarcinomas in 1165 patients with NSCLCs confirmed by real-time PCR and FISH and performed univariate and multivariate analyses to identify predictive factors associated with ROS1 rearrangement and finally developed prediction model. Detected with ROS1 immunochemistry, 59 cases of 1165 patients had a certain degree of ROS1 expression. Among these cases, 19 cases (68%, 19/28) with 3+ and 8 cases (47%, 8/17) with 2+ staining were ROS1 rearrangement verified by real-time PCR and FISH. In the resected group, the acinar-predominant growth pattern was the most commonly observed (57%, 8/14), while in the biopsy group, solid patterns were the most frequently observed (78%, 7/13). Based on multiple logistic regression analysis, we determined that female sex, cribriform structure and the presence of psammoma body were the three most powerful indicators of ROS1 rearrangement, and we have developed a predictive model for the presence of ROS1 rearrangements in lung adenocarcinomas. Conclusions Female, cribriform structure and presence of psammoma body were the three most powerful indicator of ROS1 rearrangement status, and predictive formula was helpful in screening ROS1-rearranged NSCLC, especially for ROS1 immunochemistry equivocal cases. PMID:27648828

  7. Creating Success for Students with Autism Spectrum Disorders and Their Teachers: Implementing District-Based Support Teams

    ERIC Educational Resources Information Center

    McCollow, Meaghan; Davis, Carol Ann; Copland, Michael

    2013-01-01

    This case study is intended for use in an educational leadership class to facilitate conversation on providing effective instructional practices to students on the autism spectrum. In particular, this case study demonstrates how a school district incorporated a research-based model into their system to provide support to teachers of students with…

  8. Push and pull models to manage patient consent and licensing of multimedia resources in digital repositories for case-based reasoning.

    PubMed

    Kononowicz, Andrzej A; Zary, Nabil; Davies, David; Heid, Jörn; Woodham, Luke; Hege, Inga

    2011-01-01

    Patient consents for distribution of multimedia constitute a significant element of medical case-based repositories in medicine. A technical challenge is posed by the right of patients to withdraw permission to disseminate their images or videos. A technical mechanism for spreading information about changes in multimedia usage licenses is sought. The authors gained their experience by developing and managing a large (>340 cases) repository of virtual patients within the European project eViP. The solution for dissemination of license status should reuse and extend existing metadata standards in medical education. Two methods: PUSH and PULL are described differing in the moment of update and the division of responsibilities between parties in the learning object exchange process. The authors recommend usage of the PUSH scenario because it is better adapted to legal requirements in many countries. It needs to be stressed that the solution is based on mutual trust of the exchange partners and therefore is most appropriate for use in educational alliances and consortia. It is hoped that the proposed models for exchanging consents and licensing information will become a crucial part of the technical frameworks for building case-based repositories.

  9. Generating temporal model using climate variables for the prediction of dengue cases in Subang Jaya, Malaysia

    PubMed Central

    Dom, Nazri Che; Hassan, A Abu; Latif, Z Abd; Ismail, Rodziah

    2013-01-01

    Objective To develop a forecasting model for the incidence of dengue cases in Subang Jaya using time series analysis. Methods The model was performed using the Autoregressive Integrated Moving Average (ARIMA) based on data collected from 2005 to 2010. The fitted model was then used to predict dengue incidence for the year 2010 by extrapolating dengue patterns using three different approaches (i.e. 52, 13 and 4 weeks ahead). Finally cross correlation between dengue incidence and climate variable was computed over a range of lags in order to identify significant variables to be included as external regressor. Results The result of this study revealed that the ARIMA (2,0,0) (0,0,1)52 model developed, closely described the trends of dengue incidence and confirmed the existence of dengue fever cases in Subang Jaya for the year 2005 to 2010. The prediction per period of 4 weeks ahead for ARIMA (2,0,0)(0,0,1)52 was found to be best fit and consistent with the observed dengue incidence based on the training data from 2005 to 2010 (Root Mean Square Error=0.61). The predictive power of ARIMA (2,0,0) (0,0,1)52 is enhanced by the inclusion of climate variables as external regressor to forecast the dengue cases for the year 2010. Conclusions The ARIMA model with weekly variation is a useful tool for disease control and prevention program as it is able to effectively predict the number of dengue cases in Malaysia.

  10. 3D modelling of slow landslides: the Portalet case study (Spain)

    NASA Astrophysics Data System (ADS)

    Fernandez-Merodo, Jose Antonio; Bru, Guadalupe; García-Davalillo, Juan Carlos; Herrera, Gerardo; Fernandez, Jose

    2014-05-01

    Slow landslide deformation evolution is generally cast using 1D or 2D numerical models. This paper aims to explore 3D effects on the kinematic behavior of a real landslide, the Portalet landslide (Central Spanish Pyrenees). This is a very well characterized and documented active paleo-landslide that has been reactivated by the construction of a parking area at the toe of the slope. The proposed 3D model is based on a time dependent hydro-mechanical finite element formulation that takes into account i) groundwater changes due to daily rainfall records and ii) viscous behavior and delayed creep deformation through a viscoplastic constitutive model based on Perzyna's theory. The model reproduces the nearly constant strain rate (secondary creep) and the acceleration/deceleration of the moving mass due to hydrological changes. Furthermore, the model is a able to catch the superficial 3D kinematics revealed by advanced in-situ monitoring like ground based SAR or DInSAR processing of satellite SAR images. References [1] Herrera G, Fernández-Merodo JA, Mulas J, Pastor M, Luzi G, Monserrat O (2009) A landslide forecasting model using ground based SAR data: The Portalet case study. Engineering Geology 105: 220-230 [2] Fernández-Merodo JA, Herrera G, Mira P, Mulas J, Pastor M, Noferini L, Me-catti D and Luzi G (2008). Modelling the Portalet landslide mobility (Formigal, Spain). iEMSs 2008: International Congress on Environmental Modelling and Software. Sànchez-Marrè M, Béjar J, Comas J, Rizzoli A and Guariso G (Eds.) International Environmental Modelling and Software Society (iEMSs) [3] Fernández-Merodo JA, García-Davalillo JC, Herrera G, Mira P, Pastor M (2012). 2D visco-plastic finite element modelling of slow landslides: the Portalet case study (Spain). Landslides, DOI: 10.1007/s10346-012-0370-4

  11. Application of a Gaussian multilayer diffusion model to characterize dispersion of vertical HCl column density in rocket exhaust clouds

    NASA Technical Reports Server (NTRS)

    Pellett, G. L.; Staton, W. L.

    1981-01-01

    Solid rocket exhaust cloud dispersion cases, based on seven meteorological regimes for overland advection in the Cape Canaveral, Florida, area, are examined for launch vehicle environmental impacts. They include a space shuttle case and all seven meteorological cases for the Titan 3, which exhausts 60% less HC1. The C(HC1) decays are also compared with recent in cloud peak HC1 data from eight Titan 3 launches. It is stipulated that while good overall agreement provides validation of the model, its limitations are considerable and a dynamics model is needed to handle local convective situations.

  12. Optimized model tuning in medical systems.

    PubMed

    Kléma, Jirí; Kubalík, Jirí; Lhotská, Lenka

    2005-12-01

    In medical systems it is often advantageous to utilize specific problem situations (cases) in addition to or instead of a general model. Decisions are then based on relevant past cases retrieved from a case memory. The reliability of such decisions depends directly on the ability to identify cases of practical relevance to the current situation. This paper discusses issues of automated tuning in order to obtain a proper definition of mutual case similarity in a specific medical domain. The main focus is on a reasonably time-consuming optimization of the parameters that determine case retrieval and further utilization in decision making/ prediction. The two case studies - mortality prediction after cardiological intervention, and resource allocation at a spa - document that the optimization process is influenced by various characteristics of the problem domain.

  13. Cost-minimization model of a multidisciplinary antibiotic stewardship team based on a successful implementation on a urology ward of an academic hospital.

    PubMed

    Dik, Jan-Willem H; Hendrix, Ron; Friedrich, Alex W; Luttjeboer, Jos; Panday, Prashant Nannan; Wilting, Kasper R; Lo-Ten-Foe, Jerome R; Postma, Maarten J; Sinha, Bhanu

    2015-01-01

    In order to stimulate appropriate antimicrobial use and thereby lower the chances of resistance development, an Antibiotic Stewardship Team (A-Team) has been implemented at the University Medical Center Groningen, the Netherlands. Focus of the A-Team was a pro-active day 2 case-audit, which was financially evaluated here to calculate the return on investment from a hospital perspective. Effects were evaluated by comparing audited patients with a historic cohort with the same diagnosis-related groups. Based upon this evaluation a cost-minimization model was created that can be used to predict the financial effects of a day 2 case-audit. Sensitivity analyses were performed to deal with uncertainties. Finally, the model was used to financially evaluate the A-Team. One whole year including 114 patients was evaluated. Implementation costs were calculated to be €17,732, which represent total costs spent to implement this A-Team. For this specific patient group admitted to a urology ward and consulted on day 2 by the A-Team, the model estimated total savings of €60,306 after one year for this single department, leading to a return on investment of 5.9. The implemented multi-disciplinary A-Team performing a day 2 case-audit in the hospital had a positive return on investment caused by a reduced length of stay due to a more appropriate antibiotic therapy. Based on the extensive data analysis, a model of this intervention could be constructed. This model could be used by other institutions, using their own data to estimate the effects of a day 2 case-audit in their hospital.

  14. Multivariate Radiological-Based Models for the Prediction of Future Knee Pain: Data from the OAI

    PubMed Central

    Galván-Tejada, Jorge I.; Celaya-Padilla, José M.; Treviño, Victor; Tamez-Peña, José G.

    2015-01-01

    In this work, the potential of X-ray based multivariate prognostic models to predict the onset of chronic knee pain is presented. Using X-rays quantitative image assessments of joint-space-width (JSW) and paired semiquantitative central X-ray scores from the Osteoarthritis Initiative (OAI), a case-control study is presented. The pain assessments of the right knee at the baseline and the 60-month visits were used to screen for case/control subjects. Scores were analyzed at the time of pain incidence (T-0), the year prior incidence (T-1), and two years before pain incidence (T-2). Multivariate models were created by a cross validated elastic-net regularized generalized linear models feature selection tool. Univariate differences between cases and controls were reported by AUC, C-statistics, and ODDs ratios. Univariate analysis indicated that the medial osteophytes were significantly more prevalent in cases than controls: C-stat 0.62, 0.62, and 0.61, at T-0, T-1, and T-2, respectively. The multivariate JSW models significantly predicted pain: AUC = 0.695, 0.623, and 0.620, at T-0, T-1, and T-2, respectively. Semiquantitative multivariate models predicted paint with C-stat = 0.671, 0.648, and 0.645 at T-0, T-1, and T-2, respectively. Multivariate models derived from plain X-ray radiography assessments may be used to predict subjects that are at risk of developing knee pain. PMID:26504490

  15. Systems-Oriented Workplace Learning Experiences for Early Learners: Three Models.

    PubMed

    O'Brien, Bridget C; Bachhuber, Melissa R; Teherani, Arianne; Iker, Theresa M; Batt, Joanne; O'Sullivan, Patricia S

    2017-05-01

    Early workplace learning experiences may be effective for learning systems-based practice. This study explores systems-oriented workplace learning experiences (SOWLEs) for early learners to suggest a framework for their development. The authors used a two-phase qualitative case study design. In Phase 1 (spring 2014), they prepared case write-ups based on transcribed interviews from 10 SOWLE leaders at the authors' institution and, through comparative analysis of cases, identified three SOWLE models. In Phase 2 (summer 2014), studying seven 8-week SOWLE pilots, the authors used interview and observational data collected from the seven participating medical students, two pharmacy students, and site leaders to construct case write-ups of each pilot and to verify and elaborate the models. In Model 1, students performed specific patient care activities that addressed a system gap. Some site leaders helped students connect the activities to larger systems problems and potential improvements. In Model 2, students participated in predetermined systems improvement (SI) projects, gaining experience in the improvement process. Site leaders had experience in SI and often had significant roles in the projects. In Model 3, students worked with key stakeholders to develop a project and conduct a small test of change. They experienced most elements of an improvement cycle. Site leaders often had experience with SI and knew how to guide and support students' learning. Each model could offer systems-oriented learning opportunities provided that key elements are in place including site leaders facile in SI concepts and able to guide students in SOWLE activities.

  16. Risk adjustment models for short-term outcomes after surgical resection for oesophagogastric cancer.

    PubMed

    Fischer, C; Lingsma, H; Hardwick, R; Cromwell, D A; Steyerberg, E; Groene, O

    2016-01-01

    Outcomes for oesophagogastric cancer surgery are compared with the aim of benchmarking quality of care. Adjusting for patient characteristics is crucial to avoid biased comparisons between providers. The study objective was to develop a case-mix adjustment model for comparing 30- and 90-day mortality and anastomotic leakage rates after oesophagogastric cancer resections. The study reviewed existing models, considered expert opinion and examined audit data in order to select predictors that were consequently used to develop a case-mix adjustment model for the National Oesophago-Gastric Cancer Audit, covering England and Wales. Models were developed on patients undergoing surgical resection between April 2011 and March 2013 using logistic regression. Model calibration and discrimination was quantified using a bootstrap procedure. Most existing risk models for oesophagogastric resections were methodologically weak, outdated or based on detailed laboratory data that are not generally available. In 4882 patients with oesophagogastric cancer used for model development, 30- and 90-day mortality rates were 2·3 and 4·4 per cent respectively, and 6·2 per cent of patients developed an anastomotic leak. The internally validated models, based on predictors selected from the literature, showed moderate discrimination (area under the receiver operating characteristic (ROC) curve 0·646 for 30-day mortality, 0·664 for 90-day mortality and 0·587 for anastomotic leakage) and good calibration. Based on available data, three case-mix adjustment models for postoperative outcomes in patients undergoing curative surgery for oesophagogastric cancer were developed. These models should be used for risk adjustment when assessing hospital performance in the National Health Service, and tested in other large health systems. © 2015 BJS Society Ltd Published by John Wiley & Sons Ltd.

  17. Comparison between collective coordinate models for domain wall motion in PMA nanostrips in the presence of the Dzyaloshinskii-Moriya interaction

    NASA Astrophysics Data System (ADS)

    Vandermeulen, J.; Nasseri, S. A.; Van de Wiele, B.; Durin, G.; Van Waeyenberge, B.; Dupré, L.

    2018-03-01

    Lagrangian-based collective coordinate models for magnetic domain wall (DW) motion rely on an ansatz for the DW profile and a Lagrangian approach to describe the DW motion in terms of a set of time-dependent collective coordinates: the DW position, the DW magnetization angle, the DW width and the DW tilting angle. Another approach was recently used to derive similar equations of motion by averaging the Landau-Lifshitz-Gilbert equation without any ansatz, and identifying the relevant collective coordinates afterwards. In this paper, we use an updated version of the semi-analytical equations to compare the Lagrangian-based collective coordinate models with micromagnetic simulations for field- and STT-driven (spin-transfer torque-driven) DW motion in Pt/CoFe/MgO and Pt/Co/AlOx nanostrips. Through this comparison, we assess the accuracy of the different models, and provide insight into the deviations of the models from simulations. It is found that the lack of terms related to DW asymmetry in the Lagrangian-based collective coordinate models significantly contributes to the discrepancy between the predictions of the most accurate Lagrangian-based model and the micromagnetic simulations in the field-driven case. This is in contrast to the STT-driven case where the DW remains symmetric.

  18. A classification tree based modeling approach for segment related crashes on multilane highways.

    PubMed

    Pande, Anurag; Abdel-Aty, Mohamed; Das, Abhishek

    2010-10-01

    This study presents a classification tree based alternative to crash frequency analysis for analyzing crashes on mid-block segments of multilane arterials. The traditional approach of modeling counts of crashes that occur over a period of time works well for intersection crashes where each intersection itself provides a well-defined unit over which to aggregate the crash data. However, in the case of mid-block segments the crash frequency based approach requires segmentation of the arterial corridor into segments of arbitrary lengths. In this study we have used random samples of time, day of week, and location (i.e., milepost) combinations and compared them with the sample of crashes from the same arterial corridor. For crash and non-crash cases, geometric design/roadside and traffic characteristics were derived based on their milepost locations. The variables used in the analysis are non-event specific and therefore more relevant for roadway safety feature improvement programs. First classification tree model is a model comparing all crashes with the non-crash data and then four groups of crashes (rear-end, lane-change related, pedestrian, and single-vehicle/off-road crashes) are separately compared to the non-crash cases. The classification tree models provide a list of significant variables as well as a measure to classify crash from non-crash cases. ADT along with time of day/day of week are significantly related to all crash types with different groups of crashes being more likely to occur at different times. From the classification performance of different models it was apparent that using non-event specific information may not be suitable for single vehicle/off-road crashes. The study provides the safety analysis community an additional tool to assess safety without having to aggregate the corridor crash data over arbitrary segment lengths. Copyright © 2010. Published by Elsevier Ltd.

  19. Agent-Based Modeling in Systems Pharmacology.

    PubMed

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  20. Formal methods for test case generation

    NASA Technical Reports Server (NTRS)

    Rushby, John (Inventor); De Moura, Leonardo Mendonga (Inventor); Hamon, Gregoire (Inventor)

    2011-01-01

    The invention relates to the use of model checkers to generate efficient test sets for hardware and software systems. The method provides for extending existing tests to reach new coverage targets; searching *to* some or all of the uncovered targets in parallel; searching in parallel *from* some or all of the states reached in previous tests; and slicing the model relative to the current set of coverage targets. The invention provides efficient test case generation and test set formation. Deep regions of the state space can be reached within allotted time and memory. The approach has been applied to use of the model checkers of SRI's SAL system and to model-based designs developed in Stateflow. Stateflow models achieving complete state and transition coverage in a single test case are reported.

  1. Epidemiology of measles in Southwest Nigeria: an analysis of measles case-based surveillance data from 2007 to 2012.

    PubMed

    Fatiregun, Akinola A; Adebowale, Ayodeji S; Fagbamigbe, Adeniyi F

    2014-03-01

    In Nigeria, a system of measles case-based surveillance with laboratory confirmation of suspected cases was introduced in 2005 as one of the strategies for the control of measles morbidity and mortality. In this report, we provide an epidemiological distribution of confirmed cases of measles reported from the southwest of the country between 2007 and 2012, and predict the expected number of cases for the ensuing years. A descriptive analysis of persons and place and time of confirmed measles cases (laboratory and epidemiological link) reported in the case-based surveillance data was carried out. Using an additive time series model, we predicted the expected number of cases to the year 2015, assuming that current interventional efforts were sustained. From the 10 187 suspected cases investigated during the time period, 1631 (16.0%) cases of measles were confirmed. The annual incidence rose from <1 case per million in 2007 to 23 cases per million in 2011. Cases were confirmed from all six states within the zone and most (97.4%) were in individuals aged less than 20 years. Seasonal variation existed with peaks of infection in the first and second quarters of the year. There was an increasing trend in the number of expected cases based on projections. Case-based surveillance provided an insight into understanding the epidemiology of measles infection in Southwest Nigeria. There is a need to work out alternate strategies for control of measles and to strengthen the surveillance system.

  2. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  3. Early childhood measles vaccinations are not associated with paediatric IBD: a population-based analysis.

    PubMed

    Shaw, Souradet Y; Blanchard, James F; Bernstein, Charles N

    2015-04-01

    Early childhood vaccinations have been hypothesized to contribute to the emergence of paediatric inflammatory bowel disease [IBD] in developed countries. Using linked population-based administrative databases, we aimed to explore the association between vaccination with measles-containing vaccines and the risk for IBD. This was a case-control study using the University of Manitoba IBD Epidemiology Database [UMIBDED]. The UMIBDED was linked to the Manitoba Immunization Monitoring System [MIMS], a population-based database of immunizations administered in Manitoba. All paediatric IBD cases in Manitoba, born after 1989 and diagnosed before March 31, 2008, were included. Controls were matched to cases on the basis of age, sex, and region of residence at time of diagnosis. Measles-containing vaccinations received in the first 2 years of life were documented, with vaccinations categorized as 'None' or 'Complete', with completeness defined according to Manitoba's vaccination schedule. Conditional logistic regression models were fitted to the data, with models adjusted for physician visits in the first 2 years of life and area-level socioeconomic status at case date. A total of 951 individuals [117 cases and 834 controls] met eligibility criteria, with average age of diagnosis among cases at 11 years. The proportion of IBD cases with completed vaccinations was 97%, compared with 94% of controls. In models adjusted for physician visits and area-level socioeconomic status, no statistically significant association was detected between completed measles vaccinations and the risk of IBD (adjusted odds ratio [AOR]: 1.5; 95% confidence interval [CI]: 0.5-4.4; p = 0.419]. No significant association between completed measles-containing vaccination in the first 2 years of life and paediatric IBD could be demonstrated in this population-based study. Copyright © 2015 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. Three Cases of Adolescent Childbearing Decision-Making: The Importance of Ambivalence

    ERIC Educational Resources Information Center

    Bender, Soley S.

    2008-01-01

    Limited information is available about the childbearing decision-making experience by the pregnant adolescent. The purpose of this case study was to explore this experience with three pregnant teenagers. The study is based on nine qualitative interviews. Within-case descriptions applying the theoretical model of decision-making regarding unwanted…

  5. Predicting the impact of the 2011 conflict in Libya on population mental health: PTSD and depression prevalence and mental health service requirements.

    PubMed

    Charlson, Fiona J; Steel, Zachary; Degenhardt, Louisa; Chey, Tien; Silove, Derrick; Marnane, Claire; Whiteford, Harvey A

    2012-01-01

    Mental disorders are likely to be elevated in the Libyan population during the post-conflict period. We estimated cases of severe PTSD and depression and related health service requirements using modelling from existing epidemiological data and current recommended mental health service targets in low and middle income countries (LMIC's). Post-conflict prevalence estimates were derived from models based on a previously conducted systematic review and meta-regression analysis of mental health among populations living in conflict. Political terror ratings and intensity of exposure to traumatic events were used in predictive models. Prevalence of severe cases was applied to chosen populations along with uncertainty ranges. Six populations deemed to be affected by the conflict were chosen for modelling: Misrata (population of 444,812), Benghazi (pop. 674,094), Zintan (pop. 40,000), displaced people within Tripoli/Zlitan (pop. 49,000), displaced people within Misrata (pop. 25,000) and Ras Jdir camps (pop. 3,700). Proposed targets for service coverage, resource utilisation and full-time equivalent staffing for management of severe cases of major depression and post-traumatic stress disorder (PTSD) are based on a published model for LMIC's. Severe PTSD prevalence in populations exposed to a high level of political terror and traumatic events was estimated at 12.4% (95%CI 8.5-16.7) and was 19.8% (95%CI 14.0-26.3) for severe depression. Across all six populations (total population 1,236,600), the conflict could be associated with 123,200 (71,600-182,400) cases of severe PTSD and 228,100 (134,000-344,200) cases of severe depression; 50% of PTSD cases were estimated to co-occur with severe depression. Based upon service coverage targets, approximately 154 full-time equivalent staff would be required to respond to these cases sufficiently which is substantially below the current level of resource estimates for these regions. This is the first attempt to predict the mental health burden and consequent service response needs of such a conflict, and is crucially timed for Libya.

  6. Predicting the Impact of the 2011 Conflict in Libya on Population Mental Health: PTSD and Depression Prevalence and Mental Health Service Requirements

    PubMed Central

    Charlson, Fiona J.; Steel, Zachary; Degenhardt, Louisa; Chey, Tien; Silove, Derrick; Marnane, Claire; Whiteford, Harvey A.

    2012-01-01

    Background Mental disorders are likely to be elevated in the Libyan population during the post-conflict period. We estimated cases of severe PTSD and depression and related health service requirements using modelling from existing epidemiological data and current recommended mental health service targets in low and middle income countries (LMIC’s). Methods Post-conflict prevalence estimates were derived from models based on a previously conducted systematic review and meta-regression analysis of mental health among populations living in conflict. Political terror ratings and intensity of exposure to traumatic events were used in predictive models. Prevalence of severe cases was applied to chosen populations along with uncertainty ranges. Six populations deemed to be affected by the conflict were chosen for modelling: Misrata (population of 444,812), Benghazi (pop. 674,094), Zintan (pop. 40,000), displaced people within Tripoli/Zlitan (pop. 49,000), displaced people within Misrata (pop. 25,000) and Ras Jdir camps (pop. 3,700). Proposed targets for service coverage, resource utilisation and full-time equivalent staffing for management of severe cases of major depression and post-traumatic stress disorder (PTSD) are based on a published model for LMIC’s. Findings Severe PTSD prevalence in populations exposed to a high level of political terror and traumatic events was estimated at 12.4% (95%CI 8.5–16.7) and was 19.8% (95%CI 14.0–26.3) for severe depression. Across all six populations (total population 1,236,600), the conflict could be associated with 123,200 (71,600–182,400) cases of severe PTSD and 228,100 (134,000–344,200) cases of severe depression; 50% of PTSD cases were estimated to co-occur with severe depression. Based upon service coverage targets, approximately 154 full-time equivalent staff would be required to respond to these cases sufficiently which is substantially below the current level of resource estimates for these regions. Discussion This is the first attempt to predict the mental health burden and consequent service response needs of such a conflict, and is crucially timed for Libya. PMID:22808201

  7. Updated Global Burden of Cholera in Endemic Countries

    PubMed Central

    Ali, Mohammad; Nelson, Allyson R.; Lopez, Anna Lena; Sack, David A.

    2015-01-01

    Background The global burden of cholera is largely unknown because the majority of cases are not reported. The low reporting can be attributed to limited capacity of epidemiological surveillance and laboratories, as well as social, political, and economic disincentives for reporting. We previously estimated 2.8 million cases and 91,000 deaths annually due to cholera in 51 endemic countries. A major limitation in our previous estimate was that the endemic and non-endemic countries were defined based on the countries’ reported cholera cases. We overcame the limitation with the use of a spatial modelling technique in defining endemic countries, and accordingly updated the estimates of the global burden of cholera. Methods/Principal Findings Countries were classified as cholera endemic, cholera non-endemic, or cholera-free based on whether a spatial regression model predicted an incidence rate over a certain threshold in at least three of five years (2008-2012). The at-risk populations were calculated for each country based on the percent of the country without sustainable access to improved sanitation facilities. Incidence rates from population-based published studies were used to calculate the estimated annual number of cases in endemic countries. The number of annual cholera deaths was calculated using inverse variance-weighted average case-fatality rate (CFRs) from literature-based CFR estimates. We found that approximately 1.3 billion people are at risk for cholera in endemic countries. An estimated 2.86 million cholera cases (uncertainty range: 1.3m-4.0m) occur annually in endemic countries. Among these cases, there are an estimated 95,000 deaths (uncertainty range: 21,000-143,000). Conclusion/Significance The global burden of cholera remains high. Sub-Saharan Africa accounts for the majority of this burden. Our findings can inform programmatic decision-making for cholera control. PMID:26043000

  8. The Application of Simulation Method in Isothermal Elastic Natural Gas Pipeline

    NASA Astrophysics Data System (ADS)

    Xing, Chunlei; Guan, Shiming; Zhao, Yue; Cao, Jinggang; Chu, Yanji

    2018-02-01

    This Elastic pipeline mathematic model is of crucial importance in natural gas pipeline simulation because of its compliance with the practical industrial cases. The numerical model of elastic pipeline will bring non-linear complexity to the discretized equations. Hence the Newton-Raphson method cannot achieve fast convergence in this kind of problems. Therefore A new Newton Based method with Powell-Wolfe Condition to simulate the Isothermal elastic pipeline flow is presented. The results obtained by the new method aregiven based on the defined boundary conditions. It is shown that the method converges in all cases and reduces significant computational cost.

  9. Creating an In-School Pastoral System for Student Teachers in School-Based Initial Teacher Education

    ERIC Educational Resources Information Center

    Philpott, Carey

    2015-01-01

    Recent developments in initial teacher education (ITE) have produced a number of school-centred models. These mean that student teachers may now spend more of their time in schools than has historically been the case. In some of these models, student teachers are more clearly part of the school as an institution than might be the case in more…

  10. Model-Based Prognostics of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil; Bregon, Anibal

    2015-01-01

    Model-based prognostics has become a popular approach to solving the prognostics problem. However, almost all work has focused on prognostics of systems with continuous dynamics. In this paper, we extend the model-based prognostics framework to hybrid systems models that combine both continuous and discrete dynamics. In general, most systems are hybrid in nature, including those that combine physical processes with software. We generalize the model-based prognostics formulation to hybrid systems, and describe the challenges involved. We present a general approach for modeling hybrid systems, and overview methods for solving estimation and prediction in hybrid systems. As a case study, we consider the problem of conflict (i.e., loss of separation) prediction in the National Airspace System, in which the aircraft models are hybrid dynamical systems.

  11. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  12. Modeling of Continuum Manipulators Using Pythagorean Hodograph Curves.

    PubMed

    Singh, Inderjeet; Amara, Yacine; Melingui, Achille; Mani Pathak, Pushparaj; Merzouki, Rochdi

    2018-05-10

    Research on continuum manipulators is increasingly developing in the context of bionic robotics because of their many advantages over conventional rigid manipulators. Due to their soft structure, they have inherent flexibility, which makes it a huge challenge to control them with high performances. Before elaborating a control strategy of such robots, it is essential to reconstruct first the behavior of the robot through development of an approximate behavioral model. This can be kinematic or dynamic depending on the conditions of operation of the robot itself. Kinematically, two types of modeling methods exist to describe the robot behavior; quantitative methods describe a model-based method, and qualitative methods describe a learning-based method. In kinematic modeling of continuum manipulator, the assumption of constant curvature is often considered to simplify the model formulation. In this work, a quantitative modeling method is proposed, based on the Pythagorean hodograph (PH) curves. The aim is to obtain a three-dimensional reconstruction of the shape of the continuum manipulator with variable curvature, allowing the calculation of its inverse kinematic model (IKM). It is noticed that the performances of the PH-based kinematic modeling of continuum manipulators are considerable regarding position accuracy, shape reconstruction, and time/cost of the model calculation, than other kinematic modeling methods, for two cases: free load manipulation and variable load manipulation. This modeling method is applied to the compact bionic handling assistant (CBHA) manipulator for validation. The results are compared with other IKMs developed in case of CBHA manipulator.

  13. An analytical study of the dual mass mechanical system stability

    NASA Astrophysics Data System (ADS)

    Nikolov, Svetoslav; Sinapov, Petko; Kralov, Ivan; Ignatov, Ignat

    2011-12-01

    In this paper an autonomous, nonlinear model of five ordinary differential equations modeling the motion of a dual mass mechanical system with universal joint is studied. The model is investigated qualitatively. On the base of the stability analysis performed, we obtain that the system is: i) in an equilibrium state, or ii) in a structurally unstable behavior when equilibrium states disappear. In case (i) the system is in a normal technical condition and in case (ii) hard break-downs take place.

  14. Modeling river total bed material load discharge using artificial intelligence approaches (based on conceptual inputs)

    NASA Astrophysics Data System (ADS)

    Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal

    2014-06-01

    This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.

  15. IDC Re-Engineering Phase 2 Iteration E2 Use Case Realizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Hamlet, Benjamin R.

    2016-06-01

    This architecturally significant use case describes how the System acquires meteorological data to build atmospheric models used in automatic and interactive processing of infrasound data. The System requests the latest available high-resolution global meteorological data from external data centers and puts it into the correct formats for generation of infrasound propagation models. The system moves the meteorological data from Data Acquisition Partition to the Data Processing Partition and stores the meteorological data. The System builds a new atmospheric model based on the meteorological data. This use case is architecturally significant because it describes acquiring meteorological data from various sources andmore » creating dynamic atmospheric transmission model to support the prediction of infrasonic signal detection« less

  16. Possible superconductivity in Sr2IrO4 probed by quasiparticle interference

    PubMed Central

    Gao, Yi; Zhou, Tao; Huang, Huaixiang; Wang, Qiang-Hua

    2015-01-01

    Based on the possible superconducting (SC) pairing symmetries recently proposed, the quasiparticle interference (QPI) patterns in electron- and hole-doped Sr2IrO4 are theoretically investigated. In the electron-doped case, the QPI spectra can be explained based on a model similar to the octet model of the cuprates while in the hole-doped case, both the Fermi surface topology and the sign of the SC order parameter resemble those of the iron pnictides and there exists a QPI vector resulting from the interpocket scattering between the electron and hole pockets. In both cases, the evolution of the QPI vectors with energy and their behaviors in the nonmagnetic and magnetic impurity scattering cases can well be explained based on the evolution of the constant-energy contours and the sign structure of the SC order parameter. The QPI spectra presented in this paper can be compared with future scanning tunneling microscopy experiments to test whether there are SC phases in electron- and hole-doped Sr2IrO4 and what the pairing symmetry is. PMID:25783417

  17. Modelling a flows in supply chain with analytical models: Case of a chemical industry

    NASA Astrophysics Data System (ADS)

    Benhida, Khalid; Azougagh, Yassine; Elfezazi, Said

    2016-02-01

    This study is interested on the modelling of the logistics flows in a supply chain composed on a production sites and a logistics platform. The contribution of this research is to develop an analytical model (integrated linear programming model), based on a case study of a real company operating in the phosphate field, considering a various constraints in this supply chain to resolve the planning problems for a better decision-making. The objectives of this model is to determine and define the optimal quantities of different products to route, to and from the various entities in the supply chain studied.

  18. Modeling asset price processes based on mean-field framework

    NASA Astrophysics Data System (ADS)

    Ieda, Masashi; Shiino, Masatoshi

    2011-12-01

    We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.

  19. Estimation of sojourn time in chronic disease screening without data on interval cases.

    PubMed

    Chen, T H; Kuo, H S; Yen, M F; Lai, M S; Tabar, L; Duffy, S W

    2000-03-01

    Estimation of the sojourn time on the preclinical detectable period in disease screening or transition rates for the natural history of chronic disease usually rely on interval cases (diagnosed between screens). However, to ascertain such cases might be difficult in developing countries due to incomplete registration systems and difficulties in follow-up. To overcome this problem, we propose three Markov models to estimate parameters without using interval cases. A three-state Markov model, a five-state Markov model related to regional lymph node spread, and a five-state Markov model pertaining to tumor size are applied to data on breast cancer screening in female relatives of breast cancer cases in Taiwan. Results based on a three-state Markov model give mean sojourn time (MST) 1.90 (95% CI: 1.18-4.86) years for this high-risk group. Validation of these models on the basis of data on breast cancer screening in the age groups 50-59 and 60-69 years from the Swedish Two-County Trial shows the estimates from a three-state Markov model that does not use interval cases are very close to those from previous Markov models taking interval cancers into account. For the five-state Markov model, a reparameterized procedure using auxiliary information on clinically detected cancers is performed to estimate relevant parameters. A good fit of internal and external validation demonstrates the feasibility of using these models to estimate parameters that have previously required interval cancers. This method can be applied to other screening data in which there are no data on interval cases.

  20. Not Funding the Evidence-Based Model in Ohio

    ERIC Educational Resources Information Center

    Edlefson, Carla

    2010-01-01

    The purpose of this descriptive case study was to describe the implementation of Ohio's version of the Evidence-Based Model (OEBM) state school finance system in 2009. Data sources included state budget documents and analyses as well as interviews with local school officials. The new system was responsive to three policy objectives ordered by the…

  1. Purpose, processes, partnerships, and products: four Ps to advance participatory socio-environmental modeling

    USGS Publications Warehouse

    Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre D.; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt-Olabisi, Laura; Singer, Alison; Sterling, Eleanor J.; Zellner, Moira

    2018-01-01

    Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human–environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM.

  2. Purpose, processes, partnerships, and products: four Ps to advance participatory socio-environmental modeling.

    PubMed

    Gray, Steven; Voinov, Alexey; Paolisso, Michael; Jordan, Rebecca; BenDor, Todd; Bommel, Pierre; Glynn, Pierre; Hedelin, Beatrice; Hubacek, Klaus; Introne, Josh; Kolagani, Nagesh; Laursen, Bethany; Prell, Christina; Schmitt Olabisi, Laura; Singer, Alison; Sterling, Eleanor; Zellner, Moira

    2018-01-01

    Including stakeholders in environmental model building and analysis is an increasingly popular approach to understanding ecological change. This is because stakeholders often hold valuable knowledge about socio-environmental dynamics and collaborative forms of modeling produce important boundary objects used to collectively reason about environmental problems. Although the number of participatory modeling (PM) case studies and the number of researchers adopting these approaches has grown in recent years, the lack of standardized reporting and limited reproducibility have prevented PM's establishment and advancement as a cohesive field of study. We suggest a four-dimensional framework (4P) that includes reporting on dimensions of (1) the Purpose for selecting a PM approach (the why); (2) the Process by which the public was involved in model building or evaluation (the how); (3) the Partnerships formed (the who); and (4) the Products that resulted from these efforts (the what). We highlight four case studies that use common PM software-based approaches (fuzzy cognitive mapping, agent-based modeling, system dynamics, and participatory geospatial modeling) to understand human-environment interactions and the consequences of ecological changes, including bushmeat hunting in Tanzania and Cameroon, agricultural production and deforestation in Zambia, and groundwater management in India. We demonstrate how standardizing communication about PM case studies can lead to innovation and new insights about model-based reasoning in support of ecological policy development. We suggest that our 4P framework and reporting approach provides a way for new hypotheses to be identified and tested in the growing field of PM. © 2017 by the Ecological Society of America.

  3. Developing and pretesting case studies in dental and dental hygiene education: using the diffusion of innovations model.

    PubMed

    Cragun, Deborah L; DeBate, Rita DiGioacchino; Severson, Herbert H; Shaw, Tracy; Christiansen, Steve; Koerber, Anne; Tomar, Scott L; Brown, Kelli McCormack; Tedesco, Lisa A; Hendricson, William D

    2012-05-01

    Case-based learning offers exposure to clinical situations that health professions students may not encounter in their training. The purposes of this study were to apply the Diffusion of Innovations conceptual framework to 1) identify characteristics of case studies that would increase their adoption among dental and dental hygiene faculty members and 2) develop and pretest interactive web-based case studies on sensitive oral-systemic health issues. The formative study spanned two phases using mixed methods (Phase 1: eight focus groups and four interviews; Phase 2: ten interviews and satisfaction surveys). Triangulation of quantitative and qualitative data revealed the following positive attributes of the developed case studies: relative advantage of active learning and modeling; compatibility with a variety of courses; observability of case-related knowledge and skills; independent learning; and modifiability for use with other oral-systemic health issues. These positive attributes are expected to increase the likelihood that dental and dental hygiene faculty members will adopt the developed case study once it is available for use. The themes identified in this study could be applied to the development of future case studies and may provide broader insight that might prove useful for exploring differences in case study use across dental and dental hygiene curricula.

  4. Forecasting malaria incidence based on monthly case reports and environmental factors in Karuzi, Burundi, 1997–2003

    PubMed Central

    Gomez-Elipe, Alberto; Otero, Angel; van Herp, Michel; Aguirre-Jaime, Armando

    2007-01-01

    Background The objective of this work was to develop a model to predict malaria incidence in an area of unstable transmission by studying the association between environmental variables and disease dynamics. Methods The study was carried out in Karuzi, a province in the Burundi highlands, using time series of monthly notifications of malaria cases from local health facilities, data from rain and temperature records, and the normalized difference vegetation index (NDVI). Using autoregressive integrated moving average (ARIMA) methodology, a model showing the relation between monthly notifications of malaria cases and the environmental variables was developed. Results The best forecasting model (R2adj = 82%, p < 0.0001 and 93% forecasting accuracy in the range ± 4 cases per 100 inhabitants) included the NDVI, mean maximum temperature, rainfall and number of malaria cases in the preceding month. Conclusion This model is a simple and useful tool for producing reasonably reliable forecasts of the malaria incidence rate in the study area. PMID:17892540

  5. Integrated Safety Risk Reduction Approach to Enhancing Human-Rated Spaceflight Safety

    NASA Astrophysics Data System (ADS)

    Mikula, J. F. Kip

    2005-12-01

    This paper explores and defines the current accepted concept and philosophy of safety improvement based on a Reliability enhancement (called here Reliability Enhancement Based Safety Theory [REBST]). In this theory a Reliability calculation is used as a measure of the safety achieved on the program. This calculation may be based on a math model or a Fault Tree Analysis (FTA) of the system, or on an Event Tree Analysis (ETA) of the system's operational mission sequence. In each case, the numbers used in this calculation are hardware failure rates gleaned from past similar programs. As part of this paper, a fictional but representative case study is provided that helps to illustrate the problems and inaccuracies of this approach to safety determination. Then a safety determination and enhancement approach based on hazard, worst case analysis, and safety risk determination (called here Worst Case Based Safety Theory [WCBST]) is included. This approach is defined and detailed using the same example case study as shown in the REBST case study. In the end it is concluded that an approach combining the two theories works best to reduce Safety Risk.

  6. [Prediction method of rural landscape pattern evolution based on life cycle: a case study of Jinjing Town, Hunan Province, China].

    PubMed

    Ji, Xiang; Liu, Li-Ming; Li, Hong-Qing

    2014-11-01

    Taking Jinjing Town in Dongting Lake area as a case, this paper analyzed the evolution of rural landscape patterns by means of life cycle theory, simulated the evolution cycle curve, and calculated its evolution period, then combining CA-Markov model, a complete prediction model was built based on the rule of rural landscape change. The results showed that rural settlement and paddy landscapes of Jinjing Town would change most in 2020, with the rural settlement landscape increased to 1194.01 hm2 and paddy landscape greatly reduced to 3090.24 hm2. The quantitative and spatial prediction accuracies of the model were up to 99.3% and 96.4%, respectively, being more explicit than single CA-Markov model. The prediction model of rural landscape patterns change proposed in this paper would be helpful for rural landscape planning in future.

  7. Functional Risk Modeling for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Thomson, Fraser; Mathias, Donovan; Go, Susie; Nejad, Hamed

    2010-01-01

    We introduce an approach to risk modeling that we call functional modeling , which we have developed to estimate the capabilities of a lunar base. The functional model tracks the availability of functions provided by systems, in addition to the operational state of those systems constituent strings. By tracking functions, we are able to identify cases where identical functions are provided by elements (rovers, habitats, etc.) that are connected together on the lunar surface. We credit functional diversity in those cases, and in doing so compute more realistic estimates of operational mode availabilities. The functional modeling approach yields more realistic estimates of the availability of the various operational modes provided to astronauts by the ensemble of surface elements included in a lunar base architecture. By tracking functional availability the effects of diverse backup, which often exists when two or more independent elements are connected together, is properly accounted for.

  8. Scalable methodology for large scale building energy improvement: Relevance of calibration in model-based retrofit analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heo, Yeonsook; Augenbroe, Godfried; Graziano, Diane

    2015-05-01

    The increasing interest in retrofitting of existing buildings is motivated by the need to make a major contribution to enhancing building energy efficiency and reducing energy consumption and CO2 emission by the built environment. This paper examines the relevance of calibration in model-based analysis to support decision-making for energy and carbon efficiency retrofits of individual buildings and portfolios of buildings. The authors formulate a set of real retrofit decision-making situations and evaluate the role of calibration by using a case study that compares predictions and decisions from an uncalibrated model with those of a calibrated model. The case study illustratesmore » both the mechanics and outcomes of a practical alternative to the expert- and time-intense application of dynamic energy simulation models for large-scale retrofit decision-making under uncertainty.« less

  9. Monte Carlo grain growth modeling with local temperature gradients

    NASA Astrophysics Data System (ADS)

    Tan, Y.; Maniatty, A. M.; Zheng, C.; Wen, J. T.

    2017-09-01

    This work investigated the development of a Monte Carlo (MC) simulation approach to modeling grain growth in the presence of non-uniform temperature field that may vary with time. We first scale the MC model to physical growth processes by fitting experimental data. Based on the scaling relationship, we derive a grid site selection probability (SSP) function to consider the effect of a spatially varying temperature field. The SSP function is based on the differential MC step, which allows it to naturally consider time varying temperature fields too. We verify the model and compare the predictions to other existing formulations (Godfrey and Martin 1995 Phil. Mag. A 72 737-49 Radhakrishnan and Zacharia 1995 Metall. Mater. Trans. A 26 2123-30) in simple two-dimensional cases with only spatially varying temperature fields, where the predicted grain growth in regions of constant temperature are expected to be the same as for the isothermal case. We also test the model in a more realistic three-dimensional case with a temperature field varying in both space and time, modeling grain growth in the heat affected zone of a weld. We believe the newly proposed approach is promising for modeling grain growth in material manufacturing processes that involves time-dependent local temperature gradient.

  10. Expert knowledge elicitation using computer simulation: the organization of frail elderly case management as an illustration.

    PubMed

    Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean

    2014-08-01

    Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.

  11. Bifurcation study of phase oscillator systems with attractive and repulsive interaction.

    PubMed

    Burylko, Oleksandr; Kazanovich, Yakov; Borisyuk, Roman

    2014-08-01

    We study a model of globally coupled phase oscillators that contains two groups of oscillators with positive (synchronizing) and negative (desynchronizing) incoming connections for the first and second groups, respectively. This model was previously studied by Hong and Strogatz (the Hong-Strogatz model) in the case of a large number of oscillators. We consider a generalized Hong-Strogatz model with a constant phase shift in coupling. Our approach is based on the study of invariant manifolds and bifurcation analysis of the system. In the case of zero phase shift, various invariant manifolds are analytically described and a new dynamical mode is found. In the case of a nonzero phase shift we obtained a set of bifurcation diagrams for various systems with three or four oscillators. It is shown that in these cases system dynamics can be complex enough and include multistability and chaotic oscillations.

  12. Bifurcation study of phase oscillator systems with attractive and repulsive interaction

    NASA Astrophysics Data System (ADS)

    Burylko, Oleksandr; Kazanovich, Yakov; Borisyuk, Roman

    2014-08-01

    We study a model of globally coupled phase oscillators that contains two groups of oscillators with positive (synchronizing) and negative (desynchronizing) incoming connections for the first and second groups, respectively. This model was previously studied by Hong and Strogatz (the Hong-Strogatz model) in the case of a large number of oscillators. We consider a generalized Hong-Strogatz model with a constant phase shift in coupling. Our approach is based on the study of invariant manifolds and bifurcation analysis of the system. In the case of zero phase shift, various invariant manifolds are analytically described and a new dynamical mode is found. In the case of a nonzero phase shift we obtained a set of bifurcation diagrams for various systems with three or four oscillators. It is shown that in these cases system dynamics can be complex enough and include multistability and chaotic oscillations.

  13. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries.

    PubMed

    Kannan, Vaishnavi; Fish, Jason C; Willett, DuWayne L

    2016-02-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system's requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. "Agile Modeling" retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams.

  14. [Strategic health planning based on determinants: case of the municipality of Campo Bom, Rio Grande do Sul State. A methodological proposal for the decentralized management].

    PubMed

    González, Martín Maximino León

    2009-10-01

    With the purpose to analyze the health strategic planning model based on determinants experienced in the municipality of Campo Bom, Rio Grande do Sul State, it was conducted an observational, qualitative study, of documental analysis as well as an evaluation of new process technologies in local health administration. This study contains an analysis of the methodological coherency and applicability of this model, based on the revision of the elaborated plans. The plans presented at Campo Bom case shows the possibility of integration and applicability at local level, of a health strategic planning model oriented to the new health concepts considering elements of different theoretical developments that enables the response to the most common local needs and situations. It was identified evolutional stages of health planning and analyzed integrative elements of the model and limitations of its application, pointing to the need of support the deepening on the study and the development of the field.

  15. Systems engineering interfaces: A model based approach

    NASA Astrophysics Data System (ADS)

    Fosse, E.; Delp, C. L.

    The engineering of interfaces is a critical function of the discipline of Systems Engineering. Included in interface engineering are instances of interaction. Interfaces provide the specifications of the relevant properties of a system or component that can be connected to other systems or components while instances of interaction are identified in order to specify the actual integration to other systems or components. Current Systems Engineering practices rely on a variety of documents and diagrams to describe interface specifications and instances of interaction. The SysML[1] specification provides a precise model based representation for interfaces and interface instance integration. This paper will describe interface engineering as implemented by the Operations Revitalization Task using SysML, starting with a generic case and culminating with a focus on a Flight System to Ground Interaction. The reusability of the interface engineering approach presented as well as its extensibility to more complex interfaces and interactions will be shown. Model-derived tables will support the case studies shown and are examples of model-based documentation products.

  16. Using a multinomial tree model for detecting mixtures in perceptual detection

    PubMed Central

    Chechile, Richard A.

    2014-01-01

    In the area of memory research there have been two rival approaches for memory measurement—signal detection theory (SDT) and multinomial processing trees (MPT). Both approaches provide measures for the quality of the memory representation, and both approaches provide for corrections for response bias. In recent years there has been a strong case advanced for the MPT approach because of the finding of stochastic mixtures on both target-present and target-absent tests. In this paper a case is made that perceptual detection, like memory recognition, involves a mixture of processes that are readily represented as a MPT model. The Chechile (2004) 6P memory measurement model is modified in order to apply to the case of perceptual detection. This new MPT model is called the Perceptual Detection (PD) model. The properties of the PD model are developed, and the model is applied to some existing data of a radiologist examining CT scans. The PD model brings out novel features that were absent from a standard SDT analysis. Also the topic of optimal parameter estimation on an individual-observer basis is explored with Monte Carlo simulations. These simulations reveal that the mean of the Bayesian posterior distribution is a more accurate estimator than the corresponding maximum likelihood estimator (MLE). Monte Carlo simulations also indicate that model estimates based on only the data from an individual observer can be improved upon (in the sense of being more accurate) by an adjustment that takes into account the parameter estimate based on the data pooled across all the observers. The adjustment of the estimate for an individual is discussed as an analogous statistical effect to the improvement over the individual MLE demonstrated by the James–Stein shrinkage estimator in the case of the multiple-group normal model. PMID:25018741

  17. Convergence of methods for coupling of microscopic and mesoscopic reaction-diffusion simulations

    NASA Astrophysics Data System (ADS)

    Flegg, Mark B.; Hellander, Stefan; Erban, Radek

    2015-05-01

    In this paper, three multiscale methods for coupling of mesoscopic (compartment-based) and microscopic (molecular-based) stochastic reaction-diffusion simulations are investigated. Two of the three methods that will be discussed in detail have been previously reported in the literature; the two-regime method (TRM) and the compartment-placement method (CPM). The third method that is introduced and analysed in this paper is called the ghost cell method (GCM), since it works by constructing a "ghost cell" in which molecules can disappear and jump into the compartment-based simulation. Presented is a comparison of sources of error. The convergent properties of this error are studied as the time step Δt (for updating the molecular-based part of the model) approaches zero. It is found that the error behaviour depends on another fundamental computational parameter h, the compartment size in the mesoscopic part of the model. Two important limiting cases, which appear in applications, are considered: Δt → 0 and h is fixed; Δt → 0 and h → 0 such that √{ Δt } / h is fixed. The error for previously developed approaches (the TRM and CPM) converges to zero only in the limiting case (ii), but not in case (i). It is shown that the error of the GCM converges in the limiting case (i). Thus the GCM is superior to previous coupling techniques if the mesoscopic description is much coarser than the microscopic part of the model.

  18. A Biomechanical Model for Lung Fibrosis in Proton Beam Therapy

    NASA Astrophysics Data System (ADS)

    King, David J. S.

    The physics of protons makes them well-suited to conformal radiotherapy due to the well-known Bragg peak effect. From a proton's inherent stopping power, uncertainty effects can cause a small amount of dose to overflow to an organ at risk (OAR). Previous models for calculating normal tissue complication probabilities (NTCPs) relied on the equivalent uniform dose model (EUD), in which the organ was split into 1/3, 2/3 or whole organ irradiation. However, the problem of dealing with volumes <1/3 of the total volume renders this EUD based approach no longer applicable. In this work the case for an experimental data-based replacement at low volumes is investigated. Lung fibrosis is investigated as an NTCP effect typically arising from dose overflow from tumour irradiation at the spinal base. Considering a 3D geometrical model of the lungs, irradiations are modelled with variable parameters of dose overflow. To calculate NTCPs without the EUD model, experimental data is used from the quantitative analysis of normal tissue effects in the clinic (QUANTEC) data. Additional side projects are also investigated, introduced and explained at various points. A typical radiotherapy course for the patient of 30x2Gy per fraction is simulated. A range of geometry of the target volume and irradiation types is investigated. Investigations with X-rays found the majority of the data point ratios (ratio of EUD values found from calculation based and data based methods) at 20% within unity showing a relatively close agreement. The ratios have not systematically preferred one particular type of predictive method. No Vx metric was found to consistently outperform another. In certain cases there is a good agreement and not in other cases which can be found predicted in the literature. The overall results leads to conclusion that there is no reason to discount the use of the data based predictive method particularly, as a low volume replacement predictive method.

  19. Numerical study of turbulence-influence mechanism on arc characteristics in an air direct current circuit breaker

    NASA Astrophysics Data System (ADS)

    Wu, Mingliang; Yang, Fei; Rong, Mingzhe; Wu, Yi; Qi, Yang; Cui, Yufei; Liu, Zirui; Guo, Anxiang

    2016-04-01

    This paper focuses on the numerical investigation of arc characteristics in an air direct current circuit breaker (air DCCB). Using magneto-hydrodynamics (MHD) theory, 3D laminar model and turbulence model are constructed and calculated. The standard k-epsilon model is utilized to consider the turbulence effect in the arc chamber of the DCCB. Several important phenomena are found: the arc column in the turbulence-model case is more extensive, moves much more slowly than the counterpart in the laminar-model case, and shows stagnation at the entrance of the chamber, unlike in the laminar-model case. Moreover, the arc voltage in the turbulence-model case is much lower than in the laminar-model case. However, the results in the turbulence-model case show a much better agreement with the results of the breaking experiments under DC condition than in the laminar-model case, which is contradictory to the previous conclusions from the arc researches of both the low-voltage circuit breaker and the sulfur hexafluoride (SF6) nozzle. First, in the previous air-arc research of the low-voltage circuit breaker, it is assumed that the air plasma inside the chamber is in the state of laminar, and the laminar-model application gives quite satisfactory results compared with the experiments, while in this paper, the laminar-model application works badly. Second, the turbulence-model application in the arc research of the SF6-nozzle performs much better and gives higher arc voltage than the laminar-model application does, whereas in this paper, the turbulence-model application predicts lower arc voltage than the laminar-model application does. Based on the analysis of simulation results in detail, the mechanism of the above phenomena is revealed. The transport coefficients are strongly changed by turbulence, which will enhance the arc diffusion and make the arc volume much larger. Consequently, the arc appearance and the distribution of Lorentz force in the turbulence-model case substantially differ from the arc appearance and the distribution of Lorentz force in the laminar-model case. Thus, the moving process of the arc in the turbulence-model case is slowed down and slower than in the laminar-model case. Moreover, the more extensive arc column in the turbulence-model case reduces the total arc resistance, which results in a lower arc voltage, more consistent with the experimental results than the arc voltage in the laminar-model case. Therefore, the air plasma inside this air DCCB is believed to be in the turbulence state, and the turbulence model is more suitable than the laminar model for the arc simulation of this kind of air DCCB.

  20. Numerical study of turbulence-influence mechanism on arc characteristics in an air direct current circuit breaker

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Mingliang; Yang, Fei, E-mail: yfei2007@mail.xjtu.edu.cn; Rong, Mingzhe

    This paper focuses on the numerical investigation of arc characteristics in an air direct current circuit breaker (air DCCB). Using magneto-hydrodynamics (MHD) theory, 3D laminar model and turbulence model are constructed and calculated. The standard k-epsilon model is utilized to consider the turbulence effect in the arc chamber of the DCCB. Several important phenomena are found: the arc column in the turbulence-model case is more extensive, moves much more slowly than the counterpart in the laminar-model case, and shows stagnation at the entrance of the chamber, unlike in the laminar-model case. Moreover, the arc voltage in the turbulence-model case ismore » much lower than in the laminar-model case. However, the results in the turbulence-model case show a much better agreement with the results of the breaking experiments under DC condition than in the laminar-model case, which is contradictory to the previous conclusions from the arc researches of both the low-voltage circuit breaker and the sulfur hexafluoride (SF6) nozzle. First, in the previous air-arc research of the low-voltage circuit breaker, it is assumed that the air plasma inside the chamber is in the state of laminar, and the laminar-model application gives quite satisfactory results compared with the experiments, while in this paper, the laminar-model application works badly. Second, the turbulence-model application in the arc research of the SF6-nozzle performs much better and gives higher arc voltage than the laminar-model application does, whereas in this paper, the turbulence-model application predicts lower arc voltage than the laminar-model application does. Based on the analysis of simulation results in detail, the mechanism of the above phenomena is revealed. The transport coefficients are strongly changed by turbulence, which will enhance the arc diffusion and make the arc volume much larger. Consequently, the arc appearance and the distribution of Lorentz force in the turbulence-model case substantially differ from the arc appearance and the distribution of Lorentz force in the laminar-model case. Thus, the moving process of the arc in the turbulence-model case is slowed down and slower than in the laminar-model case. Moreover, the more extensive arc column in the turbulence-model case reduces the total arc resistance, which results in a lower arc voltage, more consistent with the experimental results than the arc voltage in the laminar-model case. Therefore, the air plasma inside this air DCCB is believed to be in the turbulence state, and the turbulence model is more suitable than the laminar model for the arc simulation of this kind of air DCCB.« less

  1. Network-based regularization for matched case-control analysis of high-dimensional DNA methylation data.

    PubMed

    Sun, Hokeun; Wang, Shuang

    2013-05-30

    The matched case-control designs are commonly used to control for potential confounding factors in genetic epidemiology studies especially epigenetic studies with DNA methylation. Compared with unmatched case-control studies with high-dimensional genomic or epigenetic data, there have been few variable selection methods for matched sets. In an earlier paper, we proposed the penalized logistic regression model for the analysis of unmatched DNA methylation data using a network-based penalty. However, for popularly applied matched designs in epigenetic studies that compare DNA methylation between tumor and adjacent non-tumor tissues or between pre-treatment and post-treatment conditions, applying ordinary logistic regression ignoring matching is known to bring serious bias in estimation. In this paper, we developed a penalized conditional logistic model using the network-based penalty that encourages a grouping effect of (1) linked Cytosine-phosphate-Guanine (CpG) sites within a gene or (2) linked genes within a genetic pathway for analysis of matched DNA methylation data. In our simulation studies, we demonstrated the superiority of using conditional logistic model over unconditional logistic model in high-dimensional variable selection problems for matched case-control data. We further investigated the benefits of utilizing biological group or graph information for matched case-control data. We applied the proposed method to a genome-wide DNA methylation study on hepatocellular carcinoma (HCC) where we investigated the DNA methylation levels of tumor and adjacent non-tumor tissues from HCC patients by using the Illumina Infinium HumanMethylation27 Beadchip. Several new CpG sites and genes known to be related to HCC were identified but were missed by the standard method in the original paper. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Re-resection rates after breast-conserving surgery as a performance indicator: introduction of a case-mix model to allow comparison between Dutch hospitals.

    PubMed

    Talsma, A K; Reedijk, A M J; Damhuis, R A M; Westenend, P J; Vles, W J

    2011-04-01

    Re-resection rate after breast-conserving surgery (BCS) has been introduced as an indicator of quality of surgical treatment in international literature. The present study aims to develop a case-mix model for re-resection rates and to evaluate its performance in comparing results between hospitals. Electronic records of eligible patients diagnosed with in-situ and invasive breast cancer in 2006 and 2007 were derived from 16 hospitals in the Rotterdam Cancer Registry (RCR) (n = 961). A model was built in which prognostic factors for re-resections after BCS were identified and expected re-resection rate could be assessed for hospitals based on their case mix. To illustrate the opportunities of monitoring re-resections over time, after risk adjustment for patient profile, a VLAD chart was drawn for patients in one hospital. In general three out of every ten women had re-surgery; in about 50% this meant an additive mastectomy. Independent prognostic factors of re-resection after multivariate analysis were histological type, sublocalisation, tumour size, lymph node involvement and multifocal disease. After correction for case mix, one hospital was performing significantly less re-resections compared to the reference hospital. On the other hand, two were performing significantly more re-resections than was expected based on their patient mix. Our population-based study confirms earlier reports that re-resection is frequently required after an initial breast-conserving operation. Case-mix models such as the one we constructed can be used to correct for variation between hospitals performances. VLAD charts are valuable tools to monitor quality of care within individual hospitals. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Computation of iodine species concentrations in water

    NASA Technical Reports Server (NTRS)

    Schultz, John R.; Mudgett, Paul D.; Flanagan, David T.; Sauer, Richard L.

    1994-01-01

    During an evaluation of the use of iodine as a water disinfectant and the development of methods for measuring various iodine species in water onboard Space Freedom, it became necessary to compute the concentration of the various species based on equilibrium principles alone. Of particular concern was the case when various amounts of iodine, iodide, strong acid, and strong base are added to water. Such solutions can be used to evaluate the performance of various monitoring methods being considered. The authors of this paper present an overview of aqueous iodine chemistry, a set of nonlinear equations which can be used to model the above case, and a computer program for solving this system of equations using the Newton-Raphson method. The program was validated by comparing results over a range of concentrations and pH values with those previously presented by Gottardi for a given pH. Use of this program indicated that there are multiple roots to many cases and selecting an appropriate initial guess is important. Comparison of program results with laboratory results for the case when only iodine is added to water indicates the program gives high pH values for the iodine concentrations normally used for water disinfection. Extending the model to include the effects of iodate formation results in the computer pH values being closer to those observed, but the model with iodate does not agree well for the case in which base is added in addition to iodine to raise the pH. Potential explanations include failure to obtain equilibrium conditions in the lab, inaccuracies in published values for the equilibrium constants, and inadequate model of iodine chemistry and/or the lack of adequate analytical methods for measuring the various iodine species in water.

  4. Case mix management education in a Canadian hospital.

    PubMed

    Moffat, M; Prociw, M

    1992-01-01

    The Sunnybrook Health Science Centre's matrix organization model includes a traditional departmental structure, a strategic program-based structure and a case management-based structure--the Clinical Unit structure. The Clinical Unit structure allows the centre to give responsibility for the management of case mix and volume to decentralized Clinical Unit teams, each of which manages its own budget. To train physicians and nurses in their respective roles of Medical Unit directors and Nursing Unit directors, Sunnybrook designed unique short courses on financial management and budgeting, and case-costing and case mix management. This paper discusses how these courses were organized, details their contents and explains how they fit into Sunnybrook's program of decentralized management.

  5. Mechanistic modeling of biocorrosion caused by biofilms of sulfate reducing bacteria and acid producing bacteria.

    PubMed

    Xu, Dake; Li, Yingchao; Gu, Tingyue

    2016-08-01

    Biocorrosion is also known as microbiologically influenced corrosion (MIC). Most anaerobic MIC cases can be classified into two major types. Type I MIC involves non-oxygen oxidants such as sulfate and nitrate that require biocatalysis for their reduction in the cytoplasm of microbes such as sulfate reducing bacteria (SRB) and nitrate reducing bacteria (NRB). This means that the extracellular electrons from the oxidation of metal such as iron must be transported across cell walls into the cytoplasm. Type II MIC involves oxidants such as protons that are secreted by microbes such as acid producing bacteria (APB). The biofilms in this case supply the locally high concentrations of oxidants that are corrosive without biocatalysis. This work describes a mechanistic model that is based on the biocatalytic cathodic sulfate reduction (BCSR) theory. The model utilizes charge transfer and mass transfer concepts to describe the SRB biocorrosion process. The model also includes a mechanism to describe APB attack based on the local acidic pH at a pit bottom. A pitting prediction software package has been created based on the mechanisms. It predicts long-term pitting rates and worst-case scenarios after calibration using SRB short-term pit depth data. Various parameters can be investigated through computer simulation. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Reduction in symptomatic malaria prevalence through proactive community treatment in rural Senegal.

    PubMed

    Linn, Annē M; Ndiaye, Youssoupha; Hennessee, Ian; Gaye, Seynabou; Linn, Patrick; Nordstrom, Karin; McLaughlin, Matt

    2015-11-01

    We piloted a community-based proactive malaria case detection model in rural Senegal to evaluate whether this model can increase testing and treatment and reduce prevalence of symptomatic malaria in target communities. Home care providers conducted weekly sweeps of every household in their village throughout the transmission season to identify patients with symptoms of malaria, perform rapid diagnostic tests (RDT) on symptomatic patients and provide treatment for positive cases. The model was implemented in 15 villages from July to November 2013, the high transmission season. Fifteen comparison villages were chosen from those implementing Senegal's original, passive model of community case management of malaria. Three sweeps were conducted in the comparison villages to compare prevalence of symptomatic malaria using difference in differences analysis. At baseline, prevalence of symptomatic malaria confirmed by RDT for all symptomatic individuals found during sweeps was similar in both sets of villages (P = 0.79). At end line, prevalence was 16 times higher in the comparison villages than in the intervention villages (P = 0.003). Adjusting for potential confounders, the intervention was associated with a 30-fold reduction in odds of symptomatic malaria in the intervention villages (AOR = 0.033; 95% CI: 0.017, 0.065). Treatment seeking also increased in the intervention villages, with 57% of consultations by home care providers conducted between sweeps through routine community case management. This pilot study suggests that community-based proactive case detection reduces symptomatic malaria prevalence, likely through more timely case management and improved care seeking behaviour. A randomised controlled trial is needed to further evaluate the impact of this model. © 2015 John Wiley & Sons Ltd.

  7. RACORO continental boundary layer cloud investigations. Part I: Case study development and ensemble large-scale forcings

    DOE PAGES

    Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; ...

    2015-06-19

    Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60-hour case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in-situ measurements from the RACORO field campaign and remote-sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functionsmore » for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be ~0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing datasets are derived from the ARM variational analysis, ECMWF forecasts, and a multi-scale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in 'trial' large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.« less

  8. RACORO Continental Boundary Layer Cloud Investigations: 1. Case Study Development and Ensemble Large-Scale Forcings

    NASA Technical Reports Server (NTRS)

    Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; hide

    2015-01-01

    Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, kappa, are derived from observations to be approximately 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.

  9. RACORO continental boundary layer cloud investigations: 1. Case study development and ensemble large-scale forcings

    NASA Astrophysics Data System (ADS)

    Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; Li, Zhijin; Xie, Shaocheng; Ackerman, Andrew S.; Zhang, Minghua; Khairoutdinov, Marat

    2015-06-01

    Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.

  10. Application of spectral decomposition algorithm for mapping water quality in a turbid lake (Lake Kasumigaura, Japan) from Landsat TM data

    NASA Astrophysics Data System (ADS)

    Oyama, Youichi; Matsushita, Bunkei; Fukushima, Takehiko; Matsushige, Kazuo; Imai, Akio

    The remote sensing of Case 2 water has been far less successful than that of Case 1 water, due mainly to the complex interactions among optically active substances (e.g., phytoplankton, suspended sediments, colored dissolved organic matter, and water) in the former. To address this problem, we developed a spectral decomposition algorithm (SDA), based on a spectral linear mixture modeling approach. Through a tank experiment, we found that the SDA-based models were superior to conventional empirical models (e.g. using single band, band ratio, or arithmetic calculation of band) for accurate estimates of water quality parameters. In this paper, we develop a method for applying the SDA to Landsat-5 TM data on Lake Kasumigaura, a eutrophic lake in Japan characterized by high concentrations of suspended sediment, for mapping chlorophyll-a (Chl-a) and non-phytoplankton suspended sediment (NPSS) distributions. The results show that the SDA-based estimation model can be obtained by a tank experiment. Moreover, by combining this estimation model with satellite-SRSs (standard reflectance spectra: i.e., spectral end-members) derived from bio-optical modeling, we can directly apply the model to a satellite image. The same SDA-based estimation model for Chl-a concentration was applied to two Landsat-5 TM images, one acquired in April 1994 and the other in February 2006. The average Chl-a estimation error between the two was 9.9%, a result that indicates the potential robustness of the SDA-based estimation model. The average estimation error of NPSS concentration from the 2006 Landsat-5 TM image was 15.9%. The key point for successfully applying the SDA-based estimation model to satellite data is the method used to obtain a suitable satellite-SRS for each end-member.

  11. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    PubMed

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Additive Partial Least Squares for efficient modelling of independent variance sources demonstrated on practical case studies.

    PubMed

    Luoma, Pekka; Natschläger, Thomas; Malli, Birgit; Pawliczek, Marcin; Brandstetter, Markus

    2018-05-12

    A model recalibration method based on additive Partial Least Squares (PLS) regression is generalized for multi-adjustment scenarios of independent variance sources (referred to as additive PLS - aPLS). aPLS allows for effortless model readjustment under changing measurement conditions and the combination of independent variance sources with the initial model by means of additive modelling. We demonstrate these distinguishing features on two NIR spectroscopic case-studies. In case study 1 aPLS was used as a readjustment method for an emerging offset. The achieved RMS error of prediction (1.91 a.u.) was of similar level as before the offset occurred (2.11 a.u.). In case-study 2 a calibration combining different variance sources was conducted. The achieved performance was of sufficient level with an absolute error being better than 0.8% of the mean concentration, therefore being able to compensate negative effects of two independent variance sources. The presented results show the applicability of the aPLS approach. The main advantages of the method are that the original model stays unadjusted and that the modelling is conducted on concrete changes in the spectra thus supporting efficient (in most cases straightforward) modelling. Additionally, the method is put into context of existing machine learning algorithms. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. The Political Economy of Interlibrary Organizations: Two Case Studies.

    ERIC Educational Resources Information Center

    Townley, Charles T.

    J. Kenneth Benson's political economy model for interlibrary cooperation identifies linkages and describes interactions between the environment, the interlibrary organization, and member libraries. A tentative general model for interlibrary organizations based on the Benson model was developed, and the fit of this adjusted model to the realities…

  14. Sparsity-based Poisson denoising with dictionary learning.

    PubMed

    Giryes, Raja; Elad, Michael

    2014-12-01

    The problem of Poisson denoising appears in various imaging applications, such as low-light photography, medical imaging, and microscopy. In cases of high SNR, several transformations exist so as to convert the Poisson noise into an additive-independent identically distributed. Gaussian noise, for which many effective algorithms are available. However, in a low-SNR regime, these transformations are significantly less accurate, and a strategy that relies directly on the true noise statistics is required. Salmon et al took this route, proposing a patch-based exponential image representation model based on Gaussian mixture model, leading to state-of-the-art results. In this paper, we propose to harness sparse-representation modeling to the image patches, adopting the same exponential idea. Our scheme uses a greedy pursuit with boot-strapping-based stopping condition and dictionary learning within the denoising process. The reconstruction performance of the proposed scheme is competitive with leading methods in high SNR and achieving state-of-the-art results in cases of low SNR.

  15. Agile Implementation: A Blueprint for Implementing Evidence-Based Healthcare Solutions.

    PubMed

    Boustani, Malaz; Alder, Catherine A; Solid, Craig A

    2018-03-07

    To describe the essential components of an Agile Implementation (AI) process, which rapidly and effectively implements evidence-based healthcare solutions, and present a case study demonstrating its utility. Case demonstration study. Integrated, safety net healthcare delivery system in Indianapolis. Interdisciplinary team of clinicians and administrators. Reduction in dementia symptoms and caregiver burden; inpatient and outpatient care expenditures. Implementation scientists were able to implement a collaborative care model for dementia care and sustain it for more than 9 years. The model was implemented and sustained by using the elements of the AI process: proactive surveillance and confirmation of clinical opportunities, selection of the right evidence-based healthcare solution, localization (i.e., tailoring to the local environment) of the selected solution, development of an evaluation plan and performance feedback loop, development of a minimally standardized operation manual, and updating such manual annually. The AI process provides an effective model to implement and sustain evidence-based healthcare solutions. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.

  16. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package

    PubMed Central

    2012-01-01

    Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941

  17. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    PubMed

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  18. Whole vertebral bone segmentation method with a statistical intensity-shape model based approach

    NASA Astrophysics Data System (ADS)

    Hanaoka, Shouhei; Fritscher, Karl; Schuler, Benedikt; Masutani, Yoshitaka; Hayashi, Naoto; Ohtomo, Kuni; Schubert, Rainer

    2011-03-01

    An automatic segmentation algorithm for the vertebrae in human body CT images is presented. Especially we focused on constructing and utilizing 4 different statistical intensity-shape combined models for the cervical, upper / lower thoracic and lumbar vertebrae, respectively. For this purpose, two previously reported methods were combined: a deformable model-based initial segmentation method and a statistical shape-intensity model-based precise segmentation method. The former is used as a pre-processing to detect the position and orientation of each vertebra, which determines the initial condition for the latter precise segmentation method. The precise segmentation method needs prior knowledge on both the intensities and the shapes of the objects. After PCA analysis of such shape-intensity expressions obtained from training image sets, vertebrae were parametrically modeled as a linear combination of the principal component vectors. The segmentation of each target vertebra was performed as fitting of this parametric model to the target image by maximum a posteriori estimation, combined with the geodesic active contour method. In the experimental result by using 10 cases, the initial segmentation was successful in 6 cases and only partially failed in 4 cases (2 in the cervical area and 2 in the lumbo-sacral). In the precise segmentation, the mean error distances were 2.078, 1.416, 0.777, 0.939 mm for cervical, upper and lower thoracic, lumbar spines, respectively. In conclusion, our automatic segmentation algorithm for the vertebrae in human body CT images showed a fair performance for cervical, thoracic and lumbar vertebrae.

  19. Mechanical energy of the trunk during walking--does the model used influence the results?

    PubMed

    Syczewska, Małgorzata

    2009-01-01

    The paper presents two trunk models. In the first one, the trunk is modelled as a series of seven segments, whose dimensions and inertial properties are parametrically based on body stature and body mass. In the second one, the trunk is modelled as one rigid segment. These models are used to calculate kinetic energy of the trunk relative movement with respect to the body centre of mass. The results show that in the case of healthy subject both models give similar results, but in the case of stroke subjects the simplified model leads to the underestimation of the energy amount and does not reflect all phases of gait when energy is generated.

  20. Reconstructing Exposures from Biomarkers using Exposure-Pharmacokinetic Modeling - A Case Study with Carbaryl

    EPA Science Inventory

    Sources of uncertainty involved in exposure reconstruction for a short half-life chemical, carbaryl, were characterized using the Cumulative and Aggregate Risk Evaluation System (CARES), an exposure model, and a human physiologically based pharmacokinetic (PBPK) model. CARES was...

  1. Tourism Village Model Based on Local Indigenous: Case Study of Nongkosawit Tourism Village, Gunungpati, Semarang

    NASA Astrophysics Data System (ADS)

    Kurniasih; Nihayah, Dyah Maya; Sudibyo, Syafitri Amalia; Winda, Fajri Nur

    2018-02-01

    Officially, Nongkosawit Village has become a tourism village since 2012. However, the economic impact has not been received by the society yet because of inappropriate tourism village model. Therefore, this study aims to find out the best model for the development of Nongkosawit Tourism Village. This research used Analytical Hierarchy Process method. The results of this research shows that the model of tourism village which was suitable to the local indigenous of Nongkosawit Tourism Village was the cultural based tourism village with the percentage of 58%. Therefore, it is necessary to do re-orientation from the natural-based village model into the cultural-based village model by raising and exploring the existing culture through unique and different tourism products.

  2. Moderating Factors of Video-Modeling with Other as Model: A Meta-Analysis of Single-Case Studies

    ERIC Educational Resources Information Center

    Mason, Rose A.; Ganz, Jennifer B.; Parker, Richard I.; Burke, Mack D.; Camargo, Siglia P.

    2012-01-01

    Video modeling with other as model (VMO) is a more practical method for implementing video-based modeling techniques, such as video self-modeling, which requires significantly more editing. Despite this, identification of contextual factors such as participant characteristics and targeted outcomes that moderate the effectiveness of VMO has not…

  3. A study on the predictability of acute lymphoblastic leukaemia response to treatment using a hybrid oncosimulator.

    PubMed

    Ouzounoglou, Eleftherios; Kolokotroni, Eleni; Stanulla, Martin; Stamatakos, Georgios S

    2018-02-06

    Efficient use of Virtual Physiological Human (VPH)-type models for personalized treatment response prediction purposes requires a precise model parameterization. In the case where the available personalized data are not sufficient to fully determine the parameter values, an appropriate prediction task may be followed. This study, a hybrid combination of computational optimization and machine learning methods with an already developed mechanistic model called the acute lymphoblastic leukaemia (ALL) Oncosimulator which simulates ALL progression and treatment response is presented. These methods are used in order for the parameters of the model to be estimated for retrospective cases and to be predicted for prospective ones. The parameter value prediction is based on a regression model trained on retrospective cases. The proposed Hybrid ALL Oncosimulator system has been evaluated when predicting the pre-phase treatment outcome in ALL. This has been correctly achieved for a significant percentage of patient cases tested (approx. 70% of patients). Moreover, the system is capable of denying the classification of cases for which the results are not trustworthy enough. In that case, potentially misleading predictions for a number of patients are avoided, while the classification accuracy for the remaining patient cases further increases. The results obtained are particularly encouraging regarding the soundness of the proposed methodologies and their relevance to the process of achieving clinical applicability of the proposed Hybrid ALL Oncosimulator system and VPH models in general.

  4. Dragon pulse information management system (DPIMS): A unique model-based approach to implementing domain agnostic system of systems and behaviors

    NASA Astrophysics Data System (ADS)

    Anderson, Thomas S.

    2016-05-01

    The Global Information Network Architecture is an information technology based on Vector Relational Data Modeling, a unique computational paradigm, DoD network certified by USARMY as the Dragon Pulse Informa- tion Management System. This network available modeling environment for modeling models, where models are configured using domain relevant semantics and use network available systems, sensors, databases and services as loosely coupled component objects and are executable applications. Solutions are based on mission tactics, techniques, and procedures and subject matter input. Three recent ARMY use cases are discussed a) ISR SoS. b) Modeling and simulation behavior validation. c) Networked digital library with behaviors.

  5. A 2D flood inundation model based on cellular automata approach

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Todini, Ezio

    2010-05-01

    In the past years, the cellular automata approach has been successfully applied in two-dimensional modelling of flood events. When used in experimental applications, models based on such approach have provided good results, comparable to those obtained with more complex 2D models; moreover, CA models have proven significantly faster and easier to apply than most of existing models, and these features make them a valuable tool for flood analysis especially when dealing with large areas. However, to date the real degree of accuracy of such models has not been demonstrated, since they have been mainly used in experimental applications, while very few comparisons with theoretical solutions have been made. Also, the use of an explicit scheme of solution, which is inherent in cellular automata models, forces them to work only with small time steps, thus reducing model computation speed. The present work describes a cellular automata model based on the continuity and diffusive wave equations. Several model versions based on different solution schemes have been realized and tested in a number of numerical cases, both 1D and 2D, comparing the results with theoretical and numerical solutions. In all cases, the model performed well compared to the reference solutions, and proved to be both stable and accurate. Finally, the version providing the best results in terms of stability was tested in a real flood event and compared with different hydraulic models. Again, the cellular automata model provided very good results, both in term of computational speed and reproduction of the simulated event.

  6. How to get the most out of your orthopaedic fellowship: thinking about practice-based learning.

    PubMed

    Templeman, David

    2012-09-01

    Practice-based learning and improvement is an important skill set to develop during an orthopaedic trauma fellowship and is 1 of the 6 core competencies stated by the ACGME. The review of clinic cases is best done using a few simple models to develop a structured approach for studying cases. Three common sense and easy-to-use strategies to improve clinical practice are as follows: performing each case three times, studying the 4 quadrants of patient outcomes, and the application of the Pareto 80/20 rule. These principles help to develop a structured approach for analyzing and thinking about practice-based experiences.

  7. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  8. Application of Psychological Theories in Agent-Based Modeling: The Case of the Theory of Planned Behavior.

    PubMed

    Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo

    2018-01-01

    It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.

  9. An object-relational model for structured representation of medical knowledge.

    PubMed

    Koch, S; Risch, T; Schneider, W; Wagner, I V

    2006-07-01

    Domain specific knowledge is often not static but continuously evolving. This is especially true for the medical domain. Furthermore, the lack of standardized structures for presenting knowledge makes it difficult or often impossible to assess new knowledge in the context of existing knowledge. Possibilities to compare knowledge easily and directly are often not given. It is therefore of utmost importance to create a model that allows for comparability, consistency and quality assurance of medical knowledge in specific work situations. For this purpose, we have designed on object-relational model based on structured knowledge elements that are dynamically reusable by different multi-media-based tools for case-based documentation, disease course simulation, and decision support. With this model, high-level components, such as patient case reports or simulations of the course of a disease, and low-level components (e.g., diagnoses, symptoms or treatments) as well as the relationships between these components are modeled. The resulting schema has been implemented in AMOS II, on object-relational multi-database system supporting different views with regard to search and analysis depending on different work situations.

  10. Cadaver-based Necrotizing Fasciitis Model for Medical Training.

    PubMed

    Mohty, Kurt M; Cravens, Matthew G; Adamas-Rappaport, William J; Amini-Shervin, Bahareh; Irving, Steven C; Stea, Nicholas; Adhikari, Srikar; Amini, Richard

    2017-04-14

    Necrotizing fasciitis is a devastating infectious disease process that is characterized by extensive soft tissue necrosis along deep fascial planes, systemic toxicity, and high mortality. Ultrasound imaging is a rapid and non-invasive tool that can be used to help make the diagnosis of necrotizing fasciitis by identifying several distinctive sonographic findings. The purpose of this study is to describe the construction of a realistic diagnostic training model for necrotizing fasciitis using fresh frozen cadavers and common, affordable materials. Presently, fresh non-embalmed cadavers have been used at medical institutions for various educational sessions including cadaver-based ultrasound training sessions. Details for the preparation and construction of a necrotizing fasciitis cadaver model are presented here. This paper shows that the images obtained from the cadaver model closely imitate the ultrasound appearance of fluid and gas seen in actual clinical cases of necrotizing fasciitis. Therefore, it can be concluded that this cadaver-based model produces high-quality sonographic images that simulate those found in true cases of necrotizing fasciitis and is ideal for demonstrating the sonographic findings of necrotizing fasciitis.

  11. Remote control missile model test

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric body/tail fin data base was gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but can also be used as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analysis of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Comparisons between these data and calculations from the SWINT Euler code are also presented.

  12. A medical social work perspective on rehabilitation.

    PubMed

    Fugl-Meyer, Kerstin Sjögren

    2016-10-12

    This paper introduces a biopsychosocial model for use as a tool by medical social workers and other rehabilitation professionals for the descriptive analysis of the case history and follow-up of patients needing rehabilitative support. The model is based on action theory and emphasizes the demands on evidence-based clarification of the interplay between a subject's contextual life situation, their ability to act in order to realize their goals, and their emotional adaptation. Using clinical experience and literature searches, a standard operations procedure to adequately document the case history in clinical practice is suggested, thus providing strategies through which the work of medical social workers can be based on evidence. Some specific areas of concern for the medical social worker within the rehabilitation of disabled people are highlighted.

  13. Unified Deep Learning Architecture for Modeling Biology Sequence.

    PubMed

    Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang

    2017-10-09

    Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.

  14. Implementing inquiry-based kits within a professional development school model

    NASA Astrophysics Data System (ADS)

    Jones, Mark Thomas

    2005-07-01

    Implementation of guided inquiry teaching for the first time carries inherent problems for science teachers. Reform efforts on inquiry-based science teaching are often unsustainable and are not sensitive to teachers' needs and abilities as professionals. Professional development schools are meant to provide a research-based partnership between a public school and a university. These collaborations can provide support for the professional development of teachers. This dissertation reports a study focused on the implementation of inquiry-based science kits within the support of one of these collaborations. The researcher describes the difficulties and successful adaptations experienced by science teachers and how a coteaching model provided support. These types of data are needed in order to develop a bottom-up, sustainable process that will allow teachers to implement inquiry-based science. A qualitative methodology with "researcher as participant" was used in this study of two science teachers during 2002--2003. These two teachers were supported by a coteaching model, which included preservice teachers for each teacher as well as a supervising professor. Data were collected from the researcher's direct observations of coteachers' practice. Data were also collected from interviews and reflective pieces from the coteachers. Triangulation of the data on each teacher's case supported the validity of the findings. Case reports were prepared from these data for each classroom teacher. These case reports were used and cross-case analysis was conducted to search for major themes and findings in the study. Major findings described the hurdles teachers encounter, examples of adaptations observed in the teachers' cases and the supportive interactions with their coteachers while implementing the inquiry-based kits. In addition, the data were used to make recommendations for future training and use of the kits and the coteaching model. Results from this study showed that the kit's guided structure of inquiry and the collaboration both affected the inservice teachers in the following ways: The coteaching model supported behavioral and material management issues caused by the implementation of the kits; collaboration with preservice teachers created a "smaller-class-size" effect, which allowed teachers to attend to a smaller number of students for cooperative learning and assessment, and the elementary inservice teachers learned pedagogical strategies and science content from collaborating with secondary preservice teachers in kit use and from the kits' curriculum. Results were used as a self-study for future training and support for implementation of inquiry-based kits.

  15. Experiment evaluates ocean models and data assimiliation in the Gulf Stream

    NASA Astrophysics Data System (ADS)

    Willems, Robert C.; Glenn, S. M.; Crowley, M. F.; Malanotte-Rizzoli, P.; Young, R. E.; Ezer, T.; Mellor, G. L.; Arango, H. G.; Robinson, A. R.; Lai, C.-C. A.

    Using data sets of known quality as the basis for comparison, a recent experiment explored the Gulf Stream Region at 27°-47°N and 80°-50°W to assess the nowcast/forecast capability of specific ocean models and the impact of data assimilation. Scientists from five universities and the Naval Research Laboratory/Stennis Space Center participated in the Data Assimilation and Model Evaluation Experiment (DAMEÉ-GSR).DAMEÉ-GSR was based on case studies, each successively more complex, and was divided into three phases using case studies (data) from 1987 and 1988. Phase I evaluated models' forecast capability using common initial conditions and comparing model forecast fields with observational data at forecast time over a 2-week period. Phase II added data assimilation and assessed its impact on forecast capability, using the same case studies as in phase I, and phase III added a 2-month case study overlapping some periods in Phases I and II.

  16. EXAMINING TATOOINE: ATMOSPHERIC MODELS OF NEPTUNE-LIKE CIRCUMBINARY PLANETS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    May, E. M.; Rauscher, E.

    2016-08-01

    Circumbinary planets experience a time-varying irradiation pattern as they orbit their two host stars. In this work, we present the first detailed study of the atmospheric effects of this irradiation pattern on known and hypothetical gaseous circumbinary planets. Using both a one-dimensional energy balance model (EBM) and a three-dimensional general circulation model (GCM), we look at the temperature differences between circumbinary planets and their equivalent single-star cases in order to determine the nature of the atmospheres of these planets. We find that for circumbinary planets on stable orbits around their host stars, temperature differences are on average no more thanmore » 1.0% in the most extreme cases. Based on detailed modeling with the GCM, we find that these temperature differences are not large enough to excite circulation differences between the two cases. We conclude that gaseous circumbinary planets can be treated as their equivalent single-star case in future atmospheric modeling efforts.« less

  17. Impact of meteorological factors on the incidence of bacillary dysentery in Beijing, China: A time series analysis (1970-2012).

    PubMed

    Yan, Long; Wang, Hong; Zhang, Xuan; Li, Ming-Yue; He, Juan

    2017-01-01

    Influence of meteorological variables on the transmission of bacillary dysentery (BD) is under investigated topic and effective forecasting models as public health tool are lacking. This paper aimed to quantify the relationship between meteorological variables and BD cases in Beijing and to establish an effective forecasting model. A time series analysis was conducted in the Beijing area based upon monthly data on weather variables (i.e. temperature, rainfall, relative humidity, vapor pressure, and wind speed) and on the number of BD cases during the period 1970-2012. Autoregressive integrated moving average models with explanatory variables (ARIMAX) were built based on the data from 1970 to 2004. Prediction of monthly BD cases from 2005 to 2012 was made using the established models. The prediction accuracy was evaluated by the mean square error (MSE). Firstly, temperature with 2-month and 7-month lags and rainfall with 12-month lag were found positively correlated with the number of BD cases in Beijing. Secondly, ARIMAX model with covariates of temperature with 7-month lag (β = 0.021, 95% confidence interval(CI): 0.004-0.038) and rainfall with 12-month lag (β = 0.023, 95% CI: 0.009-0.037) displayed the highest prediction accuracy. The ARIMAX model developed in this study showed an accurate goodness of fit and precise prediction accuracy in the short term, which would be beneficial for government departments to take early public health measures to prevent and control possible BD popularity.

  18. HDOCK: a web server for protein–protein and protein–DNA/RNA docking based on a hybrid strategy

    PubMed Central

    Yan, Yumeng; Zhang, Di; Zhou, Pei; Li, Botong

    2017-01-01

    Abstract Protein–protein and protein–DNA/RNA interactions play a fundamental role in a variety of biological processes. Determining the complex structures of these interactions is valuable, in which molecular docking has played an important role. To automatically make use of the binding information from the PDB in docking, here we have presented HDOCK, a novel web server of our hybrid docking algorithm of template-based modeling and free docking, in which cases with misleading templates can be rescued by the free docking protocol. The server supports protein–protein and protein–DNA/RNA docking and accepts both sequence and structure inputs for proteins. The docking process is fast and consumes about 10–20 min for a docking run. Tested on the cases with weakly homologous complexes of <30% sequence identity from five docking benchmarks, the HDOCK pipeline tied with template-based modeling on the protein–protein and protein–DNA benchmarks and performed better than template-based modeling on the three protein–RNA benchmarks when the top 10 predictions were considered. The performance of HDOCK became better when more predictions were considered. Combining the results of HDOCK and template-based modeling by ranking first of the template-based model further improved the predictive power of the server. The HDOCK web server is available at http://hdock.phys.hust.edu.cn/. PMID:28521030

  19. Optimization of seasonal ARIMA models using differential evolution - simulated annealing (DESA) algorithm in forecasting dengue cases in Baguio City

    NASA Astrophysics Data System (ADS)

    Addawe, Rizavel C.; Addawe, Joel M.; Magadia, Joselito C.

    2016-10-01

    Accurate forecasting of dengue cases would significantly improve epidemic prevention and control capabilities. This paper attempts to provide useful models in forecasting dengue epidemic specific to the young and adult population of Baguio City. To capture the seasonal variations in dengue incidence, this paper develops a robust modeling approach to identify and estimate seasonal autoregressive integrated moving average (SARIMA) models in the presence of additive outliers. Since the least squares estimators are not robust in the presence of outliers, we suggest a robust estimation based on winsorized and reweighted least squares estimators. A hybrid algorithm, Differential Evolution - Simulated Annealing (DESA), is used to identify and estimate the parameters of the optimal SARIMA model. The method is applied to the monthly reported dengue cases in Baguio City, Philippines.

  20. Costs of providing infusion therapy for patients with inflammatory bowel disease in a hospital-based infusion center setting.

    PubMed

    Afzali, Anita; Ogden, Kristine; Friedman, Michael L; Chao, Jingdong; Wang, Anthony

    2017-04-01

    Inflammatory bowel disease (IBD) (e.g. ulcerative colitis [UC] and Crohn's disease [CD]) severely impacts patient quality-of-life. Moderate-to-severe disease is often treated with biologics requiring infusion therapy, adding incremental costs beyond drug costs. This study evaluates US hospital-based infusion services costs for treatment of UC or CD patients receiving infliximab or vedolizumab therapy. A model was developed, estimating annual costs of providing monitored infusions using an activity-based costing framework approach. Multiple sources (published literature, treatment product inserts) informed base-case model input estimates. The total modeled per patient infusion therapy costs in Year 1 with infliximab and vedolizumab was $38,782 and $41,320, respectively, and Year 2+, $49,897 and $36,197, respectively. Drug acquisition cost was the largest total costs driver (90-93%), followed by costs associated with hospital-based infusion provision: labor (53-56%, non-drug costs), allocated overhead (23%, non-drug costs), non-labor (23%, non-drug costs), and laboratory (7-10%, non-drug costs). Limitations included reliance on published estimates, base-case cost estimates infusion drug, and supplies, not accounting for volume pricing, assumption of a small hospital infusion center, and that, given the model adopts the hospital perspective, costs to the patient were not included in infusion administration cost base-case estimates. This model is an early step towards a framework to fully analyze infusion therapies' associated costs. Given the lack of published data, it would be beneficial for hospital administrators to assess total costs and trade-offs with alternative means of providing biologic therapies. This analysis highlights the value to hospital administrators of assessing cost associated with infusion patient mix to make more informed resource allocation decisions. As the landscape for reimbursement changes, tools for evaluating the costs of infusion therapy may help hospital administrators make informed choices and weigh trade-offs associated with providing infusion services for IBD patients.

  1. Integration of EEG lead placement templates into traditional technologist-based staffing models reduces costs in continuous video-EEG monitoring service.

    PubMed

    Kolls, Brad J; Lai, Amy H; Srinivas, Anang A; Reid, Robert R

    2014-06-01

    The purpose of this study was to determine the relative cost reductions within different staffing models for continuous video-electroencephalography (cvEEG) service by introducing a template system for 10/20 lead application. We compared six staffing models using decision tree modeling based on historical service line utilization data from the cvEEG service at our center. Templates were integrated into technologist-based service lines in six different ways. The six models studied were templates for all studies, templates for intensive care unit (ICU) studies, templates for on-call studies, templates for studies of ≤ 24-hour duration, technologists for on-call studies, and technologists for all studies. Cost was linearly related to the study volume for all models with the "templates for all" model incurring the lowest cost. The "technologists for all" model carried the greatest cost. Direct cost comparison shows that any introduction of templates results in cost savings, with the templates being used for patients located in the ICU being the second most cost efficient and the most practical of the combined models to implement. Cost difference between the highest and lowest cost models under the base case produced an annual estimated savings of $267,574. Implementation of the ICU template model at our institution under base case conditions would result in a $205,230 savings over our current "technologist for all" model. Any implementation of templates into a technologist-based cvEEG service line results in cost savings, with the most significant annual savings coming from using the templates for all studies, but the most practical implementation approach with the second highest cost reduction being the template used in the ICU. The lowered costs determined in this work suggest that a template-based cvEEG service could be supported at smaller centers with significantly reduced costs and could allow for broader use of cvEEG patient monitoring.

  2. Solar Wind Plasma Interaction with Asteroid 16 Psyche: Implication for Formation Theories

    NASA Astrophysics Data System (ADS)

    Fatemi, Shahab; Poppe, Andrew R.

    2018-01-01

    The asteroid 16 Psyche is a primitive metal-rich asteroid that has not yet been visited by spacecraft. Based on remote observations, Psyche is most likely composed of iron and nickel metal; however, the history of its formation and solidification is still unknown. If Psyche is a remnant core of a differentiated planetesimal exposed by collisions, it opens a unique window toward understanding the cores of the terrestrial bodies, including the Earth and Mercury. If not, it is perhaps a reaccreted rubble pile that has never melted. In the former case, Psyche may have a remanent, dipolar magnetic field; in the latter case, Psyche may have no intrinsic field, but nevertheless would be a conductive object in the solar wind. We use Advanced Modeling Infrastructure in Space Simulation (AMITIS), a three-dimensional GPU-based hybrid model of plasma that self-consistently couples the interior electromagnetic response of Psyche (i.e., magnetic diffusion) to its ambient plasma environment in order to quantify the different interactions under these two cases. The model results provide estimates for the electromagnetic environment of Psyche, showing that the magnetized case and the conductive case present very different signatures in the solar wind. These results have implications for an accurate interpretation of magnetic field observations by NASA's Discovery mission (Psyche mission) to the asteroid 16 Psyche.

  3. Unsolved homicides in Sweden: A population-based study of 264 homicides.

    PubMed

    Sturup, Joakim; Karlberg, Daniel; Kristiansson, Marianne

    2015-12-01

    The clearance rates for homicides have decreased internationally. This retrospective population-based study of all Swedish homicide incidents between 2007 and 2009 (n=264) aims to investigate factors associated with solvability in homicides. Victims were identified in an autopsy registry and offenders in a criminal-conviction registry. Autopsy reports, police files, court verdicts and criminal records were systematically collected and linked. The clearance rate was 86.4% (n=228), and almost three quarters of cases (71.9%) were solved within the first week. Nine factors were significantly associated with the case status; however, only four factors remained significant in the multivariate logistic-regression model. Cases were more likely to be solved if there was an eyewitness and if the victim was intoxicated with alcohol. Moreover, cases were less likely to be solved if the victim had a criminal record in the past five years and was killed by a firearm. In the final model, a Cox proportional-hazards model, where time to arrest was taken into account, only alcohol intoxication were positively and firearms negatively significantly associated with clearance status. The study concludes that cases involving these factors should be granted extra, intensive and lasting resources. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Magazine Influence on Cartridge Case Ejection Patterns with Glock Pistols.

    PubMed

    Kerkhoff, Wim; Alberink, Ivo; Mattijssen, Erwin J A T

    2018-01-01

    In this study, the cartridge case ejection patterns of six different Glock model pistols (one specimen per model) were compared under three conditions: firing with a loaded magazine, an empty magazine, and without magazine. The distances, covered by the ejected cartridge cases given these three conditions, were compared for each of the six models. A significant difference was found between the groups of data for each of the tested specimens. This indicates that it is important that, to reconstruct a shooting scene incident based on the ejection patterns of a pistol, test shots are fired with the same pistol type and under the correct magazine condition. © 2017 American Academy of Forensic Sciences.

  5. An integrated hypnotherapeutic model for the treatment of childhood sexual trauma: a case study.

    PubMed

    Fourie, Gerda; Guse, Tharina

    2011-01-01

    Sexual abuse appears to constitute a major risk factor for a variety of problems in adult life. The effects of abuse on adult living are not uniform therefore intervention strategies should be individualized to address unique symptom constellations. The purpose of this paper is to introduce an integrated Ericksonian and Ego state therapy approach, based on a strengths perspective for the treatment of survivors of childhood sexual abuse. The theoretical foundation for this model is described, followed by a case study. The case study demonstrates how application of this model enabled the client to resolve the experience of sexual abuse, as well as to enhance her sense of general psychological well-being.

  6. Models of the First-Term Reenlistment Decision.

    DTIC Science & Technology

    1980-09-01

    cases in each cell . bIncludes 41 cases indicated as E2 in survey. elncludes 18 cases indicated as E6 in survey. creases. For example, among E-4...1876) (4078) oThe numbers in parentheses show the number of cases in each cell . Clearly, the extent to which these correlations reflect causal...101) NOTE: Numbers in parentheses show the number of cases in each cell . aAmount computed from October 1975 pay tables based on the individual’s

  7. A discriminant analysis prediction model of non-syndromic cleft lip with or without cleft palate based on risk factors.

    PubMed

    Li, Huixia; Luo, Miyang; Luo, Jiayou; Zheng, Jianfei; Zeng, Rong; Du, Qiyun; Fang, Junqun; Ouyang, Na

    2016-11-23

    A risk prediction model of non-syndromic cleft lip with or without cleft palate (NSCL/P) was established by a discriminant analysis to predict the individual risk of NSCL/P in pregnant women. A hospital-based case-control study was conducted with 113 cases of NSCL/P and 226 controls without NSCL/P. The cases and the controls were obtained from 52 birth defects' surveillance hospitals in Hunan Province, China. A questionnaire was administered in person to collect the variables relevant to NSCL/P by face to face interviews. Logistic regression models were used to analyze the influencing factors of NSCL/P, and a stepwise Fisher discriminant analysis was subsequently used to construct the prediction model. In the univariate analysis, 13 influencing factors were related to NSCL/P, of which the following 8 influencing factors as predictors determined the discriminant prediction model: family income, maternal occupational hazards exposure, premarital medical examination, housing renovation, milk/soymilk intake in the first trimester of pregnancy, paternal occupational hazards exposure, paternal strong tea drinking, and family history of NSCL/P. The model had statistical significance (lambda = 0.772, chi-square = 86.044, df = 8, P < 0.001). Self-verification showed that 83.8 % of the participants were correctly predicted to be NSCL/P cases or controls with a sensitivity of 74.3 % and a specificity of 88.5 %. The area under the receiver operating characteristic curve (AUC) was 0.846. The prediction model that was established using the risk factors of NSCL/P can be useful for predicting the risk of NSCL/P. Further research is needed to improve the model, and confirm the validity and reliability of the model.

  8. Partial Ambiguity Resolution for Ground and Space-Based Applications in a GPS+Galileo scenario: A simulation study

    NASA Astrophysics Data System (ADS)

    Nardo, A.; Li, B.; Teunissen, P. J. G.

    2016-01-01

    Integer Ambiguity Resolution (IAR) is the key to fast and precise GNSS positioning. The proper diagnostic metric for successful IAR is provided by the ambiguity success rate being the probability of correct integer estimation. In this contribution we analyse the performance of different GPS+Galileo models in terms of number of epochs needed to reach a pre-determined success rate, for various ground and space-based applications. The simulation-based controlled model environment enables us to gain insight into the factors contributing to the ambiguity resolution strength of the different GPS+Galileo models. Different scenarios of modernized GPS+Galileo are studied, encompassing the long baseline ground case as well as the medium dynamics case (airplane) and the space-based Low Earth Orbiter (LEO) case. In our analyses of these models the capabilities of partial ambiguity resolution (PAR) are demonstrated and compared to the limitations of full ambiguity resolution (FAR). The results show that PAR is generally a more efficient way than FAR to reduce the time needed to achieve centimetre-level positioning precision. For long single baselines, PAR can achieve time reductions of fifty percent to achieve such precision levels, while for multiple baselines it even becomes more effective, reaching reductions up to eighty percent for four station networks. For a LEO, the rapidly changing observation geometry does not even allow FAR, while PAR is then still possible for both dual- and triple-frequency scenarios. With the triple-frequency GPS+Galileo model the availability of precise positioning improves by fifteen percent with respect to the dual-frequency scenario.

  9. Image-based modeling of tumor shrinkage in head and neck radiation therapy1

    PubMed Central

    Chao, Ming; Xie, Yaoqin; Moros, Eduardo G.; Le, Quynh-Thu; Xing, Lei

    2010-01-01

    Purpose: Understanding the kinetics of tumor growth∕shrinkage represents a critical step in quantitative assessment of therapeutics and realization of adaptive radiation therapy. This article presents a novel framework for image-based modeling of tumor change and demonstrates its performance with synthetic images and clinical cases. Methods: Due to significant tumor tissue content changes, similarity-based models are not suitable for describing the process of tumor volume changes. Under the hypothesis that tissue features in a tumor volume or at the boundary region are partially preserved, the kinetic change was modeled in two steps: (1) Autodetection of homologous tissue features shared by two input images using the scale invariance feature transformation (SIFT) method; and (2) establishment of a voxel-to-voxel correspondence between the images for the remaining spatial points by interpolation. The correctness of the tissue feature correspondence was assured by a bidirectional association procedure, where SIFT features were mapped from template to target images and reversely. A series of digital phantom experiments and five head and neck clinical cases were used to assess the performance of the proposed technique. Results: The proposed technique can faithfully identify the known changes introduced when constructing the digital phantoms. The subsequent feature-guided thin plate spline calculation reproduced the “ground truth” with accuracy better than 1.5 mm. For the clinical cases, the new algorithm worked reliably for a volume change as large as 30%. Conclusions: An image-based tumor kinetic algorithm was developed to model the tumor response to radiation therapy. The technique provides a practical framework for future application in adaptive radiation therapy. PMID:20527569

  10. Modeling the dissipation rate in rotating turbulent flows

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.; Raj, Rishi; Gatski, Thomas B.

    1990-01-01

    A variety of modifications to the modeled dissipation rate transport equation that have been proposed during the past two decades to account for rotational strains are examined. The models are subjected to two crucial test cases: the decay of isotropic turbulence in a rotating frame and homogeneous shear flow in a rotating frame. It is demonstrated that these modifications do not yield substantially improved predictions for these two test cases and in many instances give rise to unphysical behavior. An alternative proposal, based on the use of the tensor dissipation rate, is made for the development of improved models.

  11. Influence of air quality model resolution on uncertainty associated with health impacts

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2012-10-01

    We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model (CAMx), we ran a modeling episode with meteorological inputs simulating conditions as they occurred during August through September 2006 (a period representative of conditions leading to high ozone), and two emissions inventories (a 2006 base case and a 2018 proposed control scenario, both for Houston, Texas) at 36, 12, 4 and 2 km resolution. The base case model performance was evaluated for each resolution against daily maximum 8-h averaged ozone measured at monitoring stations. Results from each resolution were more similar to each other than they were to measured values. Population-weighted ozone concentrations were calculated for each resolution and applied to concentration response functions (with 95% confidence intervals) to estimate the health impacts of modeled ozone reduction from the base case to the control scenario. We found that estimated avoided mortalities were not significantly different between the 2, 4 and 12 km resolution runs, but the 36 km resolution may over-predict some potential health impacts. Given the cost/benefit analysis requirements motivated by Executive Order 12866 as it applies to the Clean Air Act, the uncertainty associated with human health impacts and therefore the results reported in this study, we conclude that health impacts calculated from population weighted ozone concentrations obtained using regional photochemical models at 36 km resolution fall within the range of values obtained using fine (12 km or finer) resolution modeling. However, in some cases, 36 km resolution may not be fine enough to statistically replicate the results achieved using 2, 4 or 12 km resolution. On average, when modeling at 36 km resolution, an estimated 5 deaths per week during the May through September ozone season are avoided because of ozone reductions resulting from the proposed emissions reductions (95% confidence interval was 2-8). When modeling at 2, 4 or 12 km finer scale resolution, on average 4 deaths are avoided due to the same reductions (95% confidence interval was 1-7). Study results show that ozone modeling at a resolution finer than 12 km is unlikely to reduce uncertainty in benefits analysis for this specific region. We suggest that 12 km resolution may be appropriate for uncertainty analyses of health impacts due to ozone control scenarios, in areas with similar chemistry, meteorology and population density, but that resolution requirements should be assessed on a case-by-case basis and revised as confidence intervals for concentration-response functions are updated.

  12. Immigration, Racial Profiling, and White Privilege: Community-Based Challenges and Practices for Adult Educators

    ERIC Educational Resources Information Center

    Kong, Luis J.

    2010-01-01

    In this chapter, the author will explore the significance of race from a social constructionist perspective. He will focus on immigration laws and on examples of legal cases that have set the stage for current definitions of whiteness and racial identification. A community-based transformational organizing model will be presented. The model will…

  13. The Preliminary Investigation of the Factors that Influence the E-Learning Adoption in Higher Education Institutes: Jordan Case Study

    ERIC Educational Resources Information Center

    Al-hawari, Maen; Al-halabi, Sanaa

    2010-01-01

    Creativity and high performance in learning processes are the main concerns of educational institutions. E-learning contributes to the creativity and performance of these institutions and reproduces a traditional learning model based primarily on knowledge transfer into more innovative models based on collaborative learning. In this paper, the…

  14. An ontology-driven, case-based clinical decision support model for removable partial denture design

    NASA Astrophysics Data System (ADS)

    Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao

    2016-06-01

    We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient’s oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.

  15. An ontology-driven, case-based clinical decision support model for removable partial denture design.

    PubMed

    Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao

    2016-06-14

    We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient's oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.

  16. Total inpatient treatment costs in patients with severe burns: towards a more accurate reimbursement model.

    PubMed

    Mehra, Tarun; Koljonen, Virve; Seifert, Burkhardt; Volbracht, Jörk; Giovanoli, Pietro; Plock, Jan; Moos, Rudolf Maria

    2015-01-01

    Reimbursement systems have difficulties depicting the actual cost of burn treatment, leaving care providers with a significant financial burden. Our aim was to establish a simple and accurate reimbursement model compatible with prospective payment systems. A total of 370 966 electronic medical records of patients discharged in 2012 to 2013 from Swiss university hospitals were reviewed. A total of 828 cases of burns including 109 cases of severe burns were retained. Costs, revenues and earnings for severe and nonsevere burns were analysed and a linear regression model predicting total inpatient treatment costs was established. The median total costs per case for severe burns was tenfold higher than for nonsevere burns (179 949 CHF [167 353 EUR] vs 11 312 CHF [10 520 EUR], interquartile ranges 96 782-328 618 CHF vs 4 874-27 783 CHF, p <0.001). The median of earnings per case for nonsevere burns was 588 CHF (547 EUR) (interquartile range -6 720 - 5 354 CHF) whereas severe burns incurred a large financial loss to care providers, with median earnings of -33 178 CHF (30 856 EUR) (interquartile range -95 533 - 23 662 CHF). Differences were highly significant (p <0.001). Our linear regression model predicting total costs per case with length of stay (LOS) as independent variable had an adjusted R2 of 0.67 (p <0.001 for LOS). Severe burns are systematically underfunded within the Swiss reimbursement system. Flat-rate DRG-based refunds poorly reflect the actual treatment costs. In conclusion, we suggest a reimbursement model based on a per diem rate for treatment of severe burns.

  17. Stabilization of nonlinear systems using sampled-data output-feedback fuzzy controller based on polynomial-fuzzy-model-based control approach.

    PubMed

    Lam, H K

    2012-02-01

    This paper investigates the stability of sampled-data output-feedback (SDOF) polynomial-fuzzy-model-based control systems. Representing the nonlinear plant using a polynomial fuzzy model, an SDOF fuzzy controller is proposed to perform the control process using the system output information. As only the system output is available for feedback compensation, it is more challenging for the controller design and system analysis compared to the full-state-feedback case. Furthermore, because of the sampling activity, the control signal is kept constant by the zero-order hold during the sampling period, which complicates the system dynamics and makes the stability analysis more difficult. In this paper, two cases of SDOF fuzzy controllers, which either share the same number of fuzzy rules or not, are considered. The system stability is investigated based on the Lyapunov stability theory using the sum-of-squares (SOS) approach. SOS-based stability conditions are obtained to guarantee the system stability and synthesize the SDOF fuzzy controller. Simulation examples are given to demonstrate the merits of the proposed SDOF fuzzy control approach.

  18. VHBuild.com: A Web-Based System for Managing Knowledge in Projects.

    ERIC Educational Resources Information Center

    Li, Heng; Tang, Sandy; Man, K. F.; Love, Peter E. D.

    2002-01-01

    Describes an intelligent Web-based construction project management system called VHBuild.com which integrates project management, knowledge management, and artificial intelligence technologies. Highlights include an information flow model; time-cost optimization based on genetic algorithms; rule-based drawing interpretation; and a case-based…

  19. Challenges and Rewards on the Road to Translational Systems Biology in Acute Illness: Four Case Reports from Interdisciplinary Teams

    PubMed Central

    An, Gary; Hunt, C. Anthony; Clermont, Gilles; Neugebauer, Edmund; Vodovotz, Yoram

    2007-01-01

    Introduction Translational systems biology approaches can be distinguished from mainstream systems biology in that their goal is to drive novel therapies and streamline clinical trials in critical illness. One systems biology approach, dynamic mathematical modeling (DMM), is increasingly used in dealing with the complexity of the inflammatory response and organ dysfunction. The use of DMM often requires a broadening of research methods and a multidisciplinary team approach that includes bioscientists, mathematicians, engineers, and computer scientists. However, the development of these groups must overcome domain-specific barriers to communication and understanding. Methods We present four case studies of successful translational, interdisciplinary systems biology efforts, which differ by organizational level from an individual to an entire research community. Results Case 1 is a single investigator involved in DMM of the acute inflammatory response at Cook County Hospital, in which extensive translational progress was made using agent-based models of inflammation and organ damage. Case 2 is a community-level effort from the University of Witten-Herdecke in Cologne, whose efforts have led to the formation of the Society for Complexity in Acute Illness. Case 3 is an institution-based group, the Biosystems Group at the University of California, San Francisco, whose work has included a focus on a common lexicon for DMM. Case 4 is an institution-based, trans-disciplinary research group (the Center for Inflammation and Regenerative Modeling at the University of Pittsburgh, whose modeling work has led to internal education efforts, grant support, and commercialization. Conclusion A transdisciplinary approach, which involves team interaction in an iterative fashion to address ambiguity and is supported by educational initiatives, is likely to be necessary for DMM in acute illness. Community-wide organizations such as the Society of Complexity in Acute Illness (SCAI) must strive to facilitate the implementation of DMM in sepsis/trauma research into the research community as a whole. PMID:17548029

  20. Ensembles generated from crystal structures of single distant homologues solve challenging molecular-replacement cases in AMPLE.

    PubMed

    Rigden, Daniel J; Thomas, Jens M H; Simkovic, Felix; Simpkin, Adam; Winn, Martyn D; Mayans, Olga; Keegan, Ronan M

    2018-03-01

    Molecular replacement (MR) is the predominant route to solution of the phase problem in macromolecular crystallography. Although routine in many cases, it becomes more effortful and often impossible when the available experimental structures typically used as search models are only distantly homologous to the target. Nevertheless, with current powerful MR software, relatively small core structures shared between the target and known structure, of 20-40% of the overall structure for example, can succeed as search models where they can be isolated. Manual sculpting of such small structural cores is rarely attempted and is dependent on the crystallographer's expertise and understanding of the protein family in question. Automated search-model editing has previously been performed on the basis of sequence alignment, in order to eliminate, for example, side chains or loops that are not present in the target, or on the basis of structural features (e.g. solvent accessibility) or crystallographic parameters (e.g. B factors). Here, based on recent work demonstrating a correlation between evolutionary conservation and protein rigidity/packing, novel automated ways to derive edited search models from a given distant homologue over a range of sizes are presented. A variety of structure-based metrics, many readily obtained from online webservers, can be fed to the MR pipeline AMPLE to produce search models that succeed with a set of test cases where expertly manually edited comparators, further processed in diverse ways with MrBUMP, fail. Further significant performance gains result when the structure-based distance geometry method CONCOORD is used to generate ensembles from the distant homologue. To our knowledge, this is the first such approach whereby a single structure is meaningfully transformed into an ensemble for the purposes of MR. Additional cases further demonstrate the advantages of the approach. CONCOORD is freely available and computationally inexpensive, so these novel methods offer readily available new routes to solve difficult MR cases.

  1. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case-driven approach will be deployed in the future to develop the detailed work procedures required to successfully execute each operational activity.

  2. Ensembles generated from crystal structures of single distant homologues solve challenging molecular-replacement cases in AMPLE

    PubMed Central

    Simpkin, Adam; Mayans, Olga; Keegan, Ronan M.

    2018-01-01

    Molecular replacement (MR) is the predominant route to solution of the phase problem in macromolecular crystallography. Although routine in many cases, it becomes more effortful and often impossible when the available experimental structures typically used as search models are only distantly homologous to the target. Nevertheless, with current powerful MR software, relatively small core structures shared between the target and known structure, of 20–40% of the overall structure for example, can succeed as search models where they can be isolated. Manual sculpting of such small structural cores is rarely attempted and is dependent on the crystallographer’s expertise and understanding of the protein family in question. Automated search-model editing has previously been performed on the basis of sequence alignment, in order to eliminate, for example, side chains or loops that are not present in the target, or on the basis of structural features (e.g. solvent accessibility) or crystallographic parameters (e.g. B factors). Here, based on recent work demonstrating a correlation between evolutionary conservation and protein rigidity/packing, novel automated ways to derive edited search models from a given distant homologue over a range of sizes are presented. A variety of structure-based metrics, many readily obtained from online webservers, can be fed to the MR pipeline AMPLE to produce search models that succeed with a set of test cases where expertly manually edited comparators, further processed in diverse ways with MrBUMP, fail. Further significant performance gains result when the structure-based distance geometry method CONCOORD is used to generate ensembles from the distant homologue. To our knowledge, this is the first such approach whereby a single structure is meaningfully transformed into an ensemble for the purposes of MR. Additional cases further demonstrate the advantages of the approach. CONCOORD is freely available and computationally inexpensive, so these novel methods offer readily available new routes to solve difficult MR cases. PMID:29533226

  3. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  4. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  5. Comprehensive Fault Tolerance and Science-Optimal Attitude Planning for Spacecraft Applications

    NASA Astrophysics Data System (ADS)

    Nasir, Ali

    Spacecraft operate in a harsh environment, are costly to launch, and experience unavoidable communication delay and bandwidth constraints. These factors motivate the need for effective onboard mission and fault management. This dissertation presents an integrated framework to optimize science goal achievement while identifying and managing encountered faults. Goal-related tasks are defined by pointing the spacecraft instrumentation toward distant targets of scientific interest. The relative value of science data collection is traded with risk of failures to determine an optimal policy for mission execution. Our major innovation in fault detection and reconfiguration is to incorporate fault information obtained from two types of spacecraft models: one based on the dynamics of the spacecraft and the second based on the internal composition of the spacecraft. For fault reconfiguration, we consider possible changes in both dynamics-based control law configuration and the composition-based switching configuration. We formulate our problem as a stochastic sequential decision problem or Markov Decision Process (MDP). To avoid the computational complexity involved in a fully-integrated MDP, we decompose our problem into multiple MDPs. These MDPs include planning MDPs for different fault scenarios, a fault detection MDP based on a logic-based model of spacecraft component and system functionality, an MDP for resolving conflicts between fault information from the logic-based model and the dynamics-based spacecraft models" and the reconfiguration MDP that generates a policy optimized over the relative importance of the mission objectives versus spacecraft safety. Approximate Dynamic Programming (ADP) methods for the decomposition of the planning and fault detection MDPs are applied. To show the performance of the MDP-based frameworks and ADP methods, a suite of spacecraft attitude planning case studies are described. These case studies are used to analyze the content and behavior of computed policies in response to the changes in design parameters. A primary case study is built from the Far Ultraviolet Spectroscopic Explorer (FUSE) mission for which component models and their probabilities of failure are based on realistic mission data. A comparison of our approach with an alternative framework for spacecraft task planning and fault management is presented in the context of the FUSE mission.

  6. Expectation maximization-based likelihood inference for flexible cure rate models with Weibull lifetimes.

    PubMed

    Balakrishnan, Narayanaswamy; Pal, Suvra

    2016-08-01

    Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence. © The Author(s) 2013.

  7. Challenges and Opportunities in Disease Forecasting in Outbreak Settings: A Case Study of Measles in Lola Prefecture, Guinea

    PubMed Central

    Graham, Matthew; Suk, Jonathan E.; Takahashi, Saki; Metcalf, C. Jessica; Jimenez, A. Paez; Prikazsky, Vladimir; Ferrari, Matthew J.; Lessler, Justin

    2018-01-01

    Abstract. We report on and evaluate the process and findings of a real-time modeling exercise in response to an outbreak of measles in Lola prefecture, Guinea, in early 2015 in the wake of the Ebola crisis. Multiple statistical methods for the estimation of the size of the susceptible (i.e., unvaccinated) population were applied to weekly reported measles case data on seven subprefectures throughout Lola. Stochastic compartmental models were used to project future measles incidence in each subprefecture in both an initial and a follow-up iteration of forecasting. Measles susceptibility among 1- to 5-year-olds was estimated to be between 24% and 43% at the beginning of the outbreak. Based on this high baseline susceptibility, initial projections forecasted a large outbreak occurring over approximately 10 weeks and infecting 40 children per 1,000. Subsequent forecasts based on updated data mitigated this initial projection, but still predicted a significant outbreak. A catch-up vaccination campaign took place at the same time as this second forecast and measles cases quickly receded. Of note, case reports used to fit models changed significantly between forecast rounds. Model-based projections of both current population risk and future incidence can help in setting priorities and planning during an outbreak response. A swiftly changing situation on the ground, coupled with data uncertainties and the need to adjust standard analytical approaches to deal with sparse data, presents significant challenges. Appropriate presentation of results as planning scenarios, as well as presentations of uncertainty and two-way communication, is essential to the effective use of modeling studies in outbreak response. PMID:29532773

  8. Challenges and Opportunities in Disease Forecasting in Outbreak Settings: A Case Study of Measles in Lola Prefecture, Guinea.

    PubMed

    Graham, Matthew; Suk, Jonathan E; Takahashi, Saki; Metcalf, C Jessica; Jimenez, A Paez; Prikazsky, Vladimir; Ferrari, Matthew J; Lessler, Justin

    2018-05-01

    We report on and evaluate the process and findings of a real-time modeling exercise in response to an outbreak of measles in Lola prefecture, Guinea, in early 2015 in the wake of the Ebola crisis. Multiple statistical methods for the estimation of the size of the susceptible (i.e., unvaccinated) population were applied to weekly reported measles case data on seven subprefectures throughout Lola. Stochastic compartmental models were used to project future measles incidence in each subprefecture in both an initial and a follow-up iteration of forecasting. Measles susceptibility among 1- to 5-year-olds was estimated to be between 24% and 43% at the beginning of the outbreak. Based on this high baseline susceptibility, initial projections forecasted a large outbreak occurring over approximately 10 weeks and infecting 40 children per 1,000. Subsequent forecasts based on updated data mitigated this initial projection, but still predicted a significant outbreak. A catch-up vaccination campaign took place at the same time as this second forecast and measles cases quickly receded. Of note, case reports used to fit models changed significantly between forecast rounds. Model-based projections of both current population risk and future incidence can help in setting priorities and planning during an outbreak response. A swiftly changing situation on the ground, coupled with data uncertainties and the need to adjust standard analytical approaches to deal with sparse data, presents significant challenges. Appropriate presentation of results as planning scenarios, as well as presentations of uncertainty and two-way communication, is essential to the effective use of modeling studies in outbreak response.

  9. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries

    PubMed Central

    Kannan, Vaishnavi; Fish, Jason C.; Willett, DuWayne L.

    2018-01-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system’s requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. “Agile Modeling” retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams. PMID:29750222

  10. Incorporating single-side sparing in models for predicting parotid dose sparing in head and neck IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, Lulin, E-mail: lulin.yuan@duke.edu; Wu, Q. Jackie; Yin, Fang-Fang

    2014-02-15

    Purpose: Sparing of single-side parotid gland is a common practice in head-and-neck (HN) intensity modulated radiation therapy (IMRT) planning. It is a special case of dose sparing tradeoff between different organs-at-risk. The authors describe an improved mathematical model for predicting achievable dose sparing in parotid glands in HN IMRT planning that incorporates single-side sparing considerations based on patient anatomy and learning from prior plan data. Methods: Among 68 HN cases analyzed retrospectively, 35 cases had physician prescribed single-side parotid sparing preferences. The single-side sparing model was trained with cases which had single-side sparing preferences, while the standard model was trainedmore » with the remainder of cases. A receiver operating characteristics (ROC) analysis was performed to determine the best criterion that separates the two case groups using the physician's single-side sparing prescription as ground truth. The final predictive model (combined model) takes into account the single-side sparing by switching between the standard and single-side sparing models according to the single-side sparing criterion. The models were tested with 20 additional cases. The significance of the improvement of prediction accuracy by the combined model over the standard model was evaluated using the Wilcoxon rank-sum test. Results: Using the ROC analysis, the best single-side sparing criterion is (1) the predicted median dose of one parotid is higher than 24 Gy; and (2) that of the other is higher than 7 Gy. This criterion gives a true positive rate of 0.82 and a false positive rate of 0.19, respectively. For the bilateral sparing cases, the combined and the standard models performed equally well, with the median of the prediction errors for parotid median dose being 0.34 Gy by both models (p = 0.81). For the single-side sparing cases, the standard model overestimates the median dose by 7.8 Gy on average, while the predictions by the combined model differ from actual values by only 2.2 Gy (p = 0.005). Similarly, the sum of residues between the modeled and the actual plan DVHs is the same for the bilateral sparing cases by both models (p = 0.67), while the standard model predicts significantly higher DVHs than the combined model for the single-side sparing cases (p = 0.01). Conclusions: The combined model for predicting parotid sparing that takes into account single-side sparing improves the prediction accuracy over the previous model.« less

  11. Incorporating single-side sparing in models for predicting parotid dose sparing in head and neck IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, Lulin, E-mail: lulin.yuan@duke.edu; Wu, Q. Jackie; Yin, Fang-Fang

    Purpose: Sparing of single-side parotid gland is a common practice in head-and-neck (HN) intensity modulated radiation therapy (IMRT) planning. It is a special case of dose sparing tradeoff between different organs-at-risk. The authors describe an improved mathematical model for predicting achievable dose sparing in parotid glands in HN IMRT planning that incorporates single-side sparing considerations based on patient anatomy and learning from prior plan data. Methods: Among 68 HN cases analyzed retrospectively, 35 cases had physician prescribed single-side parotid sparing preferences. The single-side sparing model was trained with cases which had single-side sparing preferences, while the standard model was trainedmore » with the remainder of cases. A receiver operating characteristics (ROC) analysis was performed to determine the best criterion that separates the two case groups using the physician's single-side sparing prescription as ground truth. The final predictive model (combined model) takes into account the single-side sparing by switching between the standard and single-side sparing models according to the single-side sparing criterion. The models were tested with 20 additional cases. The significance of the improvement of prediction accuracy by the combined model over the standard model was evaluated using the Wilcoxon rank-sum test. Results: Using the ROC analysis, the best single-side sparing criterion is (1) the predicted median dose of one parotid is higher than 24 Gy; and (2) that of the other is higher than 7 Gy. This criterion gives a true positive rate of 0.82 and a false positive rate of 0.19, respectively. For the bilateral sparing cases, the combined and the standard models performed equally well, with the median of the prediction errors for parotid median dose being 0.34 Gy by both models (p = 0.81). For the single-side sparing cases, the standard model overestimates the median dose by 7.8 Gy on average, while the predictions by the combined model differ from actual values by only 2.2 Gy (p = 0.005). Similarly, the sum of residues between the modeled and the actual plan DVHs is the same for the bilateral sparing cases by both models (p = 0.67), while the standard model predicts significantly higher DVHs than the combined model for the single-side sparing cases (p = 0.01). Conclusions: The combined model for predicting parotid sparing that takes into account single-side sparing improves the prediction accuracy over the previous model.« less

  12. COLLABORATE©, Part IV: Ramping Up Competency-Based Performance Management.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    The purpose of this fourth part of the COLLABORATE© article series provides an expansion and application of previously presented concepts pertaining to the COLLABORATE paradigm of professional case management practice. The model is built upon a value-driven foundation that: PRIMARY PRACTICE SETTING(S):: Applicable to all health care sectors where case management is practiced. As an industry, health care continues to evolve. Terrain shifts and new influences continually surface to challenge professional case management practice. The need for top-performing and nimble professionals who are knowledgeable and proficient in the workplace continues to challenge human resource departments. In addition to care setting knowledge, professional case managers must continually invest in their practice competence toolbox to grow skills and abilities that transcend policies and processes. These individuals demonstrate agility in framing (and reframing) their professional practice to facilitate the best possible outcomes for their clients. Therefore, the continued emphasis on practice competence conveyed through the performance management cycle is an essential ingredient to performance management focused on customer service excellence and organizational improvement. Professional case management transcends professional disciplines, educational levels, and practice settings. Business objectives continue to drive work process and priorities in many practice settings. However, competencies that align with regulatory and accreditation requirements should be the critical driver for consistent, high-quality case management practice. Although there is inherent value in what various disciplines bring to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities.

  13. Time Prediction Models for Echinococcosis Based on Gray System Theory and Epidemic Dynamics.

    PubMed

    Zhang, Liping; Wang, Li; Zheng, Yanling; Wang, Kai; Zhang, Xueliang; Zheng, Yujian

    2017-03-04

    Echinococcosis, which can seriously harm human health and animal husbandry production, has become an endemic in the Xinjiang Uygur Autonomous Region of China. In order to explore an effective human Echinococcosis forecasting model in Xinjiang, three grey models, namely, the traditional grey GM(1,1) model, the Grey-Periodic Extensional Combinatorial Model (PECGM(1,1)), and the Modified Grey Model using Fourier Series (FGM(1,1)), in addition to a multiplicative seasonal ARIMA(1,0,1)(1,1,0)₄ model, are applied in this study for short-term predictions. The accuracy of the different grey models is also investigated. The simulation results show that the FGM(1,1) model has a higher performance ability, not only for model fitting, but also for forecasting. Furthermore, considering the stability and the modeling precision in the long run, a dynamic epidemic prediction model based on the transmission mechanism of Echinococcosis is also established for long-term predictions. Results demonstrate that the dynamic epidemic prediction model is capable of identifying the future tendency. The number of human Echinococcosis cases will increase steadily over the next 25 years, reaching a peak of about 1250 cases, before eventually witnessing a slow decline, until it finally ends.

  14. Simulation of shoreline development in a groyne system, with a case study Sanur Bali beach

    NASA Astrophysics Data System (ADS)

    Gunawan, P. H.; Pudjaprasetya, S. R.

    2018-03-01

    The process of shoreline changes due to transport of sediment by littoral drift is studied in this paper. Pelnard-Considère is the commonly adopted model. This model is based on the principle of sediment conservation, without diffraction. In this research, we adopt the Pelnard-Considère equation with diffraction, and a numerical scheme based on the finite volume method is implemented. Shoreline development in a groyne system is then simulated. For a case study, the Sanur Bali Beach, Indonesia is considered, in which from Google Earth photos, the beach experiences changes of coastline caused by sediment trapped in a groyne system.

  15. Assessing the British Isles CH4 flux using aircraft and ground-based sampling: a case study on 12 May 2015

    NASA Astrophysics Data System (ADS)

    Pitt, Joseph

    2017-04-01

    Aircraft and ground-based sampling of atmospheric greenhouse gas composition over the British Isles was conducted between 2014 and 2016 as part of the Greenhouse gAs UK and Global Emissions (GAUGE) project. We report a case study focussing on two research aircraft flights conducted on 12 May 2015 to sample inflow and outflow across the British Isles. We have employed the NAME Lagrangian dispersion model to simulate CH4 mole fraction enhancements corresponding to aircraft and ground-based sample times and locations, using CH4 surface fluxes derived from a composite flux inventory, which included both anthropogenic and natural sources. For each sampling location, variations in the baseline CH4 mole fraction were derived using the MOZART global chemical transport model, and added to the NAME enhancements to produce a dataset of modelled CH4 mole fractions which can be compared to the measurements. Using a multiple variable regression technique, we derive CH4 fluxes for the British Isles region from both aircraft and ground-based datasets. We discuss the applicability of our approach for both datasets, and conclude that in this case the assumptions inherent in our method are much better satisfied for the aircraft data than for the ground-based data. Using the aircraft data we derive a possible range of scale factors for the prior inventory flux of 0.53 - 0.97, with a central estimate of 0.82 based on our assessment of the most likely apportionment of model uncertainty. This leads to a posterior estimate of the British Isles CH4 flux of 67 kg s-1 - 121 kg s-1, with a central value of 103 kg s-1.

  16. Second-Moment RANS Model Verification and Validation Using the Turbulence Modeling Resource Website (Invited)

    NASA Technical Reports Server (NTRS)

    Eisfeld, Bernhard; Rumsey, Chris; Togiti, Vamshi

    2015-01-01

    The implementation of the SSG/LRR-omega differential Reynolds stress model into the NASA flow solvers CFL3D and FUN3D and the DLR flow solver TAU is verified by studying the grid convergence of the solution of three different test cases from the Turbulence Modeling Resource Website. The model's predictive capabilities are assessed based on four basic and four extended validation cases also provided on this website, involving attached and separated boundary layer flows, effects of streamline curvature and secondary flow. Simulation results are compared against experimental data and predictions by the eddy-viscosity models of Spalart-Allmaras (SA) and Menter's Shear Stress Transport (SST).

  17. Applying CBR to machine tool product configuration design oriented to customer requirements

    NASA Astrophysics Data System (ADS)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  18. Offshore safety case approach and formal safety assessment of ships.

    PubMed

    Wang, J

    2002-01-01

    Tragic marine and offshore accidents have caused serious consequences including loss of lives, loss of property, and damage of the environment. A proactive, risk-based "goal setting" regime is introduced to the marine and offshore industries to increase the level of safety. To maximize marine and offshore safety, risks need to be modeled and safety-based decisions need to be made in a logical and confident way. Risk modeling and decision-making tools need to be developed and applied in a practical environment. This paper describes both the offshore safety case approach and formal safety assessment of ships in detail with particular reference to the design aspects. The current practices and the latest development in safety assessment in both the marine and offshore industries are described. The relationship between the offshore safety case approach and formal ship safety assessment is described and discussed. Three examples are used to demonstrate both the offshore safety case approach and formal ship safety assessment. The study of risk criteria in marine and offshore safety assessment is carried out. The recommendations on further work required are given. This paper gives safety engineers in the marine and offshore industries an overview of the offshore safety case approach and formal ship safety assessment. The significance of moving toward a risk-based "goal setting" regime is given.

  19. Game-Theoretic Models of Information Overload in Social Networks

    NASA Astrophysics Data System (ADS)

    Borgs, Christian; Chayes, Jennifer; Karrer, Brian; Meeder, Brendan; Ravi, R.; Reagans, Ray; Sayedi, Amin

    We study the effect of information overload on user engagement in an asymmetric social network like Twitter. We introduce simple game-theoretic models that capture rate competition between celebrities producing updates in such networks where users non-strategically choose a subset of celebrities to follow based on the utility derived from high quality updates as well as disutility derived from having to wade through too many updates. Our two variants model the two behaviors of users dropping some potential connections (followership model) or leaving the network altogether (engagement model). We show that under a simple formulation of celebrity rate competition, there is no pure strategy Nash equilibrium under the first model. We then identify special cases in both models when pure rate equilibria exist for the celebrities: For the followership model, we show existence of a pure rate equilibrium when there is a global ranking of the celebrities in terms of the quality of their updates to users. This result also generalizes to the case when there is a partial order consistent with all the linear orders of the celebrities based on their qualities to the users. Furthermore, these equilibria can be computed in polynomial time. For the engagement model, pure rate equilibria exist when all users are interested in the same number of celebrities, or when they are interested in at most two. Finally, we also give a finite though inefficient procedure to determine if pure equilibria exist in the general case of the followership model.

  20. A decision support model for improving a multi-family housing complex based on CO2 emission from electricity consumption.

    PubMed

    Hong, Taehoon; Koo, Choongwan; Kim, Hyunjoong

    2012-12-15

    The number of deteriorated multi-family housing complexes in South Korea continues to rise, and consequently their electricity consumption is also increasing. This needs to be addressed as part of the nation's efforts to reduce energy consumption. The objective of this research was to develop a decision support model for determining the need to improve multi-family housing complexes. In this research, 1664 cases located in Seoul were selected for model development. The research team collected the characteristics and electricity energy consumption data of these projects in 2009-2010. The following were carried out in this research: (i) using the Decision Tree, multi-family housing complexes were clustered based on their electricity energy consumption; (ii) using Case-Based Reasoning, similar cases were retrieved from the same cluster; and (iii) using a combination of Multiple Regression Analysis, Artificial Neural Network, and Genetic Algorithm, the prediction performance of the developed model was improved. The results of this research can be used as follows: (i) as basic research data for continuously managing several energy consumption data of multi-family housing complexes; (ii) as advanced research data for predicting energy consumption based on the project characteristics; (iii) as practical research data for selecting the most optimal multi-family housing complex with the most potential in terms of energy savings; and (iv) as consistent and objective criteria for incentives and penalties. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Fitting Data to Model: Structural Equation Modeling Diagnosis Using Two Scatter Plots

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Hayashi, Kentaro

    2010-01-01

    This article introduces two simple scatter plots for model diagnosis in structural equation modeling. One plot contrasts a residual-based M-distance of the structural model with the M-distance for the factor score. It contains information on outliers, good leverage observations, bad leverage observations, and normal cases. The other plot contrasts…

  2. An analytic model for acoustic scattering from an impedance cylinder placed normal to an impedance plane

    NASA Astrophysics Data System (ADS)

    Swearingen, Michelle E.

    2004-04-01

    An analytic model, developed in cylindrical coordinates, is described for the scattering of a spherical wave off a semi-infinite reight cylinder placed normal to a ground surface. The motivation for the research is to have a model with which one can simulate scattering from a single tree and which can be used as a fundamental element in a model for estimating the attenuation in a forest comprised of multiple tree trunks. Comparisons are made to the plane wave case, the transparent cylinder case, and the rigid and soft ground cases as a method of theoretically verifying the model for the contemplated range of model parameters. Agreement is regarded as excellent for these benchmark cases. Model sensitivity to five parameters is also explored. An experiment was performed to study the scattering from a cylinder normal to a ground surface. The data from the experiment is analyzed with a transfer function method to yield frequency and impulse responses, and calculations based on the analytic model are compared to the experimental data. Thesis advisor: David C. Swanson Copies of this thesis written in English can be obtained from

  3. Long-term hydrological simulation based on the Soil Conservation Service curve number

    NASA Astrophysics Data System (ADS)

    Mishra, Surendra Kumar; Singh, Vijay P.

    2004-05-01

    Presenting a critical review of daily flow simulation models based on the Soil Conservation Service curve number (SCS-CN), this paper introduces a more versatile model based on the modified SCS-CN method, which specializes into seven cases. The proposed model was applied to the Hemavati watershed (area = 600 km2) in India and was found to yield satisfactory results in both calibration and validation. The model conserved monthly and annual runoff volumes satisfactorily. A sensitivity analysis of the model parameters was performed, including the effect of variation in storm duration. Finally, to investigate the model components, all seven variants of the modified version were tested for their suitability.

  4. Output-Feedback Model Predictive Control of a Pasteurization Pilot Plant based on an LPV model

    NASA Astrophysics Data System (ADS)

    Karimi Pour, Fatemeh; Ocampo-Martinez, Carlos; Puig, Vicenç

    2017-01-01

    This paper presents a model predictive control (MPC) of a pasteurization pilot plant based on an LPV model. Since not all the states are measured, an observer is also designed, which allows implementing an output-feedback MPC scheme. However, the model of the plant is not completely observable when augmented with the disturbance models. In order to solve this problem, the following strategies are used: (i) the whole system is decoupled into two subsystems, (ii) an inner state-feedback controller is implemented into the MPC control scheme. A real-time example based on the pasteurization pilot plant is simulated as a case study for testing the behavior of the approaches.

  5. A SQL-Database Based Meta-CASE System and its Query Subsystem

    NASA Astrophysics Data System (ADS)

    Eessaar, Erki; Sgirka, Rünno

    Meta-CASE systems simplify the creation of CASE (Computer Aided System Engineering) systems. In this paper, we present a meta-CASE system that provides a web-based user interface and uses an object-relational database system (ORDBMS) as its basis. The use of ORDBMSs allows us to integrate different parts of the system and simplify the creation of meta-CASE and CASE systems. ORDBMSs provide powerful query mechanism. The proposed system allows developers to use queries to evaluate and gradually improve artifacts and calculate values of software measures. We illustrate the use of the systems by using SimpleM modeling language and discuss the use of SQL in the context of queries about artifacts. We have created a prototype of the meta-CASE system by using PostgreSQL™ ORDBMS and PHP scripting language.

  6. Cooperation, Technology, and Performance: A Case Study.

    ERIC Educational Resources Information Center

    Cavanagh, Thomas; Dickenson, Sabrina; Brandt, Suzanne

    1999-01-01

    Describes the CTP (Cooperation, Technology, and Performance) model and explains how it is used by the Department of Veterans Affairs-Veteran's Benefit Administration (VBA) for training. Discusses task analysis; computer-based training; cooperative-based learning environments; technology-based learning; performance-assessment methods; courseware…

  7. Clinical Case Reporting in the Peer-Reviewed Physical Therapy Literature: Time to Move Toward Functioning.

    PubMed

    Davenport, Todd E

    2015-12-01

    Physical therapists increasingly are contributing clinical case reports to the health literature, which form the basis for higher quality evidence that has been incorporated into clinical practice guidelines. Yet, few resources exist to assist physical therapists with the basic mechanics and quality standards of producing a clinical case report. This situation is further complicated by the absence of uniform standards for quality in case reporting. The importance of including a concise yet comprehensive description of patient functioning in all physical therapy case reports suggest the potential appropriateness of basing quality guidelines on the World Health Organization's International Classification of Functioning Disability and Health (ICF) model. The purpose of this paper is to assist physical therapists in creating high-quality clinical case reports for the peer-reviewed literature using the ICF model as a guiding framework. Along these lines, current recommendations related to the basic mechanics of writing a successful clinical case report are reviewed, as well and a proposal for uniform clinical case reporting requirements is introduced with the aim to improve the quality and feasibility of clinical case reporting in physical therapy that are informed by the ICF model. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Implementing Value-Based Payment Reform: A Conceptual Framework and Case Examples.

    PubMed

    Conrad, Douglas A; Vaughn, Matthew; Grembowski, David; Marcus-Smith, Miriam

    2016-08-01

    This article develops a conceptual framework for implementation of value-based payment (VBP) reform and then draws on that framework to systematically examine six distinct multi-stakeholder coalition VBP initiatives in three different regions of the United States. The VBP initiatives deploy the following payment models: reference pricing, "shadow" primary care capitation, bundled payment, pay for performance, shared savings within accountable care organizations, and global payment. The conceptual framework synthesizes prior models of VBP implementation. It describes how context, project objectives, payment and care delivery strategies, and the barriers and facilitators to translating strategy into implementation affect VBP implementation and value for patients. We next apply the framework to six case examples of implementation, and conclude by discussing the implications of the case examples and the conceptual framework for future practice and research. © The Author(s) 2015.

  9. Contact problem for an elastic reinforcement bonded to an elastic plate

    NASA Technical Reports Server (NTRS)

    Erdogan, F.; Civelek, M. B.

    1973-01-01

    The stiffening layer is treated as an elastic membrane and the base plate is assumed to be an elastic continuum. The bonding between the two materials is assumed to be either one of direct adhesion ro through a thin adhesive layer which is treated as a shear spring. The solution for the simple case in which both the stiffener and the base plate are treated as membranes is also given. The contact stress is obtained for a series of numerical examples. In the direct adhesion case the contact stress becomes infinite at the stiffener ends with a typical square root singularity for the continuum model, and behaving as a delta function for the membrane model. In the case of bonding through an adhesive layer the contact stress becomes finite and continuous along the entire contact area.

  10. Optimizing clinical and organizational practice in cancer survivor transitions between specialized oncology and primary care teams: a realist evaluation of multiple case studies.

    PubMed

    Tremblay, Dominique; Prady, Catherine; Bilodeau, Karine; Touati, Nassera; Chouinard, Maud-Christine; Fortin, Martin; Gaboury, Isabelle; Rodrigue, Jean; L'Italien, Marie-France

    2017-12-16

    Cancer is now viewed as a chronic disease, presenting challenges to follow-up and survivorship care. Models to shift from haphazard, suboptimal and fragmented episodes of care to an integrated cancer care continuum must be developed, tested and implemented. Numerous studies demonstrate improved care when follow-up is assured by both oncology and primary care providers rather than either group alone. However, there is little data on the roles assumed by specialized oncology teams and primary care providers and the extent to which they work together. This study aims to develop, pilot test and measure outcomes of an innovative risk-based coordinated cancer care model for patients transitioning from specialized oncology teams to primary care providers. This multiple case study using a sequential mixed-methods design rests on a theory-driven realist evaluation approach to understand how transitions might be improved. The cases are two health regions in Quebec, Canada, defined by their geographic territory. Each case includes a Cancer Centre and three Family Medicine Groups selected based on differences in their determining characteristics. Qualitative data will be collected from document review (scientific journal, grey literature, local documentation), semi-directed interviews with key informants, and observation of care coordination practices. Qualitative data will be supplemented with a survey to measure the outcome of the coordinated model among providers (scope of practice, collaboration, relational coordination, leadership) and patients diagnosed with breast, colorectal or prostate cancer (access to care, patient-centredness, communication, self-care, survivorship profile, quality of life). Results from descriptive and regression analyses will be triangulated with thematic analysis of qualitative data. Qualitative, quantitative, and mixed methods data will be interpreted within and across cases in order to identify context-mechanism associations that explain outcomes. The study will provide empirical data on a risk-based coordinated model of cancer care to guide actions at different levels in the health system. This in-depth multiple case study using a realist approach considers both the need for context-specific intervention research and the imperative to address research gaps regarding coordinated models of cancer care.

  11. Fuel Cycle Analysis Framework Base Cases for the IAEA/INPRO GAINS Collaborative Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brent Dixon

    Thirteen countries participated in the Collaborative Project GAINS “Global Architecture of Innovative Nuclear Energy Systems Based on Thermal and Fast Reactors Including a Closed Fuel Cycle”, which was the primary activity within the IAEA/INPRO Program Area B: “Global Vision on Sustainable Nuclear Energy” for the last three years. The overall objective of GAINS was to develop a standard framework for assessing future nuclear energy systems taking into account sustainable development, and to validate results through sample analyses. This paper details the eight scenarios that constitute the GAINS framework base cases for analysis of the transition to future innovative nuclear energymore » systems. The framework base cases provide a reference for users of the framework to start from in developing and assessing their own alternate systems. Each base case is described along with performance results against the GAINS sustainability evaluation metrics. The eight cases include four using a moderate growth projection and four using a high growth projection for global nuclear electricity generation through 2100. The cases are divided into two sets, addressing homogeneous and heterogeneous scenarios developed by GAINS to model global fuel cycle strategies. The heterogeneous world scenario considers three separate nuclear groups based on their fuel cycle strategies, with non-synergistic and synergistic cases. The framework base case analyses results show the impact of these different fuel cycle strategies while providing references for future users of the GAINS framework. A large number of scenario alterations are possible and can be used to assess different strategies, different technologies, and different assumptions about possible futures of nuclear power. Results can be compared to the framework base cases to assess where these alternate cases perform differently versus the sustainability indicators.« less

  12. A Model for Effective Teaching and Learning in Research Methods.

    ERIC Educational Resources Information Center

    Poindexter, Paula M.

    1998-01-01

    Proposes a teaching model for making research relevant. Presents a case study of the model as used in advertising and public relations research classes. Notes that the model consists of a knowledge base, team process, a realistic goal-oriented experience, self-management, expert consultation, and evaluation and synthesis. Discusses resulting…

  13. Capital planning for operating theatres based on projecting future theatre requirements.

    PubMed

    Sheehan, Jennifer A; Tyler, Peter; Jayasinha, Hirani; Meleady, Kathleen T; Jones, Neill

    2011-05-01

    During 2006, NSW and ACT Health Departments jointly engaged KPMG to develop an Operating Theatre Requirements' Projection Model and an accompanying planning guideline. A research scan was carried out to identify drivers of surgical demand, theatre capacity and theatre performance, as well as locating existing approaches to modelling operating theatre requirements for planning purposes. The project delivered a Microsoft Excel-based model for projecting future operating theatre requirements, together with an accompanying guideline for use of the model and interpretation of its outputs. It provides a valuable addition to the suite of tools available to Health staff for service and capital planning. The model operates with several limitations, largely due to being data dependent, and the state and completeness of available theatre activity data. However, the operational flexibility built into the model allows users to compensate for these limitations, on a case by case basis, when the user has access to suitable, local data. The design flexibility of the model means that updating the model as improved data become available is not difficult; resulting in revisions being able to be made quickly, and disseminated to users rapidly.

  14. FAST Model Calibration and Validation of the OC5- DeepCwind Floating Offshore Wind System Against Wave Tank Test Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Robertson, Amy N; Jonkman, Jason

    During the course of the Offshore Code Comparison Collaboration, Continued, with Correlation (OC5) project, which focused on the validation of numerical methods through comparison against tank test data, the authors created a numerical FAST model of the 1:50-scale DeepCwind semisubmersible system that was tested at the Maritime Research Institute Netherlands ocean basin in 2013. This paper discusses several model calibration studies that were conducted to identify model adjustments that improve the agreement between the numerical simulations and the experimental test data. These calibration studies cover wind-field-specific parameters (coherence, turbulence), hydrodynamic and aerodynamic modeling approaches, as well as rotor model (blade-pitchmore » and blade-mass imbalances) and tower model (structural tower damping coefficient) adjustments. These calibration studies were conducted based on relatively simple calibration load cases (wave only/wind only). The agreement between the final FAST model and experimental measurements is then assessed based on more-complex combined wind and wave validation cases.« less

  15. A Study and Model of Operating Level Financial Management Philosophy Under RMS.

    DTIC Science & Technology

    The lack of financial management education has prevented base level managers from using PRIME data as intended. This study examines the Air Force...operating level financial management philosophy before and after PRIME and the environment of PRIME adoption. A model in the form of two case problems...with solutions is created to portray the financial management concepts under PRIME to help educate base level Air Force logistic managers. The model

  16. One size does not fit all: Adapting mark-recapture and occupancy models for state uncertainty

    USGS Publications Warehouse

    Kendall, W.L.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    Multistate capture?recapture models continue to be employed with greater frequency to test hypotheses about metapopulation dynamics and life history, and more recently disease dynamics. In recent years efforts have begun to adjust these models for cases where there is uncertainty about an animal?s state upon capture. These efforts can be categorized into models that permit misclassification between two states to occur in either direction or one direction, where state is certain for a subset of individuals or is always uncertain, and where estimation is based on one sampling occasion per period of interest or multiple sampling occasions per period. State uncertainty also arises in modeling patch occupancy dynamics. I consider several case studies involving bird and marine mammal studies that illustrate how misclassified states can arise, and outline model structures for properly utilizing the data that are produced. In each case misclassification occurs in only one direction (thus there is a subset of individuals or patches where state is known with certainty), and there are multiple sampling occasions per period of interest. For the cases involving capture?recapture data I allude to a general model structure that could include each example as a special case. However, this collection of cases also illustrates how difficult it is to develop a model structure that can be directly useful for answering every ecological question of interest and account for every type of data from the field.

  17. Introduction of the 2nd Phase of the Integrated Hydrologic Model Intercomparison Project

    NASA Astrophysics Data System (ADS)

    Kollet, Stefan; Maxwell, Reed; Dages, Cecile; Mouche, Emmanuel; Mugler, Claude; Paniconi, Claudio; Park, Young-Jin; Putti, Mario; Shen, Chaopeng; Stisen, Simon; Sudicky, Edward; Sulis, Mauro; Ji, Xinye

    2015-04-01

    The 2nd Phase of the Integrated Hydrologic Model Intercomparison Project commenced in June 2013 with a workshop at Bonn University funded by the German Science Foundation and US National Science Foundation. Three test cases were defined and compared that are available online at www.hpsc-terrsys.de including a tilted v-catchment case; a case called superslab based on multiple slab-heterogeneities in the hydraulic conductivity along a hillslope; and the Borden site case, based on a published field experiment. The goal of this phase is to further interrogate the coupling of surface-subsurface flow implemented in various integrated hydrologic models; and to understand and quantify the impact of differences in the conceptual and technical implementations on the simulation results, which may constitute an additional source of uncertainty. The focus has been broadened considerably including e.g. saturated and unsaturated subsurface storages, saturated surface area, ponded surface storage in addition to discharge, and pressure/saturation profiles and cross-sections. Here, first results are presented and discussed demonstrating the conceptual and technical challenges in implementing essentially the same governing equations describing highly non-linear moisture redistribution processes and surface-groundwater interactions.

  18. Is the Bifactor Model a Better Model or is it Just Better at Modeling Implausible Responses? Application of Iteratively Reweighted Least Squares to the Rosenberg Self-Esteem Scale

    PubMed Central

    Reise, Steven P.; Kim, Dale S.; Mansolf, Maxwell; Widaman, Keith F.

    2017-01-01

    Although the structure of the Rosenberg Self-Esteem Scale (RSES; Rosenberg, 1965) has been exhaustively evaluated, questions regarding dimensionality and direction of wording effects continue to be debated. To shed new light on these issues, we ask: (1) for what percentage of individuals is a unidimensional model adequate, (2) what additional percentage of individuals can be modeled with multidimensional specifications, and (3) what percentage of individuals respond so inconsistently that they cannot be well modeled? To estimate these percentages, we applied iteratively reweighted least squares (IRLS; Yuan & Bentler, 2000) to examine the structure of the RSES in a large, publicly available dataset. A distance measure, ds, reflecting a distance between a response pattern and an estimated model, was used for case weighting. We found that a bifactor model provided the best overall model fit, with one general factor and two wording-related group factors. But, based on dr values, a distance measure based on individual residuals, we concluded that approximately 86% of cases were adequately modeled through a unidimensional structure, and only an additional 3% required a bifactor model. Roughly 11% of cases were judged as “unmodelable” due to their significant residuals in all models considered. Finally, analysis of ds revealed that some, but not all, of the superior fit of the bifactor model is owed to that model’s ability to better accommodate implausible and possibly invalid response patterns, and not necessarily because it better accounts for the effects of direction of wording. PMID:27834509

  19. Modeling the outcomes of nursing home care.

    PubMed

    Rohrer, J E; Hogan, A J

    1987-01-01

    In this exploratory analysis using data on 290 patients, we use regression analysis to model patient outcomes in two Veterans Administration nursing homes. We find resource use, as measured with minutes of nursing time, to be associated with outcomes when case mix is controlled. Our results suggest that, under case-based reimbursement systems, nursing homes could increase their revenues by withholding unskilled and psychosocial care and discouraging physicians' visits. Implications for nursing home policy are discussed.

  20. Use of the Equity Implementation Model to Review Clinical System Implementation Efforts

    PubMed Central

    Lauer, Thomas W.; Joshi, Kailash; Browdy, Thomas

    2000-01-01

    This paper presents the equity implementation model (EIM) in the context of a case that describes the implementation of a medical scheduling system. The model is based on equity theory, a well-established theory in the social sciences that has been tested in hundreds of experimental and field studies. The predictions of equity theory have been supported in organizational, societal, family, and other social settings. Thus, the EIM helps provide a theory-based understanding for collecting and reviewing users' reactions to, and acceptance or rejection of, a new technology or system. The case study (implementation of a patient scheduling and appointment setting system in a large health maintenance organization) illustrates how the EIM can be used to examine users' reactions to the implementation of a new system. PMID:10641966

  1. Modeling the Car Crash Crisis Management System Using HiLA

    NASA Astrophysics Data System (ADS)

    Hölzl, Matthias; Knapp, Alexander; Zhang, Gefei

    An aspect-oriented modeling approach to the Car Crash Crisis Management System (CCCMS) using the High-Level Aspect (HiLA) language is described. HiLA is a language for expressing aspects for UML static structures and UML state machines. In particular, HiLA supports both a static graph transformational and a dynamic approach of applying aspects. Furthermore, it facilitates methodologically turning use case descriptions into state machines: for each main success scenario, a base state machine is developed; all extensions to this main success scenario are covered by aspects. Overall, the static structure of the CCCMS is modeled in 43 classes, the main success scenarios in 13 base machines, the use case extensions in 47 static and 31 dynamic aspects, most of which are instantiations of simple aspect templates.

  2. Wisconsin District Case Study. A Report and Estimating Tool for K-12 School Districts

    ERIC Educational Resources Information Center

    Consortium for School Networking, 2004

    2004-01-01

    The Wisconsin case study school district is primarily urban and growing with 21,500 students on 40 campuses. This document contains case studies that are presented in the same format at the 2003 studies, but also have a focus on additional technologies beyond the base distributed computing model. These new technologies are voice/data integration,…

  3. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  4. Convective Systems over the South China Sea: Cloud-Resolving Model Simulations.

    NASA Astrophysics Data System (ADS)

    Tao, W.-K.; Shie, C.-L.; Simpson, J.; Braun, S.; Johnson, R. H.; Ciesielski, P. E.

    2003-12-01

    The two-dimensional version of the Goddard Cumulus Ensemble (GCE) model is used to simulate two South China Sea Monsoon Experiment (SCSMEX) convective periods [18 26 May (prior to and during the monsoon onset) and 2 11 June (after the onset of the monsoon) 1998]. Observed large-scale advective tendencies for potential temperature, water vapor mixing ratio, and horizontal momentum are used as the main forcing in governing the GCE model in a semiprognostic manner. The June SCSMEX case has stronger forcing in both temperature and water vapor, stronger low-level vertical shear of the horizontal wind, and larger convective available potential energy (CAPE).The temporal variation of the model-simulated rainfall, time- and domain-averaged heating, and moisture budgets compares well to those diagnostically determined from soundings. However, the model results have a higher temporal variability. The model underestimates the rainfall by 17% to 20% compared to that based on soundings. The GCE model-simulated rainfall for June is in very good agreement with the Tropical Rainfall Measuring Mission (TRMM), precipitation radar (PR), and the Global Precipitation Climatology Project (GPCP). Overall, the model agrees better with observations for the June case rather than the May case.The model-simulated energy budgets indicate that the two largest terms for both cases are net condensation (heating/drying) and imposed large-scale forcing (cooling/moistening). These two terms are opposite in sign, however. The model results also show that there are more latent heat fluxes for the May case. However, more rainfall is simulated for the June case. Net radiation (solar heating and longwave cooling) are about 34% and 25%, respectively, of the net condensation (condensation minus evaporation) for the May and June cases. Sensible heat fluxes do not contribute to rainfall in either of the SCSMEX cases. Two types of organized convective systems, unicell (May case) and multicell (June case), are simulated by the model. They are determined by the observed mean U wind shear (unidirectional versus reverse shear profiles above midlevels).Several sensitivity tests are performed to examine the impact of the radiation, microphysics, and large-scale mean horizontal wind on the organization and intensity of the SCSMEX convective systems.

  5. Modeling Joint Exposures and Health Outcomes for Cumulative Risk Assessment: the Case of Radon and Smoking

    EPA Science Inventory

    Community-based cumulative risk assessment requires characterization of exposures to multiple chemical and non-chemical stressors, with consideration of how the non-chemical stressors may influence risks from chemical stressors. Residential radon provides an interesting case exam...

  6. Utility of genetic and non-genetic risk factors in prediction of type 2 diabetes: Whitehall II prospective cohort study.

    PubMed

    Talmud, Philippa J; Hingorani, Aroon D; Cooper, Jackie A; Marmot, Michael G; Brunner, Eric J; Kumari, Meena; Kivimäki, Mika; Humphries, Steve E

    2010-01-14

    To assess the performance of a panel of common single nucleotide polymorphisms (genotypes) associated with type 2 diabetes in distinguishing incident cases of future type 2 diabetes (discrimination), and to examine the effect of adding genetic information to previously validated non-genetic (phenotype based) models developed to estimate the absolute risk of type 2 diabetes. Workplace based prospective cohort study with three 5 yearly medical screenings. 5535 initially healthy people (mean age 49 years; 33% women), of whom 302 developed new onset type 2 diabetes over 10 years. Non-genetic variables included in two established risk models-the Cambridge type 2 diabetes risk score (age, sex, drug treatment, family history of type 2 diabetes, body mass index, smoking status) and the Framingham offspring study type 2 diabetes risk score (age, sex, parental history of type 2 diabetes, body mass index, high density lipoprotein cholesterol, triglycerides, fasting glucose)-and 20 single nucleotide polymorphisms associated with susceptibility to type 2 diabetes. Cases of incident type 2 diabetes were defined on the basis of a standard oral glucose tolerance test, self report of a doctor's diagnosis, or the use of anti-diabetic drugs. A genetic score based on the number of risk alleles carried (range 0-40; area under receiver operating characteristics curve 0.54, 95% confidence interval 0.50 to 0.58) and a genetic risk function in which carriage of risk alleles was weighted according to the summary odds ratios of their effect from meta-analyses of genetic studies (area under receiver operating characteristics curve 0.55, 0.51 to 0.59) did not effectively discriminate cases of diabetes. The Cambridge risk score (area under curve 0.72, 0.69 to 0.76) and the Framingham offspring risk score (area under curve 0.78, 0.75 to 0.82) led to better discrimination of cases than did genotype based tests. Adding genetic information to phenotype based risk models did not improve discrimination and provided only a small improvement in model calibration and a modest net reclassification improvement of about 5% when added to the Cambridge risk score but not when added to the Framingham offspring risk score. The phenotype based risk models provided greater discrimination for type 2 diabetes than did models based on 20 common independently inherited diabetes risk alleles. The addition of genotypes to phenotype based risk models produced only minimal improvement in accuracy of risk estimation assessed by recalibration and, at best, a minor net reclassification improvement. The major translational application of the currently known common, small effect genetic variants influencing susceptibility to type 2 diabetes is likely to come from the insight they provide on causes of disease and potential therapeutic targets.

  7. An initial investigation on developing a new method to predict short-term breast cancer risk based on deep learning technology

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Wang, Yunzhi; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2016-03-01

    In order to establish a new personalized breast cancer screening paradigm, it is critically important to accurately predict the short-term risk of a woman having image-detectable cancer after a negative mammographic screening. In this study, we developed and tested a novel short-term risk assessment model based on deep learning method. During the experiment, a number of 270 "prior" negative screening cases was assembled. In the next sequential ("current") screening mammography, 135 cases were positive and 135 cases remained negative. These cases were randomly divided into a training set with 200 cases and a testing set with 70 cases. A deep learning based computer-aided diagnosis (CAD) scheme was then developed for the risk assessment, which consists of two modules: adaptive feature identification module and risk prediction module. The adaptive feature identification module is composed of three pairs of convolution-max-pooling layers, which contains 20, 10, and 5 feature maps respectively. The risk prediction module is implemented by a multiple layer perception (MLP) classifier, which produces a risk score to predict the likelihood of the woman developing short-term mammography-detectable cancer. The result shows that the new CAD-based risk model yielded a positive predictive value of 69.2% and a negative predictive value of 74.2%, with a total prediction accuracy of 71.4%. This study demonstrated that applying a new deep learning technology may have significant potential to develop a new short-term risk predicting scheme with improved performance in detecting early abnormal symptom from the negative mammograms.

  8. Performability modeling based on real data: A case study

    NASA Technical Reports Server (NTRS)

    Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.

    1988-01-01

    Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of apparent types of errors.

  9. Ranking streamflow model performance based on Information theory metrics

    NASA Astrophysics Data System (ADS)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  10. Template-based protein-protein docking exploiting pairwise interfacial residue restraints.

    PubMed

    Xue, Li C; Rodrigues, João P G L M; Dobbs, Drena; Honavar, Vasant; Bonvin, Alexandre M J J

    2017-05-01

    Although many advanced and sophisticated ab initio approaches for modeling protein-protein complexes have been proposed in past decades, template-based modeling (TBM) remains the most accurate and widely used approach, given a reliable template is available. However, there are many different ways to exploit template information in the modeling process. Here, we systematically evaluate and benchmark a TBM method that uses conserved interfacial residue pairs as docking distance restraints [referred to as alpha carbon-alpha carbon (CA-CA)-guided docking]. We compare it with two other template-based protein-protein modeling approaches, including a conserved non-pairwise interfacial residue restrained docking approach [referred to as the ambiguous interaction restraint (AIR)-guided docking] and a simple superposition-based modeling approach. Our results show that, for most cases, the CA-CA-guided docking method outperforms both superposition with refinement and the AIR-guided docking method. We emphasize the superiority of the CA-CA-guided docking on cases with medium to large conformational changes, and interactions mediated through loops, tails or disordered regions. Our results also underscore the importance of a proper refinement of superimposition models to reduce steric clashes. In summary, we provide a benchmarked TBM protocol that uses conserved pairwise interface distance as restraints in generating realistic 3D protein-protein interaction models, when reliable templates are available. The described CA-CA-guided docking protocol is based on the HADDOCK platform, which allows users to incorporate additional prior knowledge of the target system to further improve the quality of the resulting models. © The Author 2016. Published by Oxford University Press.

  11. Research misconduct oversight: defining case costs.

    PubMed

    Gammon, Elizabeth; Franzini, Luisa

    2013-01-01

    This study uses a sequential mixed method study design to define cost elements of research misconduct among faculty at academic medical centers. Using time driven activity based costing, the model estimates a per case cost for 17 cases of research misconduct reported by the Office of Research Integrity for the period of 2000-2005. Per case cost of research misconduct was found to range from $116,160 to $2,192,620. Research misconduct cost drivers are identified.

  12. Simulations of the vortex in the Dellenback abrupt expansion, resembling a hydro turbine draft tube operating at part-load

    NASA Astrophysics Data System (ADS)

    Nilsson, H.

    2012-11-01

    This work presents an OpenFOAM case-study, based on the experimental studies of the swirling flow in the abrupt expansion by Dellenback et al.[1]. The case yields similar flow conditions as those of a helical vortex rope in a hydro turbine draft tube working at part-load. The case-study is set up similar to the ERCOFTAC Conical Diffuser and Centrifugal Pump OpenFOAM case-studies [2,3], making all the files available and the results fully reproducable using OpenSource software. The mesh generation is done using m4 scripting and the OpenFOAM built-in blockMesh mesh generator. The swirling inlet boundary condition is specified as an axi-symmetric profile. The outlet boundary condition uses the zeroGradient condition for all variables except for the pressure, which uses the fixed mean value boundary condition. The wall static pressure is probed at a number of locations during the simulations, and post-processing of the time-averaged solution is done using the OpenFOAM sample utility. Gnuplot scripts are provided for plotting the results. The computational results are compared to one of the operating conditions studied by Dellenback, and measurements for all the experimentally studied operating conditions are available in the case-study. Results from five cases are here presented, based on the kEpsilon model, the kOmegaSST model, and a filtered version of the same kOmegaSST model, named kOmegaSSTF [4,5]. Two different inlet boundary conditions are evaluated. It is shown that kEpsilon and kOmegaSST give steady solutions, while kOmegaSSTF gives a highly unsteady solution. The time-averaged solution of the kOmegaSSTF model is much more accurate than the other models. The kEpsilon and kOmegaSST models are thus unable to accurately model the effect of the large-scale unsteadiness, while kOmegaSSTF resolves those scales and models only the smaller scales. The use of two different boundary conditions shows that the boundary conditions are more important than the choice between kEpsilon and kOmegaSST, for the results just after the abrupt expansion.

  13. Using an Animated Case Scenario Based on Constructivist 5E Model to Enhance Pre-Service Teachers' Awareness of Electrical Safety

    ERIC Educational Resources Information Center

    Hirca, Necati

    2013-01-01

    The objective of this study is to get pre-service teachers to develop an awareness of first aid knowledge and skills related to electrical shocking and safety within a scenario based animation based on a Constructivist 5E model. The sample of the study was composed of 78 (46 girls and 32 boys) pre-service classroom teachers from two faculties of…

  14. Models of inertial range spectra of interplanetary magnetohydrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Zhou, YE; Matthaeus, William H.

    1990-01-01

    A framework based on turbulence theory is presented to develop approximations for the local turbulence effects that are required in transport models. An approach based on Kolmogoroff-style dimensional analysis is presented as well as one based on a wave-number diffusion picture. Particular attention is given to the case of MHD turbulence with arbitrary cross helicity and with arbitrary ratios of the Alfven time scale and the nonlinear time scale.

  15. Comparing statistical and process-based flow duration curve models in ungauged basins and changing rain regimes

    NASA Astrophysics Data System (ADS)

    Müller, M. F.; Thompson, S. E.

    2016-02-01

    The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.

  16. A design of experiments approach to validation sampling for logistic regression modeling with error-prone medical records.

    PubMed

    Ouyang, Liwen; Apley, Daniel W; Mehrotra, Sanjay

    2016-04-01

    Electronic medical record (EMR) databases offer significant potential for developing clinical hypotheses and identifying disease risk associations by fitting statistical models that capture the relationship between a binary response variable and a set of predictor variables that represent clinical, phenotypical, and demographic data for the patient. However, EMR response data may be error prone for a variety of reasons. Performing a manual chart review to validate data accuracy is time consuming, which limits the number of chart reviews in a large database. The authors' objective is to develop a new design-of-experiments-based systematic chart validation and review (DSCVR) approach that is more powerful than the random validation sampling used in existing approaches. The DSCVR approach judiciously and efficiently selects the cases to validate (i.e., validate whether the response values are correct for those cases) for maximum information content, based only on their predictor variable values. The final predictive model will be fit using only the validation sample, ignoring the remainder of the unvalidated and unreliable error-prone data. A Fisher information based D-optimality criterion is used, and an algorithm for optimizing it is developed. The authors' method is tested in a simulation comparison that is based on a sudden cardiac arrest case study with 23 041 patients' records. This DSCVR approach, using the Fisher information based D-optimality criterion, results in a fitted model with much better predictive performance, as measured by the receiver operating characteristic curve and the accuracy in predicting whether a patient will experience the event, than a model fitted using a random validation sample. The simulation comparisons demonstrate that this DSCVR approach can produce predictive models that are significantly better than those produced from random validation sampling, especially when the event rate is low. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Model Driven Engineering with Ontology Technologies

    NASA Astrophysics Data System (ADS)

    Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva

    Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.

  18. Identifying Model-Based Reconfiguration Goals through Functional Deficiencies

    NASA Technical Reports Server (NTRS)

    Benazera, Emmanuel; Trave-Massuyes, Louise

    2004-01-01

    Model-based diagnosis is now advanced to the point autonomous systems face some uncertain and faulty situations with success. The next step toward more autonomy is to have the system recovering itself after faults occur, a process known as model-based reconfiguration. After faults occur, given a prediction of the nominal behavior of the system and the result of the diagnosis operation, this paper details how to automatically determine the functional deficiencies of the system. These deficiencies are characterized in the case of uncertain state estimates. A methodology is then presented to determine the reconfiguration goals based on the deficiencies. Finally, a recovery process interleaves planning and model predictive control to restore the functionalities in prioritized order.

  19. Prognostics of lithium-ion batteries based on Dempster-Shafer theory and the Bayesian Monte Carlo method

    NASA Astrophysics Data System (ADS)

    He, Wei; Williard, Nicholas; Osterman, Michael; Pecht, Michael

    A new method for state of health (SOH) and remaining useful life (RUL) estimations for lithium-ion batteries using Dempster-Shafer theory (DST) and the Bayesian Monte Carlo (BMC) method is proposed. In this work, an empirical model based on the physical degradation behavior of lithium-ion batteries is developed. Model parameters are initialized by combining sets of training data based on DST. BMC is then used to update the model parameters and predict the RUL based on available data through battery capacity monitoring. As more data become available, the accuracy of the model in predicting RUL improves. Two case studies demonstrating this approach are presented.

  20. Development and External Validation of a Melanoma Risk Prediction Model Based on Self-assessed Risk Factors.

    PubMed

    Vuong, Kylie; Armstrong, Bruce K; Weiderpass, Elisabete; Lund, Eiliv; Adami, Hans-Olov; Veierod, Marit B; Barrett, Jennifer H; Davies, John R; Bishop, D Timothy; Whiteman, David C; Olsen, Catherine M; Hopper, John L; Mann, Graham J; Cust, Anne E; McGeechan, Kevin

    2016-08-01

    Identifying individuals at high risk of melanoma can optimize primary and secondary prevention strategies. To develop and externally validate a risk prediction model for incident first-primary cutaneous melanoma using self-assessed risk factors. We used unconditional logistic regression to develop a multivariable risk prediction model. Relative risk estimates from the model were combined with Australian melanoma incidence and competing mortality rates to obtain absolute risk estimates. A risk prediction model was developed using the Australian Melanoma Family Study (629 cases and 535 controls) and externally validated using 4 independent population-based studies: the Western Australia Melanoma Study (511 case-control pairs), Leeds Melanoma Case-Control Study (960 cases and 513 controls), Epigene-QSkin Study (44 544, of which 766 with melanoma), and Swedish Women's Lifestyle and Health Cohort Study (49 259 women, of which 273 had melanoma). We validated model performance internally and externally by assessing discrimination using the area under the receiver operating curve (AUC). Additionally, using the Swedish Women's Lifestyle and Health Cohort Study, we assessed model calibration and clinical usefulness. The risk prediction model included hair color, nevus density, first-degree family history of melanoma, previous nonmelanoma skin cancer, and lifetime sunbed use. On internal validation, the AUC was 0.70 (95% CI, 0.67-0.73). On external validation, the AUC was 0.66 (95% CI, 0.63-0.69) in the Western Australia Melanoma Study, 0.67 (95% CI, 0.65-0.70) in the Leeds Melanoma Case-Control Study, 0.64 (95% CI, 0.62-0.66) in the Epigene-QSkin Study, and 0.63 (95% CI, 0.60-0.67) in the Swedish Women's Lifestyle and Health Cohort Study. Model calibration showed close agreement between predicted and observed numbers of incident melanomas across all deciles of predicted risk. In the external validation setting, there was higher net benefit when using the risk prediction model to classify individuals as high risk compared with classifying all individuals as high risk. The melanoma risk prediction model performs well and may be useful in prevention interventions reliant on a risk assessment using self-assessed risk factors.

Top