Sample records for systematic model development

  1. Modeling Systematicity and Individuality in Nonlinear Second Language Development: The Case of English Grammatical Morphemes

    ERIC Educational Resources Information Center

    Murakami, Akira

    2016-01-01

    This article introduces two sophisticated statistical modeling techniques that allow researchers to analyze systematicity, individual variation, and nonlinearity in second language (L2) development. Generalized linear mixed-effects models can be used to quantify individual variation and examine systematic effects simultaneously, and generalized…

  2. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews.

    PubMed

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to 'think' conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions.

  3. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews

    PubMed Central

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Background Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to ‘think’ conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. Methods and Findings In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Conclusions Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions. PMID:26575182

  4. Policy-Relevant Systematic Reviews to Strengthen Health Systems: Models and Mechanisms to Support Their Production

    ERIC Educational Resources Information Center

    Oliver, Sandra; Dickson, Kelly

    2016-01-01

    Support for producing systematic reviews about health systems is less well developed than for those about clinical practice. From interviewing policy makers and systematic reviewers we identified institutional mechanisms which bring systematic reviews and policy priorities closer by harnessing organisational and individual motivations, emphasising…

  5. The Concept of C2 Communication and Information Support

    DTIC Science & Technology

    2004-06-01

    communication and information literacy , • Sensors: technology and systematic development as a branch, • Military prognosis research (combat models...intelligence, • Visualization of actions, suitable forms of information presentation, • Techniques of learning CIS users communication and information ... literacy , • Sensors: technology and systematic development as a branch, • Military prognosis research (combat models), • Man - machine interface. CISu

  6. SU-E-J-257: A PCA Model to Predict Adaptive Changes for Head&neck Patients Based On Extraction of Geometric Features From Daily CBCT Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, M; Henry Ford Health System, Detroit, MI; Siddiqui, F

    2015-06-15

    Purpose: Using daily cone beam CTs (CBCTs) to develop principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients and to assess the possibility of using these prospectively in adaptive radiation therapy (ART). Methods: Planning CT (pCT) images of 4 H&N patients were deformed to model several different systematic changes in patient anatomy during the course of the radiation therapy (RT). A Pinnacle plugin was used to linearly interpolate the systematic change in patient for the 35 fraction RT course and to generate a set of 35 synthetic CBCTs. Each synthetic CBCT represents the systematic changemore » in patient anatomy for each fraction. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CBCTs with random fraction-to-fraction changes were superimposed on the DVFs. A patient-specific PCA model was built using these DVFs containing systematic plus random changes. It was hypothesized that resulting eigenDVFs (EDVFs) with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: For all 4 patients, the PCA model provided different results depending on the type and size of systematic change in patient’s body. PCA was more successful in capturing the systematic changes early in the treatment course when these were of a larger scale with respect to the random fraction-to-fraction changes in patient’s anatomy. For smaller scale systematic changes, random changes in patient could completely “hide” the systematic change. Conclusion: The leading EDVF from the patientspecific PCA models could tentatively be identified as a major systematic change during treatment if the systematic change is large enough with respect to random fraction-to-fraction changes. Otherwise, leading EDVF could not represent systematic changes reliably. This work is expected to facilitate development of population-based PCA models that can be used to prospectively identify significant anatomical changes early in treatment. This work is supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  7. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  8. [Master trainer concept "structured specialist further education" : A joint project of the German Professional Associations of Internal Medicine, Surgeons and Orthopedic/Trauma Surgeons].

    PubMed

    Siebolds, M; Ansorg, J; Dittmar, R; Hennes, N; Radau, T; Ruff, S; Denkinger, M D

    2017-10-01

    The quality requirements in the practice of postgradual medical further education below the normal level of the further education regulations is a barely developed scientific field in Germany. A systematic use of internationally accepted scientific evidence barely exists. This research and development project was initiated in 2001 in order to be able to implement a practical but evidence-based model compatible with the existing structure of postgradual medical education. This project has been supported since 2013 by the Professional Associations of Internal Medicine (BDI), Surgeons (BDC) and Orthopedic and Trauma surgeons (BVOU). The development phase of this complex intervention was based on three stages involving stakeholder interviews from relevant groups, the identification of a theoretical model for the construction and systematic literature reviews to identify the relevant evidence. The basic model for structured specialist further education developed included the creation and implementation of a simple core curriculum for every department, a tool for systematic feedback within the framework of the annual further education interviews and a simple clinical assessment to evaluate the actual clinical performance of physicians in further education. A pilot test of this model was carried out in 150 specialist departments in Germany and continually developed. The project shows that such a program can be systematically developed and pilot studies can be carried out. The central problems in implementation involve the traditional informal further education culture, which as a rule does not implement a systematic elicitation of the state of learning continuously distributed over the whole period of further education and the practical testing of competence development.

  9. Systematic Treatment Selection (STS): A Review and Future Directions

    ERIC Educational Resources Information Center

    Nguyen, Tam T.; Bertoni, Matteo; Charvat, Mylea; Gheytanchi, Anahita; Beutler, Larry E.

    2007-01-01

    Systematic Treatment Selection (STS) is a form of technical eclectism that develops and plans treatments using empirically founded principles of psychotherapy. It is a model that provides systematic guidelines for the utilization of different psychotherapeutic strategies based on patient qualities and problem characteristics. Historically, it…

  10. Systematic review of prediction models for delirium in the older adult inpatient.

    PubMed

    Lindroth, Heidi; Bratzke, Lisa; Purvis, Suzanne; Brown, Roger; Coburn, Mark; Mrkobrada, Marko; Chan, Matthew T V; Davis, Daniel H J; Pandharipande, Pratik; Carlsson, Cynthia M; Sanders, Robert D

    2018-04-28

    To identify existing prognostic delirium prediction models and evaluate their validity and statistical methodology in the older adult (≥60 years) acute hospital population. Systematic review. PubMed, CINAHL, PsychINFO, SocINFO, Cochrane, Web of Science and Embase were searched from 1 January 1990 to 31 December 2016. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses and CHARMS Statement guided protocol development. age >60 years, inpatient, developed/validated a prognostic delirium prediction model. alcohol-related delirium, sample size ≤50. The primary performance measures were calibration and discrimination statistics. Two authors independently conducted search and extracted data. The synthesis of data was done by the first author. Disagreement was resolved by the mentoring author. The initial search resulted in 7,502 studies. Following full-text review of 192 studies, 33 were excluded based on age criteria (<60 years) and 27 met the defined criteria. Twenty-three delirium prediction models were identified, 14 were externally validated and 3 were internally validated. The following populations were represented: 11 medical, 3 medical/surgical and 13 surgical. The assessment of delirium was often non-systematic, resulting in varied incidence. Fourteen models were externally validated with an area under the receiver operating curve range from 0.52 to 0.94. Limitations in design, data collection methods and model metric reporting statistics were identified. Delirium prediction models for older adults show variable and typically inadequate predictive capabilities. Our review highlights the need for development of robust models to predict delirium in older inpatients. We provide recommendations for the development of such models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Systematic Development of Intelligent Systems for Public Road Transport.

    PubMed

    García, Carmelo R; Quesada-Arencibia, Alexis; Cristóbal, Teresa; Padrón, Gabino; Alayón, Francisco

    2016-07-16

    This paper presents an architecture model for the development of intelligent systems for public passenger transport by road. The main objective of our proposal is to provide a framework for the systematic development and deployment of telematics systems to improve various aspects of this type of transport, such as efficiency, accessibility and safety. The architecture model presented herein is based on international standards on intelligent transport system architectures, ubiquitous computing and service-oriented architecture for distributed systems. To illustrate the utility of the model, we also present a use case of a monitoring system for stops on a public passenger road transport network.

  12. Systematic Development of Intelligent Systems for Public Road Transport

    PubMed Central

    García, Carmelo R.; Quesada-Arencibia, Alexis; Cristóbal, Teresa; Padrón, Gabino; Alayón, Francisco

    2016-01-01

    This paper presents an architecture model for the development of intelligent systems for public passenger transport by road. The main objective of our proposal is to provide a framework for the systematic development and deployment of telematics systems to improve various aspects of this type of transport, such as efficiency, accessibility and safety. The architecture model presented herein is based on international standards on intelligent transport system architectures, ubiquitous computing and service-oriented architecture for distributed systems. To illustrate the utility of the model, we also present a use case of a monitoring system for stops on a public passenger road transport network. PMID:27438836

  13. Remodeling of legacy systems in health care using UML.

    PubMed

    Garde, Sebastian; Knaup, Petra; Herold, Ralf

    2002-01-01

    Research projects in the field of Medical Informatics often involve the development of application systems. Usually they are developed over a longer period of time, so that at a certain point of time a systematically planned reimplementation is necessary. The first step of reimplementation should be a systematic and comprehensive remodeling. When using UML for this task a systematic approach for remodeling activities is missing. Therefore, we developed a method for remodeling of legacy systems (Qumquad) and applied it to DOSPO, a documentation and therapy planning system for pediatric oncology. Qumquad helps to systematically carry out three steps: the modeling of the current actual state of the application system, the systematic identification of weak points and the development of a target concept for reimplementation considering the identified weak points. Results show that this approach is valuable and feasible and could be applied to various application systems in health care.

  14. A Model of the Temporal Dynamics of Knowledge Brokerage in Sustainable Development

    ERIC Educational Resources Information Center

    Hukkinen, Janne I.

    2016-01-01

    I develop a conceptual model of the temporal dynamics of knowledge brokerage for sustainable development. Brokerage refers to efforts to make research and policymaking more accessible to each other. The model enables unbiased and systematic consideration of knowledge brokerage as part of policy evolution. The model is theoretically grounded in…

  15. [Information system for supporting the Nursing Care Systematization].

    PubMed

    Malucelli, Andreia; Otemaier, Kelly Rafaela; Bonnet, Marcel; Cubas, Marcia Regina; Garcia, Telma Ribeiro

    2010-01-01

    It is an unquestionable fact, the importance, relevance and necessity of implementing the Nursing Care Systematization in the different environments of professional practice. Considering it as a principle, emerged the motivation for the development of an information system to support the Nursing Care Systematization, based on Nursing Process steps and Human Needs, using the diagnoses language, nursing interventions and outcomes for professional practice documentation. This paper describes the methodological steps and results of the information system development - requirements elicitation, modeling, object-relational mapping, implementation and system validation.

  16. Prognostic models for complete recovery in ischemic stroke: a systematic review and meta-analysis.

    PubMed

    Jampathong, Nampet; Laopaiboon, Malinee; Rattanakanokchai, Siwanon; Pattanittum, Porjai

    2018-03-09

    Prognostic models have been increasingly developed to predict complete recovery in ischemic stroke. However, questions arise about the performance characteristics of these models. The aim of this study was to systematically review and synthesize performance of existing prognostic models for complete recovery in ischemic stroke. We searched journal publications indexed in PUBMED, SCOPUS, CENTRAL, ISI Web of Science and OVID MEDLINE from inception until 4 December, 2017, for studies designed to develop and/or validate prognostic models for predicting complete recovery in ischemic stroke patients. Two reviewers independently examined titles and abstracts, and assessed whether each study met the pre-defined inclusion criteria and also independently extracted information about model development and performance. We evaluated validation of the models by medians of the area under the receiver operating characteristic curve (AUC) or c-statistic and calibration performance. We used a random-effects meta-analysis to pool AUC values. We included 10 studies with 23 models developed from elderly patients with a moderately severe ischemic stroke, mainly in three high income countries. Sample sizes for each study ranged from 75 to 4441. Logistic regression was the only analytical strategy used to develop the models. The number of various predictors varied from one to 11. Internal validation was performed in 12 models with a median AUC of 0.80 (95% CI 0.73 to 0.84). One model reported good calibration. Nine models reported external validation with a median AUC of 0.80 (95% CI 0.76 to 0.82). Four models showed good discrimination and calibration on external validation. The pooled AUC of the two validation models of the same developed model was 0.78 (95% CI 0.71 to 0.85). The performance of the 23 models found in the systematic review varied from fair to good in terms of internal and external validation. Further models should be developed with internal and external validation in low and middle income countries.

  17. The human placental perfusion model: a systematic review and development of a model to predict in vivo transfer of therapeutic drugs.

    PubMed

    Hutson, J R; Garcia-Bournissen, F; Davis, A; Koren, G

    2011-07-01

    Dual perfusion of a single placental lobule is the only experimental model to study human placental transfer of substances in organized placental tissue. To date, there has not been any attempt at a systematic evaluation of this model. The aim of this study was to systematically evaluate the perfusion model in predicting placental drug transfer and to develop a pharmacokinetic model to account for nonplacental pharmacokinetic parameters in the perfusion results. In general, the fetal-to-maternal drug concentration ratios matched well between placental perfusion experiments and in vivo samples taken at the time of delivery of the infant. After modeling for differences in maternal and fetal/neonatal protein binding and blood pH, the perfusion results were able to accurately predict in vivo transfer at steady state (R² = 0.85, P < 0.0001). Placental perfusion experiments can be used to predict placental drug transfer when adjusting for extra parameters and can be useful for assessing drug therapy risks and benefits in pregnancy.

  18. Flipping the classroom to teach systematic reviews: the development of a continuing education course for librarians*

    PubMed Central

    Conte, Marisa L.; MacEachern, Mark P.; Mani, Nandita S.; Townsend, Whitney A.; Smith, Judith E.; Masters, Chase; Kelley, Caitlin

    2015-01-01

    Objective: The researchers used the flipped classroom model to develop and conduct a systematic review course for librarians. Setting: The research took place at an academic health sciences library. Method: A team of informationists developed and conducted a pilot course. Assessment informed changes to both course components; a second course addressed gaps in the pilot. Main Results: Both the pilot and subsequent course received positive reviews. Changes based on assessment data will inform future iterations. Conclusion: The flipped classroom model can be successful in developing and implementing a course that is well rated by students. PMID:25918484

  19. Discovering a Gold Mine of Strategies for At-Risk Students through Systematic Staff Development.

    ERIC Educational Resources Information Center

    Bernal, Jesse R.; Villarreal, Diana

    This paper discusses an effective model of systematic staff development focusing on prevention and intervention strategies used with at-risk students. The following are key elements: (1) matching of the purposes of training to the goals of the school districts; (2) multiple and integrated activities; (3) participants' thorough orientation to the…

  20. How can animal models inform on the transition to chronic symptoms in whiplash?

    PubMed Central

    Winkelstein, Beth A.

    2011-01-01

    Study Design A non-systematic review of the literature. Objective The objective was to present general schema for mechanisms of whiplash pain and review the role of animal models in understanding the development of chronic pain from whiplash injury. Summary of Background Data Extensive biomechanical and clinical studies of whiplash have been performed to understand the injury mechanisms and symptoms of whiplash injury. However, only recently have animal models of this painful disorder been developed based on other pain models in the literature. Methods A non-systematic review was performed and findings were integrated to formulate a generalized picture of mechanisms by chronic whiplash pain develops from mechanical tissue injuries. Results The development of chronic pain from tissue injuries in the neck due to whiplash involves complex interactions between the injured tissue and spinal neuroimmune circuits. A variety of animal models are beginning to define these mechanisms. Conclusion Continued work is needed in developing appropriate animal models to investigate chronic pain from whiplash injuries and care must be taken to determine whether such models aim to model the injury event or the pain symptom. PMID:22020616

  1. Collaborative Modeling: Experience of the U.S. Preventive Services Task Force.

    PubMed

    Petitti, Diana B; Lin, Jennifer S; Owens, Douglas K; Croswell, Jennifer M; Feuer, Eric J

    2018-01-01

    Models can be valuable tools to address uncertainty, trade-offs, and preferences when trying to understand the effects of interventions. Availability of results from two or more independently developed models that examine the same question (comparative modeling) allows systematic exploration of differences between models and the effect of these differences on model findings. Guideline groups sometimes commission comparative modeling to support their recommendation process. In this commissioned collaborative modeling, modelers work with the people who are developing a recommendation or policy not only to define the questions to be addressed but ideally, work side-by-side with each other and with systematic reviewers to standardize selected inputs and incorporate selected common assumptions. This paper describes the use of commissioned collaborative modeling by the U.S. Preventive Services Task Force (USPSTF), highlighting the general challenges and opportunities encountered and specific challenges for some topics. It delineates other approaches to use modeling to support evidence-based recommendations and the many strengths of collaborative modeling compared with other approaches. Unlike systematic reviews prepared for the USPSTF, the commissioned collaborative modeling reports used by the USPSTF in making recommendations about screening have not been required to follow a common format, sometimes making it challenging to understand key model features. This paper presents a checklist developed to critically appraise commissioned collaborative modeling reports about cancer screening topics prepared for the USPSTF. Copyright © 2017 American Journal of Preventive Medicine. All rights reserved.

  2. Predictive Microbiology and Food Safety Applications

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling is the science of systematic study of recurrent events or phenomena. When models are properly developed, their applications may save costs and time. For microbial food safety research and applications, predictive microbiology models may be developed based on the fact that most ...

  3. Conceptualising paediatric health disparities: a metanarrative systematic review and unified conceptual framework.

    PubMed

    Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J

    2017-08-04

    There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Development of Prototype Driver Models for Highway Design: Research Update

    DOT National Transportation Integrated Search

    1999-06-01

    One of the high-priority research areas of the Federal Highway Administration (FHWA) is the development of the Interactive Highway Safety Design Model (IHSDM). The goal of the IHSDM research program is to develop a systematic approach that will allow...

  5. A Systematic Process for Developing High Quality SaaS Cloud Services

    NASA Astrophysics Data System (ADS)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  6. A systematic literature review of the key challenges for developing the structure of public health economic models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-04-01

    To identify the key methodological challenges for public health economic modelling and set an agenda for future research. An iterative literature search identified papers describing methodological challenges for developing the structure of public health economic models. Additional multidisciplinary literature searches helped expand upon important ideas raised within the review. Fifteen articles were identified within the formal literature search, highlighting three key challenges: inclusion of non-healthcare costs and outcomes; inclusion of equity; and modelling complex systems and multi-component interventions. Based upon these and multidisciplinary searches about dynamic complexity, the social determinants of health, and models of human behaviour, six areas for future research were specified. Future research should focus on: the use of systems approaches within health economic modelling; approaches to assist the systematic consideration of the social determinants of health; methods for incorporating models of behaviour and social interactions; consideration of equity; and methodology to help modellers develop valid, credible and transparent public health economic model structures.

  7. Complete Systematic Error Model of SSR for Sensor Registration in ATC Surveillance Networks

    PubMed Central

    Besada, Juan A.

    2017-01-01

    In this paper, a complete and rigorous mathematical model for secondary surveillance radar systematic errors (biases) is developed. The model takes into account the physical effects systematically affecting the measurement processes. The azimuth biases are calculated from the physical error of the antenna calibration and the errors of the angle determination dispositive. Distance bias is calculated from the delay of the signal produced by the refractivity index of the atmosphere, and from clock errors, while the altitude bias is calculated taking into account the atmosphere conditions (pressure and temperature). It will be shown, using simulated and real data, that adapting a classical bias estimation process to use the complete parametrized model results in improved accuracy in the bias estimation. PMID:28934157

  8. Nonbleeding adenomas: Evidence of systematic false-negative fecal immunochemical test results and their implications for screening effectiveness-A modeling study.

    PubMed

    van der Meulen, Miriam P; Lansdorp-Vogelaar, Iris; van Heijningen, Else-Mariëtte B; Kuipers, Ernst J; van Ballegooijen, Marjolein

    2016-06-01

    If some adenomas do not bleed over several years, they will cause systematic false-negative fecal immunochemical test (FIT) results. The long-term effectiveness of FIT screening has been estimated without accounting for such systematic false-negativity. There are now data with which to evaluate this issue. The authors developed one microsimulation model (MISCAN [MIcrosimulation SCreening ANalysis]-Colon) without systematic false-negative FIT results and one model that allowed a percentage of adenomas to be systematically missed in successive FIT screening rounds. Both variants were adjusted to reproduce the first-round findings of the Dutch CORERO FIT screening trial. The authors then compared simulated detection rates in the second screening round with those observed, and adjusted the simulated percentage of systematically missed adenomas to those data. Finally, the authors calculated the impact of systematic false-negative FIT results on the effectiveness of repeated FIT screening. The model without systematic false-negativity simulated higher detection rates in the second screening round than observed. These observed rates could be reproduced when assuming that FIT systematically missed 26% of advanced and 73% of nonadvanced adenomas. To reduce the false-positive rate in the second round to the observed level, the authors also had to assume that 30% of false-positive findings were systematically false-positive. Systematic false-negative FIT testing limits the long-term reduction of biennial FIT screening in the incidence of colorectal cancer (35.6% vs 40.9%) and its mortality (55.2% vs 59.0%) in participants. The results of the current study provide convincing evidence based on the combination of real-life and modeling data that a percentage of adenomas are systematically missed by repeat FIT screening. This impairs the efficacy of FIT screening. Cancer 2016;122:1680-8. © 2016 American Cancer Society. © 2016 American Cancer Society.

  9. AQUATIC TOXICITY MODE OF ACTION STUDIES APPLIED TO QSAR DEVELOPMENT

    EPA Science Inventory

    A series of QSAR models for predicting fish acute lethality were developed using systematically collected data on more than 600 chemicals. These models were developed based on the assumption that chemicals producing toxicity through a common mechanism will have commonality in the...

  10. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  11. Systematic Assessment for University Sexuality Programming.

    ERIC Educational Resources Information Center

    Westefeld, John S.; Winkelpleck, Judy M.

    1982-01-01

    Suggests systematic empirical assessment is needed to plan university sexuality programing. Proposes the traditional approach of asking about students' attitudes, knowledge, and behavior is useful for developing specific programing content. Presents an assessment model emphasizing assessment of students' desires for sexuality programing in terms…

  12. Design, Development, and Initial Evaluation of a Terminology for Clinical Decision Support and Electronic Clinical Quality Measurement.

    PubMed

    Lin, Yanhua; Staes, Catherine J; Shields, David E; Kandula, Vijay; Welch, Brandon M; Kawamoto, Kensaku

    2015-01-01

    When coupled with a common information model, a common terminology for clinical decision support (CDS) and electronic clinical quality measurement (eCQM) could greatly facilitate the distributed development and sharing of CDS and eCQM knowledge resources. To enable such scalable knowledge authoring and sharing, we systematically developed an extensible and standards-based terminology for CDS and eCQM in the context of the HL7 Virtual Medical Record (vMR) information model. The development of this terminology entailed three steps: (1) systematic, physician-curated concept identification from sources such as the Health Information Technology Standards Panel (HITSP) and the SNOMED-CT CORE problem list; (2) concept de-duplication leveraging the Unified Medical Language System (UMLS) MetaMap and Metathesaurus; and (3) systematic concept naming using standard terminologies and heuristic algorithms. This process generated 3,046 concepts spanning 68 domains. Evaluation against representative CDS and eCQM resources revealed approximately 50-70% concept coverage, indicating the need for continued expansion of the terminology.

  13. Design, Development, and Initial Evaluation of a Terminology for Clinical Decision Support and Electronic Clinical Quality Measurement

    PubMed Central

    Lin, Yanhua; Staes, Catherine J; Shields, David E; Kandula, Vijay; Welch, Brandon M; Kawamoto, Kensaku

    2015-01-01

    When coupled with a common information model, a common terminology for clinical decision support (CDS) and electronic clinical quality measurement (eCQM) could greatly facilitate the distributed development and sharing of CDS and eCQM knowledge resources. To enable such scalable knowledge authoring and sharing, we systematically developed an extensible and standards-based terminology for CDS and eCQM in the context of the HL7 Virtual Medical Record (vMR) information model. The development of this terminology entailed three steps: (1) systematic, physician-curated concept identification from sources such as the Health Information Technology Standards Panel (HITSP) and the SNOMED-CT CORE problem list; (2) concept de-duplication leveraging the Unified Medical Language System (UMLS) MetaMap and Metathesaurus; and (3) systematic concept naming using standard terminologies and heuristic algorithms. This process generated 3,046 concepts spanning 68 domains. Evaluation against representative CDS and eCQM resources revealed approximately 50–70% concept coverage, indicating the need for continued expansion of the terminology. PMID:26958220

  14. In Search of the Elusive ADDIE Model.

    ERIC Educational Resources Information Center

    Molenda, Michael

    2003-01-01

    Discusses the origin of the ADDIE model of instructional design and concludes that the term came into use by word of mouth as a label for the whole family of systematic instructional development models. Examines the underlying ideas behind the acronym analysis, design, development, implementation, and evaluation. (Author/LRW)

  15. The Use of Problem-Solving Techniques to Develop Semiotic Declarative Knowledge Models about Magnetism and Their Role in Learning for Prospective Science Teachers

    ERIC Educational Resources Information Center

    Ismail, Yilmaz

    2016-01-01

    This study aims to develop a semiotic declarative knowledge model, which is a positive constructive behavior model that systematically facilitates understanding in order to ensure that learners think accurately and ask the right questions about a topic. The data used to develop the experimental model were obtained using four measurement tools…

  16. Incorporation of habitat information in the development of indices of larval bluefin tuna (Thunnus thynnus) in the Western Mediterranean Sea (2001-2005 and 2012-2013)

    NASA Astrophysics Data System (ADS)

    Ingram, G. Walter; Alvarez-Berastegui, Diego; Reglero, Patricia; Balbín, Rosa; García, Alberto; Alemany, Francisco

    2017-06-01

    Fishery independent indices of bluefin tuna larvae in the Western Mediterranean Sea are presented utilizing ichthyoplankton survey data collected from 2001 through 2005 and 2012 through 2013. Indices were developed using larval catch rates collected using two different types of bongo sampling, by first standardizing catch rates by gear/fishing-style and then employing a delta-lognormal modeling approach. The delta-lognormal models were developed three ways: 1) a basic larval index including the following covariates: time of day, a systematic geographic area variable, month and year; 2) a standard environmental larval index including the following covariates: mean water temperature over the mixed layer depth, mean salinity over the mixed layer depth, geostrophic velocity, time of day, a systematic geographic area variable, month and year; and 3) a habitat-adjusted larval index including the following covariates: a potential habitat variable, time of day, a systematic geographic area variable, month and year. Results indicated that all three model-types had similar precision in index values. However, the habitat-adjusted larval index demonstrated a high correlation with estimates of spawning stock biomass from the previous stock assessment model, and, therefore, is recommended as a tuning index in future stock assessment models.

  17. Strong systematicity through sensorimotor conceptual grounding: an unsupervised, developmental approach to connectionist sentence processing

    NASA Astrophysics Data System (ADS)

    Jansen, Peter A.; Watter, Scott

    2012-03-01

    Connectionist language modelling typically has difficulty with syntactic systematicity, or the ability to generalise language learning to untrained sentences. This work develops an unsupervised connectionist model of infant grammar learning. Following the semantic boostrapping hypothesis, the network distils word category using a developmentally plausible infant-scale database of grounded sensorimotor conceptual representations, as well as a biologically plausible semantic co-occurrence activation function. The network then uses this knowledge to acquire an early benchmark clausal grammar using correlational learning, and further acquires separate conceptual and grammatical category representations. The network displays strongly systematic behaviour indicative of the general acquisition of the combinatorial systematicity present in the grounded infant-scale language stream, outperforms previous contemporary models that contain primarily noun and verb word categories, and successfully generalises broadly to novel untrained sensorimotor grounded sentences composed of unfamiliar nouns and verbs. Limitations as well as implications to later grammar learning are discussed.

  18. Use of regularized principal component analysis to model anatomical changes during head and neck radiation therapy for treatment adaptation and response assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, Mikhail A., E-mail: chetvertkov@wayne

    2016-10-15

    Purpose: To develop standard (SPCA) and regularized (RPCA) principal component analysis models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients and assess their potential use in adaptive radiation therapy, and for extracting quantitative information for treatment response assessment. Methods: Planning CT images of ten H&N patients were artificially deformed to create “digital phantom” images, which modeled systematic anatomical changes during radiation therapy. Artificial deformations closely mirrored patients’ actual deformations and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and syntheticmore » CBCTs (i.e., digital phantoms) and between pCT and clinical CBCTs. Patient-specific SPCA and RPCA models were built from these synthetic and clinical DVF sets. EigenDVFs (EDVFs) having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Results: Principal component analysis (PCA) models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade PCA’s ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. Conclusions: Leading EDVFs from the both PCA approaches have the potential to capture systematic anatomical change during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in a treatment course, or based on population models.« less

  19. Qumquad: a UML-based approach for remodeling of legacy systems in health care.

    PubMed

    Garde, Sebastian; Knaup, Petra; Herold, Ralf

    2003-07-01

    Health care information systems still comprise legacy systems to a certain extent. For reengineering legacy systems a thorough remodeling is inalienable. Current modeling techniques like the Unified Modeling Language (UML) do not offer a systematic and comprehensive process-oriented method for remodeling activities. We developed a systematic method for remodeling legacy systems in health care called Qumquad. Qumquad consists of three major steps: (i) modeling the actual state of the application system, (ii) systematic identification of weak points in this model and (iii) development of a target concept for the reimplementation considering the identified weak points. We applied Qumquad for remodeling a documentation and therapy planning system for pediatric oncology (DOSPO). As a result of our remodeling activities we regained an abstract model of the system, an analysis of the current weak points of DOSPO and possible (partly alternative) solutions to overcome the weak points. Qumquad proved to be very helpful in the reengineering process of DOSPO since we now have at our disposal a comprehensive model for the reimplementation of DOSPO that current users of the system agree on. Qumquad can easily be applied to other reengineering projects in health care.

  20. Systematic, theoretically-grounded development and feasibility testing of an innovative, preventive web-based game for children exposed to acute trauma

    PubMed Central

    Marsac, Meghan L.; Winston, Flaura K.; Hildenbrand, Aimee K.; Kohser, Kristen L.; March, Sonja; Kenardy, Justin; Kassam-Adams, Nancy

    2015-01-01

    Background Millions of children are affected by acute medical events annually, creating need for resources to promote recovery. While web-based interventions promise wide reach and low cost for users, development can be time- and cost-intensive. A systematic approach to intervention development can help to minimize costs and increase likelihood of effectiveness. Using a systematic approach, our team integrated evidence on the etiology of traumatic stress, an explicit program theory, and a user-centered design process to intervention development. Objective To describe evidence and the program theory model applied to the Coping Coach intervention and present pilot data evaluating intervention feasibility and acceptability. Method Informed by empirical evidence on traumatic stress prevention, an overarching program theory model was articulated to delineate pathways from a) specific intervention content to b) program targets and proximal outcomes to c) key longer-term health outcomes. Systematic user-testing with children ages 8–12 (N = 42) exposed to an acute medical event and their parents was conducted throughout intervention development. Results Functionality challenges in early prototypes necessitated revisions. Child engagement was positive throughout revisions to the Coping Coach intervention. Final pilot-testing demonstrated promising feasibility and high user-engagement and satisfaction. Conclusion Applying a systematic approach to the development of Coping Coach led to the creation of a functional intervention that is accepted by children and parents. Development of new e-health interventions may benefit from a similar approach. Future research should evaluate the efficacy of Coping Coach in achieving targeted outcomes of reduced trauma symptoms and improved health-related quality of life. PMID:25844276

  1. The Regional Climate Model Evaluation System: A Systematic Evaluation Of CORDEX Simulations Using Obs4MIPs

    NASA Astrophysics Data System (ADS)

    Goodman, A.; Lee, H.; Waliser, D. E.; Guttowski, W.

    2017-12-01

    Observation-based evaluations of global climate models (GCMs) have been a key element for identifying systematic model biases that can be targeted for model improvements and for establishing uncertainty associated with projections of global climate change. However, GCMs are limited in their ability to represent physical phenomena which occur on smaller, regional scales, including many types of extreme weather events. In order to help facilitate projections in changes of such phenomena, simulations from regional climate models (RCMs) for 14 different domains around the world are being provided by the Coordinated Regional Climate Downscaling Experiment (CORDEX; www.cordex.org). However, although CORDEX specifies standard simulation and archiving protocols, these simulations are conducted independently by individual research and modeling groups representing each of these domains often with different output requirements and data archiving and exchange capabilities. Thus, with respect to similar efforts using GCMs (e.g., the Coupled Model Intercomparison Project, CMIP), it is more difficult to achieve a standardized, systematic evaluation of the RCMs for each domain and across all the CORDEX domains. Using the Regional Climate Model Evaluation System (RCMES; rcmes.jpl.nasa.gov) developed at JPL, we are developing easy to use templates for performing systematic evaluations of CORDEX simulations. Results from the application of a number of evaluation metrics (e.g., biases, centered RMS, and pattern correlations) will be shown for a variety of physical quantities and CORDEX domains. These evaluations are performed using products from obs4MIPs, an activity initiated by DOE and NASA, and now shepherded by the World Climate Research Program's Data Advisory Council.

  2. Governance for public health and health equity: The Tröndelag model for public health work.

    PubMed

    Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim

    2018-06-01

    Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.

  3. Improved water resource management for a highly complex environment using three-dimensional groundwater modelling

    NASA Astrophysics Data System (ADS)

    Moeck, Christian; Affolter, Annette; Radny, Dirk; Dressmann, Horst; Auckenthaler, Adrian; Huggenberger, Peter; Schirmer, Mario

    2018-02-01

    A three-dimensional groundwater model was used to improve water resource management for a study area in north-west Switzerland, where drinking-water production is close to former landfills and industrial areas. To avoid drinking-water contamination, artificial groundwater recharge with surface water is used to create a hydraulic barrier between the contaminated sites and drinking-water extraction wells. The model was used for simulating existing and proposed water management strategies as a tool to ensure the utmost security for drinking water. A systematic evaluation of the flow direction between existing observation points using a developed three-point estimation method for a large number of scenarios was carried out. It is demonstrated that systematically applying the developed methodology helps to identify vulnerable locations which are sensitive to changing boundary conditions such as those arising from changes to artificial groundwater recharge rates. At these locations, additional investigations and protection are required. The presented integrated approach, using the groundwater flow direction between observation points, can be easily transferred to a variety of hydrological settings to systematically evaluate groundwater modelling scenarios.

  4. Computational Models of Relational Processes in Cognitive Development

    ERIC Educational Resources Information Center

    Halford, Graeme S.; Andrews, Glenda; Wilson, William H.; Phillips, Steven

    2012-01-01

    Acquisition of relational knowledge is a core process in cognitive development. Relational knowledge is dynamic and flexible, entails structure-consistent mappings between representations, has properties of compositionality and systematicity, and depends on binding in working memory. We review three types of computational models relevant to…

  5. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less

  6. A systematic review of predictive models for asthma development in children.

    PubMed

    Luo, Gang; Nkoy, Flory L; Stone, Bryan L; Schmick, Darell; Johnson, Michael D

    2015-11-28

    Asthma is the most common pediatric chronic disease affecting 9.6 % of American children. Delay in asthma diagnosis is prevalent, resulting in suboptimal asthma management. To help avoid delay in asthma diagnosis and advance asthma prevention research, researchers have proposed various models to predict asthma development in children. This paper reviews these models. A systematic review was conducted through searching in PubMed, EMBASE, CINAHL, Scopus, the Cochrane Library, the ACM Digital Library, IEEE Xplore, and OpenGrey up to June 3, 2015. The literature on predictive models for asthma development in children was retrieved, with search results limited to human subjects and children (birth to 18 years). Two independent reviewers screened the literature, performed data extraction, and assessed article quality. The literature search returned 13,101 references in total. After manual review, 32 of these references were determined to be relevant and are discussed in the paper. We identify several limitations of existing predictive models for asthma development in children, and provide preliminary thoughts on how to address these limitations. Existing predictive models for asthma development in children have inadequate accuracy. Efforts to improve these models' performance are needed, but are limited by a lack of a gold standard for asthma development in children.

  7. The role of the basic state in the ENSO-monsoon relationship and implications for predictability

    NASA Astrophysics Data System (ADS)

    Turner, A. G.; Inness, P. M.; Slingo, J. M.

    2005-04-01

    The impact of systematic model errors on a coupled simulation of the Asian summer monsoon and its interannual variability is studied. Although the mean monsoon climate is reasonably well captured, systematic errors in the equatorial Pacific mean that the monsoon-ENSO teleconnection is rather poorly represented in the general-circulation model. A system of ocean-surface heat flux adjustments is implemented in the tropical Pacific and Indian Oceans in order to reduce the systematic biases. In this version of the general-circulation model, the monsoon-ENSO teleconnection is better simulated, particularly the lag-lead relationships in which weak monsoons precede the peak of El Niño. In part this is related to changes in the characteristics of El Niño, which has a more realistic evolution in its developing phase. A stronger ENSO amplitude in the new model version also feeds back to further strengthen the teleconnection. These results have important implications for the use of coupled models for seasonal prediction of systems such as the monsoon, and suggest that some form of flux correction may have significant benefits where model systematic error compromises important teleconnections and modes of interannual variability.

  8. Toward a systematic exploration of nano-bio interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Xue; Liu, Fang; Liu, Yin

    Many studies of nanomaterials make non-systematic alterations of nanoparticle physicochemical properties. Given the immense size of the property space for nanomaterials, such approaches are not very useful in elucidating fundamental relationships between inherent physicochemical properties of these materials and their interactions with, and effects on, biological systems. Data driven artificial intelligence methods such as machine learning algorithms have proven highly effective in generating models with good predictivity and some degree of interpretability. They can provide a viable method of reducing or eliminating animal testing. However, careful experimental design with the modelling of the results in mind is a proven andmore » efficient way of exploring large materials spaces. This approach, coupled with high speed automated experimental synthesis and characterization technologies now appearing, is the fastest route to developing models that regulatory bodies may find useful. We advocate greatly increased focus on systematic modification of physicochemical properties of nanoparticles combined with comprehensive biological evaluation and computational analysis. This is essential to obtain better mechanistic understanding of nano-bio interactions, and to derive quantitatively predictive and robust models for the properties of nanomaterials that have useful domains of applicability. - Highlights: • Nanomaterials studies make non-systematic alterations to nanoparticle properties. • Vast nanomaterials property spaces require systematic studies of nano-bio interactions. • Experimental design and modelling are efficient ways of exploring materials spaces. • We advocate systematic modification and computational analysis to probe nano-bio interactions.« less

  9. A Comparison of Two Balance Calibration Model Building Methods

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard; Ulbrich, Norbert

    2007-01-01

    Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.

  10. Modeling the North American vertical datum of 1988 errors in the conterminous United States

    NASA Astrophysics Data System (ADS)

    Li, X.

    2018-02-01

    A large systematic difference (ranging from -20 cm to +130 cm) was found between NAVD 88 (North AmericanVertical Datum of 1988) and the pure gravimetric geoid models. This difference not only makes it very difficult to augment the local geoid model by directly using the vast NAVD 88 network with state-of-the-art technologies recently developed in geodesy, but also limits the ability of researchers to effectively demonstrate the geoid model improvements on the NAVD 88 network. Here, both conventional regression analyses based on various predefined basis functions such as polynomials, B-splines, and Legendre functions and the Latent Variable Analysis (LVA) such as the Factor Analysis (FA) are used to analyze the systematic difference. Besides giving a mathematical model, the regression results do not reveal a great deal about the physical reasons that caused the large differences in NAVD 88, which may be of interest to various researchers. Furthermore, there is still a significant amount of no-Gaussian signals left in the residuals of the conventional regression models. On the other side, the FA method not only provides a better not of the data, but also offers possible explanations of the error sources. Without requiring extra hypothesis tests on the model coefficients, the results from FA are more efficient in terms of capturing the systematic difference. Furthermore, without using a covariance model, a novel interpolating method based on the relationship between the loading matrix and the factor scores is developed for predictive purposes. The prediction error analysis shows that about 3-7 cm precision is expected in NAVD 88 after removing the systematic difference.

  11. Reviewing the evidence to inform the population of cost-effectiveness models within health technology assessments.

    PubMed

    Kaltenthaler, Eva; Tappenden, Paul; Paisley, Suzy

    2013-01-01

    Health technology assessments (HTAs) typically require the development of a cost-effectiveness model, which necessitates the identification, selection, and use of other types of information beyond clinical effectiveness evidence to populate the model parameters. The reviewing activity associated with model development should be transparent and reproducible but can result in a tension between being both timely and systematic. Little procedural guidance exists in this area. The purpose of this article was to provide guidance, informed by focus groups, on what might constitute a systematic and transparent approach to reviewing information to populate model parameters. A focus group series was held with HTA experts in the United Kingdom including systematic reviewers, information specialists, and health economic modelers to explore these issues. Framework analysis was used to analyze the qualitative data elicited during focus groups. Suggestions included the use of rapid reviewing methods and the need to consider the trade-off between relevance and quality. The need for transparency in the reporting of review methods was emphasized. It was suggested that additional attention should be given to the reporting of parameters deemed to be more important to the model or where the preferred decision regarding the choice of evidence is equivocal. These recommendations form part of a Technical Support Document produced for the National Institute for Health and Clinical Excellence Decision Support Unit in the United Kingdom. It is intended that these recommendations will help to ensure a more systematic, transparent, and reproducible process for the review of model parameters within HTA. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  12. Qualitative adaptation of child behaviour problem instruments in a developing-country setting.

    PubMed

    Khan, B; Avan, B I

    2014-07-08

    A key barrier to epidemiological research on child behaviour problems in developing countries is the lack of culturally relevant, internationally recognized psychometric instruments. This paper proposes a model for the qualitative adaptation of psychometric instruments in developing-country settings and presents a case study of the adaptation of 3 internationally recognized instruments in Pakistan: the Child Behavior Checklist, the Youth Self-Report and the Teacher's Report Form. This model encompassed a systematic procedure with 6 distinct phases to minimize bias and ensure equivalence with the original instruments: selection, deliberation, alteration, feasibility, testing and formal approval. The process was conducted in collaboration with the instruments' developer. A multidisciplinary working group of experts identified equivalence issues and suggested modifications. Focus group discussions with informants highlighted comprehension issues. Subsequently modified instruments were thoroughly tested. Finally, the instruments' developer approval further validated the qualitative adaptation. The study proposes a rigorous and systematic model to effectively achieve cultural adaptation of psychometric instruments.

  13. Community Dissemination of the Early Start Denver Model: Implications for Science and Practice

    ERIC Educational Resources Information Center

    Vismara, Laurie A.; Young, Gregory S.; Rogers, Sally J.

    2013-01-01

    The growing number of Autism Spectrum Disorder cases exceeds the services available for these children. This increase challenges both researchers and service providers to develop systematic, effective dissemination strategies for transporting university research models to community early intervention (EI) programs. The current study developed an…

  14. Classroom Crisis Intervention through Contracting: A Moral Development Model.

    ERIC Educational Resources Information Center

    Smaby, Marlowe H.; Tamminen, Armas W.

    1981-01-01

    A counselor can arbitrate problem situations using a systematic approach to classroom intervention which includes meetings with the teacher and students. This crisis intervention model based on moral development can be more effective than reliance on guidance activities disconnected from the actual classroom settings where the problems arise.…

  15. How adverse outcome pathways can aid the development and ...

    EPA Pesticide Factsheets

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work

  16. Operationalizing resilience using state and transition models

    USDA-ARS?s Scientific Manuscript database

    In management, restoration, and policy contexts, the notion of resilience can be confusing. Systematic development of conceptual models of ecological state change (state transition models; STMs) can help overcome semantic confusion and promote a mechanistic understanding of resilience. Drawing on ex...

  17. A Programming Environment Evaluation Methodology for Object-Oriented Systems. Ph.D Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1987-01-01

    The object-oriented design strategy as both a problem decomposition and system development paradigm has made impressive inroads into the various areas of the computing sciences. Substantial development productivity improvements have been demonstrated in areas ranging from artificial intelligence to user interface design. However, there has been very little progress in the formal characterization of these productivity improvements and in the identification of the underlying cognitive mechanisms. The development and validation of models and metrics of this sort require large amounts of systematically-gathered structural and productivity data. There has, however, been a notable lack of systematically-gathered information on these development environments. A large part of this problem is attributable to the lack of a systematic programming environment evaluation methodology that is appropriate to the evaluation of object-oriented systems.

  18. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  19. Neural systems language: a formal modeling language for the systematic description, unambiguous communication, and automated digital curation of neural connectivity.

    PubMed

    Brown, Ramsay A; Swanson, Larry W

    2013-09-01

    Systematic description and the unambiguous communication of findings and models remain among the unresolved fundamental challenges in systems neuroscience. No common descriptive frameworks exist to describe systematically the connective architecture of the nervous system, even at the grossest level of observation. Furthermore, the accelerating volume of novel data generated on neural connectivity outpaces the rate at which this data is curated into neuroinformatics databases to synthesize digitally systems-level insights from disjointed reports and observations. To help address these challenges, we propose the Neural Systems Language (NSyL). NSyL is a modeling language to be used by investigators to encode and communicate systematically reports of neural connectivity from neuroanatomy and brain imaging. NSyL engenders systematic description and communication of connectivity irrespective of the animal taxon described, experimental or observational technique implemented, or nomenclature referenced. As a language, NSyL is internally consistent, concise, and comprehensible to both humans and computers. NSyL is a promising development for systematizing the representation of neural architecture, effectively managing the increasing volume of data on neural connectivity and streamlining systems neuroscience research. Here we present similar precedent systems, how NSyL extends existing frameworks, and the reasoning behind NSyL's development. We explore NSyL's potential for balancing robustness and consistency in representation by encoding previously reported assertions of connectivity from the literature as examples. Finally, we propose and discuss the implications of a framework for how NSyL will be digitally implemented in the future to streamline curation of experimental results and bridge the gaps among anatomists, imagers, and neuroinformatics databases. Copyright © 2013 Wiley Periodicals, Inc.

  20. Modeling and applications in microbial food safety

    USDA-ARS?s Scientific Manuscript database

    Mathematical modeling is a scientific and systematic approach to study and describe the recurrent events or phenomena with successful application track for decades. When models are properly developed and validated, their applications may save costs and time. For the microbial food safety concerns, ...

  1. High Fidelity Modeling of Field Reversed Configuration (FRC) Thrusters

    DTIC Science & Technology

    2017-04-22

    signatures which can be used for direct, non -invasive, comparison with experimental diagnostics can be produced. This research will be directly... experimental campaign is critical to developing general design philosophies for low-power plasmoid formation, the complexity of non -linear plasma processes...advanced space propulsion. The work consists of numerical method development, physical model development, and systematic studies of the non -linear

  2. Taxonomy and systematics are key to biological information: Arabidopsis, Eutrema (Thellungiella), Noccaea and Schrenkiella (Brassicaceae) as examples

    PubMed Central

    Koch, Marcus A.; German, Dmitry A.

    2013-01-01

    Taxonomy and systematics provide the names and evolutionary framework for any biological study. Without these names there is no access to a biological context of the evolutionary processes which gave rise to a given taxon: close relatives and sister species (hybridization), more distantly related taxa (ancestral states), for example. This is not only true for the single species a research project is focusing on, but also for its relatives, which might be selected for comparative approaches and future research. Nevertheless, taxonomical and systematic knowledge is rarely fully explored and considered across biological disciplines. One would expect the situation to be more developed with model organisms such as Noccaea, Arabidopsis, Schrenkiella and Eutrema (Thellungiella). However, we show the reverse. Using Arabidopsis halleri and Noccaea caerulescens, two model species among metal accumulating taxa, we summarize and reflect past taxonomy and systematics of Arabidopsis and Noccaea and provide a modern synthesis of taxonomic, systematic and evolutionary perspectives. The same is presented for several species of Eutrema s. l. and Schrenkiella recently appeared as models for studying stress tolerance in plants and widely known under the name Thellungiella. PMID:23914192

  3. How do illness-anxious individuals process health-threatening information? A systematic review of evidence for the cognitive-behavioral model.

    PubMed

    Leonidou, Chrysanthi; Panayiotou, Georgia

    2018-08-01

    According to the cognitive-behavioral model, illness anxiety is developed and maintained through biased processing of health-threatening information and maladaptive responses to such information. This study is a systematic review of research that attempted to validate central tenets of the cognitive-behavioral model regarding etiological and maintenance mechanisms in illness anxiety. Sixty-two studies, including correlational and experimental designs, were identified through a systematic search of databases and were evaluated for their quality. Outcomes were synthesized following a qualitative thematic approach under categories of theoretically driven mechanisms derived from the cognitive-behavioral model: attention, memory and interpretation biases, perceived awareness and inaccuracy in perception of somatic sensations, negativity bias, emotion dysregulation, and behavioral avoidance. Findings partly support the cognitive-behavioral model, but several of its hypothetical mechanisms only receive weak support due to the scarcity of relevant studies. Directions for future research are suggested based on identified gaps in the existing literature. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. MCD Process Model: A Systematic Approach to Curriculum Development in Black Studies.

    ERIC Educational Resources Information Center

    Miller, Howard J.

    1986-01-01

    Holds that Black Studies programs have had problems surviving because of (1) resistance to curriculum change in colleges and universities, (2) their lack of supporters in positions of administrative power, and (3) lack of an organized, conceptual approach to developing and implementing a Black Studies curriculum. Presents a model designed to…

  5. Dynamical properties of the Penna aging model applied to the population of wolves

    NASA Astrophysics Data System (ADS)

    Makowiec, Danuta

    1997-02-01

    The parameters of th Penna bit-string model of aging of biological systems are systematically tested to better understand the model itself as well as the results arising from applying this model to studies of the development of the stationary population of Alaska wolves.

  6. Assessing cost-effectiveness of HPV vaccines with decision analytic models: what are the distinct challenges of low- and middle-income countries? A protocol for a systematic review.

    PubMed

    Ekwunife, Obinna I; Grote, Andreas Gerber; Mosch, Christoph; O'Mahony, James F; Lhachimi, Stefan K

    2015-05-12

    Cervical cancer poses a huge health burden, both to developed and developing nations, making prevention and control strategies necessary. However, the challenges of designing and implementing prevention strategies differ for low- and middle-income countries (LMICs) as compared to countries with fully developed health care systems. Moreover, for many LMICs, much of the data needed for decision analytic modelling, such as prevalence, will most likely only be partly available or measured with much larger uncertainty. Lastly, imperfect implementation of human papillomavirus (HPV) vaccination may influence the effectiveness of cervical cancer prevention in unpredictable ways. This systematic review aims to assess how decision analytic modelling studies of HPV cost-effectiveness in LMICs accounted for the particular challenges faced in such countries. Specifically, the study will assess the following: (1) whether the existing literature on cost-effectiveness modelling of HPV vaccines acknowledges the distinct challenges of LMICs, (2) how these challenges were accommodated in the models, (3) whether certain parameters systemically exhibited large degrees of uncertainty due to lack of data and how influential were these parameters on model-based recommendations, and (4) whether the choice of modelling herd immunity influences model-based recommendations, especially when coverage of a HPV vaccination program is not optimal. We will conduct a systematic review to identify suitable studies from MEDLINE (via PubMed), EMBASE, NHS Economic Evaluation Database (NHS EED), EconLit, Web of Science, and CEA Registry. Searches will be conducted for studies of interest published since 2006. The searches will be supplemented by hand searching of the most relevant papers found in the search. Studies will be critically appraised using Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement checklist. We will undertake a descriptive, narrative, and interpretative synthesis of data to address the study objectives. The proposed systematic review will assess how the cost-effectiveness studies of HPV vaccines accounted for the distinct challenges of LMICs. The gaps identified will expose areas for additional research as well as challenges that need to be accounted for in future modelling studies. PROSPERO CRD42015017870.

  7. A Systematic Method of Integrating BIM and Sensor Technology for Sustainable Construction Design

    NASA Astrophysics Data System (ADS)

    Liu, Zhen; Deng, Zhiyu

    2017-10-01

    Building Information Modeling (BIM) has received lots of attention of construction field, and sensor technology was applied in construction data collection. This paper developed a method to integrate BIM and sensor technology for sustainable construction design. A brief literature review was conducted to clarify the current development of BIM and sensor technology; then a systematic method for integrating BIM and sensor technology to realize sustainable construction design was put forward; finally a brief discussion and conclusion was given.

  8. Predictors of Major Depression and Posttraumatic Stress Disorder Following Traumatic Brain Injury: A Systematic Review and Meta-Analysis.

    PubMed

    Cnossen, Maryse C; Scholten, Annemieke C; Lingsma, Hester F; Synnot, Anneliese; Haagsma, Juanita; Steyerberg, Prof Ewout W; Polinder, Suzanne

    2017-01-01

    Although major depressive disorder (MDD) and posttraumatic stress disorder (PTSD) are prevalent after traumatic brain injury (TBI), little is known about which patients are at risk for developing them. The authors systematically reviewed the literature on predictors and multivariable models for MDD and PTSD after TBI. The authors included 26 observational studies. MDD was associated with female gender, preinjury depression, postinjury unemployment, and lower brain volume, whereas PTSD was related to shorter posttraumatic amnesia, memory of the traumatic event, and early posttraumatic symptoms. Risk of bias ratings for most studies were acceptable, although studies that developed a multivariable model suffered from methodological shortcomings.

  9. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    PubMed Central

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  10. Health literacy and public health: a systematic review and integration of definitions and models.

    PubMed

    Sørensen, Kristine; Van den Broucke, Stephan; Fullam, James; Doyle, Gerardine; Pelikan, Jürgen; Slonska, Zofia; Brand, Helmut

    2012-01-25

    Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  11. Demonstration of reduced-order urban scale building energy models

    DOE PAGES

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew; ...

    2017-09-08

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  12. Demonstration of reduced-order urban scale building energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heidarinejad, Mohammad; Mattise, Nicholas; Dahlhausen, Matthew

    The aim of this study is to demonstrate a developed framework to rapidly create urban scale reduced-order building energy models using a systematic summary of the simplifications required for the representation of building exterior and thermal zones. These urban scale reduced-order models rely on the contribution of influential variables to the internal, external, and system thermal loads. OpenStudio Application Programming Interface (API) serves as a tool to automate the process of model creation and demonstrate the developed framework. The results of this study show that the accuracy of the developed reduced-order building energy models varies only up to 10% withmore » the selection of different thermal zones. In addition, to assess complexity of the developed reduced-order building energy models, this study develops a novel framework to quantify complexity of the building energy models. Consequently, this study empowers the building energy modelers to quantify their building energy model systematically in order to report the model complexity alongside the building energy model accuracy. An exhaustive analysis on four university campuses suggests that the urban neighborhood buildings lend themselves to simplified typical shapes. Specifically, building energy modelers can utilize the developed typical shapes to represent more than 80% of the U.S. buildings documented in the CBECS database. One main benefits of this developed framework is the opportunity for different models including airflow and solar radiation models to share the same exterior representation, allowing a unifying exchange data. Altogether, the results of this study have implications for a large-scale modeling of buildings in support of urban energy consumption analyses or assessment of a large number of alternative solutions in support of retrofit decision-making in the building industry.« less

  13. Capturing security requirements for software systems.

    PubMed

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  14. Capturing security requirements for software systems

    PubMed Central

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-01-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514

  15. Systematic optimization of fed-batch simultaneous saccharification and fermentation at high-solid loading based on enzymatic hydrolysis and dynamic metabolic modeling of Saccharomyces cerevisiae.

    PubMed

    Unrean, Pornkamol; Khajeeram, Sutamat; Laoteng, Kobkul

    2016-03-01

    An integrative simultaneous saccharification and fermentation (SSF) modeling is a useful guiding tool for rapid process optimization to meet the techno-economic requirement of industrial-scale lignocellulosic ethanol production. In this work, we have developed the SSF model composing of a metabolic network of a Saccharomyces cerevisiae cell associated with fermentation kinetics and enzyme hydrolysis model to quantitatively capture dynamic responses of yeast cell growth and fermentation during SSF. By using model-based design of feeding profiles for substrate and yeast cell in the fed-batch SSF process, an efficient ethanol production with high titer of up to 65 g/L and high yield of 85 % of theoretical yield was accomplished. The ethanol titer and productivity was increased by 47 and 41 %, correspondingly, in optimized fed-batch SSF as compared to batch process. The developed integrative SSF model is, therefore, considered as a promising approach for systematic design of economical and sustainable SSF bioprocessing of lignocellulose.

  16. High Fidelity Modeling of Field Reversed Configuration (FRC) Thrusters

    DTIC Science & Technology

    2016-06-01

    space propulsion . This effort consists of numerical model development, physical model development, and systematic studies of the non-linear plasma...studies of the physical characteristics of Field Reversed Configuration (FRC) plasma for advanced space propulsion . This effort consists of numerical...FRCs for propulsion application. Two of the most advanced designs are based on the theta-pinch formation and the RMF formation mechanism, which

  17. Modeled Neutron and Charged-Particle Induced Nuclear Reaction Cross Sections for Radiochemistry in the Region of Yttrium, Zirconium, Niobium, and Molybdenum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, R D; Kelley, K; Dietrich, F S

    2006-06-13

    We have developed a set of modeled nuclear reaction cross sections for use in radiochemical diagnostics. Systematics for the input parameters required by the Hauser-Feshbach statistical model were developed and used to calculate neutron, proton, and deuteron induced nuclear reaction cross sections for targets ranging from strontium (Z = 38) to rhodium (Z = 45).

  18. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    DTIC Science & Technology

    2016-07-27

    is a common requirement for aircraft, rockets , and hypersonic vehicles. The Aerospace Fuels Quality Test and Model Development (AFQTMoDev) project...was initiated to mature fuel quality assurance practices for rocket grade kerosene, thereby ensuring operational readiness of conventional and...and reliability, is a common requirement for aircraft, rockets , and hypersonic vehicles. The Aerospace Fuels Quality Test and Model Development

  19. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  20. SU-F-R-41: Regularized PCA Can Model Treatment-Related Changes in Head and Neck Patients Using Daily CBCTs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chetvertkov, M; Henry Ford Health System, Detroit, MI; Siddiqui, F

    2016-06-15

    Purpose: To use daily cone beam CTs (CBCTs) to develop regularized principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients, to guide replanning decisions in adaptive radiation therapy (ART). Methods: Known deformations were applied to planning CT (pCT) images of 10 H&N patients to model several different systematic anatomical changes. A Pinnacle plugin was used to interpolate systematic changes over 35 fractions, generating a set of 35 synthetic CTs for each patient. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CTs and random fraction-to-fraction changes were superimposed on the DVFs. Standard non-regularizedmore » and regularized patient-specific PCA models were built using the DVFs. The ability of PCA to extract the known deformations was quantified. PCA models were also generated from clinical CBCTs, for which the deformations and DVFs were not known. It was hypothesized that resulting eigenvectors/eigenfunctions with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: As demonstrated with quantitative results in the supporting document regularized PCA is more successful than standard PCA at capturing systematic changes early in the treatment. Regularized PCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes. To be successful at guiding ART, regularized PCA should be coupled with models of when anatomical changes occur: early, late or throughout the treatment course. Conclusion: The leading eigenvector/eigenfunction from the both PCA approaches can tentatively be identified as a major systematic change during radiotherapy course when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the regularized PCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in the treatment course. This work is supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  1. Insights on the impact of systematic model errors on data assimilation performance in changing catchments

    NASA Astrophysics Data System (ADS)

    Pathiraja, S.; Anghileri, D.; Burlando, P.; Sharma, A.; Marshall, L.; Moradkhani, H.

    2018-03-01

    The global prevalence of rapid and extensive land use change necessitates hydrologic modelling methodologies capable of handling non-stationarity. This is particularly true in the context of Hydrologic Forecasting using Data Assimilation. Data Assimilation has been shown to dramatically improve forecast skill in hydrologic and meteorological applications, although such improvements are conditional on using bias-free observations and model simulations. A hydrologic model calibrated to a particular set of land cover conditions has the potential to produce biased simulations when the catchment is disturbed. This paper sheds new light on the impacts of bias or systematic errors in hydrologic data assimilation, in the context of forecasting in catchments with changing land surface conditions and a model calibrated to pre-change conditions. We posit that in such cases, the impact of systematic model errors on assimilation or forecast quality is dependent on the inherent prediction uncertainty that persists even in pre-change conditions. Through experiments on a range of catchments, we develop a conceptual relationship between total prediction uncertainty and the impacts of land cover changes on the hydrologic regime to demonstrate how forecast quality is affected when using state estimation Data Assimilation with no modifications to account for land cover changes. This work shows that systematic model errors as a result of changing or changed catchment conditions do not always necessitate adjustments to the modelling or assimilation methodology, for instance through re-calibration of the hydrologic model, time varying model parameters or revised offline/online bias estimation.

  2. El Paso's Organizational Development Model

    ERIC Educational Resources Information Center

    de los Santos, Gilberto

    1975-01-01

    The success of El Paso Community College (Texas) is attributed to its early definition of instructional thrusts including: systematizing and individualizing instruction; increasing awareness, sensitivity, and appreciation of the culture of the students; development of staff bilingual capabilities; development of staff teams versed in management by…

  3. Systematizing the Delivery of Local Employment and Training Services. The Job Center Technical Assistance Guide.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Industry, Labor and Human Relations, Madison.

    This technical assistance guide was developed to consolidate a statewide understanding of the effort to systematize the delivery of employment and training programs through the local formation of job centers in Wisconsin, and to provide a compilation, drawn from 20 local models, that explains how the programs are delivered. The guide is organized…

  4. A Systematic Review and Classification of Interventions for Speech-Sound Disorder in Preschool Children

    ERIC Educational Resources Information Center

    Wren, Yvonne; Harding, Sam; Goldbart, Juliet; Roulstone, Sue

    2018-01-01

    Background: Multiple interventions have been developed to address speech sound disorder (SSD) in children. Many of these have been evaluated but the evidence for these has not been considered within a model which categorizes types of intervention. The opportunity to carry out a systematic review of interventions for SSD arose as part of a larger…

  5. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  6. CALIBRATION OF SUBSURFACE BATCH AND REACTIVE-TRANSPORT MODELS INVOLVING COMPLEX BIOGEOCHEMICAL PROCESSES

    EPA Science Inventory

    In this study, the calibration of subsurface batch and reactive-transport models involving complex biogeochemical processes was systematically evaluated. Two hypothetical nitrate biodegradation scenarios were developed and simulated in numerical experiments to evaluate the perfor...

  7. Models to predict length of stay in the Intensive Care Unit after coronary artery bypass grafting: a systematic review.

    PubMed

    Atashi, Alireza; Verburg, Ilona W; Karim, Hesam; Miri, Mirmohammad; Abu-Hanna, Ameen; de Jonge, Evert; de Keizer, Nicolette F; Eslami, Saeid

    2018-06-01

    Intensive Care Units (ICU) length of stay (LoS) prediction models are used to compare different institutions and surgeons on their performance, and is useful as an efficiency indicator for quality control. There is little consensus about which prediction methods are most suitable to predict (ICU) length of stay. The aim of this study is to systematically review models for predicting ICU LoS after coronary artery bypass grafting and to assess the reporting and methodological quality of these models to apply them for benchmarking. A general search was conducted in Medline and Embase up to 31-12-2016. Three authors classified the papers for inclusion by reading their title, abstract and full text. All original papers describing development and/or validation of a prediction model for LoS in the ICU after CABG surgery were included. We used a checklist developed for critical appraisal and data extraction for systematic reviews of prediction modeling and extended it on handling specific patients subgroups. We also defined other items and scores to assess the methodological and reporting quality of the models. Of 5181 uniquely identified articles, fifteen studies were included of which twelve on development of new models and three on validation of existing models. All studies used linear or logistic regression as method for model development, and reported various performance measures based on the difference between predicted and observed ICU LoS. Most used a prospective (46.6%) or retrospective study design (40%). We found heterogeneity in patient inclusion/exclusion criteria; sample size; reported accuracy rates; and methods of candidate predictor selection. Most (60%) studies have not mentioned the handling of missing values and none compared the model outcome measure of survivors with non-survivors. For model development and validation studies respectively, the maximum reporting (methodological) scores were 66/78 and 62/62 (14/22 and 12/22). There are relatively few models for predicting ICU length of stay after CABG. Several aspects of methodological and reporting quality of studies in this field should be improved. There is a need for standardizing outcome and risk factor definitions in order to develop/validate a multi-institutional and international risk scoring system.

  8. Development of Probabilistic Socio-Economic Emissions Scenarios (2012)

    EPA Pesticide Factsheets

    The purpose of this analysis is to help overcome these limitations through the development of a publically available library of socio-economic-emissions projections derived from a systematic examination of uncertainty in key underlying model parameters, w

  9. Systematic approaches to toxicology in the zebrafish.

    PubMed

    Peterson, Randall T; Macrae, Calum A

    2012-01-01

    As the current paradigms of drug discovery evolve, it has become clear that a more comprehensive understanding of the interactions between small molecules and organismal biology will be vital. The zebrafish is emerging as a complement to existing in vitro technologies and established preclinical in vivo models that can be scaled for high-throughput. In this review, we highlight the current status of zebrafish toxicology studies, identify potential future niches for the model in the drug development pipeline, and define the hurdles that must be overcome as zebrafish technologies are refined for systematic toxicology.

  10. The Discrepancy Evaluation Model: A Systematic Approach for the Evaluation of Career Planning and Placement Programs.

    ERIC Educational Resources Information Center

    Buttram, Joan L.; Covert, Robert W.

    The Discrepancy Evaluation Model (DEM), developed in 1966 by Malcolm Provus, provides information for program assessment and program improvement. Under the DEM, evaluation is defined as the comparison of an actual performance to a desired standard. The DEM embodies five stages of evaluation based upon a program's natural development: program…

  11. Developing an Integrative Play Therapy Group Model for Middle School Male Students to Address Bullying Behaviors

    ERIC Educational Resources Information Center

    Jordan, Jakarla

    2016-01-01

    This research examines the systematic process of developing an integrative play therapy group model for middle school male students, ages 11-15 who participate in bullying behaviors. Play therapy approaches and evidence-based practices are documented as effective measures for addressing bullying behaviors with children and adolescents. This group…

  12. Effects of Instructional Design with Mental Model Analysis on Learning.

    ERIC Educational Resources Information Center

    Hong, Eunsook

    This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…

  13. Validation of a common data model for active safety surveillance research

    PubMed Central

    Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E

    2011-01-01

    Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893

  14. Evaluating clinical librarian services: a systematic review.

    PubMed

    Brettle, Alison; Maden-Jenkins, Michelle; Anderson, Lucy; McNally, Rosalind; Pratchett, Tracey; Tancock, Jenny; Thornton, Debra; Webb, Anne

    2011-03-01

      Previous systematic reviews have indicated limited evidence and poor quality evaluations of clinical librarian (CL) services. Rigorous evaluations should demonstrate the value of CL services, but guidance is needed before this can be achieved.   To undertake a systematic review which examines models of CL services, quality, methods and perspectives of clinical librarian service evaluations.   Systematic review methodology and synthesis of evidence, undertaken collaboratively by a group of 8 librarians to develop research and critical appraisal skills.   There are four clear models of clinical library service provision. Clinical librarians are effective in saving health professionals time, providing relevant, useful information and high quality services. Clinical librarians have a positive effect on clinical decision making by contributing to better informed decisions, diagnosis and choice of drug or therapy. The quality of CL studies is improving, but more work is needed on reducing bias and providing evidence of specific impacts on patient care. The Critical Incident Technique as part of a mixed method approach appears to offer a useful approach to demonstrating impact.   This systematic review provides practical guidance regarding the evaluation of CL services. It also provides updated evidence regarding the effectiveness and impact of CL services. The approach used was successful in developing research and critical appraisal skills in a group of librarians. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.

  15. A Systematic Planning for Science Laboratory Instruction: Research-Based Evidence

    ERIC Educational Resources Information Center

    Balta, Nuri

    2015-01-01

    The aim of this study is to develop an instructional design model for science laboratory instruction. Well-known ID models were analysed and Dick and Carey model was imitated to produce a science laboratory instructional design (SLID) model. In order to validate the usability of the designed model, the views of 34 high school teachers related to…

  16. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  17. Veterans' informal caregivers in the "sandwich generation": a systematic review toward a resilience model.

    PubMed

    Smith-Osborne, Alexa; Felderhoff, Brandi

    2014-01-01

    Social work theory advanced the formulation of the construct of the sandwich generation to apply to the emerging generational cohort of caregivers, most often middle-aged women, who were caring for maturing children and aging parents simultaneously. This systematic review extends that focus by synthesizing the literature on sandwich generation caregivers for the general aging population with dementia and for veterans with dementia and polytrauma. It develops potential protective mechanisms based on empirical literature to support an intervention resilience model for social work practitioners. This theoretical model addresses adaptive coping of sandwich- generation families facing ongoing challenges related to caregiving demands.

  18. A Model-Driven Approach to Teaching Concurrency

    ERIC Educational Resources Information Center

    Carro, Manuel; Herranz, Angel; Marino, Julio

    2013-01-01

    We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…

  19. From Cognitive Science to School Practice: Building the Bridge

    ERIC Educational Resources Information Center

    Singer, Mihaela

    2003-01-01

    The paper is focused on recent researches in neuroscience and developmental psychology regarding mathematical abilities of infants. A model that tries to explain these findings is developed. The model underlies the mental operations that could be systematically trained to generate efficient school learning. The model is built from a cognitive…

  20. Recent development of feature extraction and classification multispectral/hyperspectral images: a systematic literature review

    NASA Astrophysics Data System (ADS)

    Setiyoko, A.; Dharma, I. G. W. S.; Haryanto, T.

    2017-01-01

    Multispectral data and hyperspectral data acquired from satellite sensor have the ability in detecting various objects on the earth ranging from low scale to high scale modeling. These data are increasingly being used to produce geospatial information for rapid analysis by running feature extraction or classification process. Applying the most suited model for this data mining is still challenging because there are issues regarding accuracy and computational cost. This research aim is to develop a better understanding regarding object feature extraction and classification applied for satellite image by systematically reviewing related recent research projects. A method used in this research is based on PRISMA statement. After deriving important points from trusted sources, pixel based and texture-based feature extraction techniques are promising technique to be analyzed more in recent development of feature extraction and classification.

  1. Normal forms for reduced stochastic climate models

    PubMed Central

    Majda, Andrew J.; Franzke, Christian; Crommelin, Daan

    2009-01-01

    The systematic development of reduced low-dimensional stochastic climate models from observations or comprehensive high-dimensional climate models is an important topic for atmospheric low-frequency variability, climate sensitivity, and improved extended range forecasting. Here techniques from applied mathematics are utilized to systematically derive normal forms for reduced stochastic climate models for low-frequency variables. The use of a few Empirical Orthogonal Functions (EOFs) (also known as Principal Component Analysis, Karhunen–Loéve and Proper Orthogonal Decomposition) depending on observational data to span the low-frequency subspace requires the assessment of dyad interactions besides the more familiar triads in the interaction between the low- and high-frequency subspaces of the dynamics. It is shown below that the dyad and multiplicative triad interactions combine with the climatological linear operator interactions to simultaneously produce both strong nonlinear dissipation and Correlated Additive and Multiplicative (CAM) stochastic noise. For a single low-frequency variable the dyad interactions and climatological linear operator alone produce a normal form with CAM noise from advection of the large scales by the small scales and simultaneously strong cubic damping. These normal forms should prove useful for developing systematic strategies for the estimation of stochastic models from climate data. As an illustrative example the one-dimensional normal form is applied below to low-frequency patterns such as the North Atlantic Oscillation (NAO) in a climate model. The results here also illustrate the short comings of a recent linear scalar CAM noise model proposed elsewhere for low-frequency variability. PMID:19228943

  2. A systematic review of innovative diabetes care models in low-and middle-income countries (LMICs).

    PubMed

    Esterson, Yonah B; Carey, Michelle; Piette, John D; Thomas, Nihal; Hawkins, Meredith

    2014-02-01

    Over 70% of the world's patients with diabetes reside in low-and middle-income countries (LMICs), where adequate infrastructure and resources for diabetes care are often lacking. Therefore, academic institutions, health care organizations, and governments from Western nations and LMICs have worked together to develop a variety of effective diabetes care models for resource-poor settings. A focused search of PubMed was conducted with the goal of identifying reports that addressed the implementation of diabetes care models or initiatives to improve clinical and/or biochemical outcomes in patients with diabetes mellitus. A total of 15 published manuscripts comprising nine diabetes care models in 16 locations in sub-Saharan Africa, Latin America, and Asia identified by the above approach were systematically reviewed. The reviewed models shared a number of principles including collaboration, education, standardization, resource optimization, and technological innovation. The most comprehensive models used a number of these principles, which contributed to their success. Reviewing the principles shared by these successful programs may help guide the development of effective future models for diabetes care in low-income settings.

  3. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    PubMed Central

    Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2017-01-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170

  4. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    PubMed

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  6. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: choice, control & change.

    PubMed

    Lee, Heewon; Contento, Isobel R; Koch, Pamela

    2013-03-01

    To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  7. Health literacy and public health: A systematic review and integration of definitions and models

    PubMed Central

    2012-01-01

    Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings. PMID:22276600

  8. Developing and testing a landscape habitat suitability model for the American marten (Martes americana) in the Cascades mountains of California

    Treesearch

    Thomas A. Kirk; William J. Zielinski

    2009-01-01

    We used field surveys and Geographic Information System data to identify landscape-scale habitat associations of American martens (Martes americana) and to develop a model to predict their occurrence in northeastern California. Systematic surveys using primarily enclosed track plates, with 10-km spacing, were conducted across a 27,700 km

  9. Modeling How, When, and What Is Learned in a Simple Fault-Finding Task

    ERIC Educational Resources Information Center

    Ritter, Frank E.; Bibby, Peter A.

    2008-01-01

    We have developed a process model that learns in multiple ways while finding faults in a simple control panel device. The model predicts human participants' learning through its own learning. The model's performance was systematically compared to human learning data, including the time course and specific sequence of learned behaviors. These…

  10. The Corporate University Model for Continuous Learning, Training and Development.

    ERIC Educational Resources Information Center

    El-Tannir, Akram A.

    2002-01-01

    Corporate universities typically convey corporate culture and provide systematic curriculum aimed at achieving strategic objectives. Virtual access and company-specific content combine to provide opportunities for continuous and active learning, a model that is becoming pervasive. (Contains 17 references.) (SK)

  11. Practice makes perfect: A systematic review of the expertise development of pharmacist and nurse independent prescribers in the United Kingdom.

    PubMed

    Abuzour, Aseel S; Lewis, Penny J; Tully, Mary P

    2018-01-01

    Prescribing is a complex and error-prone task that demands expertise. McLellan et al.'s theory of expertise development model ("the model"), developed to assess medical literature on prescribing by medical students, proposes that in order to develop, individuals should deliberately engage their knowledge, skills and attitudes within a social context. Its applicability to independent prescribers (IP) is unknown. A systematic review was conducted to explore whether the model is applicable to non-medical independent prescribing and to assess the factors underpinning expertise development reported in the literature. Six electronic databases (EMBASE, Medline, AMED, CINAHL, IPA and PsychInfo) were searched for articles published between 2006 and 2016, reporting empirical data on pharmacist and nurse IPs education or practice. Data were extracted using themes from the model and analysed using framework analysis. Thirty-four studies met the inclusion criteria. Knowledge, pre-registration education, experience, support and confidence were some of the intrinsic and extrinsic factors influencing IPs. Difficulty in transferring theory to practice was attributed to lack of basic pharmacology and bioscience content in pre-registration nursing rather than the prescribing programme. Students saw interventions using virtual learning or learning in practice as more useful with long-term benefits e.g. students were able to use their skills in history taking following the virtual learning intervention 6-months after the programme. All studies demonstrated how engaging knowledge and skills affected individuals' attitude by, for example, increasing professional dignity. IPs were able to develop their expertise when integrating their competencies in a workplace context with support from colleagues and adherence to guidelines. This is the first study to synthesize data systematically on expertise development from studies on IPs using the model. The model showed the need for stronger foundations in scientific knowledge amongst some IPs, where continuous workplace practice can improve skills and strengthen attitudes. This could facilitate a smoother transfer of learnt theory to practice, in order for IPs to be experts within their fields and not merely adequately competent. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Use of Decision Models in the Development of Evidence-Based Clinical Preventive Services Recommendations: Methods of the U.S. Preventive Services Task Force.

    PubMed

    Owens, Douglas K; Whitlock, Evelyn P; Henderson, Jillian; Pignone, Michael P; Krist, Alex H; Bibbins-Domingo, Kirsten; Curry, Susan J; Davidson, Karina W; Ebell, Mark; Gillman, Matthew W; Grossman, David C; Kemper, Alex R; Kurth, Ann E; Maciosek, Michael; Siu, Albert L; LeFevre, Michael L

    2016-10-04

    The U.S. Preventive Services Task Force (USPSTF) develops evidence-based recommendations about preventive care based on comprehensive systematic reviews of the best available evidence. Decision models provide a complementary, quantitative approach to support the USPSTF as it deliberates about the evidence and develops recommendations for clinical and policy use. This article describes the rationale for using modeling, an approach to selecting topics for modeling, and how modeling may inform recommendations about clinical preventive services. Decision modeling is useful when clinical questions remain about how to target an empirically established clinical preventive service at the individual or program level or when complex determinations of magnitude of net benefit, overall or among important subpopulations, are required. Before deciding whether to use decision modeling, the USPSTF assesses whether the benefits and harms of the preventive service have been established empirically, assesses whether there are key issues about applicability or implementation that modeling could address, and then defines the decision problem and key questions to address through modeling. Decision analyses conducted for the USPSTF are expected to follow best practices for modeling. For chosen topics, the USPSTF assesses the strengths and limitations of the systematically reviewed evidence and the modeling analyses and integrates the results of each to make preventive service recommendations.

  13. Knowledge brokering on emissions modelling in Strategic Environmental Assessment of Estonian energy policy with special reference to the LEAP model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuldna, Piret, E-mail: piret.kuldna@seit.ee; Peterson, Kaja; Kuhi-Thalfeldt, Reeli

    Strategic Environmental Assessment (SEA) serves as a platform for bringing together researchers, policy developers and other stakeholders to evaluate and communicate significant environmental and socio-economic effects of policies, plans and programmes. Quantitative computer models can facilitate knowledge exchange between various parties that strive to use scientific findings to guide policy-making decisions. The process of facilitating knowledge generation and exchange, i.e. knowledge brokerage, has been increasingly explored, but there is not much evidence in the literature on how knowledge brokerage activities are used in full cycles of SEAs which employ quantitative models. We report on the SEA process of the nationalmore » energy plan with reflections on where and how the Long-range Energy Alternatives Planning (LEAP) model was used for knowledge brokerage on emissions modelling between researchers and policy developers. Our main suggestion is that applying a quantitative model not only in ex ante, but also ex post scenario modelling and associated impact assessment can facilitate systematic and inspiring knowledge exchange process on a policy problem and capacity building of participating actors. - Highlights: • We examine the knowledge brokering on emissions modelling between researchers and policy developers in a full cycle of SEA. • Knowledge exchange process can evolve at any modelling stage within SEA. • Ex post scenario modelling enables systematic knowledge exchange and learning on a policy problem.« less

  14. Systematic Uncertainties in High-Energy Hadronic Interaction Models

    NASA Astrophysics Data System (ADS)

    Zha, M.; Knapp, J.; Ostapchenko, S.

    2003-07-01

    Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.

  15. Predictors of human immunodeficiency virus (HIV) infection in primary care among adults living in developed countries: a systematic review.

    PubMed

    Rumbwere Dube, Benhildah N; Marshall, Tom P; Ryan, Ronan P; Omonijo, Modupe

    2018-06-02

    Early diagnosis of human immunodeficiency virus (HIV) is important because antiretroviral therapies are more effective if infected individuals are diagnosed early. Diagnosis of HIV relies on laboratory testing and determining the demographic and clinical characteristics of undiagnosed HIV-infected patients may be useful in identifying patients for testing. This systematic review aims to identify characteristics of HIV-infected adults prior to diagnosis that could be used in a prediction model for early detection of patients for HIV testing in UK primary care. The population of interest was adults aged ≥ 18 years in developed countries. The exposures were demographic, socio-economic or clinical characteristics associated with the outcome, laboratory confirmed HIV/AIDS infection. Observational studies with a comparator group were included in the systematic review. Electronic searches for articles from January 1995 to April 2016 were conducted on online databases of EMBASE, MEDLINE, The Cochrane Library and grey literature. Two reviewers selected studies for inclusion. A checklist was developed for quality assessment, and a data extraction form was created to collate data from selected studies. Full-text screening of 429 articles identified 17 cohort and case-control studies, from 26,819 retrieved articles. Demographic and socio-economic characteristics associated with HIV infection included age, gender and measures of deprivation. Lifestyle choices identified were drug use, binge-drinking, number of lifetime partners and having a partner with risky behaviour. Eighteen clinical features and comorbid conditions identified in this systematic review are included in the 51 conditions listed in the British HIV Association guidelines. Additional clinical features and comorbid conditions identified but not specified in the guidelines included hyperlipidemia, hypertension, minor trauma and diabetes. This systematic review consolidates existing scientific evidence on characteristics of HIV-infected individuals that could be used to inform decision making in prognostic model development. Further exploration of availability of some of the demographic and behavioural predictors of HIV, such as ethnicity, number of lifetime partners and partner characteristics, in primary care records will be required to determine whether they can be applied in the prediction model.

  16. Reliable inference of light curve parameters in the presence of systematics

    NASA Astrophysics Data System (ADS)

    Gibson, Neale P.

    2016-10-01

    Time-series photometry and spectroscopy of transiting exoplanets allow us to study their atmospheres. Unfortunately, the required precision to extract atmospheric information surpasses the design specifications of most general purpose instrumentation. This results in instrumental systematics in the light curves that are typically larger than the target precision. Systematics must therefore be modelled, leaving the inference of light-curve parameters conditioned on the subjective choice of systematics models and model-selection criteria. Here, I briefly review the use of systematics models commonly used for transmission and emission spectroscopy, including model selection, marginalisation over models, and stochastic processes. These form a hierarchy of models with increasing degree of objectivity. I argue that marginalisation over many systematics models is a minimal requirement for robust inference. Stochastic models provide even more flexibility and objectivity, and therefore produce the most reliable results. However, no systematics models are perfect, and the best strategy is to compare multiple methods and repeat observations where possible.

  17. Entropy and Galilean Invariance of Lattice Boltzmann Theories

    NASA Astrophysics Data System (ADS)

    Chikatamarla, Shyam S.; Karlin, Iliya V.

    2006-11-01

    A theory of lattice Boltzmann (LB) models for hydrodynamic simulation is developed upon a novel relation between entropy construction and roots of Hermite polynomials. A systematic procedure is described for constructing numerically stable and complete Galilean invariant LB models. The stability of the new LB models is illustrated with a shock tube simulation.

  18. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Business model framework applications in health care: A systematic review.

    PubMed

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  20. Analytic Methods for Adjusting Subjective Rating Schemes.

    ERIC Educational Resources Information Center

    Cooper, Richard V. L.; Nelson, Gary R.

    Statistical and econometric techniques of correcting for supervisor bias in models of individual performance appraisal were developed, using a variant of the classical linear regression model. Location bias occurs when individual performance is systematically overestimated or underestimated, while scale bias results when raters either exaggerate…

  1. The Multiple Component Alternative for Gifted Education.

    ERIC Educational Resources Information Center

    Swassing, Ray

    1984-01-01

    The Multiple Component Model (MCM) of gifted education includes instruction which may overlap in literature, history, art, enrichment, languages, science, physics, math, music, and dance. The model rests on multifactored identification and requires systematic development and selection of components with ongoing feedback and evaluation. (CL)

  2. An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys

    NASA Astrophysics Data System (ADS)

    Seijak, Uros

    Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we improved the redshift space distortion growth rate measurement precision by a factor of 2.5 using customized clustering statistics in the non-linear regime that were immunized against observational systematics. We look forward to addressing the unique challenges of modeling and empirically characterizing the WFIRST galaxies and observational systematics.

  3. Whither Sex Education? Excellence in Comprehensive Program Development.

    ERIC Educational Resources Information Center

    Southern, Stephen

    A review of recent sex education literature is presented in an attempt to integrate observations and recommendations related to both program development and innovation acceptance. A Developmental Research and Utilization Model is employed to systematically guide planning, implementation, evaluation, advocacy, and institutionalization. Curriculum…

  4. Identification and characterization of outcome measures reported in animal models of epilepsy: Protocol for a systematic review of the literature-A TASK2 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Simonato, Michele; Iyengar, Sloka; Brooks-Kayal, Amy; Collins, Stephen; Depaulis, Antoine; Howells, David W; Jensen, Frances; Liao, Jing; Macleod, Malcolm R; Patel, Manisha; Potschka, Heidrun; Walker, Matthew; Whittemore, Vicky; Sena, Emily S

    2017-11-01

    Current antiseizure therapy is ineffective in approximately one third of people with epilepsy and is often associated with substantial side effects. In addition, most current therapeutic paradigms offer treatment, but not cure, and no therapies are able to modify the underlying disease, that is, can prevent or halt the process of epileptogenesis or alleviate the cognitive and psychiatric comorbidities. Preclinical research in the field of epilepsy has been extensive, but unfortunately, not all the animal models being used have been validated for their predictive value. The overall goal of TASK2 of the AES/ILAE Translational Task Force is to organize and coordinate systematic reviews on selected topics regarding animal research in epilepsy. Herein we describe our strategy. In the first part of the paper we provide an overview of the usefulness of systematic reviews and meta-analysis for preclinical research and explain the essentials for their conduct. Then we describe in detail the protocol for a first systematic review, which will focus on the identification and characterization of outcome measures reported in animal models of epilepsy. The specific goals of this study are to define systematically the phenotypic characteristics of the most commonly used animal models, and to effectively compare these with the manifestations of human epilepsy. This will provide epilepsy researchers with detailed information on the strengths and weaknesses of epilepsy models, facilitating their refinement and future research. Ultimately, this could lead to a refined use of relevant models for understanding the mechanism(s) of the epilepsies and developing novel therapies. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  5. Assessing the complexity of interventions within systematic reviews: development, content and use of a new tool (iCAT_SR).

    PubMed

    Lewin, Simon; Hendry, Maggie; Chandler, Jackie; Oxman, Andrew D; Michie, Susan; Shepperd, Sasha; Reeves, Barnaby C; Tugwell, Peter; Hannes, Karin; Rehfuess, Eva A; Welch, Vivien; Mckenzie, Joanne E; Burford, Belinda; Petkovic, Jennifer; Anderson, Laurie M; Harris, Janet; Noyes, Jane

    2017-04-26

    Health interventions fall along a spectrum from simple to more complex. There is wide interest in methods for reviewing 'complex interventions', but few transparent approaches for assessing intervention complexity in systematic reviews. Such assessments may assist review authors in, for example, systematically describing interventions and developing logic models. This paper describes the development and application of the intervention Complexity Assessment Tool for Systematic Reviews (iCAT_SR), a new tool to assess and categorise levels of intervention complexity in systematic reviews. We developed the iCAT_SR by adapting and extending an existing complexity assessment tool for randomized trials. We undertook this adaptation using a consensus approach in which possible complexity dimensions were circulated for feedback to a panel of methodologists with expertise in complex interventions and systematic reviews. Based on these inputs, we developed a draft version of the tool. We then invited a second round of feedback from the panel and a wider group of systematic reviewers. This informed further refinement of the tool. The tool comprises ten dimensions: (1) the number of active components in the intervention; (2) the number of behaviours of recipients to which the intervention is directed; (3) the range and number of organizational levels targeted by the intervention; (4) the degree of tailoring intended or flexibility permitted across sites or individuals in applying or implementing the intervention; (5) the level of skill required by those delivering the intervention; (6) the level of skill required by those receiving the intervention; (7) the degree of interaction between intervention components; (8) the degree to which the effects of the intervention are context dependent; (9) the degree to which the effects of the interventions are changed by recipient or provider factors; (10) and the nature of the causal pathway between intervention and outcome. Dimensions 1-6 are considered 'core' dimensions. Dimensions 7-10 are optional and may not be useful for all interventions. The iCAT_SR tool facilitates more in-depth, systematic assessment of the complexity of interventions in systematic reviews and can assist in undertaking reviews and interpreting review findings. Further testing of the tool is now needed.

  6. Thirty Years of Improving the NCEP Global Forecast System

    NASA Astrophysics Data System (ADS)

    White, G. H.; Manikin, G.; Yang, F.

    2014-12-01

    Current eight day forecasts by the NCEP Global Forecast System are as accurate as five day forecasts 30 years ago. This revolution in weather forecasting reflects increases in computer power, improvements in the assimilation of observations, especially satellite data, improvements in model physics, improvements in observations and international cooperation and competition. One important component has been and is the diagnosis, evaluation and reduction of systematic errors. The effect of proposed improvements in the GFS on systematic errors is one component of the thorough testing of such improvements by the Global Climate and Weather Modeling Branch. Examples of reductions in systematic errors in zonal mean temperatures and winds and other fields will be presented. One challenge in evaluating systematic errors is uncertainty in what reality is. Model initial states can be regarded as the best overall depiction of the atmosphere, but can be misleading in areas of few observations or for fields not well observed such as humidity or precipitation over the oceans. Verification of model physics is particularly difficult. The Environmental Modeling Center emphasizes the evaluation of systematic biases against observations. Recently EMC has placed greater emphasis on synoptic evaluation and on precipitation, 2-meter temperatures and dew points and 10 meter winds. A weekly EMC map discussion reviews the performance of many models over the United States and has helped diagnose and alleviate significant systematic errors in the GFS, including a near surface summertime evening cold wet bias over the eastern US and a multi-week period when the GFS persistently developed bogus tropical storms off Central America. The GFS exhibits a wet bias for light rain and a dry bias for moderate to heavy rain over the continental United States. Significant changes to the GFS are scheduled to be implemented in the fall of 2014. These include higher resolution, improved physics and improvements to the assimilation. These changes significantly improve the tropospheric flow and reduce a tropical upper tropospheric warm bias. One important error remaining is the failure of the GFS to maintain deep convection over Indonesia and in the tropical west Pacific. This and other current systematic errors will be presented.

  7. Supporting adherence and healthy lifestyles in leg ulcer patients: systematic development of the Lively Legs program for dermatology outpatient clinics.

    PubMed

    Heinen, Maud M; Bartholomew, L Kay; Wensing, Michel; van de Kerkhof, Peter; van Achterberg, Theo

    2006-05-01

    The objective of our project was to develop a lifestyle program for leg ulcer patients at outpatient clinics for dermatology. We used the intervention-mapping (IM) framework for systematically developing theory and evidence based health promotion programs. We started with a needs-assessment. A multidisciplinary project group of health care workers and patients was involved in all five IM steps; formulating proximal program objectives, selecting methods and strategies, producing program components, planning for adoption and implementation and planning for evaluation. Several systematic literature reviews and original studies were performed to support this process. Social Cognitive Theory was selected as the main theory behind the program 'Lively Legs' and was combined with elements of Goal-Setting Theory, the precaution adoption model and motivational interviewing. The program is conducted through health counseling by dermatology nurses and was successfully pre-tested. Also, an implementation and evaluation plan were made. Intervention mapping helped us to succeed in developing a lifestyle program with clear goals and methods, operational strategies and materials and clear procedures. Coaching leg ulcer patients towards adherence with compression therapy and healthy lifestyles should be taken on without delay. Systematic development of lifestyle programs for other patient groups should be encouraged.

  8. Optimization of large animal MI models; a systematic analysis of control groups from preclinical studies.

    PubMed

    Zwetsloot, P P; Kouwenberg, L H J A; Sena, E S; Eding, J E; den Ruijter, H M; Sluijter, J P G; Pasterkamp, G; Doevendans, P A; Hoefer, I E; Chamuleau, S A J; van Hout, G P J; Jansen Of Lorkeers, S J

    2017-10-27

    Large animal models are essential for the development of novel therapeutics for myocardial infarction. To optimize translation, we need to assess the effect of experimental design on disease outcome and model experimental design to resemble the clinical course of MI. The aim of this study is therefore to systematically investigate how experimental decisions affect outcome measurements in large animal MI models. We used control animal-data from two independent meta-analyses of large animal MI models. All variables of interest were pre-defined. We performed univariable and multivariable meta-regression to analyze whether these variables influenced infarct size and ejection fraction. Our analyses incorporated 246 relevant studies. Multivariable meta-regression revealed that infarct size and cardiac function were influenced independently by choice of species, sex, co-medication, occlusion type, occluded vessel, quantification method, ischemia duration and follow-up duration. We provide strong systematic evidence that commonly used endpoints significantly depend on study design and biological variation. This makes direct comparison of different study-results difficult and calls for standardized models. Researchers should take this into account when designing large animal studies to most closely mimic the clinical course of MI and enable translational success.

  9. Systematics-insensitive Periodic Signal Search with K2

    NASA Astrophysics Data System (ADS)

    Angus, Ruth; Foreman-Mackey, Daniel; Johnson, John A.

    2016-02-01

    From pulsating stars to transiting exoplanets, the search for periodic signals in K2 data, Kepler’s two-wheeled extension, is relevant to a long list of scientific goals. Systematics affecting K2 light curves due to the decreased spacecraft pointing precision inhibit the easy extraction of periodic signals from the data. We here develop a method for producing periodograms of K2 light curves that are insensitive to pointing-induced systematics; the Systematics-insensitive Periodogram (SIP). Traditional sine-fitting periodograms use a generative model to find the frequency of a sinusoid that best describes the data. We extend this principle by including systematic trends, based on a set of “eigen light curves,” following Foreman-Mackey et al., in our generative model as well as a sum of sine and cosine functions over a grid of frequencies. Using this method we are able to produce periodograms with vastly reduced systematic features. The quality of the resulting periodograms are such that we can recover acoustic oscillations in giant stars and measure stellar rotation periods without the need for any detrending. The algorithm is also applicable to the detection of other periodic phenomena such as variable stars, eclipsing binaries and short-period exoplanet candidates. The SIP code is available at https://github.com/RuthAngus/SIPK2.

  10. Principal component analysis-based anatomical motion models for use in adaptive radiation therapy of head and neck cancer patients

    NASA Astrophysics Data System (ADS)

    Chetvertkov, Mikhail A.

    Purpose: To develop standard and regularized principal component analysis (PCA) models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients, assess their potential use in adaptive radiation therapy (ART), and to extract quantitative information for treatment response assessment. Methods: Planning CT (pCT) images of H&N patients were artificially deformed to create "digital phantom" images, which modeled systematic anatomical changes during Radiation Therapy (RT). Artificial deformations closely mirrored patients' actual deformations, and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and synthetic CBCTs (i.e., digital phantoms), and between pCT and clinical CBCTs. Patient-specific standard PCA (SPCA) and regularized PCA (RPCA) models were built from these synthetic and clinical DVF sets. Eigenvectors, or eigenDVFs (EDVFs), having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Modeled anatomies were used to assess the dose deviations with respect to the planned dose distribution. Results: PCA models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade SPCA's ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes, and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. For dose assessment it has been shown that the modeled dose distribution was different from the planned dose for the parotid glands due to their shrinkage and shift into the higher dose volumes during the radiotherapy course. Modeled DVHs still underestimated the effect of parotid shrinkage due to the large compression factor (CF) used to acquire DVFs. Conclusion: Leading EDVFs from both PCA approaches have the potential to capture systematic anatomical changes during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable than SPCA at capturing systematic changes, enabling dosimetric consequences to be projected to the future treatment fractions based on trends established early in a treatment course, or, potentially, based on population models. This work showed that PCA has a potential in identifying the major mode of anatomical changes during the radiotherapy course and subsequent use of this information in future dose predictions is feasible. Use of smaller CF values for DVFs is preferred, otherwise anatomical motion will be underestimated.

  11. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models

    PubMed Central

    Ataman, Meric

    2017-01-01

    Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these “consistently-reduced” models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models. PMID:28727725

  12. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  13. Producing Cochrane systematic reviews-a qualitative study of current approaches and opportunities for innovation and improvement.

    PubMed

    Turner, Tari; Green, Sally; Tovey, David; McDonald, Steve; Soares-Weiser, Karla; Pestridge, Charlotte; Elliott, Julian

    2017-08-01

    Producing high-quality, relevant systematic reviews and keeping them up to date is challenging. Cochrane is a leading provider of systematic reviews in health. For Cochrane to continue to contribute to improvements in heath, Cochrane Reviews must be rigorous, reliable and up to date. We aimed to explore existing models of Cochrane Review production and emerging opportunities to improve the efficiency and sustainability of these processes. To inform discussions about how to best achieve this, we conducted 26 interviews and an online survey with 106 respondents. Respondents highlighted the importance and challenge of creating reliable, timely systematic reviews. They described the challenges and opportunities presented by current production models, and they shared what they are doing to improve review production. They particularly highlighted significant challenges with increasing complexity of review methods; difficulty keeping authors on board and on track; and the length of time required to complete the process. Strong themes emerged about the roles of authors and Review Groups, the central actors in the review production process. The results suggest that improvements to Cochrane's systematic review production models could come from improving clarity of roles and expectations, ensuring continuity and consistency of input, enabling active management of the review process, centralising some review production steps; breaking reviews into smaller "chunks", and improving approaches to building capacity of and sharing information between authors and Review Groups. Respondents noted the important role new technologies have to play in enabling these improvements. The findings of this study will inform the development of new Cochrane Review production models and may provide valuable data for other systematic review producers as they consider how best to produce rigorous, reliable, up-to-date reviews.

  14. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  15. A Point Kinetics Model for Estimating Neutron Multiplication of Bare Uranium Metal in Tagged Neutron Measurements

    DOE PAGES

    Tweardy, Matthew C.; McConchie, Seth; Hayward, Jason P.

    2017-06-13

    An extension of the point kinetics model is developed in this paper to describe the neutron multiplicity response of a bare uranium object under interrogation by an associated particle imaging deuterium-tritium (D-T) measurement system. This extended model is used to estimate the total neutron multiplication of the uranium. Both MCNPX-PoliMi simulations and data from active interrogation measurements of highly enriched and depleted uranium geometries are used to evaluate the potential of this method and to identify the sources of systematic error. The detection efficiency correction for measured coincidence response is identified as a large source of systematic error. If themore » detection process is not considered, results suggest that the method can estimate total multiplication to within 13% of the simulated value. Values for multiplicity constants in the point kinetics equations are sensitive to enrichment due to (n, xn) interactions by D-T neutrons and can introduce another significant source of systematic bias. This can theoretically be corrected if isotopic composition is known a priori. Finally, the spatial dependence of multiplication is also suspected of introducing further systematic bias for high multiplication uranium objects.« less

  16. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology.

    PubMed

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice

    2017-02-01

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  17. Teachers' Attitudes toward Reporting Child Sexual Abuse: Problems with Existing Research Leading to New Scale Development

    ERIC Educational Resources Information Center

    Walsh, Kerryann; Rassafiani, Mehdi; Mathews, Ben; Farrell, Ann; Butler, Des

    2010-01-01

    This paper details a systematic literature review identifying problems in extant research relating to teachers' attitudes toward reporting child sexual abuse and offers a model for new attitude scale development and testing. Scale development comprised a five-phase process grounded in contemporary attitude theories, including (a) developing the…

  18. Development of a Systematic Stakeholder Identification System for 3VS Modeling in the Snohomish Basin, Washington, USA

    EPA Science Inventory

    In the Environmental Protection Agency’s Triple Value Simulation (3VS) models, social, economic and environmental indicators are utilized to understand the interrelated impacts of programs and regulations on ecosystems and human communities. Critical to identifying the app...

  19. Evaluating the Psychometric Characteristics of Generated Multiple-Choice Test Items

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André

    2016-01-01

    Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric…

  20. Exploratory and problem-solving consumer behavior across the life span.

    PubMed

    Lesser, J A; Kunkel, S R

    1991-09-01

    Different cognitive functioning, social, and personality changes appear to occur systematically during the adult life span. This article synthesizes research on life span changes in order to develop age-specific models of shopping behavior. The models are tested within a naturalistic field study of shoppers.

  1. A Transformational Bilingual Model for Teacher Education.

    ERIC Educational Resources Information Center

    Moheno, Phil; Pacheco, Richard

    At San Diego State University, the training program for bilingual education teachers was developed to systematically accommodate changing needs in education, particularly the needs to educate students with academic proficiency in both Spanish and English and to have a multicultural perspective. The emerging teacher education model empowers…

  2. Attention Gating in Short-Term Visual Memory.

    ERIC Educational Resources Information Center

    Reeves, Adam; Sperling, George

    1986-01-01

    An experiment is conducted showing that an attention shift to a stream of numerals presented in rapid serial visual presentation mode produces not a total loss, but a systematic distortion of order. An attention gating model (AGM) is developed from a more general attention model. (Author/LMO)

  3. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    PubMed

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    A risk prediction model is a statistical tool for estimating the probability that a currently healthy individual with specific risk factors will develop a condition in the future such as breast cancer. Reliably accurate prediction models can inform future disease burdens, health policies and individual decisions. Breast cancer prediction models containing modifiable risk factors, such as alcohol consumption, BMI or weight, condom use, exogenous hormone use and physical activity, are of particular interest to women who might be considering how to reduce their risk of breast cancer and clinicians developing health policies to reduce population incidence rates. We performed a systematic review to identify and evaluate the performance of prediction models for breast cancer that contain modifiable factors. A protocol was developed and a sensitive search in databases including MEDLINE and EMBASE was conducted in June 2010. Extensive use was made of reference lists. Included were any articles proposing or validating a breast cancer prediction model in a general female population, with no language restrictions. Duplicate data extraction and quality assessment were conducted. Results were summarised qualitatively, and where possible meta-analysis of model performance statistics was undertaken. The systematic review found 17 breast cancer models, each containing a different but often overlapping set of modifiable and other risk factors, combined with an estimated baseline risk that was also often different. Quality of reporting was generally poor, with characteristics of included participants and fitted model results often missing. Only four models received independent validation in external data, most notably the 'Gail 2' model with 12 validations. None of the models demonstrated consistently outstanding ability to accurately discriminate between those who did and those who did not develop breast cancer. For example, random-effects meta-analyses of the performance of the 'Gail 2' model showed the average C statistic was 0.63 (95% CI 0.59-0.67), and the expected/observed ratio of events varied considerably across studies (95% prediction interval for E/O ratio when the model was applied in practice was 0.75-1.19). There is a need for models with better predictive performance but, given the large amount of work already conducted, further improvement of existing models based on conventional risk factors is perhaps unlikely. Research to identify new risk factors with large additionally predictive ability is therefore needed, alongside clearer reporting and continual validation of new models as they develop.

  4. Effectiveness of chronic care models: opportunities for improving healthcare practice and health outcomes: a systematic review.

    PubMed

    Davy, Carol; Bleasel, Jonathan; Liu, Hueiming; Tchan, Maria; Ponniah, Sharon; Brown, Alex

    2015-05-10

    The increasing prevalence of chronic disease and even multiple chronic diseases faced by both developed and developing countries is of considerable concern. Many of the interventions to address this within primary healthcare settings are based on a chronic care model first developed by MacColl Institute for Healthcare Innovation at Group Health Cooperative. This systematic literature review aimed to identify and synthesise international evidence on the effectiveness of elements that have been included in a chronic care model for improving healthcare practices and health outcomes within primary healthcare settings. The review broadens the work of other similar reviews by focusing on effectiveness of healthcare practice as well as health outcomes associated with implementing a chronic care model. In addition, relevant case series and case studies were also included. Of the 77 papers which met the inclusion criteria, all but two reported improvements to healthcare practice or health outcomes for people living with chronic disease. While the most commonly used elements of a chronic care model were self-management support and delivery system design, there were considerable variations between studies regarding what combination of elements were included as well as the way in which chronic care model elements were implemented. This meant that it was impossible to clearly identify any optimal combination of chronic care model elements that led to the reported improvements. While the main argument for excluding papers reporting case studies and case series in systematic literature reviews is that they are not of sufficient quality or generalizability, we found that they provided a more detailed account of how various chronic care models were developed and implemented. In particular, these papers suggested that several factors including supporting reflective healthcare practice, sending clear messages about the importance of chronic disease care and ensuring that leaders support the implementation and sustainability of interventions may have been just as important as a chronic care model's elements in contributing to the improvements in healthcare practice or health outcomes for people living with chronic disease.

  5. Assurance of Learning in the MIS Program

    ERIC Educational Resources Information Center

    Harper, Jeffrey S.; Harder, Joseph T.

    2009-01-01

    This article describes the development of a systematic and practical methodology for assessing program effectiveness and monitoring student development in undergraduate decision sciences programs. The model we present is based on a student's progression through learning stages associated with four key competencies: technical, analytical,…

  6. Controlled human infection models for vaccine development: Zika virus debate.

    PubMed

    Gopichandran, Vijayaprasad

    2018-01-01

    An ethics panel, convened by the National Institute of Health and other research bodies in the USA, disallowed researchers from the Johns Hopkins University and University of Vermont from performing controlled human infection of healthy volunteers to develop a vaccine against Zika virus infection. The members published their ethical analysis and recommendations in February 2017. They have elaborated on the risks posed by human challenge with Zika virus to the volunteers and other uninvolved third parties and have systematically analysed the social value of such a human challenge experiment. They have also posited some mandatory ethical requirements which should be met before allowing the infection of healthy volunteers with the Zika virus. This commentary elaborates on the debate on the ethics of the human challenge model for the development of a Zika virus vaccine and the role of systematic ethical analysis in protecting the interests of research participants. It further analyses the importance of this debate to the development of a Zika vaccine in India.

  7. Real-time PCR machine system modeling and a systematic approach for the robust design of a real-time PCR-on-a-chip system.

    PubMed

    Lee, Da-Sheng

    2010-01-01

    Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.

  8. Collaborative School Improvement: An Integrated Model for Educational Leaders.

    ERIC Educational Resources Information Center

    Perry, Eleanor A.

    A systematic way to create collaborative school improvement is provided. The currently expanding role of administrators as staff developers is explored; 10 strategies are listed for the principal to use as a key player in staff development. Two specific organization development problem-solving strategies, Situation-Target-Plan (S-T-P) and Force…

  9. A model for the development of university curricula in nanoelectronics

    NASA Astrophysics Data System (ADS)

    Bruun, E.; Nielsen, I.

    2010-12-01

    Nanotechnology is having an increasing impact on university curricula in electrical engineering and in physics. Major influencers affecting developments in university programmes related to nanoelectronics are discussed and a model for university programme development is described. The model takes into account that nanotechnology affects not only physics but also electrical engineering and computer engineering because of the advent of new nanoelectronics devices. The model suggests that curriculum development tends to follow one of three major tracks: physics; electrical engineering; computer engineering. Examples of European curricula following this framework are identified and described. These examples may serve as sources of inspiration for future developments and the model presented may provide guidelines for a systematic selection of topics in the university programmes.

  10. In-Service Workshop Model. Development Work, Volunteer Service and Project Review. Core Curriculum Resource Materials.

    ERIC Educational Resources Information Center

    Edwards, Dan

    A model is provided for an inservice workshop to provide systematic project review, conduct individual volunteer support and problem solving, and conduct future work planning. Information on model use and general instructions are presented. Materials are provided for 12 sessions covering a 5-day period. The first session on climate setting and…

  11. Measuring maternal satisfaction with maternity care: A systematic integrative review: What is the most appropriate, reliable and valid tool that can be used to measure maternal satisfaction with continuity of maternity care?

    PubMed

    Perriman, Noelyn; Davis, Deborah

    2016-06-01

    The objective of this systematic integrative review is to identify, summarise and communicate the findings of research relating to tools that measure maternal satisfaction with continuity of maternity care models. In so doing the most appropriate, reliable and valid tool that can be used to measure maternal satisfaction with continuity of maternity care will be determined. A systematic integrative review of published and unpublished literature was undertaken using selected databases. Research papers were included if they measured maternal satisfaction in a continuity model of maternity care, were published in English after 1999 and if they included (or made available) the instrument used to measure satisfaction. Six hundred and thirty two unique papers were identified and after applying the selection criteria, four papers were included in the review. Three of these originated in Australia and one in Canada. The primary focus of all papers was not on the development of a tool to measure maternal satisfaction but on the comparison of outcomes in different models of care. The instruments developed varied in terms of the degree to which they were tested for validity and reliability. Women's satisfaction with maternity services is an important measure of quality. Most satisfaction surveys in maternity appear to reflect fragmented models of care though continuity of care models are increasing in line with the evidence demonstrating their effectiveness. It is important that robust tools are developed for this context and that there is some consistency in the way this is measured and reported for the purposes of benchmarking and quality improvement. Copyright © 2016 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  12. Radiometry of liquids: characteristics and systematization of the requirements imposed on it (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isakov, L.M.; El'tsin, G.I.

    1972-01-01

    The requirements imposed on the measurement of the radioactivity of liquids are differentiated as a function of the purpose of the instrument. Five groups of radiometers were examined and for each the individual requirements were characterized. The proposed systematization was oriented toward the ordering of the development of liquid radiometers and a reduction in the number of models without limiting their range of applicability. (tr-auth)

  13. A systematic review and qualitative analysis to inform the development of a new emergency department-based geriatric case management model.

    PubMed

    Sinha, Samir K; Bessman, Edward S; Flomenbaum, Neal; Leff, Bruce

    2011-06-01

    We inform the future development of a new geriatric emergency management practice model. We perform a systematic review of the existing evidence for emergency department (ED)-based case management models designed to improve the health, social, and health service utilization outcomes for noninstitutionalized older patients within the context of an index ED visit. This was a systematic review of English-language articles indexed in MEDLINE and CINAHL (1966 to 2010), describing ED-based case management models for older adults. Bibliographies of the retrieved articles were reviewed to identify additional references. A systematic qualitative case study analytic approach was used to identify the core operational components and outcome measures of the described clinical interventions. The authors of the included studies were also invited to verify our interpretations of their work. The determined patterns of component adherence were then used to postulate the relative importance and effect of the presence or absence of a particular component in influencing the overall effectiveness of their respective interventions. Eighteen of 352 studies (reported in 20 articles) met study criteria. Qualitative analyses identified 28 outcome measures and 8 distinct model characteristic components that included having an evidence-based practice model, nursing clinical involvement or leadership, high-risk screening processes, focused geriatric assessments, the initiation of care and disposition planning in the ED, interprofessional and capacity-building work practices, post-ED discharge follow-up with patients, and evaluation and monitoring processes. Of the 15 positive study results, 6 had all 8 characteristic components and 9 were found to be lacking at least 1 component. Two studies with positive results lacked 2 characteristic components and none lacked more than 2 components. Of the 3 studies with negative results demonstrating no positive effects based on any outcome tested, one lacked 2, one lacked 3, and one lacked 4 of the 8 model components. Successful models of ED-based case management models for older adults share certain key characteristics. This study builds on the emerging literature in this area and leverages the differences in these models and their associated outcomes to support the development of an evidence-based normative and effective geriatric emergency management practice model designed to address the special care needs and thereby improve the health and health service utilization outcomes of older patients. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  14. Rigid-flexible coupling dynamic modeling and investigation of a redundantly actuated parallel manipulator with multiple actuation modes

    NASA Astrophysics Data System (ADS)

    Liang, Dong; Song, Yimin; Sun, Tao; Jin, Xueying

    2017-09-01

    A systematic dynamic modeling methodology is presented to develop the rigid-flexible coupling dynamic model (RFDM) of an emerging flexible parallel manipulator with multiple actuation modes. By virtue of assumed mode method, the general dynamic model of an arbitrary flexible body with any number of lumped parameters is derived in an explicit closed form, which possesses the modular characteristic. Then the completely dynamic model of system is formulated based on the flexible multi-body dynamics (FMD) theory and the augmented Lagrangian multipliers method. An approach of combining the Udwadia-Kalaba formulation with the hybrid TR-BDF2 numerical algorithm is proposed to address the nonlinear RFDM. Two simulation cases are performed to investigate the dynamic performance of the manipulator with different actuation modes. The results indicate that the redundant actuation modes can effectively attenuate vibration and guarantee higher dynamic performance compared to the traditional non-redundant actuation modes. Finally, a virtual prototype model is developed to demonstrate the validity of the presented RFDM. The systematic methodology proposed in this study can be conveniently extended for the dynamic modeling and controller design of other planar flexible parallel manipulators, especially the emerging ones with multiple actuation modes.

  15. A Game-Theoretic Model of Grounding for Referential Communication Tasks

    ERIC Educational Resources Information Center

    Thompson, William

    2009-01-01

    Conversational grounding theory proposes that language use is a form of rational joint action, by which dialog participants systematically and collaboratively add to their common ground of shared knowledge and beliefs. Following recent work applying "game theory" to pragmatics, this thesis develops a game-theoretic model of grounding that…

  16. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  17. The Analysis on Systematic Development of College Microlecture

    ERIC Educational Resources Information Center

    Liu, Xiaohong; Wang, Lisi

    2013-01-01

    In order to apply micro lectures to college education successfully, construct new teaching and learning strategies and teaching model, this paper proposes characteristics of college microlecture based on the college education features and construct microlecture structure model based on the definitions by the experts and scholars. Microlecture's…

  18. What's a Parent to Do? Coping with Crisis.

    ERIC Educational Resources Information Center

    Coleman, Trudy; And Others

    This instructional packet is one of a series of five modules that emphasize a systematic decision-making model for common problematic situations. The steps of the model are identifying the problem, gathering information, developing and assessing alternatives, implementing a solution, and evaluating and modifying the solution. Aimed at adult basic…

  19. Read Naturally. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2007

    2007-01-01

    "Read Naturally" is designed to improve reading fluency using a combination of books, audiotapes, and computer software. According to the developer's web site, this program has three main strategies: repeated reading of text for developing oral reading fluency, teacher modeling of story reading, and systematic monitoring of student…

  20. Core Professionalism Education in Surgery: A Systematic Review.

    PubMed

    Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender

    2018-03-15

    Professionalism education is one of the major elements of surgical residency education. To evaluate the studies on core professionalism education programs in surgical professionalism education. Systematic review. This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable.

  1. New Vistas in Chemical Product and Process Design.

    PubMed

    Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul

    2016-06-07

    Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.

  2. Patent portfolio management: literature review and a proposed model.

    PubMed

    Conegundes De Jesus, Camila Kiyomi; Salerno, Mario Sergio

    2018-05-09

    Patents and patent portfolios are gaining attention in the last decades, from the called 'pro-patent era' to the recent billionaire transactions involving patent portfolios. The field is growing in importance, both theoretically and practically and despite having substantial literature on new product development portfolio management, we have not found an article relating this theory to patent portfolios. Areas covered: The paper develops a systematic literature review on patent portfolio management to organize the evolution and tendencies of patent portfolio management, highlighting distinctive features of patent portfolio management. Interview with IP manager of three life sciences companies, including a leading multinational group provided relevant information about patent portfolio management. Expert opinion: Based on the systematic literature review on portfolio management, more specifically, on new product development portfolio theory, and interview the paper proposes the paper proposes a reference model to manage patent portfolios. The model comprises four stages aligned with the three goals of the NPD portfolio management: 1 - Linking strategy of the Company's NPD Portfolio to Patent Portfolio; 2 - Balancing the portfolio in buckets; 3 - Patent Valuation (maximizing valuation); 4 - Regularly reviewing the patent portfolio.

  3. Analyzing Science Teaching: A Case Study Based on Three Philosophical Models of Teaching. The Explanatory Modes Project, Background Paper No. 5.

    ERIC Educational Resources Information Center

    Munby, A. Hugh

    The development of a category scheme for the systematic analysis of science classroom discourse is described. Three teaching models are discussed: the Impression Model, which depicts the mind of a student as receiving and storing external impressions; the Insight Model, which denies the possibility that ideas or knowledge can be conveyed by…

  4. The Answer Is in the Question: A Guide for Describing and Investigating the Conceptual Foundations and Statistical Properties of Cognitive Psychometric Models

    ERIC Educational Resources Information Center

    Rupp, Andre A.

    2007-01-01

    One of the most revolutionary advances in psychometric research during the last decades has been the systematic development of statistical models that allow for cognitive psychometric research (CPR) to be conducted. Many of the models currently available for such purposes are extensions of basic latent variable models in item response theory…

  5. Reuse at the Software Productivity Consortium

    NASA Technical Reports Server (NTRS)

    Weiss, David M.

    1989-01-01

    The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.

  6. A systematic review of the psychological and social benefits of participation in sport for adults: informing development of a conceptual model of health through sport.

    PubMed

    Eime, Rochelle M; Young, Janet A; Harvey, Jack T; Charity, Melanie J; Payne, Warren R

    2013-12-07

    The definition of health incorporates the physical, social and mental domains, however the Physical Activity (PA) guidelines do not address social health. Furthermore, there is insufficient evidence about the levels or types of PA associated specifically with psychological health. This paper first presents the results of a systematic review of the psychological and social health benefits of participation in sport by adults. Secondly, the information arising from the systematic review has been used to develop a conceptual model of Health through Sport. A systematic review of 14 electronic databases was conducted in June 2012, and studies published since 1990 were considered for inclusion. Studies that addressed mental and/or social health benefits from participation in sport were included. A total of 3668 publications were initially identified, of which 11 met the selection criteria. There were many different psychological and social health benefits reported, with the most commonly being wellbeing and reduced distress and stress. Sport may be associated with improved psychosocial health in addition to improvements attributable to participation in PA. Specifically, club-based or team-based sport seems to be associated with improved health outcomes compared to individual activities, due to the social nature of the participation. Notwithstanding this, individuals who prefer to participate in sport by themselves can still derive mental health benefits which can enhance the development of true-self-awareness and personal growth which is essential for social health. A conceptual model, Health through Sport, is proposed. The model depicts the relationship between psychological, psychosocial and social health domains, and their positive associations with sport participation, as reported in the literature. However, it is acknowledged that the capacity to determine the existence and direction of causal links between participation and health is limited by the cross-sectional nature of studies to date. It is recommended that participation in sport is advocated as a form of leisure-time PA for adults which can produce a range of health benefits. It is also recommended that the causal link between participation in sport and psycho-social health be further investigated and the conceptual model of Health through Sport tested.

  7. Interventions to improve care coordination between primary healthcare and oncology care providers: a systematic review.

    PubMed

    Tomasone, Jennifer R; Brouwers, Melissa C; Vukmirovic, Marija; Grunfeld, Eva; O'Brien, Mary Ann; Urquhart, Robin; Walker, Melanie; Webster, Fiona; Fitch, Margaret

    2016-01-01

    Coordination of patient care between primary care and oncology care providers is vital to care quality and outcomes across the cancer continuum, yet it is known to be challenging. We conducted a systematic review to evaluate current or new models of care and/or interventions aimed at improving coordination between primary care and oncology care providers for patients with adult breast and/or colorectal cancer. MEDLINE, EMBASE, CINAHL, Cochrane Library Database of Systematic Reviews, and the Centre for Reviews and Dissemination were searched for existing English language studies published between January 2000 and 15 May 2015. Systematic reviews, meta-analyses, randomised controlled trials (RCTs) and non-randomised studies were included if they evaluated a specific model/intervention that was designed to improve care coordination between primary care and oncology care providers, for any stage of the cancer continuum, for patients with adult breast and/or colorectal cancer. Two reviewers extracted data and assessed risk of bias. Twenty-two studies (5 systematic reviews, 6 RCTs and 11 non-randomised studies) were included and varied with respect to the targeted phase of the cancer continuum, type of model or intervention tested, and outcome measures. The majority of studies showed no statistically significant changes in any patient, provider or system outcomes. Owing to conceptual and methodological limitations in this field, the review is unable to provide specific conclusions about the most effective or preferred model/intervention to improve care coordination. Imprecise results that lack generalisability and definitiveness provide limited evidence to base the development of future interventions and policies. CRD42015025006.

  8. Effectiveness of a systematic approach to promote intersectoral collaboration in comprehensive school health promotion-a multiple-case study using quantitative and qualitative data.

    PubMed

    Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K

    2015-07-05

    We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.

  9. Development and pilot test of a process to identify research needs from a systematic review.

    PubMed

    Saldanha, Ian J; Wilson, Lisa M; Bennett, Wendy L; Nicholson, Wanda K; Robinson, Karen A

    2013-05-01

    To ensure appropriate allocation of research funds, we need methods for identifying high-priority research needs. We developed and pilot tested a process to identify needs for primary clinical research using a systematic review in gestational diabetes mellitus. We conducted eight steps: abstract research gaps from a systematic review using the Population, Intervention, Comparison, Outcomes, and Settings (PICOS) framework; solicit feedback from the review authors; translate gaps into researchable questions using the PICOS framework; solicit feedback from multidisciplinary stakeholders at our institution; establish consensus among multidisciplinary external stakeholders on the importance of the research questions using the Delphi method; prioritize outcomes; develop conceptual models to highlight research needs; and evaluate the process. We identified 19 research questions. During the Delphi method, external stakeholders established consensus for 16 of these 19 questions (15 with "high" and 1 with "medium" clinical benefit/importance). We pilot tested an eight-step process to identify clinically important research needs. Before wider application of this process, it should be tested using systematic reviews of other diseases. Further evaluation should include assessment of the usefulness of the research needs generated using this process for primary researchers and funders. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Development of a systematic career coaching program for medical students.

    PubMed

    Hur, Yera; Cho, A Ra; Kwon, Mihye

    2018-03-01

    This study aimed to develop a systematic career-coaching program (SCCP) that can be used by medical teaching schools to address a growing need for career-coaching. The program objectives were to help students (1) develop a comprehensive self-understanding of their aptitudes, interests, and personality traits; (2) explore possible career choices and decide on a career path; and (3) develop the competencies needed to prepare for their future careers. The SCCP was based on the ADDIE (analysis, design, development, implementation, and evaluation) model and decision-making questioning model. Medical professionals, medical education and career counseling experts, and students participated in designing the program. The SCCP describes coaching content, tools, operational methods, and appropriate timing, and identifies the professionals and specialists who can offer their expertise in the different coaching phases. It is designed to allow medical schools to offer the program in segments or in its entirety, depending on the curriculum and environment. The SCCP represents a viable career-coaching program for medical students that can be applied in part or in its entirety, depending on a medical school's curriculum and educational environment.

  11. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  12. Transforming Gifts into Talents: The DMGT as a Developmental Theory

    ERIC Educational Resources Information Center

    Gagne, Francoys

    2004-01-01

    The Differentiated Model of Giftedness and Talent (DMGT) presents the talent development process (P) as the transformation of outstanding natural abilities, or gifts (G), into outstanding systematically developed skills which define expertise, or talent (T) 3 in a particular occupational field. This developmental sequence constitutes the heart of…

  13. Enhancing Capacity to Improve Student Learning

    ERIC Educational Resources Information Center

    Mayotte, Gail; Wei, Dan; Lamphier, Sarah; Doyle, Thomas

    2013-01-01

    Professional development provides a means to build capacity among school personnel when it is delivered as part of a systematic, long-term approach to school and teacher improvement. This research examines a sustained, diocesan-wide professional development model, called the ACE Collaborative for Academic Excellence, that aims to build capacity…

  14. IMPROVED VALUATION OF ECOLOGICAL BENEFITS ASSOCIATED WITH AQUATIC LIVING RESOURCES: DEVELOPMENT AND TESTING OF INDICATOR-BASED STATED PREFERENCE VALUATION AND TRANSFER

    EPA Science Inventory

    In addition to development and systematic qualitative/quantitative testing of indicator-based valuation for aquatic living resources, the proposed work will improve interdisciplinary mechanisms to model and communicate aquatic ecosystem change within SP valuation—an area...

  15. Educational Technology and Organizational Development: A Collaborative Approach to Organizational Change.

    ERIC Educational Resources Information Center

    Forbes, Raymond L., Jr.; Nickols, Frederick W.

    The basic similarities between educational technology and organizational development provide a powerful rationale for collaboration. The two disciplines are essentially in the same business, that of systematically changing human behavior. System theory and the system model appear to supply the language and the technology through which such efforts…

  16. Towards a Bernsteinian Language of Description for Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Straehler-Pohl, Hauke; Gellert, Uwe

    2013-01-01

    This article aims at developing an external language of description to investigate the problem of why particular groups of students are systematically not provided access to school mathematical knowledge. Based on Basil Bernstein's conceptualisation of power in classification, we develop a three-dimensional model that operationalises the…

  17. Developmental Growth in Students' Concept of Energy: Analysis of Selected Items from the TIMSS Database

    ERIC Educational Resources Information Center

    Liu, Xiufeng; McKeough, Anne

    2005-01-01

    The aim of this study was to develop a model of students' energy concept development. Applying Case's (1985, 1992) structural theory of cognitive development, we hypothesized that students' concept of energy undergoes a series of transitions, corresponding to systematic increases in working memory capacity. The US national sample from the Third…

  18. Procedure for the systematic orientation of digitised cranial models. Design and validation.

    PubMed

    Bailo, M; Baena, S; Marín, J J; Arredondo, J M; Auría, J M; Sánchez, B; Tardío, E; Falcón, L

    2015-12-01

    Comparison of bony pieces requires that they are oriented systematically to ensure that homologous regions are compared. Few orientation methods are highly accurate; this is particularly true for methods applied to three-dimensional models obtained by surface scanning, a technique whose special features make it a powerful tool in forensic contexts. The aim of this study was to develop and evaluate a systematic, assisted orientation method for aligning three-dimensional cranial models relative to the Frankfurt Plane, which would be produce accurate orientations independent of operator and anthropological expertise. The study sample comprised four crania of known age and sex. All the crania were scanned and reconstructed using an Eva Artec™ portable 3D surface scanner and subsequently, the position of certain characteristic landmarks were determined by three different operators using the Rhinoceros 3D surface modelling software. Intra-observer analysis showed a tendency for orientation to be more accurate when using the assisted method than when using conventional manual orientation. Inter-observer analysis showed that experienced evaluators achieve results at least as accurate if not more accurate using the assisted method than those obtained using manual orientation; while inexperienced evaluators achieved more accurate orientation using the assisted method. The method tested is a an innovative system capable of providing very precise, systematic and automatised spatial orientations of virtual cranial models relative to standardised anatomical planes independent of the operator and operator experience. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Wrong Answers on Multiple-Choice Achievement Tests: Blind Guesses or Systematic Choices?.

    ERIC Educational Resources Information Center

    Powell, J. C.

    A multi-faceted model for the selection of answers for multiple-choice tests was developed from the findings of a series of exploratory studies. This model implies that answer selection should be curvilinear. A series of models were tested for fit using the chi square procedure. Data were collected from 359 elementary school students ages 9-12.…

  20. An Application of the PMI Model at the Project Level Evaluation of ESEA Title IV-C Projects.

    ERIC Educational Resources Information Center

    McBeath, Marcia

    All of the papers presented as part of a symposium concerned the application of the Planning, Monitoring, and Implementation Model (PMI) to the evaluation of the District of Columbia Public Schools' programs supported by the Elementary Secondary Education Act (ESEA) Title IV-C. PMI was developed to provide a model for systematic evaluation of…

  1. dETECT: A Model for the Evaluation of Instructional Units for Teaching Computing in Middle School

    ERIC Educational Resources Information Center

    von Wangenheim, Christiane G.; Petri, Giani; Zibertti, André W.; Borgatto, Adriano F.; Hauck, Jean C. R.; Pacheco, Fernando S.; Filho, Raul Missfeldt

    2017-01-01

    The objective of this article is to present the development and evaluation of dETECT (Evaluating TEaching CompuTing), a model for the evaluation of the quality of instructional units for teaching computing in middle school based on the students' perception collected through a measurement instrument. The dETECT model was systematically developed…

  2. Lean leadership attributes: a systematic review of the literature.

    PubMed

    Aij, Kjeld Harald; Teunissen, Maurits

    2017-10-09

    Purpose Emphasis on quality and reducing costs has led many health-care organizations to reconfigure their management, process, and quality control infrastructures. Many are lean, a management philosophy with roots in manufacturing industries that emphasizes elimination of waste. Successful lean implementation requires systemic change and strong leadership. Despite the importance of leadership to successful lean implementation, few researchers have probed the question of ideal leadership attributes to achieve lean thinking in health care. The purpose of this paper is to provide insight into applicable attributes for lean leaders in health care. Design/methodology/approach The authors systematically reviewed the literature on principles of leadership and, using Dombrowski and Mielke's (2013) conceptual model of lean leadership, developed a parallel theoretical model for lean leadership in health care. Findings This work contributes to the development of a new framework for describing leadership attributes within lean management of health care. Originality/value The summary of attributes can provide a model for health-care leaders to apply lean in their organizations.

  3. A potential role of anti-poverty programs in health promotion

    PubMed Central

    Silverman, Kenneth; Holtyn, August F.; Jarvis, Brantley

    2016-01-01

    Poverty is one of the most pervasive risk factors underlying poor health, but is rarely targeted to improve health. Research on the effects of anti-poverty interventions on health has been limited, at least in part because funding for that research has been limited. Anti-poverty programs have been applied on a large scale, frequently by governments, but without systematic development and cumulative programmatic experimental studies. Anti-poverty programs that produce lasting effects on poverty have not been developed. Before evaluating the effect of anti-poverty programs on health, programs must be developed that can reduce poverty consistently. Anti-poverty programs require systematic development and cumulative programmatic scientific evaluation. Research on the therapeutic workplace could provide a model for that research and an adaptation of the therapeutic workplace could serve as a foundation of a comprehensive anti-poverty program. Once effective anti-poverty programs are developed, future research could determine if those programs improve health in addition to increasing income. The potential personal, health and economic benefits of effective anti-poverty programs could be substantial, and could justify the major efforts and expenses that would be required to support systematic research to develop such programs. PMID:27235603

  4. A potential role of anti-poverty programs in health promotion.

    PubMed

    Silverman, Kenneth; Holtyn, August F; Jarvis, Brantley P

    2016-11-01

    Poverty is one of the most pervasive risk factors underlying poor health, but is rarely targeted to improve health. Research on the effects of anti-poverty interventions on health has been limited, at least in part because funding for that research has been limited. Anti-poverty programs have been applied on a large scale, frequently by governments, but without systematic development and cumulative programmatic experimental studies. Anti-poverty programs that produce lasting effects on poverty have not been developed. Before evaluating the effect of anti-poverty programs on health, programs must be developed that can reduce poverty consistently. Anti-poverty programs require systematic development and cumulative programmatic scientific evaluation. Research on the therapeutic workplace could provide a model for that research and an adaptation of the therapeutic workplace could serve as a foundation of a comprehensive anti-poverty program. Once effective anti-poverty programs are developed, future research could determine if those programs improve health in addition to increasing income. The potential personal, health and economic benefits of effective anti-poverty programs could be substantial, and could justify the major efforts and expenses that would be required to support systematic research to develop such programs. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  6. Systematic review and overview of health economic evaluation models in obesity prevention and therapy.

    PubMed

    Schwander, Bjoern; Hiligsmann, Mickaël; Nuijten, Mark; Evers, Silvia

    2016-10-01

    Given the increasing clinical and economic burden of obesity, it is of major importance to identify cost-effective approaches for obesity management. Areas covered: This study aims to systematically review and compile an overview of published decision models for health economic assessments (HEA) in obesity, in order to summarize and compare their key characteristics as well as to identify, inform and guide future research. Of the 4,293 abstracts identified, 87 papers met our inclusion criteria. A wide range of different methodological approaches have been identified. Of the 87 papers, 69 (79%) applied unique /distinctive modelling approaches. Expert commentary: This wide range of approaches suggests the need to develop recommendations /minimal requirements for model-based HEA of obesity. In order to reach this long-term goal, further research is required. Valuable future research steps would be to investigate the predictiveness, validity and quality of the identified modelling approaches.

  7. Systematic coarse-grained modeling of complexation between small interfering RNA and polycations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Zonghui; Luijten, Erik, E-mail: luijten@northwestern.edu; Department of Materials Science and Engineering, Northwestern University, Evanston, Illinois 60208

    All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed bindingmore » patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.« less

  8. Comparison of two stochastic techniques for reliable urban runoff prediction by modeling systematic errors

    NASA Astrophysics Data System (ADS)

    Del Giudice, Dario; Löwe, Roland; Madsen, Henrik; Mikkelsen, Peter Steen; Rieckermann, Jörg

    2015-07-01

    In urban rainfall-runoff, commonly applied statistical techniques for uncertainty quantification mostly ignore systematic output errors originating from simplified models and erroneous inputs. Consequently, the resulting predictive uncertainty is often unreliable. Our objective is to present two approaches which use stochastic processes to describe systematic deviations and to discuss their advantages and drawbacks for urban drainage modeling. The two methodologies are an external bias description (EBD) and an internal noise description (IND, also known as stochastic gray-box modeling). They emerge from different fields and have not yet been compared in environmental modeling. To compare the two approaches, we develop a unifying terminology, evaluate them theoretically, and apply them to conceptual rainfall-runoff modeling in the same drainage system. Our results show that both approaches can provide probabilistic predictions of wastewater discharge in a similarly reliable way, both for periods ranging from a few hours up to more than 1 week ahead of time. The EBD produces more accurate predictions on long horizons but relies on computationally heavy MCMC routines for parameter inferences. These properties make it more suitable for off-line applications. The IND can help in diagnosing the causes of output errors and is computationally inexpensive. It produces best results on short forecast horizons that are typical for online applications.

  9. Adapting a scenario tree model for freedom from disease as surveillance progresses: the Canadian notifiable avian influenza model.

    PubMed

    Christensen, Jette; El Allaki, Farouk; Vallières, André

    2014-05-01

    Scenario tree models with temporal discounting have been applied in four continents to support claims of freedom from animal disease. Recently, a second (new) model was developed for the same population and disease. This is a natural development because surveillance is a dynamic process that needs to adapt to changing circumstances - the difficulty is the justification for, documentation of, presentation of and the acceptance of the changes. Our objective was to propose a systematic approach to present changes to an existing scenario tree model for freedom from disease. We used the example of how we adapted the deterministic Canadian Notifiable Avian Influenza scenario tree model published in 2011 to a stochastic scenario tree model where the definition of sub-populations and the estimation of probability of introduction of the pathogen were modified. We found that the standardized approach by Vanderstichel et al. (2013) with modifications provided a systematic approach to make and present changes to an existing scenario tree model. We believe that the new 2013 CanNAISS scenario tree model is a better model than the 2011 model because the 2013 model included more surveillance data. In particular, the new data on Notifiable Avian Influenza in Canada from the last 5 years were used to improve input parameters and model structure. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  10. Candidate Predictors of Health-Related Quality of Life of Colorectal Cancer Survivors: A Systematic Review

    PubMed Central

    van der Linden, Bernadette W.A.; Winkels, Renate M.; van Duijnhoven, Fränzel J.; Mols, Floortje; van Roekel, Eline H.; Kampman, Ellen; Beijer, Sandra; Weijenberg, Matty P.

    2016-01-01

    The population of colorectal cancer (CRC) survivors is growing and many survivors experience deteriorated health-related quality of life (HRQoL) in both early and late post-treatment phases. Identification of CRC survivors at risk for HRQoL deterioration can be improved by using prediction models. However, such models are currently not available for oncology practice. As a starting point for developing prediction models of HRQoL for CRC survivors, a comprehensive overview of potential candidate HRQoL predictors is necessary. Therefore, a systematic literature review was conducted to identify candidate predictors of HRQoL of CRC survivors. Original research articles on associations of biopsychosocial factors with HRQoL of CRC survivors were searched in PubMed, Embase, and Google Scholar. Two independent reviewers assessed eligibility and selected articles for inclusion (N = 53). Strength of evidence for candidate HRQoL predictors was graded according to predefined methodological criteria. The World Health Organization’s International Classification of Functioning, Disability and Health (ICF) was used to develop a biopsychosocial framework in which identified candidate HRQoL predictors were mapped across the main domains of the ICF: health condition, body structures and functions, activities, participation, and personal and environmental factors. The developed biopsychosocial ICF framework serves as a basis for selecting candidate HRQoL predictors, thereby providing conceptual guidance for developing comprehensive, evidence-based prediction models of HRQoL for CRC survivors. Such models are useful in clinical oncology practice to aid in identifying individual CRC survivors at risk for HRQoL deterioration and could also provide potential targets for a biopsychosocial intervention aimed at safeguarding the HRQoL of at-risk individuals. Implications for Practice: More and more people now survive a diagnosis of colorectal cancer. The quality of life of these cancer survivors is threatened by health problems persisting for years after diagnosis and treatment. Early identification of survivors at risk of experiencing low quality of life in the future is thus important for taking preventive measures. Clinical prediction models are tools that can help oncologists identify at-risk individuals. However, such models are currently not available for clinical oncology practice. This systematic review outlines candidate predictors of low quality of life of colorectal cancer survivors, providing a firm conceptual basis for developing prediction models. PMID:26911406

  11. Quasi steady-state aerodynamic model development for race vehicle simulations

    NASA Astrophysics Data System (ADS)

    Mohrfeld-Halterman, J. A.; Uddin, M.

    2016-01-01

    Presented in this paper is a procedure to develop a high fidelity quasi steady-state aerodynamic model for use in race car vehicle dynamic simulations. Developed to fit quasi steady-state wind tunnel data, the aerodynamic model is regressed against three independent variables: front ground clearance, rear ride height, and yaw angle. An initial dual range model is presented and then further refined to reduce the model complexity while maintaining a high level of predictive accuracy. The model complexity reduction decreases the required amount of wind tunnel data thereby reducing wind tunnel testing time and cost. The quasi steady-state aerodynamic model for the pitch moment degree of freedom is systematically developed in this paper. This same procedure can be extended to the other five aerodynamic degrees of freedom to develop a complete six degree of freedom quasi steady-state aerodynamic model for any vehicle.

  12. Solid waste forecasting using modified ANFIS modeling.

    PubMed

    Younes, Mohammad K; Nopiah, Z M; Basri, N E Ahmad; Basri, H; Abushammala, Mohammed F M; K N A, Maulud

    2015-10-01

    Solid waste prediction is crucial for sustainable solid waste management. Usually, accurate waste generation record is challenge in developing countries which complicates the modelling process. Solid waste generation is related to demographic, economic, and social factors. However, these factors are highly varied due to population and economy growths. The objective of this research is to determine the most influencing demographic and economic factors that affect solid waste generation using systematic approach, and then develop a model to forecast solid waste generation using a modified Adaptive Neural Inference System (MANFIS). The model evaluation was performed using Root Mean Square Error (RMSE), Mean Absolute Error (MAE) and the coefficient of determination (R²). The results show that the best input variables are people age groups 0-14, 15-64, and people above 65 years, and the best model structure is 3 triangular fuzzy membership functions and 27 fuzzy rules. The model has been validated using testing data and the resulted training RMSE, MAE and R² were 0.2678, 0.045 and 0.99, respectively, while for testing phase RMSE =3.986, MAE = 0.673 and R² = 0.98. To date, a few attempts have been made to predict the annual solid waste generation in developing countries. This paper presents modeling of annual solid waste generation using Modified ANFIS, it is a systematic approach to search for the most influencing factors and then modify the ANFIS structure to simplify the model. The proposed method can be used to forecast the waste generation in such developing countries where accurate reliable data is not always available. Moreover, annual solid waste prediction is essential for sustainable planning.

  13. Nonstandard working schedules and health: the systematic search for a comprehensive model.

    PubMed

    Merkus, Suzanne L; Holte, Kari Anne; Huysmans, Maaike A; van Mechelen, Willem; van der Beek, Allard J

    2015-10-23

    Theoretical models on shift work fall short of describing relevant health-related pathways associated with the broader concept of nonstandard working schedules. Shift work models neither combine relevant working time characteristics applicable to nonstandard schedules nor include the role of rest periods and recovery in the development of health complaints. Therefore, this paper aimed to develop a comprehensive model on nonstandard working schedules to address these shortcomings. A literature review was conducted using a systematic search and selection process. Two searches were performed: one associating the working time characteristics time-of-day and working time duration with health and one associating recovery after work with health. Data extracted from the models were used to develop a comprehensive model on nonstandard working schedules and health. For models on the working time characteristics, the search strategy yielded 3044 references, of which 26 met the inclusion criteria that contained 22 distinctive models. For models on recovery after work, the search strategy yielded 896 references, of which seven met the inclusion criteria containing seven distinctive models. Of the models on the working time characteristics, three combined time-of-day with working time duration, 18 were on time-of-day (i.e. shift work), and one was on working time duration. The model developed in the paper has a comprehensive approach to working hours and other work-related risk factors and proposes that they should be balanced by positive non-work factors to maintain health. Physiological processes leading to health complaints are circadian disruption, sleep deprivation, and activation that should be counterbalanced by (re-)entrainment, restorative sleep, and recovery, respectively, to maintain health. A comprehensive model on nonstandard working schedules and health was developed. The model proposes that work and non-work as well as their associated physiological processes need to be balanced to maintain good health. The model gives researchers a useful overview over the various risk factors and pathways associated with health that should be considered when studying any form of nonstandard working schedule.

  14. Fundamental studies in geodynamics

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.

    1980-01-01

    Progress in modeling instantaneous plate kinematics is reviewed, with emphasis on recently developed models of present day plate motions derived by the systematic inversion of globally distributed data sets. Rivera plate motions, the Caribbean South American boundary, Indian plate deformation, Pacific-North America, seismicity and subduction processes, and the study of slow earthquakes and free oscillations are discussed.

  15. Modeling Agrilus planipennis within-tree colonization patterns and development of a systematic subsampling plan

    USDA-ARS?s Scientific Manuscript database

    Emerald ash borer, Agrilus planipennis Fairmaire, an insect native to central Asia, was first detected in southeast Michigan in 2002, and has since killed millions of ash trees, Fraxinus spp., throughout eastern North America. Here, we use generalized linear mixed models to predict the presence or a...

  16. Qualitative Facilities Assessment: Beyond the Condition Audit

    ERIC Educational Resources Information Center

    Kaiser, Harvey H.; Klein, Eva

    2010-01-01

    In APPA's recently published book, "Strategic Capital Development: The New Model for Campus Investment," the authors make a case for substantial change in capital planning for higher education institutions. The new model posed is intended to urge institutions and systems to (1) identify more systematically all capital needs of all types; (2)…

  17. Mathematical Models in Educational Planning. Education and Development, Technical Reports.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This volume contains papers, presented at a 1966 OECD meeting, on the possibilities of applying a number of related techniques such as mathematical model building, simulation, and systematic control theory to the problems of educational planning. The authors and their papers are (1) Richard Stone, "A View of the Conference," (2) Hector…

  18. Dynamic Measurement Modeling: Using Nonlinear Growth Models to Estimate Student Learning Capacity

    ERIC Educational Resources Information Center

    Dumas, Denis G.; McNeish, Daniel M.

    2017-01-01

    Single-timepoint educational measurement practices are capable of assessing student ability at the time of testing but are not designed to be informative of student capacity for developing in any particular academic domain, despite commonly being used in such a manner. For this reason, such measurement practice systematically underestimates the…

  19. Team Design Communication Patterns in e-Learning Design and Development

    ERIC Educational Resources Information Center

    Rapanta, Chrysi; Maina, Marcelo; Lotz, Nicole; Bacchelli, Alberto

    2013-01-01

    Prescriptive stage models have been found insufficient to describe the dynamic aspects of designing, especially in interdisciplinary e-learning design teams. There is a growing need for a systematic empirical analysis of team design processes that offer deeper and more detailed insights into instructional design (ID) than general models can offer.…

  20. Business Management Coaching: Focusing on Entrepreneur's Current Position and Aims

    ERIC Educational Resources Information Center

    Cheah, Kheng T.

    2012-01-01

    One-to-one business coaching over 6 months was provided to nine clients in Hawaii to help them acquire business transition skills. The STARS model was used to determine the individual business situation and to explore suitable leadership strategies to move forward. Systematically, each client developed a business model, business strategies, a…

  1. Measurement and modeling of intrinsic transcription terminators

    PubMed Central

    Cambray, Guillaume; Guimaraes, Joao C.; Mutalik, Vivek K.; Lam, Colin; Mai, Quynh-Anh; Thimmaiah, Tim; Carothers, James M.; Arkin, Adam P.; Endy, Drew

    2013-01-01

    The reliable forward engineering of genetic systems remains limited by the ad hoc reuse of many types of basic genetic elements. Although a few intrinsic prokaryotic transcription terminators are used routinely, termination efficiencies have not been studied systematically. Here, we developed and validated a genetic architecture that enables reliable measurement of termination efficiencies. We then assembled a collection of 61 natural and synthetic terminators that collectively encode termination efficiencies across an ∼800-fold dynamic range within Escherichia coli. We simulated co-transcriptional RNA folding dynamics to identify competing secondary structures that might interfere with terminator folding kinetics or impact termination activity. We found that structures extending beyond the core terminator stem are likely to increase terminator activity. By excluding terminators encoding such context-confounding elements, we were able to develop a linear sequence-function model that can be used to estimate termination efficiencies (r = 0.9, n = 31) better than models trained on all terminators (r = 0.67, n = 54). The resulting systematically measured collection of terminators should improve the engineering of synthetic genetic systems and also advance quantitative modeling of transcription termination. PMID:23511967

  2. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  3. A correction method for systematic error in (1)H-NMR time-course data validated through stochastic cell culture simulation.

    PubMed

    Sokolenko, Stanislav; Aucoin, Marc G

    2015-09-04

    The growing ubiquity of metabolomic techniques has facilitated high frequency time-course data collection for an increasing number of applications. While the concentration trends of individual metabolites can be modeled with common curve fitting techniques, a more accurate representation of the data needs to consider effects that act on more than one metabolite in a given sample. To this end, we present a simple algorithm that uses nonparametric smoothing carried out on all observed metabolites at once to identify and correct systematic error from dilution effects. In addition, we develop a simulation of metabolite concentration time-course trends to supplement available data and explore algorithm performance. Although we focus on nuclear magnetic resonance (NMR) analysis in the context of cell culture, a number of possible extensions are discussed. Realistic metabolic data was successfully simulated using a 4-step process. Starting with a set of metabolite concentration time-courses from a metabolomic experiment, each time-course was classified as either increasing, decreasing, concave, or approximately constant. Trend shapes were simulated from generic functions corresponding to each classification. The resulting shapes were then scaled to simulated compound concentrations. Finally, the scaled trends were perturbed using a combination of random and systematic errors. To detect systematic errors, a nonparametric fit was applied to each trend and percent deviations calculated at every timepoint. Systematic errors could be identified at time-points where the median percent deviation exceeded a threshold value, determined by the choice of smoothing model and the number of observed trends. Regardless of model, increasing the number of observations over a time-course resulted in more accurate error estimates, although the improvement was not particularly large between 10 and 20 samples per trend. The presented algorithm was able to identify systematic errors as small as 2.5 % under a wide range of conditions. Both the simulation framework and error correction method represent examples of time-course analysis that can be applied to further developments in (1)H-NMR methodology and the more general application of quantitative metabolomics.

  4. [Thinking on the Training of Uniportal Video-assisted Thoracic Surgery].

    PubMed

    Zhu, Yuming; Jiang, Gening

    2018-04-20

    Recently, uniportal video-assisted thoracic surgery (VATS) has developed rapidly and has become the main theme of global surgical development. The specific, standardized and systematic training of this technology has become an important topic. Specific training in the uniportal VATS approach is crucial to ensure safety and radical treatment. Such training approach, including a direct interaction with experienced surgeons in high-volume centers, is crucial and represents an indispensable step. Another form of training that usually occurs after preceptorship is proctorship: an experienced mentor can be invited to a trainee's own center to provide specific on-site tutelage. Videos published online are commonly used as training material. Technology has allowed the use of different models of simulators for training. The most common model is the use of animal wet laboratory training. Other models, however, have been used mostrecently, such as the use of 3D and VR Technology, virtual reality simulators, and completely artificial models of the human thorax with synthetic lung, vessel, airway, and nodal tissues. A short-duration, high-volume, clinical immersion training, and a long term systematic training in high-volume centers are getting more and more attention. According to the evaluation of students' grading, a diversified training mode is adopted and the targeted training in accordance with different students helps to improve the training effect. We have done some work in systematic and standardized training of uniportal VATS in single center. We believe such training is feasible and absolutely necessary.

  5. A systematic review of neonatal treatment intensity scores and their potential application in low-resource setting hospitals for predicting mortality, morbidity and estimating resource use.

    PubMed

    Aluvaala, Jalemba; Collins, Gary S; Maina, Michuki; Berkley, James A; English, Mike

    2017-12-07

    Treatment intensity scores can predict mortality and estimate resource use. They may therefore be of interest for essential neonatal care in low resource settings where neonatal mortality remains high. We sought to systematically review neonatal treatment intensity scores to (1) assess the level of evidence on predictive performance in predicting clinical outcomes and estimating resource utilisation and (2) assess the applicability of the identified models to decision making for neonatal care in low resource settings. We conducted a systematic search of PubMed, EMBASE (OVID), CINAHL, Global Health Library (Global index, WHO) and Google Scholar to identify studies published up until 21 December 2016. Included were all articles that used treatments as predictors in neonatal models. Individual studies were appraised using the CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies (CHARMS). In addition, Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used as a guiding framework to assess certainty in the evidence for predicting outcomes across studies. Three thousand two hundred forty-nine articles were screened, of which ten articles were included in the review. All of the studies were conducted in neonatal intensive care units with sample sizes ranging from 22 to 9978, with a median of 163. Two articles reported model development, while eight reported external application of existing models to new populations. Meta-analysis was not possible due heterogeneity in the conduct and reporting of the identified studies. Discrimination as assessed by area under receiver operating characteristic curve was reported for in-hospital mortality, median 0.84 (range 0.75-0.96, three studies), early adverse outcome and late adverse outcome (0.78 and 0.59, respectively, one study). Existing neonatal treatment intensity models show promise in predicting mortality and morbidity. There is however low certainty in the evidence on their performance in essential neonatal care in low resource settings as all studies had methodological limitations and were conducted in intensive care. The approach may however be developed further for low resource settings like Kenya because treatment data may be easier to obtain compared to measures of physiological status. PROSPERO CRD42016034205.

  6. Systematic modelling and design evaluation of unperturbed tumour dynamics in xenografts.

    PubMed

    Parra Guillen, Zinnia P Patricia; Mangas Sanjuan, Victor; Garcia-Cremades, Maria; Troconiz, Inaki F; Mo, Gary; Pitou, Celine; Iversen, Philip W; Wallin, Johan E

    2018-04-24

    Xenograft mice are largely used to evaluate the efficacy of oncological drugs during preclinical phases of drug discovery and development. Mathematical models provide a useful tool to quantitatively characterise tumour growth dynamics and also optimise upcoming experiments. To the best of our knowledge, this is the first report where unperturbed growth of a large set of tumour cell lines (n=28) has been systematically analysed using the model proposed by Simeoni in the context of non-linear mixed effect (NLME). Exponential growth was identified as the governing mechanism in the majority of the cell lines, with constant rate values ranging from 0.0204 to 0.203 day -1 No common patterns could be observed across tumour types, highlighting the importance of combining information from different cell lines when evaluating drug activity. Overall, typical model parameters were precisely estimated using designs where tumour size measurements were taken every two days. Moreover, reducing the number of measurement to twice per week, or even once per week for cell lines with low growth rates, showed little impact on parameter precision. However, in order to accurately characterise parameter variability (i.e. relative standard errors below 50%), a sample size of at least 50 mice is needed. This work illustrates the feasibility to systematically apply NLME models to characterise tumour growth in drug discovery and development, and constitutes a valuable source of data to optimise experimental designs by providing an a priori sampling window and minimising the number of samples required. The American Society for Pharmacology and Experimental Therapeutics.

  7. Quality metrics in high-dimensional data visualization: an overview and systematization.

    PubMed

    Bertini, Enrico; Tatu, Andrada; Keim, Daniel

    2011-12-01

    In this paper, we present a systematization of techniques that use quality metrics to help in the visual exploration of meaningful patterns in high-dimensional data. In a number of recent papers, different quality metrics are proposed to automate the demanding search through large spaces of alternative visualizations (e.g., alternative projections or ordering), allowing the user to concentrate on the most promising visualizations suggested by the quality metrics. Over the last decade, this approach has witnessed a remarkable development but few reflections exist on how these methods are related to each other and how the approach can be developed further. For this purpose, we provide an overview of approaches that use quality metrics in high-dimensional data visualization and propose a systematization based on a thorough literature review. We carefully analyze the papers and derive a set of factors for discriminating the quality metrics, visualization techniques, and the process itself. The process is described through a reworked version of the well-known information visualization pipeline. We demonstrate the usefulness of our model by applying it to several existing approaches that use quality metrics, and we provide reflections on implications of our model for future research. © 2010 IEEE

  8. Systematization of a set of closure techniques.

    PubMed

    Hausken, Kjell; Moxnes, John F

    2011-11-01

    Approximations in population dynamics are gaining popularity since stochastic models in large populations are time consuming even on a computer. Stochastic modeling causes an infinite set of ordinary differential equations for the moments. Closure models are useful since they recast this infinite set into a finite set of ordinary differential equations. This paper systematizes a set of closure approximations. We develop a system, which we call a power p closure of n moments, where 0≤p≤n. Keeling's (2000a,b) approximation with third order moments is shown to be an instantiation of this system which we call a power 3 closure of 3 moments. We present an epidemiological example and evaluate the system for third and fourth moments compared with Monte Carlo simulations. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. A Systematic Review of Health Economics Simulation Models of Chronic Obstructive Pulmonary Disease.

    PubMed

    Zafari, Zafar; Bryan, Stirling; Sin, Don D; Conte, Tania; Khakban, Rahman; Sadatsafavi, Mohsen

    2017-01-01

    Many decision-analytic models with varying structures have been developed to inform resource allocation in chronic obstructive pulmonary disease (COPD). To review COPD models for their adherence to the best practice modeling recommendations and their assumptions regarding important aspects of the natural history of COPD. A systematic search of English articles reporting on the development or application of a decision-analytic model in COPD was performed in MEDLINE, Embase, and citations within reviewed articles. Studies were summarized and evaluated on the basis of their adherence to the Consolidated Health Economic Evaluation Reporting Standards. They were also evaluated for the underlying assumptions about disease progression, heterogeneity, comorbidity, and treatment effects. Forty-nine models of COPD were included. Decision trees and Markov models were the most popular techniques (43 studies). Quality of reporting and adherence to the guidelines were generally high, especially in more recent publications. Disease progression was modeled through clinical staging in most studies. Although most studies (n = 43) had incorporated some aspects of COPD heterogeneity, only 8 reported the results across subgroups. Only 2 evaluations explicitly considered the impact of comorbidities. Treatment effect had been mostly modeled (20) as both reduction in exacerbation rate and improvement in lung function. Many COPD models have been developed, generally with similar structural elements. COPD is highly heterogeneous, and comorbid conditions play an important role in its burden. These important aspects, however, have not been adequately addressed in most of the published models. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Bundle Block Adjustment of Airborne Three-Line Array Imagery Based on Rotation Angles

    PubMed Central

    Zhang, Yongjun; Zheng, Maoteng; Huang, Xu; Xiong, Jinxin

    2014-01-01

    In the midst of the rapid developments in electronic instruments and remote sensing technologies, airborne three-line array sensors and their applications are being widely promoted and plentiful research related to data processing and high precision geo-referencing technologies is under way. The exterior orientation parameters (EOPs), which are measured by the integrated positioning and orientation system (POS) of airborne three-line sensors, however, have inevitable systematic errors, so the level of precision of direct geo-referencing is not sufficiently accurate for surveying and mapping applications. Consequently, a few ground control points are necessary to refine the exterior orientation parameters, and this paper will discuss bundle block adjustment models based on the systematic error compensation and the orientation image, considering the principle of an image sensor and the characteristics of the integrated POS. Unlike the models available in the literature, which mainly use a quaternion to represent the rotation matrix of exterior orientation, three rotation angles are directly used in order to effectively model and eliminate the systematic errors of the POS observations. Very good experimental results have been achieved with several real datasets that verify the correctness and effectiveness of the proposed adjustment models. PMID:24811075

  11. Bundle block adjustment of airborne three-line array imagery based on rotation angles.

    PubMed

    Zhang, Yongjun; Zheng, Maoteng; Huang, Xu; Xiong, Jinxin

    2014-05-07

    In the midst of the rapid developments in electronic instruments and remote sensing technologies, airborne three-line array sensors and their applications are being widely promoted and plentiful research related to data processing and high precision geo-referencing technologies is under way. The exterior orientation parameters (EOPs), which are measured by the integrated positioning and orientation system (POS) of airborne three-line sensors, however, have inevitable systematic errors, so the level of precision of direct geo-referencing is not sufficiently accurate for surveying and mapping applications. Consequently, a few ground control points are necessary to refine the exterior orientation parameters, and this paper will discuss bundle block adjustment models based on the systematic error compensation and the orientation image, considering the principle of an image sensor and the characteristics of the integrated POS. Unlike the models available in the literature, which mainly use a quaternion to represent the rotation matrix of exterior orientation, three rotation angles are directly used in order to effectively model and eliminate the systematic errors of the POS observations. Very good experimental results have been achieved with several real datasets that verify the correctness and effectiveness of the proposed adjustment models.

  12. Roles of University Support for International Students in the United States: Analysis of a Systematic Model of University Identification, University Support, and Psychological Well-Being

    ERIC Educational Resources Information Center

    Cho, Jaehee; Yu, Hongsik

    2015-01-01

    Unlike previous research on international students' social support, this current study applied the concept of organizational support to university contexts, examining the effects of university support. Mainly based on the social identity/self-categorization stress model, this study developed and tested a path model composed of four key…

  13. Evaluation of the CONSUME and FOFEM fuel consumption models in pine and mixed hardwood forests of the eastern United States

    Treesearch

    Susan J. Prichard; Eva C. Karau; Roger D. Ottmar; Maureen C. Kennedy; James B. Cronan; Clinton S. Wright; Robert E. Keane

    2014-01-01

    Reliable predictions of fuel consumption are critical in the eastern United States (US), where prescribed burning is frequently applied to forests and air quality is of increasing concern. CONSUME and the First Order Fire Effects Model (FOFEM), predictive models developed to estimate fuel consumption and emissions from wildland fires, have not been systematically...

  14. Systematic Assessment Through Mathematical Model For Sustainability Reporting In Malaysia Context

    NASA Astrophysics Data System (ADS)

    Lanang, Wan Nurul Syahirah Wan; Turan, Faiz Mohd; Johan, Kartina

    2017-08-01

    Sustainability assessment have been studied and increasingly recognized as a powerful and valuable tool to measure the performance of sustainability in a company or industry. Nowadays, there are many existing tools that the users can use for sustainable development. There are various initiatives exists on tools for sustainable development, though most of the tools focused on environmental, economy and social aspects. Using the Green Project Management (GPM) P5 concept that suggests the firms not only needs to engage in mainly 3Ps principle: planet, profit, people responsible behaviours, but also, product and process need to be included in the practices, this study will introduce a new mathematical model for assessing the level of sustainability practice in the company. Based on multiple case studies, involving in-depth interviews with senior directors, feedback from experts, and previous engineering report, a systematic approach is done with the aims to obtain the respective data from the feedbacks and to be developed into a new mathematical model. By reviewing on the methodology of this research it comprises of several phases where it starts with the analyzation of the parameters and criteria selection according to the Malaysian context of industry. Moving on to the next step is data analysis involving regression and finally the normalisation process will be done to determine the result of this research either succeeded or not. Lastly, this study is expected to provide a clear guideline to any company or organization to assimilate the sustainability assessment in their development stage. In future, the better understanding towards the sustainability assessment is attained to be aligned unitedly in order to integrated the process approach into the systematic approach for the sustainability assessment.

  15. Optimizing construction quality management of pavements using mechanistic performance analysis.

    DOT National Transportation Integrated Search

    2004-08-01

    This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...

  16. Simulation models in population breast cancer screening: A systematic review.

    PubMed

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    2015-08-01

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Critical interpretive synthesis of barriers and facilitators to TB treatment in immigrant populations.

    PubMed

    Lin, S; Melendez-Torres, G J

    2017-10-01

    To systematically review studies of TB treatment experiences in immigrant populations, using Critical Interpretive Synthesis (CIS). On 26 October 2014, MEDLINE, CINAHL, Embase, LILACS, and PsycINFO were systematically searched. Grey literature and reference lists were hand-searched. Initial papers included were restricted to studies of immigrant patient perspectives; after a model was developed, a second set of papers was included to test the emerging theory. Of 1761 studies identified in the search, a total of 29 were included in the synthesis. Using those studies, we developed a model that suggested treatment experiences were strongly related to the way both individuals and societies adjusted to immigration ('acculturation strategies'). Relationships with healthcare workers and immigration policies played particularly significant roles in TB treatment. This review emphasised the roles of repatriation policy and healthcare workers in forming experiences of TB treatment in immigrant populations. © 2017 John Wiley & Sons Ltd.

  18. Transparent metal model study of the use of a cellular growth front to form aligned monotectic composite materials

    NASA Technical Reports Server (NTRS)

    Kaukler, William F.

    1988-01-01

    The purpose of this work was to resolve a scientific controversy in the understanding of how second phase particles become aligned during unidirectional growth of a monotectic alloy. A second aspect was to make the first systematic observations of the solidification behavior of a monotectic alloy during cellular growth in-situ. This research provides the first systematic transparent model study of cellular solidification. An interface stability diagram was developed for the planar to cellular transition of the succinonitrile glycerol (SNG) system. A method was developed utilizing Fourier Transform Infrared Spectroscopy which allows quantitative compositional analysis of directionally solidified SNG along the growth axis. To determine the influence of cellular growth front on alignment for directionally solidified monotectic alloys, the planar and cellular growth morphology was observed in-situ for SNG between 8 and 17 percent glycerol and for a range of over two orders of magnitude G/R.

  19. Path integration mediated systematic search: a Bayesian model.

    PubMed

    Vickerstaff, Robert J; Merkle, Tobias

    2012-08-21

    The systematic search behaviour is a backup system that increases the chances of desert ants finding their nest entrance after foraging when the path integrator has failed to guide them home accurately enough. Here we present a mathematical model of the systematic search that is based on extensive behavioural studies in North African desert ants Cataglyphis fortis. First, a simple search heuristic utilising Bayesian inference and a probability density function is developed. This model, which optimises the short-term nest detection probability, is then compared to three simpler search heuristics and to recorded search patterns of Cataglyphis ants. To compare the different searches a method to quantify search efficiency is established as well as an estimate of the error rate in the ants' path integrator. We demonstrate that the Bayesian search heuristic is able to automatically adapt to increasing levels of positional uncertainty to produce broader search patterns, just as desert ants do, and that it outperforms the three other search heuristics tested. The searches produced by it are also arguably the most similar in appearance to the ant's searches. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Systematic assessment of blood circulation time of functionalized upconversion nanoparticles in the chick embryo

    NASA Astrophysics Data System (ADS)

    Nadort, Annemarie; Liang, Liuen; Grebenik, Ekaterina; Guller, Anna; Lu, Yiqing; Qian, Yi; Goldys, Ewa; Zvyagin, Andrei

    2015-12-01

    Nanoparticle-based delivery of drugs and contrast agents holds great promise in cancer research, because of the increased delivery efficiency compared to `free' drugs and dyes. A versatile platform to investigate nanotechnology is the chick embryo chorioallantoic membrane tumour model, due to its availability (easy, cheap) and accessibility (interventions, imaging). In our group, we developed this model using several tumour cell lines (e.g. breast cancer, colon cancer). In addition, we have synthesized in-house silica coated photoluminescent upconversion nanoparticles with several functional groups (COOH, NH2, PEG). In this work we will present the systematic assessment of their in vivo blood circulation times. To this end, we injected chick embryos grown ex ovo with the functionalized UCNPs and obtained a small amount of blood at several time points after injection to create blood smears The UCNP signal from the blood smears was quantified using a modified inverted microscope imaging set-up. The results of this systematic study are valuable to optimize biochemistry protocols and guide nanomedicine advancement in the versatile chick embryo tumour model.

  1. Real-time PCR Machine System Modeling and a Systematic Approach for the Robust Design of a Real-time PCR-on-a-Chip System

    PubMed Central

    Lee, Da-Sheng

    2010-01-01

    Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design. PMID:22315563

  2. Alternative Modes of Evaluation and Their Application to Rural Development.

    ERIC Educational Resources Information Center

    Wetherill, G. Richard; Buttram, Joan L.

    In order to "cut through the jargon of the multifaceted field of evaluative research", 21 evaluation models representing a range of possibilities were identified (via literature review) and compared in terms of purpose and five basic phases applicable to rural development. Evaluation was defined as "the systematic examination of a…

  3. School Effectiveness: Problem-Solving and Managing Conflict.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Education, St. Paul.

    This module discusses the theory and practice of school improvement and outlines a nine-step systematic problem-solving process for developing an action plan addressing school improvement goals. The first section describes a general model for the study of the school as a social system, as developed by Getzels and Thelen (1960). The second section…

  4. A Professional Development Manual for Online Learning at Savannah State University

    ERIC Educational Resources Information Center

    Nyatuame, Patrice C.

    2017-01-01

    This capstone is designed to support instructions, learning, and assessment at Savannah State University. The concepts that frame this capstone include Johnson and Aragon's (2004) pedagogical model that was used to develop the HRE Online Master's Degree Program. The manual addresses the new faculty member with a getting started systematic guide…

  5. A Creative Learning Ecosystem, Quality of Education and Innovative Capacity: A Perspective from Higher Education

    ERIC Educational Resources Information Center

    Crosling, Glenda; Nair, Mahendhiran; Vaithilingam, Santha

    2015-01-01

    Globally, governments recognize the importance of creativity and innovation for sustainable socioeconomic development, and many invest resources to develop learning environments that foster these capacities. This paper provides a systematic framework based on Nair's "Innovation Helix" model for studying the factors of a country's…

  6. A systematic grounded approach to the development of complex interventions: the Australian WorkHealth Program--arthritis as a case study.

    PubMed

    Reavley, Nicola; Livingston, Jenni; Buchbinder, Rachelle; Bennell, Kim; Stecki, Chris; Osborne, Richard Harry

    2010-02-01

    Despite demands for evidence-based research and practice, little attention has been given to systematic approaches to the development of complex interventions to tackle workplace health problems. This paper outlines an approach to the initial stages of a workplace program development which integrates health promotion and disease management. The approach commences with systematic and genuine processes of obtaining information from key stakeholders with broad experience of these interventions. This information is constructed into a program framework in which practice-based and research-informed elements are both valued. We used this approach to develop a workplace education program to reduce the onset and impact of a common chronic disease - osteoarthritis. To gain information systematically at a national level, a structured concept mapping workshop with 47 participants from across Australia was undertaken. Participants were selected to maximise the whole-of-workplace perspective and included health education providers, academics, clinicians and policymakers. Participants generated statements in response to a seeding statement: Thinking as broadly as possible, what changes in education and support should occur in the workplace to help in the prevention and management of arthritis? Participants grouped the resulting statements into conceptually coherent groups and a computer program was used to generate a 'cluster map' along with a list of statements sorted according to cluster membership. In combination with research-based evidence, the concept map informed the development of a program logic model incorporating the program's guiding principles, possible service providers, services, training modes, program elements and the causal processes by which participants might benefit. The program logic model components were further validated through research findings from diverse fields, including health education, coaching, organisational learning, workplace interventions, workforce development and osteoarthritis disability prevention. In summary, wide and genuine consultation, concept mapping, and evidence-based program logic development were integrated to develop a whole-of-system complex intervention in which potential effectiveness and assimilation into the workplace for which optimised. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. Economic evaluation in chronic pain: a systematic review and de novo flexible economic model.

    PubMed

    Sullivan, W; Hirst, M; Beard, S; Gladwell, D; Fagnani, F; López Bastida, J; Phillips, C; Dunlop, W C N

    2016-07-01

    There is unmet need in patients suffering from chronic pain, yet innovation may be impeded by the difficulty of justifying economic value in a field beset by data limitations and methodological variability. A systematic review was conducted to identify and summarise the key areas of variability and limitations in modelling approaches in the economic evaluation of treatments for chronic pain. The results of the literature review were then used to support the development of a fully flexible open-source economic model structure, designed to test structural and data assumptions and act as a reference for future modelling practice. The key model design themes identified from the systematic review included: time horizon; titration and stabilisation; number of treatment lines; choice/ordering of treatment; and the impact of parameter uncertainty (given reliance on expert opinion). Exploratory analyses using the model to compare a hypothetical novel therapy versus morphine as first-line treatments showed cost-effectiveness results to be sensitive to structural and data assumptions. Assumptions about the treatment pathway and choice of time horizon were key model drivers. Our results suggest structural model design and data assumptions may have driven previous cost-effectiveness results and ultimately decisions based on economic value. We therefore conclude that it is vital that future economic models in chronic pain are designed to be fully transparent and hope our open-source code is useful in order to aspire to a common approach to modelling pain that includes robust sensitivity analyses to test structural and parameter uncertainty.

  8. Evaluation of animal models of neurobehavioral disorders

    PubMed Central

    van der Staay, F Josef; Arndt, Saskia S; Nordquist, Rebecca E

    2009-01-01

    Animal models play a central role in all areas of biomedical research. The process of animal model building, development and evaluation has rarely been addressed systematically, despite the long history of using animal models in the investigation of neuropsychiatric disorders and behavioral dysfunctions. An iterative, multi-stage trajectory for developing animal models and assessing their quality is proposed. The process starts with defining the purpose(s) of the model, preferentially based on hypotheses about brain-behavior relationships. Then, the model is developed and tested. The evaluation of the model takes scientific and ethical criteria into consideration. Model development requires a multidisciplinary approach. Preclinical and clinical experts should establish a set of scientific criteria, which a model must meet. The scientific evaluation consists of assessing the replicability/reliability, predictive, construct and external validity/generalizability, and relevance of the model. We emphasize the role of (systematic and extended) replications in the course of the validation process. One may apply a multiple-tiered 'replication battery' to estimate the reliability/replicability, validity, and generalizability of result. Compromised welfare is inherent in many deficiency models in animals. Unfortunately, 'animal welfare' is a vaguely defined concept, making it difficult to establish exact evaluation criteria. Weighing the animal's welfare and considerations as to whether action is indicated to reduce the discomfort must accompany the scientific evaluation at any stage of the model building and evaluation process. Animal model building should be discontinued if the model does not meet the preset scientific criteria, or when animal welfare is severely compromised. The application of the evaluation procedure is exemplified using the rat with neonatal hippocampal lesion as a proposed model of schizophrenia. In a manner congruent to that for improving animal models, guided by the procedure expounded upon in this paper, the developmental and evaluation procedure itself may be improved by careful definition of the purpose(s) of a model and by defining better evaluation criteria, based on the proposed use of the model. PMID:19243583

  9. Computer simulation models of pre-diabetes populations: a systematic review protocol

    PubMed Central

    Khurshid, Waqar; Pagano, Eva; Feenstra, Talitha

    2017-01-01

    Introduction Diabetes is a major public health problem and prediabetes (intermediate hyperglycaemia) is associated with a high risk of developing diabetes. With evidence supporting the use of preventive interventions for prediabetes populations and the discovery of novel biomarkers stratifying the risk of progression, there is a need to evaluate their cost-effectiveness across jurisdictions. In diabetes and prediabetes, it is relevant to inform cost-effectiveness analysis using decision models due to their ability to forecast long-term health outcomes and costs beyond the time frame of clinical trials. To support good implementation and reimbursement decisions of interventions in these populations, models should be clinically credible, based on best available evidence, reproducible and validated against clinical data. Our aim is to identify recent studies on computer simulation models and model-based economic evaluations of populations of individuals with prediabetes, qualify them and discuss the knowledge gaps, challenges and opportunities that need to be addressed for future evaluations. Methods and analysis A systematic review will be conducted in MEDLINE, Embase, EconLit and National Health Service Economic Evaluation Database. We will extract peer-reviewed studies published between 2000 and 2016 that describe computer simulation models of the natural history of individuals with prediabetes and/or decision models to evaluate the impact of interventions, risk stratification and/or screening on these populations. Two reviewers will independently assess each study for inclusion. Data will be extracted using a predefined pro forma developed using best practice. Study quality will be assessed using a modelling checklist. A narrative synthesis of all studies will be presented, focussing on model structure, quality of models and input data, and validation status. Ethics and dissemination This systematic review is exempt from ethics approval because the work is carried out on published documents. The findings of the review will be disseminated in a related peer-reviewed journal and presented at conferences. Reviewregistration number CRD42016047228. PMID:28982807

  10. Towards improved and more routine Earth system model evaluation in CMIP

    DOE PAGES

    Eyring, Veronika; Gleckler, Peter J.; Heinze, Christoph; ...

    2016-11-01

    The Coupled Model Intercomparison Project (CMIP) has successfully provided the climate community with a rich collection of simulation output from Earth system models (ESMs) that can be used to understand past climate changes and make projections and uncertainty estimates of the future. Confidence in ESMs can be gained because the models are based on physical principles and reproduce many important aspects of observed climate. More research is required to identify the processes that are most responsible for systematic biases and the magnitude and uncertainty of future projections so that more relevant performance tests can be developed. At the same time,more » there are many aspects of ESM evaluation that are well established and considered an essential part of systematic evaluation but have been implemented ad hoc with little community coordination. Given the diversity and complexity of ESM analysis, we argue that the CMIP community has reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently and consistently. We provide a perspective and viewpoint on how a more systematic, open, and rapid performance assessment of the large and diverse number of models that will participate in current and future phases of CMIP can be achieved, and announce our intention to implement such a system for CMIP6. Accomplishing this could also free up valuable resources as many scientists are frequently "re-inventing the wheel" by re-writing analysis routines for well-established analysis methods. A more systematic approach for the community would be to develop and apply evaluation tools that are based on the latest scientific knowledge and observational reference, are well suited for routine use, and provide a wide range of diagnostics and performance metrics that comprehensively characterize model behaviour as soon as the output is published to the Earth System Grid Federation (ESGF). The CMIP infrastructure enforces data standards and conventions for model output and documentation accessible via the ESGF, additionally publishing observations (obs4MIPs) and reanalyses (ana4MIPs) for model intercomparison projects using the same data structure and organization as the ESM output. This largely facilitates routine evaluation of the ESMs, but to be able to process the data automatically alongside the ESGF, the infrastructure needs to be extended with processing capabilities at the ESGF data nodes where the evaluation tools can be executed on a routine basis. Efforts are already underway to develop community-based evaluation tools, and we encourage experts to provide additional diagnostic codes that would enhance this capability for CMIP. And, at the same time, we encourage the community to contribute observations and reanalyses for model evaluation to the obs4MIPs and ana4MIPs archives. The intention is to produce through the ESGF a widely accepted quasi-operational evaluation framework for CMIP6 that would routinely execute a series of standardized evaluation tasks. Over time, as this capability matures, we expect to produce an increasingly systematic characterization of models which, compared with early phases of CMIP, will more quickly and openly identify the strengths and weaknesses of the simulations. This will also reveal whether long-standing model errors remain evident in newer models and will assist modelling groups in improving their models. Finally, this framework will be designed to readily incorporate updates, including new observations and additional diagnostics and metrics as they become available from the research community.« less

  11. Core Professionalism Education in Surgery: A Systematic Review

    PubMed Central

    Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender

    2018-01-01

    Background: Professionalism education is one of the major elements of surgical residency education. Aims: To evaluate the studies on core professionalism education programs in surgical professionalism education. Study Design: Systematic review. Methods: This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Results: Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. Conclusion: It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable. PMID:29553464

  12. On the engineering design for systematic integration of agent-orientation in industrial automation.

    PubMed

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Modeled Neutron Induced Nuclear Reaction Cross Sections for Radiochemsitry in the region of Thulium, Lutetium, and Tantalum I. Results of Built in Spherical Symmetry in a Deformed Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, R. D.

    2013-09-06

    We have developed a set of modeled nuclear reaction cross sections for use in radiochemical diagnostics. Systematics for the input parameters required by the Hauser-Feshbach statistical model were developed and used to calculate neutron induced nuclear reaction cross sections for targets ranging from Terbium (Z = 65) to Rhenium (Z = 75). Of particular interest are the cross sections on Tm, Lu, and Ta including reactions on isomeric targets.

  14. Systematic evaluation of atmospheric chemistry-transport model CHIMERE

    NASA Astrophysics Data System (ADS)

    Khvorostyanov, Dmitry; Menut, Laurent; Mailler, Sylvain; Siour, Guillaume; Couvidat, Florian; Bessagnet, Bertrand; Turquety, Solene

    2017-04-01

    Regional-scale atmospheric chemistry-transport models (CTM) are used to develop air quality regulatory measures, to support environmentally sensitive decisions in the industry, and to address variety of scientific questions involving the atmospheric composition. Model performance evaluation with measurement data is critical to understand their limits and the degree of confidence in model results. CHIMERE CTM (http://www.lmd.polytechnique.fr/chimere/) is a French national tool for operational forecast and decision support and is widely used in the international research community in various areas of atmospheric chemistry and physics, climate, and environment (http://www.lmd.polytechnique.fr/chimere/CW-articles.php). This work presents the model evaluation framework applied systematically to the new CHIMERE CTM versions in the course of the continuous model development. The framework uses three of the four CTM evaluation types identified by the Environmental Protection Agency (EPA) and the American Meteorological Society (AMS): operational, diagnostic, and dynamic. It allows to compare the overall model performance in subsequent model versions (operational evaluation), identify specific processes and/or model inputs that could be improved (diagnostic evaluation), and test the model sensitivity to the changes in air quality, such as emission reductions and meteorological events (dynamic evaluation). The observation datasets currently used for the evaluation are: EMEP (surface concentrations), AERONET (optical depths), and WOUDC (ozone sounding profiles). The framework is implemented as an automated processing chain and allows interactive exploration of the results via a web interface.

  15. An effectiveness analysis of healthcare systems using a systems theoretic approach.

    PubMed

    Chuang, Sheuwen; Inder, Kerry

    2009-10-24

    The use of accreditation and quality measurement and reporting to improve healthcare quality and patient safety has been widespread across many countries. A review of the literature reveals no association between the accreditation system and the quality measurement and reporting systems, even when hospital compliance with these systems is satisfactory. Improvement of health care outcomes needs to be based on an appreciation of the whole system that contributes to those outcomes. The research literature currently lacks an appropriate analysis and is fragmented among activities. This paper aims to propose an integrated research model of these two systems and to demonstrate the usefulness of the resulting model for strategic research planning. To achieve these aims, a systematic integration of the healthcare accreditation and quality measurement/reporting systems is structured hierarchically. A holistic systems relationship model of the administration segment is developed to act as an investigation framework. A literature-based empirical study is used to validate the proposed relationships derived from the model. Australian experiences are used as evidence for the system effectiveness analysis and design base for an adaptive-control study proposal to show the usefulness of the system model for guiding strategic research. Three basic relationships were revealed and validated from the research literature. The systemic weaknesses of the accreditation system and quality measurement/reporting system from a system flow perspective were examined. The approach provides a system thinking structure to assist the design of quality improvement strategies. The proposed model discovers a fourth implicit relationship, a feedback between quality performance reporting components and choice of accreditation components that is likely to play an important role in health care outcomes. An example involving accreditation surveyors is developed that provides a systematic search for improving the impact of accreditation on quality of care and hence on the accreditation/performance correlation. There is clear value in developing a theoretical systems approach to achieving quality in health care. The introduction of the systematic surveyor-based search for improvements creates an adaptive-control system to optimize health care quality. It is hoped that these outcomes will stimulate further research in the development of strategic planning using systems theoretic approach for the improvement of quality in health care.

  16. MIZMAS: Modeling the Evolution of Ice Thickness and Floe Size Distributions in the Marginal Ice Zone of the Chukchi and Beaufort Seas

    DTIC Science & Technology

    2014-09-30

    existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this...through downscaling future projection simulations. APPROACH To address the scientific objectives, we plan to develop, implement, and validate a...ITD and FSD at the same time. The development of MIZMAS will be based on systematic model parameterization, calibration, and validation, and data

  17. BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements.

    PubMed

    Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang

    2017-10-27

    This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm.

  18. BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements

    PubMed Central

    Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang

    2017-01-01

    This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm. PMID:29076998

  19. Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature.

    PubMed

    Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Emile

    2010-12-01

    This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. © 2010 Milbank Memorial Fund. Published by Wiley Periodicals Inc.

  20. Knowledge Exchange Processes in Organizations and Policy Arenas: A Narrative Systematic Review of the Literature

    PubMed Central

    Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Émile

    2010-01-01

    Context: This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Methods: Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. Findings: This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). Conclusions: The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. PMID:21166865

  1. Preparatory studies for the WFIRST supernova cosmology measurements

    NASA Astrophysics Data System (ADS)

    Perlmutter, Saul

    In the context of the WFIRST-AFTA Science Definition Team we developed a first version of a supernova program, described in the WFIRST-AFTA SDT report. This program uses the imager to discover supernova candidates and an Integral Field Spectrograph (IFS) to obtain spectrophotometric light curves and higher signal to noise spectra of the supernovae near peak to better characterize the supernovae and thus minimize systematic errors. While this program was judged a robust one, and the estimates of the sensitivity to the cosmological parameters were felt to be reliable, due to limitation of time the analysis was clearly limited in depth on a number of issues. The goal of this proposal is to further develop this program and refine the estimates of the sensitivities to the cosmological parameters using more sophisticated systematic uncertainty models and covariance error matrices that fold in more realistic data concerning observed populations of SNe Ia as well as more realistic instrument models. We propose to develop analysis algorithms and approaches that are needed to build, optimize, and refine the WFIRST instrument and program requirements to accomplish the best supernova cosmology measurements possible. We plan to address the following: a) Use realistic Supernova populations, subclasses and population drift. One bothersome uncertainty with the supernova technique is the possibility of population drift with redshift. We are in a unique position to characterize and mitigate such effects using the spectrophotometric time series of real Type Ia supernovae from the Nearby Supernova Factory (SNfactory). Each supernova in this sample has global galaxy measurements as well as additional local environment information derived from the IFS spectroscopy. We plan to develop methods of coping with this issue, e.g., by selecting similar subsamples of supernovae and allowing additional model flexibility, in order to reduce systematic uncertainties. These studies will allow us to tune details, like the wavelength coverage and S/N requirements, of the WFIRST IFS to capitalize on these systematic error reduction methods. b) Supernova extraction and host galaxy subtractions. The underlying light of the host galaxy must be subtracted from the supernova images making up the lightcurves. Using the IFS to provide the lightcurve points via spectrophotometry requires the subtraction of a reference spectrum of the galaxy taken after the supernova light has faded to a negligible level. We plan to apply the expertise obtained from the SNfactory to develop galaxy background procedures that minimize the systematic errors introduced by this step in the analysis. c) Instrument calibration and ground to space cross calibration. Calibrating the entire supernova sample will be a challenge as no standard stars exist that span the range of magnitudes and wavelengths relevant to the WFIRST survey. Linking the supernova measurements to the relatively brighter standards will require several links. WFIRST will produce the high redshift sample, but the nearby supernova to anchor the Hubble diagram will have to come from ground based observations. Developing algorithms to carry out the cross calibration of these two samples to the required one percent level will be an important goal of our proposal. An integral part of this calibration will be to remove all instrumental signatures and to develop unbiased measurement techniques starting at the pixel level. We then plan to pull the above studies together in a synthesis to produce a correlated error matrix. We plan to develop a Fisher Matrix based model to evaluate the correlated error matrix due to the various systematic errors discussed above. A realistic error model will allow us to carry out a more reliable estimates of the eventual errors on the measurement of the cosmological parameters, as well as serve as a means of optimizing and fine tuning the requirements for the instruments and survey strategies.

  2. Exploring Measurement Error with Cookies: A Real and Virtual Approach via Interactive Excel

    ERIC Educational Resources Information Center

    Sinex, Scott A; Gage, Barbara A.; Beck, Peggy J.

    2007-01-01

    A simple, guided-inquiry investigation using stacked sandwich cookies is employed to develop a simple linear mathematical model and to explore measurement error by incorporating errors as part of the investigation. Both random and systematic errors are presented. The model and errors are then investigated further by engaging with an interactive…

  3. Improving the Mathematics Preparation of Elementary Teachers, One Lesson at a Time

    ERIC Educational Resources Information Center

    Berk, Dawn; Hiebert, James

    2009-01-01

    In this paper, we describe a model for systematically improving the mathematics preparation of elementary teachers, one lesson at a time. We begin by identifying a serious obstacle for teacher educators: the absence of mechanisms for developing a shareable knowledge base for teacher preparation. We propose our model as a way to address this…

  4. Likelihood Methods for Adaptive Filtering and Smoothing. Technical Report #455.

    ERIC Educational Resources Information Center

    Butler, Ronald W.

    The dynamic linear model or Kalman filtering model provides a useful methodology for predicting the past, present, and future states of a dynamic system, such as an object in motion or an economic or social indicator that is changing systematically with time. Recursive likelihood methods for adaptive Kalman filtering and smoothing are developed.…

  5. Transition to adult services for young people with mental health needs: A systematic review.

    PubMed

    Paul, Moli; Street, Cathy; Wheeler, Nicola; Singh, Swaran P

    2015-07-01

    Young people's transition from child and adolescent (CAMHS) to adult mental health services (AMHS). To systematically review evidence on the effectiveness of different models of CAMHS-AMHS transitional care, service user and staff perspectives, and facilitators of/barriers to effective CAMHS-AMHS transition. A systematic search in May 2012 of Medline, PsycINFO, CINAHL, EMBASE, AMED, Health Business Elite, HMIC, Cochrane Database, Web of Science and ASSIA; ancestral searches; and consultation with experts in the field. Qualitative, quantitative and mixed-methods primary research on the CAMHS-AMHS health-care transition of young people (aged 16-21 years) with mental health problems. Two reviewers independently completed a standardised data extraction form and critically evaluated identified documents using a validated appraisal tool for empirical studies with varied methodologies. A total of 19 studies of variable quality were identified. None were randomised or case-controlled trials. Studies incorporating service user/carer perspectives highlighted the need to tackle stigma and provide accessible, age-appropriate services. Parents/carers wanted more involvement with AMHS. Transitional care provision was considered patchy and often not prioritised within mental health services. There was no clear evidence of superior effectiveness of any particular model. High-quality evidence of transitional care models is lacking. Data broadly support the development of programmes that address the broader transitional care needs of 'emerging adults' and their mental health needs but further evaluation is necessary. Developing robust transitional mental health care will require the policy-practice gap to be addressed and development of accessible, acceptable, responsive, age-appropriate provision. © The Author(s) 2014.

  6. Understanding the Impact of Interventions to Prevent Antimicrobial Resistant Infections in the Long-Term Care Facility: A Review and Practical Guide to Mathematical Modeling.

    PubMed

    Rosello, Alicia; Horner, Carolyne; Hopkins, Susan; Hayward, Andrew C; Deeny, Sarah R

    2017-02-01

    OBJECTIVES (1) To systematically search for all dynamic mathematical models of infectious disease transmission in long-term care facilities (LTCFs); (2) to critically evaluate models of interventions against antimicrobial resistance (AMR) in this setting; and (3) to develop a checklist for hospital epidemiologists and policy makers by which to distinguish good quality models of AMR in LTCFs. METHODS The CINAHL, EMBASE, Global Health, MEDLINE, and Scopus databases were systematically searched for studies of dynamic mathematical models set in LTCFs. Models of interventions targeting methicillin-resistant Staphylococcus aureus in LTCFs were critically assessed. Using this analysis, we developed a checklist for good quality mathematical models of AMR in LTCFs. RESULTS AND DISCUSSION Overall, 18 papers described mathematical models that characterized the spread of infectious diseases in LTCFs, but no models of AMR in gram-negative bacteria in this setting were described. Future models of AMR in LTCFs require a more robust methodology (ie, formal model fitting to data and validation), greater transparency regarding model assumptions, setting-specific data, realistic and current setting-specific parameters, and inclusion of movement dynamics between LTCFs and hospitals. CONCLUSIONS Mathematical models of AMR in gram-negative bacteria in the LTCF setting, where these bacteria are increasingly becoming prevalent, are needed to help guide infection prevention and control. Improvements are required to develop outputs of sufficient quality to help guide interventions and policy in the future. We suggest a checklist of criteria to be used as a practical guide to determine whether a model is robust enough to test policy. Infect Control Hosp Epidemiol 2017;38:216-225.

  7. Evidence- and practice-informed approach to implementing peer grief support after suicide systematically in the USA.

    PubMed

    Cook, Franklin James; Langford, Linda; Ruocco, Kim

    2017-01-01

    The landmark report, Responding to Grief, Trauma, and Distress After a Suicide: U.S. National Guidelines, identifies the suicide bereaved as an underserved population and recommends systematic development of peer grief support to help meet the needs of survivors of suicide loss. A widespread array of peer grief support after suicide (PGSS) services exists nationally, but only as a decentralized network of autonomous programs. Some research indicates that peer support is generally helpful to the suicide bereaved, a finding that is reinforced by a large body of emerging research showing that peer support is effective in mental illness and substance abuse recovery. The practice, study, growth, and refinement of peer support in those fields have generated viable ideas about the elements and principles of effective peer support-for individual practitioners and for programs and organizations-that could be used to guide the systematic implementation of PGSS. In addition, a comprehensive PGSS program (Tragedy Assistance Program for Survivors) that currently serves a large population-survivors of suicide in the military-could be a model for national PGSS systems development. Finally, there are several frameworks for systems development-zero suicide, consumer-operated services, recovery-oriented systems of care, and the consumer action research model-that could guide the expansion and increased effectiveness of PGSS in keeping with the Guidelines' recommendation.

  8. Toward the First Data Acquisition Standard in Synthetic Biology.

    PubMed

    Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I

    2016-08-19

    This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.

  9. Identifying causes of Western Pacific ITCZ drift in ECMWF System 4 hindcasts

    NASA Astrophysics Data System (ADS)

    Shonk, Jonathan K. P.; Guilyardi, Eric; Toniazzo, Thomas; Woolnough, Steven J.; Stockdale, Tim

    2018-02-01

    The development of systematic biases in climate models used in operational seasonal forecasting adversely affects the quality of forecasts they produce. In this study, we examine the initial evolution of systematic biases in the ECMWF System 4 forecast model, and isolate aspects of the model simulations that lead to the development of these biases. We focus on the tendency of the simulated intertropical convergence zone in the western equatorial Pacific to drift northwards by between 0.5° and 3° of latitude depending on season. Comparing observations with both fully coupled atmosphere-ocean hindcasts and atmosphere-only hindcasts (driven by observed sea-surface temperatures), we show that the northward drift is caused by a cooling of the sea-surface temperature on the Equator. The cooling is associated with anomalous easterly wind stress and excessive evaporation during the first twenty days of hindcast, both of which occur whether air-sea interactions are permitted or not. The easterly wind bias develops immediately after initialisation throughout the lower troposphere; a westerly bias develops in the upper troposphere after about 10 days of hindcast. At this point, the baroclinic structure of the wind bias suggests coupling with errors in convective heating, although the initial wind bias is barotropic in structure and appears to have an alternative origin.

  10. Statistical modeling of interfractional tissue deformation and its application in radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Vile, Douglas J.

    In radiation therapy, interfraction organ motion introduces a level of geometric uncertainty into the planning process. Plans, which are typically based upon a single instance of anatomy, must be robust against daily anatomical variations. For this problem, a model of the magnitude, direction, and likelihood of deformation is useful. In this thesis, principal component analysis (PCA) is used to statistically model the 3D organ motion for 19 prostate cancer patients, each with 8-13 fractional computed tomography (CT) images. Deformable image registration and the resultant displacement vector fields (DVFs) are used to quantify the interfraction systematic and random motion. By applying the PCA technique to the random DVFs, principal modes of random tissue deformation were determined for each patient, and a method for sampling synthetic random DVFs was developed. The PCA model was then extended to describe the principal modes of systematic and random organ motion for the population of patients. A leave-one-out study tested both the systematic and random motion model's ability to represent PCA training set DVFs. The random and systematic DVF PCA models allowed the reconstruction of these data with absolute mean errors between 0.5-0.9 mm and 1-2 mm, respectively. To the best of the author's knowledge, this study is the first successful effort to build a fully 3D statistical PCA model of systematic tissue deformation in a population of patients. By sampling synthetic systematic and random errors, organ occupancy maps were created for bony and prostate-centroid patient setup processes. By thresholding these maps, PCA-based planning target volume (PTV) was created and tested against conventional margin recipes (van Herk for bony alignment and 5 mm fixed [3 mm posterior] margin for centroid alignment) in a virtual clinical trial for low-risk prostate cancer. Deformably accumulated delivered dose served as a surrogate for clinical outcome. For the bony landmark setup subtrial, the PCA PTV significantly (p<0.05) reduced D30, D20, and D5 to bladder and D50 to rectum, while increasing rectal D20 and D5. For the centroid-aligned setup, the PCA PTV significantly reduced all bladder DVH metrics and trended to lower rectal toxicity metrics. All PTVs covered the prostate with the prescription dose.

  11. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations.

    PubMed

    Mueller, Monika; D'Addario, Maddalena; Egger, Matthias; Cevallos, Myriam; Dekkers, Olaf; Mugglin, Catrina; Scott, Pippa

    2018-05-21

    Systematic reviews and meta-analyses of observational studies are frequently performed, but no widely accepted guidance is available at present. We performed a systematic scoping review of published methodological recommendations on how to systematically review and meta-analyse observational studies. We searched online databases and websites and contacted experts in the field to locate potentially eligible articles. We included articles that provided any type of recommendation on how to conduct systematic reviews and meta-analyses of observational studies. We extracted and summarised recommendations on pre-defined key items: protocol development, research question, search strategy, study eligibility, data extraction, dealing with different study designs, risk of bias assessment, publication bias, heterogeneity, statistical analysis. We summarised recommendations by key item, identifying areas of agreement and disagreement as well as areas where recommendations were missing or scarce. The searches identified 2461 articles of which 93 were eligible. Many recommendations for reviews and meta-analyses of observational studies were transferred from guidance developed for reviews and meta-analyses of RCTs. Although there was substantial agreement in some methodological areas there was also considerable disagreement on how evidence synthesis of observational studies should be conducted. Conflicting recommendations were seen on topics such as the inclusion of different study designs in systematic reviews and meta-analyses, the use of quality scales to assess the risk of bias, and the choice of model (e.g. fixed vs. random effects) for meta-analysis. There is a need for sound methodological guidance on how to conduct systematic reviews and meta-analyses of observational studies, which critically considers areas in which there are conflicting recommendations.

  12. Development of a systematic career coaching program for medical students

    PubMed Central

    2018-01-01

    Purpose This study aimed to develop a systematic career-coaching program (SCCP) that can be used by medical teaching schools to address a growing need for career-coaching. The program objectives were to help students (1) develop a comprehensive self-understanding of their aptitudes, interests, and personality traits; (2) explore possible career choices and decide on a career path; and (3) develop the competencies needed to prepare for their future careers. Methods The SCCP was based on the ADDIE (analysis, design, development, implementation, and evaluation) model and decision-making questioning model. Medical professionals, medical education and career counseling experts, and students participated in designing the program. Results The SCCP describes coaching content, tools, operational methods, and appropriate timing, and identifies the professionals and specialists who can offer their expertise in the different coaching phases. It is designed to allow medical schools to offer the program in segments or in its entirety, depending on the curriculum and environment. Conclusion The SCCP represents a viable career-coaching program for medical students that can be applied in part or in its entirety, depending on a medical school’s curriculum and educational environment. PMID:29510607

  13. A systematic plan for the continued study of dimensional stability of metallic alloys considered for the fabrication of cryogenic wind tunnel models

    NASA Technical Reports Server (NTRS)

    Wigley, D. A.

    1985-01-01

    Interrelated research and development activities, phased development of stepped specimen program are documented and a sequence for a specific program of machining, validation and heat treatment cycles for one material are described. Proposed work for the next phase of dimensional stability research is presented and further technology development activities are proposed.

  14. A systematic review of the psychological and social benefits of participation in sport for adults: informing development of a conceptual model of health through sport

    PubMed Central

    2013-01-01

    Background The definition of health incorporates the physical, social and mental domains, however the Physical Activity (PA) guidelines do not address social health. Furthermore, there is insufficient evidence about the levels or types of PA associated specifically with psychological health. This paper first presents the results of a systematic review of the psychological and social health benefits of participation in sport by adults. Secondly, the information arising from the systematic review has been used to develop a conceptual model of Health through Sport. Methods A systematic review of 14 electronic databases was conducted in June 2012, and studies published since 1990 were considered for inclusion. Studies that addressed mental and/or social health benefits from participation in sport were included. Results A total of 3668 publications were initially identified, of which 11 met the selection criteria. There were many different psychological and social health benefits reported, with the most commonly being wellbeing and reduced distress and stress. Sport may be associated with improved psychosocial health in addition to improvements attributable to participation in PA. Specifically, club-based or team-based sport seems to be associated with improved health outcomes compared to individual activities, due to the social nature of the participation. Notwithstanding this, individuals who prefer to participate in sport by themselves can still derive mental health benefits which can enhance the development of true-self-awareness and personal growth which is essential for social health. A conceptual model, Health through Sport, is proposed. The model depicts the relationship between psychological, psychosocial and social health domains, and their positive associations with sport participation, as reported in the literature. However, it is acknowledged that the capacity to determine the existence and direction of causal links between participation and health is limited by the cross-sectional nature of studies to date. Conclusion It is recommended that participation in sport is advocated as a form of leisure-time PA for adults which can produce a range of health benefits. It is also recommended that the causal link between participation in sport and psycho-social health be further investigated and the conceptual model of Health through Sport tested. PMID:24313992

  15. Summary of Research on the Effectiveness of Math Professional Development Approaches. REL 2014-010

    ERIC Educational Resources Information Center

    Gersten, Russell; Taylor, Mary Jo; Keys, Tran D.; Rolfhus, Eric; Newman-Gonchar, Rebecca

    2014-01-01

    This study used a systematic process modeled after the What Works Clearinghouse (WWC) study review process to answer the question: What does the causal research say are effective math professional development interventions for K-12 teachers aimed at improving student achievement? The study identified and screened 910 research studies in a…

  16. TThe role of nitrogen availability in land-atmosphere interactions: a systematic evaluation of carbon-nitrogen coupling in a global land surface model using plot-level nitrogen fertilization experiments

    NASA Astrophysics Data System (ADS)

    Thomas, R. Q.; Goodale, C. L.; Bonan, G. B.; Mahowald, N. M.; Ricciuto, D. M.; Thornton, P. E.

    2010-12-01

    Recent research from global land surface models emphasizes the important role of nitrogen cycling on global climate, via its control on the terrestrial carbon balance. Despite the implications of nitrogen cycling on global climate predictions, the research community has not performed a systematic evaluation of nitrogen cycling in global models. Here, we present such an evaluation for one global land model, CLM-CN. In the evaluation we simulated 45 plot-scale nitrogen-fertilization experiments distributed across 33 temperate and boreal forest sites. Model predictions were evaluated against field observations by comparing the vegetation and soil carbon responses to the additional nitrogen. Aggregated across all experiments, the model predicted a larger vegetation carbon response and a smaller soil carbon response than observed; the responses partially offset each other, leading to a slightly larger total ecosystem carbon response than observed. However, the model-observation agreement improved for vegetation carbon when the sites with observed negative carbon responses to nitrogen were excluded, which may be because the model lacks mechanisms whereby nitrogen additions increase tree mortality. Among experiments, younger forests and boreal forests’ vegetation carbon responses were less than predicted and mature forests (> 40 years old) were greater than predicted. Specific to the CLM-CN, this study used a systematic evaluation to identify key areas to focus model development, especially soil carbon- nitrogen interactions and boreal forest nitrogen cycling. Applicable to the modeling community, this study demonstrates a standardized protocol for comparing carbon-nitrogen interactions among global land models.

  17. Implementing a Systematic Process for Consistent Nursing Care in a NICU: A Quality Improvement Project.

    PubMed

    McCarley, Renay Marie; Dowling, Donna A; Dolansky, Mary A; Bieda, Amy

    2018-03-01

    The global aim of this quality improvement project was to develop and implement a systematic process to assign and maintain consistent bedside nurses for infants and families. A systematic process based on a primary care nursing model was implemented to assign consistent care for a 48-bed, single-family room NICU. Four PDSA cycles were necessary to obtain agreement from the nursing staff as to the best process for assigning primary nurses. Post-intervention data revealed a 9.5 percent decrease of consistent caregivers for infants in the NICU ≤ 28 days and a 2.3 percent increase of consistent caregivers for infants in the NICU ≥ 29 days. Although these findings did not meet the goal of the specific aim, a systematic process was created to assign bedside nurses to infants. Further PDSAs will be needed to refine the process to reach the aim.

  18. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  19. DEPOSITION OF SULFATE ACID AEROSOLS IN THE DEVELOPING HUMAN LUNG

    EPA Science Inventory

    Computations of aerosol deposition as affected by (i) aerosol hygroscopicity, (ii) human age, and (iii) respiratory intensity are accomplished using a validated mathematical model. he interactive effects are very complicated but systematic. ew general observations can be made; ra...

  20. Women's maternity care needs and related service models in rural areas: A comprehensive systematic review of qualitative evidence.

    PubMed

    Hoang, Ha; Le, Quynh; Ogden, Kathryn

    2014-12-01

    Understanding the needs of rural women in maternity care and service models available to them is significant for the development of effective policies and the sustainability of rural communities. Nevertheless, no systematic review of studies addressing these needs has been conducted. To synthesise the best available evidence on the experiences of women's needs in maternity care and existing service models in rural areas. Literature search of ten electronic databases, digital theses, and reference lists of relevant studies applying inclusion/exclusion criteria was conducted. Selected papers were assessed using standardised critical appraisal instruments from JBI-QARI. Data extracted from these studies were synthesised using thematic synthesis. 12 studies met the inclusion criteria. There were three main themes and several sub-themes identified. A comprehensive set of the maternity care expectations of rural women was reported in this review including safety (7), continuity of care (6) and quality of care (6), and informed choices needs (4). In addition, challenges in accessing maternity services also emerged from the literature such as access (6), risk of travelling (9) and associated cost of travel (9). Four models of maternity care examined in the literature were medically led care (5), GP-led care (4), midwifery-led care (7) and home birth (6). The systematic review demonstrates the importance of including well-conducted qualitative studies in informing the development of evidence-based policies to address women's maternity care needs and inform service models. Synthesising the findings from qualitative studies offers important insight for informing effective public health policy. Copyright © 2014 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  1. Multiscale modeling of lithium ion batteries: thermal aspects

    PubMed Central

    Zausch, Jochen

    2015-01-01

    Summary The thermal behavior of lithium ion batteries has a huge impact on their lifetime and the initiation of degradation processes. The development of hot spots or large local overpotentials leading, e.g., to lithium metal deposition depends on material properties as well as on the nano- und microstructure of the electrodes. In recent years a theoretical structure emerges, which opens the possibility to establish a systematic modeling strategy from atomistic to continuum scale to capture and couple the relevant phenomena on each scale. We outline the building blocks for such a systematic approach and discuss in detail a rigorous approach for the continuum scale based on rational thermodynamics and homogenization theories. Our focus is on the development of a systematic thermodynamically consistent theory for thermal phenomena in batteries at the microstructure scale and at the cell scale. We discuss the importance of carefully defining the continuum fields for being able to compare seemingly different phenomenological theories and for obtaining rules to determine unknown parameters of the theory by experiments or lower-scale theories. The resulting continuum models for the microscopic and the cell scale are numerically solved in full 3D resolution. The complex very localized distributions of heat sources in a microstructure of a battery and the problems of mapping these localized sources on an averaged porous electrode model are discussed by comparing the detailed 3D microstructure-resolved simulations of the heat distribution with the result of the upscaled porous electrode model. It is shown, that not all heat sources that exist on the microstructure scale are represented in the averaged theory due to subtle cancellation effects of interface and bulk heat sources. Nevertheless, we find that in special cases the averaged thermal behavior can be captured very well by porous electrode theory. PMID:25977870

  2. Screening and vaccination as determined by the Social Ecological Model and the Theory of Triadic Influence: a systematic review.

    PubMed

    Nyambe, Anayawa; Van Hal, Guido; Kampen, Jarl K

    2016-11-17

    Vaccination and screening are forms of primary and secondary prevention methods. These methods are recommended for controlling the spread of a vast number of diseases and conditions. To determine the most effective preventive methods to be used by a society, multi-level models have shown to be more effective than models that focus solely on individual level characteristics. The Social Ecological Model (SEM) and the Theory of Triadic Influence (TTI) are such models. The purpose of this systematic review was to identify main differences and similarities of SEM and TTI regarding screening and vaccination in order to prepare potentially successful prevention programs for practice. A systematic review was conducted. Separate literature searches were performed during January and February 2015 using Medline, Ovid, Proquest, PubMed, University of Antwerp Discovery Service and Web of Science, for articles that apply the SEM and TTI. A Data Extraction Form with mostly closed-end questions was developed to assist with data extraction. Aggregate descriptive statistics were utilized to summarize the general characteristics of the SEM and TTI as documented in the scientific literature. A total of 290 potentially relevant articles referencing the SEM were found. As for the TTI, a total of 131 potentially relevant articles were found. After strict evaluation for inclusion and exclusion criteria, 40 SEM studies and 46 TTI studies were included in the systematic review. The SEM and TTI are theoretical frameworks that share many theoretical concepts and are relevant for several types of health behaviors. However, they differ in the structure of the model, and in how the variables are thought to interact with each other, the TTI being a matrix while the SEM has a ring structure. The main difference consists of the division of the TTI into levels of causation (ultimate, distal and proximal) which are not considered within the levels of the SEM. It was further found that in the articles studied in this systematic review, both models are often considered effective, while the empirical basis of these (and other) conclusions reached by their authors is in many cases unclear or incompletely specified.

  3. New Methodology for Estimating Fuel Economy by Vehicle Class

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less

  4. Risk prediction models for graft failure in kidney transplantation: a systematic review.

    PubMed

    Kaboré, Rémi; Haller, Maria C; Harambat, Jérôme; Heinze, Georg; Leffondré, Karen

    2017-04-01

    Risk prediction models are useful for identifying kidney recipients at high risk of graft failure, thus optimizing clinical care. Our objective was to systematically review the models that have been recently developed and validated to predict graft failure in kidney transplantation recipients. We used PubMed and Scopus to search for English, German and French language articles published in 2005-15. We selected studies that developed and validated a new risk prediction model for graft failure after kidney transplantation, or validated an existing model with or without updating the model. Data on recipient characteristics and predictors, as well as modelling and validation methods were extracted. In total, 39 articles met the inclusion criteria. Of these, 34 developed and validated a new risk prediction model and 5 validated an existing one with or without updating the model. The most frequently predicted outcome was graft failure, defined as dialysis, re-transplantation or death with functioning graft. Most studies used the Cox model. There was substantial variability in predictors used. In total, 25 studies used predictors measured at transplantation only, and 14 studies used predictors also measured after transplantation. Discrimination performance was reported in 87% of studies, while calibration was reported in 56%. Performance indicators were estimated using both internal and external validation in 13 studies, and using external validation only in 6 studies. Several prediction models for kidney graft failure in adults have been published. Our study highlights the need to better account for competing risks when applicable in such studies, and to adequately account for post-transplant measures of predictors in studies aiming at improving monitoring of kidney transplant recipients. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarazona, David; Berz, Martin; Hipple, Robert

    The main goal of the Muon g-2 Experiment (g-2) at Fermilab is to measure the muon anomalous magnetic moment to unprecedented precision. This new measurement will allow to test the completeness of the Standard Model (SM) and to validate other theoretical models beyond the SM. The close interplay of the understanding of particle beam dynamics and the preparation of the beam properties with the experimental measurement is tantamount to the reduction of systematic errors in the determination of the muon anomalous magnetic moment. We describe progress in developing detailed calculations and modeling of the muon beam delivery system in ordermore » to obtain a better understanding of spin-orbit correlations, nonlinearities, and more realistic aspects that contribute to the systematic errors of the g-2 measurement. Our simulation is meant to provide statistical studies of error effects and quick analyses of running conditions for when g-2 is taking beam, among others. We are using COSY, a differential algebra solver developed at Michigan State University that will also serve as an alternative to compare results obtained by other simulation teams of the g-2 Collaboration.« less

  6. Preference heterogeneity in a count data model of demand for off-highway vehicle recreation

    Treesearch

    Thomas P Holmes; Jeffrey E Englin

    2010-01-01

    This paper examines heterogeneity in the preferences for OHV recreation by applying the random parameters Poisson model to a data set of off-highway vehicle (OHV) users at four National Forest sites in North Carolina. The analysis develops estimates of individual consumer surplus and finds that estimates are systematically affected by the random parameter specification...

  7. Systematic Model for Validating Equipment Uses in Selected Marketing and Distribution Education Programs. Final Report, February 1, 1980-June 30, 1981.

    ERIC Educational Resources Information Center

    Gildan, Kate; Buckner, Leroy

    Research was conducted to provide a model for selecting equipment for marketing and distributive education programs that was required for the development of the skills or competencies needed to perform in marketing and distribution occupation. A research of the literature identified both competency statements for three program areas--Fashion…

  8. Generalized IRT Models for Extreme Response Style

    ERIC Educational Resources Information Center

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Extreme response style (ERS) is a systematic tendency for a person to endorse extreme options (e.g., strongly disagree, strongly agree) on Likert-type or rating-scale items. In this study, we develop a new class of item response theory (IRT) models to account for ERS so that the target latent trait is free from the response style and the tendency…

  9. Transference of Responsibility Model Goals to the School Environment: Exploring the Impact of a Coaching Club Program

    ERIC Educational Resources Information Center

    Walsh, David S.; Ozaeta, Jimmy; Wright, Paul M.

    2010-01-01

    Background: The Teaching Personal and Social Responsibility Model (TPSR) has been used throughout the USA and in several other countries to integrate systematically life skill development within physical activity-based programs. While TPSR is widely used in practice and has a growing empirical base, few studies have examined the degree of…

  10. The ILRS Contribution to ITRF2013

    NASA Astrophysics Data System (ADS)

    Pavlis, Erricos C.; Luceri, Cinzia; Sciarretta, Cecilia; Evans, Keith

    2014-05-01

    Satellite Laser Ranging (SLR) data have contributed to the definition of the International Terrestrial Reference Frame (ITRF) over the past three decades. The development of ITRF2005 ushered a new era with the use of weekly or session contributions, allowing greater flexibility in the editing, relative weighting and the combination of information from the four contributing techniques. The new approach allows each Service to generate a solution based on the rigorous combination of the individual Analysis Centers' contributions that provides an opportunity to verify the intra-technique consistency and a comparison of internal procedures and adopted models. The intra- and inter-technique comparisons that the time series approach facilitates are an extremely powerful diagnostic that highlights differences and inconsistencies at the single station level. Over the past year the ILRS Analysis Working Group (AWG) worked on designing an improved ILRS contribution for the development of ITRF2013. The ILRS approach is based on the current IERS Conventions 2010 and our internal ILRS standards, with a few deviations that are documented. Since the Global Geodetic Observing System - GGOS identified the ITRF as its key project, the ILRS has taken a two-pronged approach in order to meet its stringent goals: modernizing the engineering components (ground and space segments), and revising the modeling standards taking advantage of recent improvements in system Earth modeling. The main concern in the case of SLR is monitoring systematic errors at individual stations, accounting for undocumented discontinuities, and improving the target signature models. The latter has been addressed with the adoption of mm-level models for all of our targets. As far as the station systematics, the AWG had already embarked on a major effort to improve the handling of such errors prior to the development of ITRF2008. The results of that effort formed the foundation for the re-examination of the systematic errors at all sites. The new process benefited extensively from the results of the quality control process that ILRS provides on a daily basis as a feedback to the stations, and the recovery of systematic error corrections from the data themselves through targeted investigations. The present re-analysis extends from 1983 to the end of 2013. The data quality for the early period 1983-1993 is significantly poorer than for the recent years. However, it contributes to the overall stability of the datum definition, especially in terms of its origin and scale and, as the more recent and higher quality data accumulate, the significance of the early data will progressively diminish. As in the case of ITRF2008, station engineers and analysts have worked together to determine the magnitude and cause of systematic errors that were noticed during the analysis, rationalize them based on events at the stations, and develop appropriate corrections whenever possible. This presentation will give an overview of the process and examples from the various steps.

  11. The problem of genesis and systematic of sedimentary units of hydrocarbon reservoirs

    NASA Astrophysics Data System (ADS)

    Zhilina, E. N.; Chernova, O. S.

    2017-12-01

    The problem of identifying and ranking sedimentation, facies associations and their constituent parts - lithogenetic types of sedimentary rocks was considered. As a basis for paleo-sedimentary modelling, the author has developed a classification for terrigenous natural reservoirs,that for the first time links separate sedimentological units into a single hierarchical system. Hierarchy ranking levels are based on a compilation of global knowledge and experience in sediment geology, sedimentological study and systematization, and data from deep-well coresrepresentingJurassichydrocarbon-bearing formationsof the southeastern margin of the Western Siberian sedimentary basin.

  12. Transforming the urban food desert from the grassroots up: a model for community change.

    PubMed

    Lewis, LaVonna Blair; Galloway-Gilliam, Lark; Flynn, Gwendolyn; Nomachi, Jonathan; Keener, LaTonya Chavis; Sloane, David C

    2011-01-01

    Confronted by continuing health disparities in vulnerable communities, Community Health Councils (CHC), a nonprofit community-based organization in South Los Angeles, worked with the African Americans Building a Legacy of Health Coalition and research partners to develop a community change model to address the root causes of health disparities within the community's African American population. This article discusses how the CHC Model's development and application led to public policy interventions in a "food desert." The CHC Model provided a systematic approach to engaging impacted communities in support of societal level reforms, with the goal to influence health outcomes.

  13. Nonholonomic Hamiltonian Method for Meso-macroscale Simulations of Reacting Shocks

    NASA Astrophysics Data System (ADS)

    Fahrenthold, Eric; Lee, Sangyup

    2015-06-01

    The seamless integration of macroscale, mesoscale, and molecular scale models of reacting shock physics has been hindered by dramatic differences in the model formulation techniques normally used at different scales. In recent research the authors have developed the first unified discrete Hamiltonian approach to multiscale simulation of reacting shock physics. Unlike previous work, the formulation employs reacting themomechanical Hamiltonian formulations at all scales, including the continuum. Unlike previous work, the formulation employs a nonholonomic modeling approach to systematically couple the models developed at all scales. Example applications of the method show meso-macroscale shock to detonation simulations in nitromethane and RDX. Research supported by the Defense Threat Reduction Agency.

  14. Microenterprise Development Interventions for Sexual Risk Reduction: A Systematic Review

    PubMed Central

    Lee, Ramon; Thirumurthy, Harsha; Muessig, Kathryn E.; Tucker, Joseph D.

    2013-01-01

    Comprehensive interventions that address both individual and structural determinants associated with HIV/STI risk are gaining increasing attention over the past decade. Microenterprise development offers an appealing model for HIV prevention by addressing poverty and gender equality. This study systematically reviewed the effects of microenterprise development interventions on HIV/STI incidence and sexual risk behaviors. Microenterprise development was defined as developing small business capacity among individuals to alleviate poverty. Seven eligible research studies representing five interventions were identified and included in this review. All of the studies targeted women, and three focused on sex workers. None measured biomarker outcomes. All three sex worker studies showed significant reduction in sexual risk behaviors when compared to the control group. Non-sex worker studies showed limited changes in sexual risk behavior. This review indicates the potential utility of microenterprise development in HIV risk reduction programs. More research is needed to determine how microenterprise development can be effectively incorporated in comprehensive HIV control strategies. PMID:23963497

  15. Microenterprise development interventions for sexual risk reduction: a systematic review.

    PubMed

    Cui, Rosa R; Lee, Ramon; Thirumurthy, Harsha; Muessig, Kathryn E; Tucker, Joseph D

    2013-11-01

    Comprehensive interventions that address both individual and structural determinants associated with HIV/STI risk are gaining increasing attention over the past decade. Microenterprise development offers an appealing model for HIV prevention by addressing poverty and gender equality. This study systematically reviewed the effects of microenterprise development interventions on HIV/STI incidence and sexual risk behaviors. Microenterprise development was defined as developing small business capacity among individuals to alleviate poverty. Seven eligible research studies representing five interventions were identified and included in this review. All of the studies targeted women, and three focused on sex workers. None measured biomarker outcomes. All three sex worker studies showed significant reduction in sexual risk behaviors when compared to the control group. Non-sex worker studies showed limited changes in sexual risk behavior. This review indicates the potential utility of microenterprise development in HIV risk reduction programs. More research is needed to determine how microenterprise development can be effectively incorporated in comprehensive HIV control strategies.

  16. A Decision Model for Supporting Task Allocation Processes in Global Software Development

    NASA Astrophysics Data System (ADS)

    Lamersdorf, Ansgar; Münch, Jürgen; Rombach, Dieter

    Today, software-intensive systems are increasingly being developed in a globally distributed way. However, besides its benefit, global development also bears a set of risks and problems. One critical factor for successful project management of distributed software development is the allocation of tasks to sites, as this is assumed to have a major influence on the benefits and risks. We introduce a model that aims at improving management processes in globally distributed projects by giving decision support for task allocation that systematically regards multiple criteria. The criteria and causal relationships were identified in a literature study and refined in a qualitative interview study. The model uses existing approaches from distributed systems and statistical modeling. The article gives an overview of the problem and related work, introduces the empirical and theoretical foundations of the model, and shows the use of the model in an example scenario.

  17. Developing core economic outcome sets for asthma studies: a protocol for a systematic review.

    PubMed

    Hounsome, Natalia; Fitzsimmons, Deborah; Phillips, Ceri; Patel, Anita

    2017-08-11

    Core outcome sets are standardised lists of outcomes, which should be measured and reported in all clinical studies of a specific condition. This study aims to develop core outcome sets for economic evaluations in asthma studies. Economic outcomes include items such as costs, resource use or quality-adjusted life years. The starting point in developing core outcome sets will be conducting a systematic literature review to establish a preliminary list of reporting items to be considered for inclusion in the core outcome set. We will conduct literature searches of peer-reviewed studies published from January 1990 to January 2017. These will include any comparative or observational studies (including economic models) and systematic reviews reporting economic outcomes. All identified economic outcomes will be tabulated together with the major study characteristics, such as population, study design, the nature and intensity of the intervention, mode of data collection and instrument(s) used to derive an outcome. We will undertake a 'realist synthesis review' to analyse the identified economic outcomes. The outcomes will be summarised in the context of evaluation perspectives, types of economic evaluation and methodological approaches. Parallel to undertaking a systematic review, we will conduct semistructured interviews with stakeholders (including people with personal experience of asthma, health professionals, researchers and decision makers) in order to explore additional outcomes which have not been considered, or used, in published studies. The list of outcomes generated from the systematic review and interviews with stakeholders will form the basis of a Delphi survey to refine the identified outcomes into a core outcome set. The review will not involve access to individual-level data. Findings from our systematic review will be communicated to a broad range of stakeholders including clinical guideline developers, research funders, trial registries, ethics committees and other regulators. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Systematic Dissemination of Research and Development Program Improvement Efforts.

    ERIC Educational Resources Information Center

    Sanders, Carol S.

    A systematic approach to disseminaton of vocational education research and development program improvement efforts is comprehensive, effective, and efficient. Systematic dissemination is a prerequisite link to assessing impact of research and development--for program improvement to occur, successful dissemination is crucial. A systematic approach…

  19. A systematic review of advance practice providers in acute care: options for a new model in a burn intensive care unit.

    PubMed

    Edkins, Renee E; Cairns, Bruce A; Hultman, C Scott

    2014-03-01

    Accreditation Council for Graduate Medical Education mandated work-hour restrictions have negatively impacted many areas of clinical care, including management of burn patients, who require intensive monitoring, resuscitation, and procedural interventions. As surgery residents become less available to meet service needs, new models integrating advanced practice providers (APPs) into the burn team must emerge. We performed a systematic review of APPs in critical care questioning, how best to use all providers to solve these workforce challenges? We performed a systematic review of PubMed, CINAHL, Ovid, and Google Scholar, from 2002 to 2012, using the key words: nurse practitioner, physician assistant, critical care, and burn care. After applying inclusion/exclusion criteria, 18 relevant articles were selected for review. In addition, throughput and financial models were developed to examine provider staffing patterns. Advanced practice providers in critical care settings function in various models, both with and without residents, reporting to either an intensivist or an attending physician. When APPs participated, patient outcomes were similar or improved compared across provider models. Several studies reported considerable cost-savings due to decrease length of stay, decreased ventilator days, and fewer urinary tract infections when nurse practitioners were included in the provider mix. Restrictions in resident work-hours and changing health care environments require that new provider models be created for acute burn care. This article reviews current utilization of APPs in critical care units and proposes a new provider model for burn centers.

  20. Shoreline Change and Storm-Induced Beach Erosion Modeling: A Collection of Seven Papers

    DTIC Science & Technology

    1990-03-01

    reducing, and analyzing the data in a systematic manner. Most physical data needed for evaluating and interpreting shoreline and beach evolution processes...proposed development concepts using both physical and numerical models. b. Analyzed and interpreted model results. c. Provided technical documentation of... interpret study results in the context required for "Confirmation" hearings. 26 The Corps of Engineers, Los Angeles District (SPL), has also begun studies

  1. Towards a Pedagogy for Cooperative Education.

    ERIC Educational Resources Information Center

    Heinemann, Harry N.

    1983-01-01

    Discusses the need to develop an appropriate pedagogy for cooperative education to integrate the educational outcomes from the work and classroom experience. Suggests that the model should structure, guide, and provide for the systematic evaluation of the learning resulting from the cooperative education assignment. (JOW)

  2. Systematic development and optimization of chemically defined medium supporting high cell density growth of Bacillus coagulans.

    PubMed

    Chen, Yu; Dong, Fengqing; Wang, Yonghong

    2016-09-01

    With determined components and experimental reducibility, the chemically defined medium (CDM) and the minimal chemically defined medium (MCDM) are used in many metabolism and regulation studies. This research aimed to develop the chemically defined medium supporting high cell density growth of Bacillus coagulans, which is a promising producer of lactic acid and other bio-chemicals. In this study, a systematic methodology combining the experimental technique with flux balance analysis (FBA) was proposed to design and simplify a CDM. The single omission technique and single addition technique were employed to determine the essential and stimulatory compounds, before the optimization of their concentrations by the statistical method. In addition, to improve the growth rationally, in silico omission and addition were performed by FBA based on the construction of a medium-size metabolic model of B. coagulans 36D1. Thus, CDMs were developed to obtain considerable biomass production of at least five B. coagulans strains, in which two model strains B. coagulans 36D1 and ATCC 7050 were involved.

  3. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  4. A systematic review of the psychological and social benefits of participation in sport for children and adolescents: informing development of a conceptual model of health through sport.

    PubMed

    Eime, Rochelle M; Young, Janet A; Harvey, Jack T; Charity, Melanie J; Payne, Warren R

    2013-08-15

    There are specific guidelines regarding the level of physical activity (PA) required to provide health benefits. However, the research underpinning these PA guidelines does not address the element of social health. Furthermore, there is insufficient evidence about the levels or types of PA associated specifically with psychological health. This paper first presents the results of a systematic review of the psychological and social health benefits of participation in sport by children and adolescents. Secondly, the information arising from the systematic review has been used to develop a conceptual model. A systematic review of 14 electronic databases was conducted in June 2012, and studies published since 1990 were considered for inclusion. Studies that addressed mental and/or social health benefits from participation in sport were included. A total of 3668 publications were initially identified, of which 30 met the selection criteria. There were many different psychological and social health benefits reported, with the most commonly being improved self-esteem, social interaction followed by fewer depressive symptoms. Sport may be associated with improved psychosocial health above and beyond improvements attributable to participation in PA. Specifically, team sport seems to be associated with improved health outcomes compared to individual activities, due to the social nature of the participation. A conceptual model, Health through Sport, is proposed. The model depicts the relationship between psychological, psychosocial and social health domains, and their positive associations with sport participation, as reported in the literature. However, it is acknowledged that the capacity to determine the existence and direction of causal links between participation and health is limited by the fact that the majority of studies identified (n=21) were cross-sectional. It is recommended that community sport participation is advocated as a form of leisure time PA for children and adolescents, in an effort to not only improve physical health in relation to such matters as the obesity crisis, but also to enhance psychological and social health outcomes. It is also recommended that the causal link between participation in sport and psychosocial health be further investigated and the conceptual model of Health through Sport tested.

  5. A whole school approach: collaborative development of school health policies, processes, and practices.

    PubMed

    Hunt, Pete; Barrios, Lisa; Telljohann, Susan K; Mazyck, Donna

    2015-11-01

    The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. The existing literature, including scientific articles, programmatic guidance, and publications by national agencies and organizations, was reviewed and synthesized to describe an overview of interrelatedness of learning and health and the 10 components of the WSCC model. The literature suggests potential benefits of applying the WSCC model at the district and school level. But, the model lacks specific guidance as to how this might be made actionable. A collaborative approach to health and learning is suggested, including a 10-step systematic process to help schools and districts develop an action plan for improving health and education outcomes. Essential preliminary actions are suggested to minimize the impact of the challenges that commonly derail systematic planning processes and program implementation, such as lack of readiness, personnel shortages, insufficient resources, and competing priorities. All new models require testing and evidence to confirm their value. District and schools will need to test this model and put plans into action to show that significant, substantial, and sustainable health and academic outcomes can be achieved. © 2015 The Authors. Journal of School Health published by Wiley Periodicals, Inc. on behalf of American School Health Association.

  6. Factors supporting good partnership working between generalist and specialist palliative care services: a systematic review.

    PubMed

    Gardiner, Clare; Gott, Merryn; Ingleton, Christine

    2012-05-01

    The care that most people receive at the end of their lives is provided not by specialist palliative care professionals but by generalists such as GPs, district nurses and others who have not undertaken specialist training in palliative care. A key focus of recent UK policy is improving partnership working across the spectrum of palliative care provision. However there is little evidence to suggest factors which support collaborative working between specialist and generalist palliative care providers. To explore factors that support partnership working between specialist and generalist palliative care providers. Systematic review. A systematic review of studies relating to partnership working between specialist and generalist palliative care providers was undertaken. Six electronic databases were searched for papers published up until January 2011. Of the 159 articles initially identified, 22 papers met the criteria for inclusion. Factors supporting good partnership working included: good communication between providers; clear definition of roles and responsibilities; opportunities for shared learning and education; appropriate and timely access to specialist palliative care services; and coordinated care. Multiple examples exist of good partnership working between specialist and generalist providers; however, there is little consistency regarding how models of collaborative working are developed, and which models are most effective. Little is known about the direct impact of collaborative working on patient outcomes. Further research is required to gain the direct perspectives of health professionals and patients regarding collaborative working in palliative care, and to develop appropriate and cost-effective models for partnership working.

  7. Guidelines for performing systematic reviews in the development of toxicity factors.

    PubMed

    Schaefer, Heather R; Myers, Jessica L

    2017-12-01

    The Texas Commission on Environmental Quality (TCEQ) developed guidance on conducting systematic reviews during the development of chemical-specific toxicity factors. Using elements from publicly available frameworks, the TCEQ systematic review process was developed in order to supplement the existing TCEQ Guidelines for developing toxicity factors (TCEQ Regulatory Guidance 442). The TCEQ systematic review process includes six steps: 1) Problem Formulation; 2) Systematic Literature Review and Study Selection; 3) Data Extraction; 4) Study Quality and Risk of Bias Assessment; 5) Evidence Integration and Endpoint Determination; and 6) Confidence Rating. This document provides guidance on conducting a systematic literature review and integrating evidence from different data streams when developing chemical-specific reference values (ReVs) and unit risk factors (URFs). However, this process can also be modified or expanded to address other questions that would benefit from systematic review practices. The systematic review and evidence integration framework can improve regulatory decision-making processes, increase transparency, minimize bias, improve consistency between different risk assessments, and further improve confidence in toxicity factor development. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  8. A systematic approach to engineering ethics education.

    PubMed

    Li, Jessica; Fu, Shengli

    2012-06-01

    Engineering ethics education is a complex field characterized by dynamic topics and diverse students, which results in significant challenges for engineering ethics educators. The purpose of this paper is to introduce a systematic approach to determine what to teach and how to teach in an ethics curriculum. This is a topic that has not been adequately addressed in the engineering ethics literature. This systematic approach provides a method to: (1) develop a context-specific engineering ethics curriculum using the Delphi technique, a process-driven research method; and (2) identify appropriate delivery strategies and instructional strategies using an instructional design model. This approach considers the context-specific needs of different engineering disciplines in ethics education and leverages the collaboration of engineering professors, practicing engineers, engineering graduate students, ethics scholars, and instructional design experts. The proposed approach is most suitable for a department, a discipline/field or a professional society. The approach helps to enhance learning outcomes and to facilitate ethics education curriculum development as part of the regular engineering curriculum.

  9. Protocol: a systematic review of studies developing and/or evaluating search strategies to identify prognosis studies.

    PubMed

    Corp, Nadia; Jordan, Joanne L; Hayden, Jill A; Irvin, Emma; Parker, Robin; Smith, Andrea; van der Windt, Danielle A

    2017-04-20

    Prognosis research is on the rise, its importance recognised because chronic health conditions and diseases are increasingly common and costly. Prognosis systematic reviews are needed to collate and synthesise these research findings, especially to help inform effective clinical decision-making and healthcare policy. A detailed, comprehensive search strategy is central to any systematic review. However, within prognosis research, this is challenging due to poor reporting and inconsistent use of available indexing terms in electronic databases. Whilst many published search filters exist for finding clinical trials, this is not the case for prognosis studies. This systematic review aims to identify and compare existing methodological filters developed and evaluated to identify prognosis studies of any of the three main types: overall prognosis, prognostic factors, and prognostic [risk prediction] models. Primary studies reporting the development and/or evaluation of methodological search filters to retrieve any type of prognosis study will be included in this systematic review. Multiple electronic bibliographic databases will be searched, grey literature will be sought from relevant organisations and websites, experts will be contacted, and citation tracking of key papers and reference list checking of all included papers will be undertaken. Titles will be screened by one person, and abstracts and full articles will be reviewed for inclusion independently by two reviewers. Data extraction and quality assessment will also be undertaken independently by two reviewers with disagreements resolved by discussion or by a third reviewer if necessary. Filters' characteristics and performance metrics reported in the included studies will be extracted and tabulated. To enable comparisons, filters will be grouped according to database, platform, type of prognosis study, and type of filter for which it was intended. This systematic review will identify all existing validated prognosis search filters and synthesise evidence about their applicability and performance. These findings will identify if current filters provide a proficient means of searching electronic bibliographic databases or if further prognosis filters are needed and can feasibly be developed for systematic searches of prognosis studies.

  10. Web-Based Virtual Patients in Nursing Education: Development and Validation of Theory-Anchored Design and Activity Models

    PubMed Central

    2014-01-01

    Background Research has shown that nursing students find it difficult to translate and apply their theoretical knowledge in a clinical context. Virtual patients (VPs) have been proposed as a learning activity that can support nursing students in their learning of scientific knowledge and help them integrate theory and practice. Although VPs are increasingly used in health care education, they still lack a systematic consistency that would allow their reuse outside of their original context. There is therefore a need to develop a model for the development and implementation of VPs in nursing education. Objective The aim of this study was to develop and evaluate a virtual patient model optimized to the learning and assessment needs in nursing education. Methods The process of modeling started by reviewing theoretical frameworks reported in the literature and used by practitioners when designing learning and assessment activities. The Outcome-Present State Test (OPT) model was chosen as the theoretical framework. The model was then, in an iterative manner, developed and optimized to the affordances of virtual patients. Content validation was performed with faculty both in terms of the relevance of the chosen theories but also its applicability in nursing education. The virtual patient nursing model was then instantiated in two VPs. The students’ perceived usefulness of the VPs was investigated using a questionnaire. The result was analyzed using descriptive statistics. Results A virtual patient Nursing Design Model (vpNDM) composed of three layers was developed. Layer 1 contains the patient story and ways of interacting with the data, Layer 2 includes aspects of the iterative process of clinical reasoning, and finally Layer 3 includes measurable outcomes. A virtual patient Nursing Activity Model (vpNAM) was also developed as a guide when creating VP-centric learning activities. The students perceived the global linear VPs as a relevant learning activity for the integration of theory and practice. Conclusions Virtual patients that are adapted to the nursing paradigm can support nursing students’ development of clinical reasoning skills. The proposed virtual patient nursing design and activity models will allow the systematic development of different types of virtual patients from a common model and thereby create opportunities for sharing pedagogical designs across technical solutions. PMID:24727709

  11. Fostering Child Development by Improving Care Quality: A Systematic Review of the Effectiveness of Structural Interventions and Caregiver Trainings in Institutional Care.

    PubMed

    Hermenau, Katharin; Goessmann, Katharina; Rygaard, Niels Peter; Landolt, Markus A; Hecker, Tobias

    2017-12-01

    Quality of child care has been shown to have a crucial impact on children's development and psychological adjustment, particularly for orphans with a history of maltreatment and trauma. However, adequate care for orphans is often impacted by unfavorable caregiver-child ratios and poorly trained, overburdened personnel, especially in institutional care in countries with limited resources and large numbers of orphans. This systematic review investigated the effects of structural interventions and caregiver trainings on child development in institutional environments. The 24 intervention studies included in this systematic review reported beneficial effects on the children's emotional, social, and cognitive development. Yet, few studies focused on effects of interventions on the child-caregiver relationship or the general institutional environment. Moreover, our review revealed that interventions aimed at improving institutional care settings have largely neglected violence and abuse prevention. Unfortunately, our findings are partially limited by constraints of study design and methodology. In sum, this systematic review sheds light on obstacles and possibilities for the improvement in institutional care. There must be greater efforts at preventing violence, abuse, and neglect of children living in institutional care. Therefore, we advocate for combining attachment theory-based models with maltreatment prevention approaches and then testing them using rigorous scientific standards. By using approaches grounded in the evidence, it could be possible to enable more children to grow up in supportive and nonviolent environments.

  12. Improved water resource management using three dimensional groundwater modelling for a highly complex environmental

    NASA Astrophysics Data System (ADS)

    Moeck, Christian; Affolter, Annette; Radny, Dirk; Auckenthaler, Adrian; Huggenberger, Peter; Schirmer, Mario

    2017-04-01

    Proper allocation and management of groundwater is an important and critical challenge under rising water demands of various environmental sectors but good groundwater quality is often limited because of urbanization and contamination of aquifers. Given the predictive capability of groundwater models, they are often the only viable means of providing input to water management decisions. However, modelling flow and transport processes can be difficult due to their unknown subsurface heterogeneity and typically unknown distribution of contaminants. As a result water resource management tasks are based on uncertain assumption on contaminants patterns and this uncertainty is typically not incorporated into the assessment of risks associated with different proposed management scenarios. A three-dimensional groundwater model was used to improve water resource management for a study area, where drinking water production is close to different former landfills and industrial areas. To avoid drinking water contamination, artificial groundwater recharge with surface water into the gravel aquifer is used to create a hydraulic barrier between contaminated sites and drinking water extraction wells. The model was used for simulating existing and proposed water management strategies as a tool to ensure the utmost security for drinking water. A systematic evaluation of the flow direction and magnitude between existing observation points using a newly developed three point estimation method for a large amount of scenarios was carried out. Due to the numerous observation points 32 triangles (three-points) were created which cover the entire area around the Hardwald. We demonstrated that systematically applying our developed methodology helps to identify important locations which are sensitive to changing boundary conditions and where additional protection is required without highly computational demanding transport modelling. The presented integrated approach using the flow direction between observation points can be easily transferred to a variety of hydrological settings to evaluate systematically groundwater modelling scenarios.

  13. Astrostatistics in X-ray Astronomy: Systematics and Calibration

    NASA Astrophysics Data System (ADS)

    Siemiginowska, Aneta; Kashyap, Vinay; CHASC

    2014-01-01

    Astrostatistics has been emerging as a new field in X-ray and gamma-ray astronomy, driven by the analysis challenges arising from data collected by high performance missions since the beginning of this century. The development and implementation of new analysis methods and techniques requires a close collaboration between astronomers and statisticians, and requires support from a reliable and continuous funding source. The NASA AISR program was one such, and played a crucial part in our work. Our group (CHASC; http://heawww.harvard.edu/AstroStat/), composed of a mixture of high energy astrophysicists and statisticians, was formed ~15 years ago to address specific issues related to Chandra X-ray Observatory data (Siemiginowska et al. 1997) and was initially fully supported by Chandra. We have developed several statistical methods that have laid the foundation for extensive application of Bayesian methodologies to Poisson data in high-energy astrophysics. I will describe one such project, on dealing with systematic uncertainties (Lee et al. 2011, ApJ ), and present the implementation of the method in Sherpa, the CIAO modeling and fitting application. This algorithm propagates systematic uncertainties in instrumental responses (e.g., ARFs) through the Sherpa spectral modeling chain to obtain realistic error bars on model parameters when the data quality is high. Recent developments include the ability to narrow the space of allowed calibration and obtain better parameter estimates as well as tighter error bars. Acknowledgements: This research is funded in part by NASA contract NAS8-03060. References: Lee, H., Kashyap, V.L., van Dyk, D.A., et al. 2011, ApJ, 731, 126 Siemiginowska, A., Elvis, M., Connors, A., et al. 1997, Statistical Challenges in Modern Astronomy II, 241

  14. Efficacy of a Systematic Process for Developing Function-Based Treatment for Young Children with Disabilities

    ERIC Educational Resources Information Center

    Aldosari, Mubarak S.

    2016-01-01

    This study conducted an in-depth analysis of the efficacy of the Decision Model in the development of function-based treatments for disruptive behaviors in four toddlers with disabilities aged from 26 to 34 months in inclusive toddler classrooms. The research was conducted in three parts. In Part 1, a functional behavioral assessment was conducted…

  15. Born in Zanzibar, Computerized in Provo, Utah: A Systematic Instructional Design Approach for Swahili CALL

    ERIC Educational Resources Information Center

    Bush, Michael D.

    2010-01-01

    The development of online learning materials is a complex and expensive process that can benefit from the application of consistent and organized principles of instructional design. This article discusses the development at Brigham Young University of the online portion of a one-semester course in Swahili using the ADDIE Model (Analysis, Design,…

  16. Effects of Self-Regulated Strategy Development on the Writing Skills of School-Age Children with Attention-Deficit/Hyperactivity Disorder. EBP Briefs. Volume 12, Issue 4

    ERIC Educational Resources Information Center

    Roitsch, Jane; Murphy, Kimberly; Michalek, Anne M. P.

    2017-01-01

    Clinical Question: Does the self-regulated strategy development (SRSD) intervention model improve the writing skills of school-age children with attention-deficit/hyperactivity disorder (ADHD)? Method: Systematic Review. Study Sources: ASHA, ASHAWire, Google Scholar, Academic Search Complete, Education Full Text, Education Research Complete,…

  17. A Model for Facilitating Curriculum Development in Higher Education: A Faculty-Driven, Data-Informed, and Educational Developer-Supported Approach

    ERIC Educational Resources Information Center

    Wolf, Peter

    2007-01-01

    In the fall of 2003, Teaching Support Services (TSS), a department at the University of Guelph, was approached by a faculty member in the department of food sciences. Professor Art Hill was interested in seeking support in systematically assessing the department's undergraduate curriculum and using that assessment to trigger further improvement of…

  18. Sport Education as a Pedagogical Application for Ethical Development in Physical Education and Youth Sport

    ERIC Educational Resources Information Center

    Harvey, Stephen; Kirk, David; O'Donovan, Toni M.

    2014-01-01

    The purpose of this paper is to consider four pedagogical applications within the Sport Education model to examine the ways in which a young person can become a literate sports person and develop ethical behaviour through engagement in physical education and youth sport. Through a systematic review of the Sport Education research literature we…

  19. An Investigation of University Student and K-12 Teacher Reasoning about Key Ideas in the Development of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Robertson, Amy D.

    2011-01-01

    This dissertation describes a systematic investigation of university student and K-12 teacher reasoning about key ideas relevant to the development of a particulate model for matter. Written assessments and individual demonstration interviews have been used to study the reasoning of introductory and sophomore-level physics students, introductory…

  20. Twelve tips for applying change models to curriculum design, development and delivery.

    PubMed

    McKimm, Judy; Jones, Paul Kneath

    2017-10-25

    Drawing primarily from business and management literature and the authors' experience, these 12 tips provide guidance to organizations, teams, and individuals involved in curriculum or program development at undergraduate, postgraduate, and continuing education levels. The tips are based around change models and approaches and can help underpin successful curriculum review, development, and delivery, as well as fostering appropriate educational innovation. A range of tools exist to support systematic program development and review, but even relatively simple changes need to take account of many factors, including the complexity of the environment, stakeholder engagement, cultural and psychological aspects, and the importance of followers.

  1. The Need for Systematic Reviews of Reasons

    PubMed Central

    Sofaer, Neema; Strech, Daniel

    2012-01-01

    There are many ethical decisions in the practice of health research and care, and in the creation of policy and guidelines. We argue that those charged with making such decisions need a new genre of review. The new genre is an application of the systematic review, which was developed over decades to inform medical decision-makers about what the totality of studies that investigate links between smoking and cancer, for example, implies about whether smoking causes cancer. We argue that there is a need for similarly inclusive and rigorous reviews of reason-based bioethics, which uses reasoning to address ethical questions. After presenting a brief history of the systematic review, we reject the only existing model for writing a systematic review of reason-based bioethics, which holds that such a review should address an ethical question. We argue that such a systematic review may mislead decision-makers when a literature is incomplete, or when there are mutually incompatible but individually reasonable answers to the ethical question. Furthermore, such a review can be written without identifying all the reasons given when the ethical questions are discussed, their alleged implications for the ethical question, and the attitudes taken to the reasons. The reviews we propose address instead the empirical question of which reasons have been given when addressing a specified ethical question, and present such detailed information on the reasons. We argue that this information is likely to improve decision-making, both directly and indirectly, and also the academic literature. We explain the limitations of our alternative model for systematic reviews. PMID:21521251

  2. Process-level improvements in CMIP5 models and their impact on tropical variability, the Southern Ocean, and monsoons

    NASA Astrophysics Data System (ADS)

    Lauer, Axel; Jones, Colin; Eyring, Veronika; Evaldsson, Martin; Hagemann, Stefan; Mäkelä, Jarmo; Martin, Gill; Roehrig, Romain; Wang, Shiyu

    2018-01-01

    The performance of updated versions of the four earth system models (ESMs) CNRM, EC-Earth, HadGEM, and MPI-ESM is assessed in comparison to their predecessor versions used in Phase 5 of the Coupled Model Intercomparison Project. The Earth System Model Evaluation Tool (ESMValTool) is applied to evaluate selected climate phenomena in the models against observations. This is the first systematic application of the ESMValTool to assess and document the progress made during an extensive model development and improvement project. This study focuses on the South Asian monsoon (SAM) and the West African monsoon (WAM), the coupled equatorial climate, and Southern Ocean clouds and radiation, which are known to exhibit systematic biases in present-day ESMs. The analysis shows that the tropical precipitation in three out of four models is clearly improved. Two of three updated coupled models show an improved representation of tropical sea surface temperatures with one coupled model not exhibiting a double Intertropical Convergence Zone (ITCZ). Simulated cloud amounts and cloud-radiation interactions are improved over the Southern Ocean. Improvements are also seen in the simulation of the SAM and WAM, although systematic biases remain in regional details and the timing of monsoon rainfall. Analysis of simulations with EC-Earth at different horizontal resolutions from T159 up to T1279 shows that the synoptic-scale variability in precipitation over the SAM and WAM regions improves with higher model resolution. The results suggest that the reasonably good agreement of modeled and observed mean WAM and SAM rainfall in lower-resolution models may be a result of unrealistic intensity distributions.

  3. A Systematic Error Correction Method for TOVS Radiances

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Rokke, Laurie; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Treatment of systematic errors is crucial for the successful use of satellite data in a data assimilation system. Systematic errors in TOVS radiance measurements and radiative transfer calculations can be as large or larger than random instrument errors. The usual assumption in data assimilation is that observational errors are unbiased. If biases are not effectively removed prior to assimilation, the impact of satellite data will be lessened and can even be detrimental. Treatment of systematic errors is important for short-term forecast skill as well as the creation of climate data sets. A systematic error correction algorithm has been developed as part of a 1D radiance assimilation. This scheme corrects for spectroscopic errors, errors in the instrument response function, and other biases in the forward radiance calculation for TOVS. Such algorithms are often referred to as tuning of the radiances. The scheme is able to account for the complex, air-mass dependent biases that are seen in the differences between TOVS radiance observations and forward model calculations. We will show results of systematic error correction applied to the NOAA 15 Advanced TOVS as well as its predecessors. We will also discuss the ramifications of inter-instrument bias with a focus on stratospheric measurements.

  4. Nature versus nurture: A systematic approach to elucidate gene-environment interactions in the development of myopic refractive errors.

    PubMed

    Miraldi Utz, Virginia

    2017-01-01

    Myopia is the most common eye disorder and major cause of visual impairment worldwide. As the incidence of myopia continues to rise, the need to further understand the complex roles of molecular and environmental factors controlling variation in refractive error is of increasing importance. Tkatchenko and colleagues applied a systematic approach using a combination of gene set enrichment analysis, genome-wide association studies, and functional analysis of a murine model to identify a myopia susceptibility gene, APLP2. Differential expression of refractive error was associated with time spent reading for those with low frequency variants in this gene. This provides support for the longstanding hypothesis of gene-environment interactions in refractive error development.

  5. A systematic review of health economic models and utility estimation methods in schizophrenia.

    PubMed

    Németh, Bertalan; Fasseeh, Ahmad; Molnár, Anett; Bitter, István; Horváth, Margit; Kóczián, Kristóf; Götze, Árpád; Nagy, Balázs

    2018-06-01

    There is a growing need for economic evaluations describing the disease course, as well as the costs and clinical outcomes related to the treatment of schizophrenia. Areas covered: A systematic review on studies describing health economic models in schizophrenia and a targeted literature review on utility mapping algorithms in schizophrenia were carried out. Models found in the review were collated and assessed in detail according to their type and various other attributes. Fifty-nine studies were included in the review. Modeling techniques varied from simple decision trees to complex simulation models. The models used various clinical endpoints as value drivers, 47% of the models used quality-adjusted life years, and eight percent used disability-adjusted life years to measure benefits, while others applied various clinical outcomes. Most models considered patients switching between therapies, and therapeutic adherence, compliance or persistence. The targeted literature review identified four main approaches to map PANSS scores to utility values. Expert commentary: Health economic models developed for schizophrenia showed great variability, with simulation models becoming more frequently used in the last decade. Using PANSS scores as the basis of utility estimations is justifiable.

  6. [Relationship between water supply, sanitation, public health, and environment: elements for the formulation of a sanitary infrastructure planning model].

    PubMed

    Soares, Sérgio R A; Bernardes, Ricardo S; Netto, Oscar de M Cordeiro

    2002-01-01

    The understanding of sanitation infrastructure, public health, and environmental relations is a fundamental assumption for planning sanitation infrastructure in urban areas. This article thus suggests elements for developing a planning model for sanitation infrastructure. The authors performed a historical survey of environmental and public health issues related to the sector, an analysis of the conceptual frameworks involving public health and sanitation systems, and a systematization of the various effects that water supply and sanitation have on public health and the environment. Evaluation of these effects should guarantee the correct analysis of possible alternatives, deal with environmental and public health objectives (the main purpose of sanitation infrastructure), and provide the most reasonable indication of actions. The suggested systematization of the sanitation systems effects in each step of their implementation is an advance considering the association between the fundamental elements for formulating a planning model for sanitation infrastructure.

  7. The Systematics of Strong Lens Modeling Quantified: The Effects of Constraint Selection and Redshift Information on Magnification, Mass, and Multiple Image Predictability

    NASA Astrophysics Data System (ADS)

    Johnson, Traci L.; Sharon, Keren

    2016-11-01

    Until now, systematic errors in strong gravitational lens modeling have been acknowledged but have never been fully quantified. Here, we launch an investigation into the systematics induced by constraint selection. We model the simulated cluster Ares 362 times using random selections of image systems with and without spectroscopic redshifts and quantify the systematics using several diagnostics: image predictability, accuracy of model-predicted redshifts, enclosed mass, and magnification. We find that for models with >15 image systems, the image plane rms does not decrease significantly when more systems are added; however, the rms values quoted in the literature may be misleading as to the ability of a model to predict new multiple images. The mass is well constrained near the Einstein radius in all cases, and systematic error drops to <2% for models using >10 image systems. Magnification errors are smallest along the straight portions of the critical curve, and the value of the magnification is systematically lower near curved portions. For >15 systems, the systematic error on magnification is ∼2%. We report no trend in magnification error with the fraction of spectroscopic image systems when selecting constraints at random; however, when using the same selection of constraints, increasing this fraction up to ∼0.5 will increase model accuracy. The results suggest that the selection of constraints, rather than quantity alone, determines the accuracy of the magnification. We note that spectroscopic follow-up of at least a few image systems is crucial because models without any spectroscopic redshifts are inaccurate across all of our diagnostics.

  8. Development of advanced grid stiffened (AGS) fiber reinforced polymer (FRP) tube-encased concrete columns.

    DOT National Transportation Integrated Search

    2013-03-01

    In this project, a new type of confining device, a latticework of interlacing fiber reinforced polymer (FRP) ribs that are jacketed by a FRP skin, is proposed, manufactured, tested, and modeled to encase concrete cylinders. This systematic study incl...

  9. Response to the Lazarus and Beutler Article "On Technical Eclecticism."

    ERIC Educational Resources Information Center

    Gilliland, Burl E.; And Others

    1994-01-01

    Responds to 1993 Lazarus and Beutler article "On Technical Eclecticism." States that eclectic counseling is not nondescript disorganized, nonsystematic modality. Suggests that counselor eclecticism is valuable skill that must be developed to be utilized appropriately. Outlines six-step systematic, eclectic intervention model. (CRR)

  10. 75 FR 12753 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-17

    ... effective at improving health care quality. While evidence-based approaches for decisionmaking have become standard in healthcare, this has been limited in laboratory medicine. No single- evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...

  11. Instructional Programming. (SCAT Project, Title VI-G).

    ERIC Educational Resources Information Center

    Shoemaker, Sue

    Developed by the SCAT (Support, Competency-Assistance and Training) Project staff, the document deals with the fourth step, instructional programing, of a systematic instruction model for use with exceptional children. Purposes of the paper are noted to include providing guidelines for establishing and implementing individualized instructional…

  12. Hidden asymmetry and forward-backward correlations

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Zalewski, K.

    2010-09-01

    A model-independent method of studying the forward-backward correlations in symmetric high-energy processes is developed. The method allows a systematic study of the properties of various particle sources and allows one to uncover asymmetric structures hidden in symmetric hadron-hadron and nucleus-nucleus inelastic reactions.

  13. Systematic development of reduced reaction mechanisms for dynamic modeling

    NASA Technical Reports Server (NTRS)

    Frenklach, M.; Kailasanath, K.; Oran, E. S.

    1986-01-01

    A method for systematically developing a reduced chemical reaction mechanism for dynamic modeling of chemically reactive flows is presented. The method is based on the postulate that if a reduced reaction mechanism faithfully describes the time evolution of both thermal and chain reaction processes characteristic of a more complete mechanism, then the reduced mechanism will describe the chemical processes in a chemically reacting flow with approximately the same degree of accuracy. Here this postulate is tested by producing a series of mechanisms of reduced accuracy, which are derived from a full detailed mechanism for methane-oxygen combustion. These mechanisms were then tested in a series of reactive flow calculations in which a large-amplitude sinusoidal perturbation is applied to a system that is initially quiescent and whose temperature is high enough to start ignition processes. Comparison of the results for systems with and without convective flow show that this approach produces reduced mechanisms that are useful for calculations of explosions and detonations. Extensions and applicability to flames are discussed.

  14. Health literacy in childhood and youth: a systematic review of definitions and models.

    PubMed

    Bröder, Janine; Okan, Orkan; Bauer, Ullrich; Bruland, Dirk; Schlupp, Sandra; Bollweg, Torsten M; Saboga-Nunes, Luis; Bond, Emma; Sørensen, Kristine; Bitzer, Eva-Maria; Jordan, Susanne; Domanska, Olga; Firnges, Christiane; Carvalho, Graça S; Bittlingmayer, Uwe H; Levin-Zamir, Diane; Pelikan, Jürgen; Sahrai, Diana; Lenz, Albert; Wahl, Patricia; Thomas, Malcolm; Kessl, Fabian; Pinheiro, Paulo

    2017-04-26

    Children and young people constitute a core target group for health literacy research and practice: during childhood and youth, fundamental cognitive, physical and emotional development processes take place and health-related behaviours and skills develop. However, there is limited knowledge and academic consensus regarding the abilities and knowledge a child or young person should possess for making sound health decisions. The research presented in this review addresses this gap by providing an overview and synthesis of current understandings of health literacy in childhood and youth. Furthermore, the authors aim to understand to what extent available models capture the unique needs and characteristics of children and young people. Six databases were systematically searched with relevant search terms in English and German. Of the n = 1492 publications identified, N = 1021 entered the abstract screening and N = 340 full-texts were screened for eligibility. A total of 30 articles, which defined or conceptualized generic health literacy for a target population of 18 years or younger, were selected for a four-step inductive content analysis. The systematic review of the literature identified 12 definitions and 21 models that have been specifically developed for children and young people. In the literature, health literacy in children and young people is described as comprising variable sets of key dimensions, each appearing as a cluster of related abilities, skills, commitments, and knowledge that enable a person to approach health information competently and effectively and to derive at health-promoting decisions and actions. Identified definitions and models are very heterogeneous, depicting health literacy as multidimensional, complex construct. Moreover, health literacy is conceptualized as an action competence, with a strong focus on personal attributes, while also recognising its interrelatedness with social and contextual determinants. Life phase specificities are mainly considered from a cognitive and developmental perspective, leaving children's and young people's specific needs, vulnerabilities, and social structures poorly incorporated within most models. While a critical number of definitions and models were identified for youth or secondary school students, similar findings are lacking for children under the age of ten or within a primary school context.

  15. Integrating science and business models of sustainability for environmentally-challenging industries such as secondary lead smelters: a systematic review and analysis of findings.

    PubMed

    Genaidy, A M; Sequeira, R; Tolaymat, T; Kohler, J; Wallace, S; Rinder, M

    2010-09-01

    Secondary lead smelters (SLS) represent an environmentally-challenging industry as they deal with toxic substances posing potential threats to both human and environmental health, consequently, they operate under strict government regulations. Such challenges have resulted in the significant reduction of SLS plants in the last three decades. In addition, the domestic recycling of lead has been on a steep decline in the past 10 years as the amount of lead recovered has remained virtually unchanged while consumption has increased. Therefore, one may wonder whether sustainable development can be achieved among SLS. The primary objective of this study was to determine whether a roadmap for sustainable development can be established for SLS. The following aims were established in support of the study objective: (1) to conduct a systematic review and an analysis of models of sustainable systems with a particular emphasis on SLS; (2) to document the challenges for the U.S. secondary lead smelting industry; and (3) to explore practices and concepts which act as vehicles for SLS on the road to sustainable development. An evidence-based methodology was adopted to achieve the study objective. A comprehensive electronic search was conducted to implement the aforementioned specific aims. Inclusion criteria were established to filter out irrelevant scientific papers and reports. The relevant articles were closely scrutinized and appraised to extract the required information and data for the possible development of a sustainable roadmap. The search process yielded a number of research articles which were utilized in the systematic review. Two types of models emerged: management/business and science/mathematical models. Although the management/business models explored actions to achieve sustainable growth in the industrial enterprise, science/mathematical models attempted to explain the sustainable behaviors and properties aiming at predominantly ecosystem management. As such, there are major disconnects between the science/mathematical and management/business models in terms of aims and goals. Therefore, there is an urgent need to integrate science and business models of sustainability for the industrial enterprises at large and environmentally-challenging industrial sectors in particular. In this paper, we offered examples of practices and concepts which can be used in charting a path towards sustainable development for secondary lead smelters particularly that the waste generated is much greater outside the industrial enterprise than inside. An environmentally-challenging industry such as secondary lead smelters requires a fresh look to chart a path towards sustainable development (i.e., survivability and purposive needs) for all stakeholders (i.e., industrial enterprise, individual stakeholders, and social/ecological systems). Such a path should deal with issues beyond pollution prevention, product stewardship and clean technologies. 2010 Elsevier Ltd. All rights reserved.

  16. External validation of type 2 diabetes computer simulation models: definitions, approaches, implications and room for improvement-a protocol for a systematic review.

    PubMed

    Ogurtsova, Katherine; Heise, Thomas L; Linnenkamp, Ute; Dintsios, Charalabos-Markos; Lhachimi, Stefan K; Icks, Andrea

    2017-12-29

    Type 2 diabetes mellitus (T2DM), a highly prevalent chronic disease, puts a large burden on individual health and health care systems. Computer simulation models, used to evaluate the clinical and economic effectiveness of various interventions to handle T2DM, have become a well-established tool in diabetes research. Despite the broad consensus about the general importance of validation, especially external validation, as a crucial instrument of assessing and controlling for the quality of these models, there are no systematic reviews comparing such validation of diabetes models. As a result, the main objectives of this systematic review are to identify and appraise the different approaches used for the external validation of existing models covering the development and progression of T2DM. We will perform adapted searches by applying respective search strategies to identify suitable studies from 14 electronic databases. Retrieved study records will be included or excluded based on predefined eligibility criteria as defined in this protocol. Among others, a publication filter will exclude studies published before 1995. We will run abstract and full text screenings and then extract data from all selected studies by filling in a predefined data extraction spreadsheet. We will undertake a descriptive, narrative synthesis of findings to address the study objectives. We will pay special attention to aspects of quality of these models in regard to the external validation based upon ISPOR and ADA recommendations as well as Mount Hood Challenge reports. All critical stages within the screening, data extraction and synthesis processes will be conducted by at least two authors. This protocol adheres to PRISMA and PRISMA-P standards. The proposed systematic review will provide a broad overview of the current practice in the external validation of models with respect to T2DM incidence and progression in humans built on simulation techniques. PROSPERO CRD42017069983 .

  17. Publishing web-based guidelines using interactive decision models.

    PubMed

    Sanders, G D; Nease, R F; Owens, D K

    2001-05-01

    Commonly used methods for guideline development and dissemination do not enable developers to tailor guidelines systematically to specific patient populations and update guidelines easily. We developed a web-based system, ALCHEMIST, that uses decision models and automatically creates evidence-based guidelines that can be disseminated, tailored and updated over the web. Our objective was to demonstrate the use of this system with clinical scenarios that provide challenges for guideline development. We used the ALCHEMIST system to develop guidelines for three clinical scenarios: (1) Chlamydia screening for adolescent women, (2) antiarrhythmic therapy for the prevention of sudden cardiac death; and (3) genetic testing for the BRCA breast-cancer mutation. ALCHEMIST uses information extracted directly from the decision model, combined with the additional information from the author of the decision model, to generate global guidelines. ALCHEMIST generated electronic web-based guidelines for each of the three scenarios. Using ALCHEMIST, we demonstrate that tailoring a guideline for a population at high-risk for Chlamydia changes the recommended policy for control of Chlamydia from contact tracing of reported cases to a population-based screening programme. We used ALCHEMIST to incorporate new evidence about the effectiveness of implantable cardioverter defibrillators (ICD) and demonstrate that the cost-effectiveness of use of ICDs improves from $74 400 per quality-adjusted life year (QALY) gained to $34 500 per QALY gained. Finally, we demonstrate how a clinician could use ALCHEMIST to incorporate a woman's utilities for relevant health states and thereby develop patient-specific recommendations for BRCA testing; the patient-specific recommendation improved quality-adjusted life expectancy by 37 days. The ALCHEMIST system enables guideline developers to publish both a guideline and an interactive decision model on the web. This web-based tool enables guideline developers to tailor guidelines systematically, to update guidelines easily, and to make the underlying evidence and analysis transparent for users.

  18. Systematic Review of Economic Models Used to Compare Techniques for Detecting Peripheral Arterial Disease.

    PubMed

    Moloney, Eoin; O'Connor, Joanne; Craig, Dawn; Robalino, Shannon; Chrysos, Alexandros; Javanbakht, Mehdi; Sims, Andrew; Stansby, Gerard; Wilkes, Scott; Allen, John

    2018-04-23

    Peripheral arterial disease (PAD) is a common condition, in which atherosclerotic narrowing in the arteries restricts blood supply to the leg muscles. In order to support future model-based economic evaluations comparing methods of diagnosis in this area, a systematic review of economic modelling studies was conducted. A systematic literature review was performed in June 2017 to identify model-based economic evaluations of diagnostic tests to detect PAD, with six individual databases searched. The review was conducted in accordance with the methods outlined in the Centre for Reviews and Dissemination's guidance for undertaking reviews in healthcare, and appropriate inclusion criteria were applied. Relevant data were extracted, and studies were quality assessed. Seven studies were included in the final review, all of which were published between 1995 and 2014. There was wide variation in the types of diagnostic test compared. The majority of the studies (six of seven) referenced the sources used to develop their model, and all studies stated and justified the structural assumptions. Reporting of the data within the included studies could have been improved. Only one identified study focused on the cost-effectiveness of a test typically used in primary care. This review brings together all applied modelling methods for tests used in the diagnosis of PAD, which could be used to support future model-based economic evaluations in this field. The limited modelling work available on tests typically used for the detection of PAD in primary care, in particular, highlights the importance of future work in this area.

  19. A Psychological Model for Aggregating Judgments of Magnitude

    NASA Astrophysics Data System (ADS)

    Merkle, Edgar C.; Steyvers, Mark

    In this paper, we develop and illustrate a psychologically-motivated model for aggregating judgments of magnitude across experts. The model assumes that experts' judgments are perturbed from the truth by both systematic biases and random error, and it provides aggregated estimates that are implicitly based on the application of nonlinear weights to individual judgments. The model is also easily extended to situations where experts report multiple quantile judgments. We apply the model to expert judgments concerning flange leaks in a chemical plant, illustrating its use and comparing it to baseline measures.

  20. Biofidelic Human Activity Modeling and Simulation with Large Variability

    DTIC Science & Technology

    2014-11-25

    A systematic approach was developed for biofidelic human activity modeling and simulation by using body scan data and motion capture data to...replicate a human activity in 3D space. Since technologies for simultaneously capturing human motion and dynamic shapes are not yet ready for practical use, a...that can replicate a human activity in 3D space with the true shape and true motion of a human. Using this approach, a model library was built to

  1. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  2. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  3. Validation of Community Models: 2. Development of a Baseline, Using the Wang-Sheeley-Arge Model

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter

    2009-01-01

    This paper is the second in a series providing independent validation of community models of the outer corona and inner heliosphere. Here I present a comprehensive validation of the Wang-Sheeley-Arge (WSA) model. These results will serve as a baseline against which to compare the next generation of comparable forecasting models. The WSA model is used by a number of agencies to predict Solar wind conditions at Earth up to 4 days into the future. Given its importance to both the research and forecasting communities, it is essential that its performance be measured systematically and independently. I offer just such an independent and systematic validation. I report skill scores for the model's predictions of wind speed and interplanetary magnetic field (IMF) polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests synoptic line of sight magnetograms. For this study I generated model results for monthly magnetograms from multiple observatories, spanning the Carrington rotation range from 1650 to 2074. I compare the influence of the different magnetogram sources and performance at quiet and active times. I also consider the ability of the WSA model to forecast both sharp transitions in wind speed from slow to fast wind and reversals in the polarity of the radial component of the IMF. These results will serve as a baseline against which to compare future versions of the model as well as the current and future generation of magnetohydrodynamic models under development for forecasting use.

  4. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  5. Economic evaluation of medical tests at the early phases of development: a systematic review of empirical studies.

    PubMed

    Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham

    2018-02-01

    There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.

  6. HZETRN radiation transport validation using balloon-based experimental data

    NASA Astrophysics Data System (ADS)

    Warner, James E.; Norman, Ryan B.; Blattnig, Steve R.

    2018-05-01

    The deterministic radiation transport code HZETRN (High charge (Z) and Energy TRaNsport) was developed by NASA to study the effects of cosmic radiation on astronauts and instrumentation shielded by various materials. This work presents an analysis of computed differential flux from HZETRN compared with measurement data from three balloon-based experiments over a range of atmospheric depths, particle types, and energies. Model uncertainties were quantified using an interval-based validation metric that takes into account measurement uncertainty both in the flux and the energy at which it was measured. Average uncertainty metrics were computed for the entire dataset as well as subsets of the measurements (by experiment, particle type, energy, etc.) to reveal any specific trends of systematic over- or under-prediction by HZETRN. The distribution of individual model uncertainties was also investigated to study the range and dispersion of errors beyond just single scalar and interval metrics. The differential fluxes from HZETRN were generally well-correlated with balloon-based measurements; the median relative model difference across the entire dataset was determined to be 30%. The distribution of model uncertainties, however, revealed that the range of errors was relatively broad, with approximately 30% of the uncertainties exceeding ± 40%. The distribution also indicated that HZETRN systematically under-predicts the measurement dataset as a whole, with approximately 80% of the relative uncertainties having negative values. Instances of systematic bias for subsets of the data were also observed, including a significant underestimation of alpha particles and protons for energies below 2.5 GeV/u. Muons were found to be systematically over-predicted at atmospheric depths deeper than 50 g/cm2 but under-predicted for shallower depths. Furthermore, a systematic under-prediction of alpha particles and protons was observed below the geomagnetic cutoff, suggesting that improvements to the light ion production cross sections in HZETRN should be investigated.

  7. Compositional characteristics of some Apollo 14 clastic materials.

    NASA Technical Reports Server (NTRS)

    Lindstrom, M. M.; Duncan, A. R.; Fruchter, J. S.; Mckay, S. M.; Stoeser, J. W.; Goles, G. G.; Lindstrom, D. J.

    1972-01-01

    Eighty-two subsamples of Apollo 14 materials have been analyzed by instrumental neutron activation analysis techniques for as many as 25 elements. In many cases, it was necessary to develop new procedures to allow analyses of small specimens. Compositional relationships among Apollo 14 materials indicate that there are small but systematic differences between regolith from the valley terrain and that from Cone Crater ejecta. Fragments from 1-2 mm size fractions of regolith samples may be divided into compositional classes, and the 'soil breccias' among them are very similar to valley soils. Multicomponent linear mixing models have been used as interpretive tools in dealing with data on regolith fractions and subsamples from breccia 14321. These mixing models show systematic compositional variations with inferred age for Apollo 14 clastic materials.

  8. Prevention and assessment of infectious diseases among children and adult migrants arriving to the European Union/European Economic Association: a protocol for a suite of systematic reviews for public health and health systems

    PubMed Central

    Mayhew, Alain D; Morton, Rachael L; Greenaway, Christina; Akl, Elie A; Rahman, Prinon; Zenner, Dominik; Pareek, Manish; Tugwell, Peter; Welch, Vivian; Meerpohl, Joerg; Alonso-Coello, Pablo; Hui, Charles; Biggs, Beverley-Ann; Requena-Méndez, Ana; Agbata, Eric; Noori, Teymur; Schünemann, Holger J

    2017-01-01

    Introduction The European Centre for Disease Prevention and Control is developing evidence-based guidance for voluntary screening, treatment and vaccine prevention of infectious diseases for newly arriving migrants to the European Union/European Economic Area. The objective of this systematic review protocol is to guide the identification, appraisal and synthesis of the best available evidence on prevention and assessment of the following priority infectious diseases: tuberculosis, HIV, hepatitis B, hepatitis C, measles, mumps, rubella, diphtheria, tetanus, pertussis, poliomyelitis (polio), Haemophilus influenza disease, strongyloidiasis and schistosomiasis. Methods and analysis The search strategy will identify evidence from existing systematic reviews and then update the effectiveness and cost-effectiveness evidence using prospective trials, economic evaluations and/or recently published systematic reviews. Interdisciplinary teams have designed logic models to help define study inclusion and exclusion criteria, guiding the search strategy and identifying relevant outcomes. We will assess the certainty of evidence using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. Ethics and dissemination There are no ethical or safety issues. We anticipate disseminating the findings through open-access publications, conference abstracts and presentations. We plan to publish technical syntheses as GRADEpro evidence summaries and the systematic reviews as part of a special edition open-access publication on refugee health. We are following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Protocols reporting guideline. This protocol is registered in PROSPERO: CRD42016045798. PMID:28893741

  9. MO-C-17A-03: A GPU-Based Method for Validating Deformable Image Registration in Head and Neck Radiotherapy Using Biomechanical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neylon, J; Min, Y; Qi, S

    2014-06-15

    Purpose: Deformable image registration (DIR) plays a pivotal role in head and neck adaptive radiotherapy but a systematic validation of DIR algorithms has been limited by a lack of quantitative high-resolution groundtruth. We address this limitation by developing a GPU-based framework that provides a systematic DIR validation by generating (a) model-guided synthetic CTs representing posture and physiological changes, and (b) model-guided landmark-based validation. Method: The GPU-based framework was developed to generate massive mass-spring biomechanical models from patient simulation CTs and contoured structures. The biomechanical model represented soft tissue deformations for known rigid skeletal motion. Posture changes were simulated by articulatingmore » skeletal anatomy, which subsequently applied elastic corrective forces upon the soft tissue. Physiological changes such as tumor regression and weight loss were simulated in a biomechanically precise manner. Synthetic CT data was then generated from the deformed anatomy. The initial and final positions for one hundred randomly-chosen mass elements inside each of the internal contoured structures were recorded as ground truth data. The process was automated to create 45 synthetic CT datasets for a given patient CT. For instance, the head rotation was varied between +/− 4 degrees along each axis, and tumor volumes were systematically reduced up to 30%. Finally, the original CT and deformed synthetic CT were registered using an optical flow based DIR. Results: Each synthetic data creation took approximately 28 seconds of computation time. The number of landmarks per data set varied between two and three thousand. The validation method is able to perform sub-voxel analysis of the DIR, and report the results by structure, giving a much more in depth investigation of the error. Conclusions: We presented a GPU based high-resolution biomechanical head and neck model to validate DIR algorithms by generating CT equivalent 3D volumes with simulated posture changes and physiological regression.« less

  10. Organizational Performance and Customer Value

    ERIC Educational Resources Information Center

    Tosti, Donald; Herbst, Scott A.

    2009-01-01

    While behavior systems analysts have recognized the importance of the consumer of organizational products (i.e., receiving system) in developing models of organizational change, few have offered a systematic assessment of the relationship between consumer and organizational practices. In this article we will discuss how a behavior systems approach…

  11. Making a Difference.

    ERIC Educational Resources Information Center

    Panza, Carol M.

    2001-01-01

    Suggests that human performance technologists need to have an analysis approach to support the development of an appropriate set of improvement recommendations for clients and then move to an action plan to help them see results. Presents a performance improvement model and a systematic approach that considers organizational context, ownership,…

  12. Reliability of the Suicide Opinion Questionnaire.

    ERIC Educational Resources Information Center

    Rogers, James R.; DeShon, Richard P.

    The lack of systematic psychometric information on the Suicide Opinion Questionnaire (SOQ) was addressed by investigating the factor structure and reliability of the eight-factor clinical scale model (mental illness, cry for help, right to die, religion, impulsivity, normality, aggression, and moral evil), developed for interpreting responses to…

  13. 78 FR 9698 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... effective at improving health care quality. While evidence-based approaches for decision-making have become standard in healthcare, this has been limited in laboratory medicine. No single-evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...

  14. Individual- vs. Culture-Level Dimensions of Individualism and Collectivism: Effects on Preferred Conversational Styles.

    ERIC Educational Resources Information Center

    Kim, Min-Sun; And Others

    1996-01-01

    Develops and uses a mediation model to investigate the links between culture, individual values (independent and interdependent construals of self), and perceptions of conversational constraints. Finds culture-level individualism and collectivism systematically related to individual-level cultural orientations (independent and interdependent…

  15. CAMCE: An Environment to Support Multimedia Courseware Projects.

    ERIC Educational Resources Information Center

    Barrese, R. M.; And Others

    1992-01-01

    Presents results of CAMCE (Computer-Aided Multimedia Courseware Engineering) project research concerned with definition of a methodology to describe a systematic approach for multimedia courseware development. Discussion covers the CAMCE methodology, requirements of an advanced authoring environment, use of an object-based model in the CAMCE…

  16. Psychological first aid: a consensus-derived, empirically supported, competency-based training model.

    PubMed

    McCabe, O Lee; Everly, George S; Brown, Lisa M; Wendelboe, Aaron M; Abd Hamid, Nor Hashidah; Tallchief, Vicki L; Links, Jonathan M

    2014-04-01

    Surges in demand for professional mental health services occasioned by disasters represent a major public health challenge. To build response capacity, numerous psychological first aid (PFA) training models for professional and lay audiences have been developed that, although often concurring on broad intervention aims, have not systematically addressed pedagogical elements necessary for optimal learning or teaching. We describe a competency-based model of PFA training developed under the auspices of the Centers for Disease Control and Prevention and the Association of Schools of Public Health. We explain the approach used for developing and refining the competency set and summarize the observable knowledge, skills, and attitudes underlying the 6 core competency domains. We discuss the strategies for model dissemination, validation, and adoption in professional and lay communities.

  17. Psychological First Aid: A Consensus-Derived, Empirically Supported, Competency-Based Training Model

    PubMed Central

    Everly, George S.; Brown, Lisa M.; Wendelboe, Aaron M.; Abd Hamid, Nor Hashidah; Tallchief, Vicki L.; Links, Jonathan M.

    2014-01-01

    Surges in demand for professional mental health services occasioned by disasters represent a major public health challenge. To build response capacity, numerous psychological first aid (PFA) training models for professional and lay audiences have been developed that, although often concurring on broad intervention aims, have not systematically addressed pedagogical elements necessary for optimal learning or teaching. We describe a competency-based model of PFA training developed under the auspices of the Centers for Disease Control and Prevention and the Association of Schools of Public Health. We explain the approach used for developing and refining the competency set and summarize the observable knowledge, skills, and attitudes underlying the 6 core competency domains. We discuss the strategies for model dissemination, validation, and adoption in professional and lay communities. PMID:23865656

  18. Modeling of pathogen survival during simulated gastric digestion.

    PubMed

    Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

    2011-02-01

    The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens.

  19. Modeling of Pathogen Survival during Simulated Gastric Digestion ▿

    PubMed Central

    Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

    2011-01-01

    The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens. PMID:21131530

  20. The Systematic Training Model: Corn Circles in Search of a Spaceship?

    ERIC Educational Resources Information Center

    Taylor, Harry

    1991-01-01

    Examines how training practice diverges from the prevailing Systematic Training Model, in terms of a rehabilitative and a radical critique. Outlines an alternative transitional model while suggesting criteria with which to assess any updated model. (SK)

  1. THE SYSTEMATICS OF STRONG LENS MODELING QUANTIFIED: THE EFFECTS OF CONSTRAINT SELECTION AND REDSHIFT INFORMATION ON MAGNIFICATION, MASS, AND MULTIPLE IMAGE PREDICTABILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Traci L.; Sharon, Keren, E-mail: tljohn@umich.edu

    Until now, systematic errors in strong gravitational lens modeling have been acknowledged but have never been fully quantified. Here, we launch an investigation into the systematics induced by constraint selection. We model the simulated cluster Ares 362 times using random selections of image systems with and without spectroscopic redshifts and quantify the systematics using several diagnostics: image predictability, accuracy of model-predicted redshifts, enclosed mass, and magnification. We find that for models with >15 image systems, the image plane rms does not decrease significantly when more systems are added; however, the rms values quoted in the literature may be misleading asmore » to the ability of a model to predict new multiple images. The mass is well constrained near the Einstein radius in all cases, and systematic error drops to <2% for models using >10 image systems. Magnification errors are smallest along the straight portions of the critical curve, and the value of the magnification is systematically lower near curved portions. For >15 systems, the systematic error on magnification is ∼2%. We report no trend in magnification error with the fraction of spectroscopic image systems when selecting constraints at random; however, when using the same selection of constraints, increasing this fraction up to ∼0.5 will increase model accuracy. The results suggest that the selection of constraints, rather than quantity alone, determines the accuracy of the magnification. We note that spectroscopic follow-up of at least a few image systems is crucial because models without any spectroscopic redshifts are inaccurate across all of our diagnostics.« less

  2. Summary of long-baseline systematics session at CETUP*2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cherdack, Daniel; Worcester, Elizabeth

    2015-10-15

    A session studying systematics in long-baseline neutrino oscillation physics was held July 14-18, 2014 as part of CETUP* 2014. Systematic effects from flux normalization and modeling, modeling of cross sections and nuclear interactions, and far detector effects were addressed. Experts presented the capabilities of existing and planned tools. A program of study to determine estimates of and requirements for the size of these effects was designed. This document summarizes the results of the CETUP* systematics workshop and the current status of systematic uncertainty studies in long-baseline neutrino oscillation measurements.

  3. Formalism Challenges of the Cougaar Model Driven Architecture

    NASA Technical Reports Server (NTRS)

    Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.

    2004-01-01

    The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.

  4. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of this exercise will directly provide end users with important information about the uncertainty of regional climate scenarios, and will furthermore provide the basis for further developing downscaling methods. This presentation will provide background information on VALUE and discuss the identified characteristics and the validation framework.

  5. Cross-cultural perspectives on physician and lay models of the common cold.

    PubMed

    Baer, Roberta D; Weller, Susan C; de Alba García, Javier García; Rocha, Ana L Salcedo

    2008-06-01

    We compare physicians and laypeople within and across cultures, focusing on similarities and differences across samples, to determine whether cultural differences or lay-professional differences have a greater effect on explanatory models of the common cold. Data on explanatory models for the common cold were collected from physicians and laypeople in South Texas and Guadalajara, Mexico. Structured interview materials were developed on the basis of open-ended interviews with samples of lay informants at each locale. A structured questionnaire was used to collect information from each sample on causes, symptoms, and treatments for the common cold. Consensus analysis was used to estimate the cultural beliefs for each sample. Instead of systematic differences between samples based on nationality or level of professional training, all four samples largely shared a single-explanatory model of the common cold, with some differences on subthemes, such as the role of hot and cold forces in the etiology of the common cold. An evaluation of our findings indicates that, although there has been conjecture about whether cultural or lay-professional differences are of greater importance in understanding variation in explanatory models of disease and illness, systematic data collected on community and professional beliefs indicate that such differences may be a function of the specific illness. Further generalizations about lay-professional differences need to be based on detailed data for a variety of illnesses, to discern patterns that may be present. Finally, a systematic approach indicates that agreement across individual explanatory models is sufficient to allow for a community-level explanatory model of the common cold.

  6. Improving effectiveness of systematic conservation planning with density data.

    PubMed

    Veloz, Samuel; Salas, Leonardo; Altman, Bob; Alexander, John; Jongsomjit, Dennis; Elliott, Nathan; Ballard, Grant

    2015-08-01

    Systematic conservation planning aims to design networks of protected areas that meet conservation goals across large landscapes. The optimal design of these conservation networks is most frequently based on the modeled habitat suitability or probability of occurrence of species, despite evidence that model predictions may not be highly correlated with species density. We hypothesized that conservation networks designed using species density distributions more efficiently conserve populations of all species considered than networks designed using probability of occurrence models. To test this hypothesis, we used the Zonation conservation prioritization algorithm to evaluate conservation network designs based on probability of occurrence versus density models for 26 land bird species in the U.S. Pacific Northwest. We assessed the efficacy of each conservation network based on predicted species densities and predicted species diversity. High-density model Zonation rankings protected more individuals per species when networks protected the highest priority 10-40% of the landscape. Compared with density-based models, the occurrence-based models protected more individuals in the lowest 50% priority areas of the landscape. The 2 approaches conserved species diversity in similar ways: predicted diversity was higher in higher priority locations in both conservation networks. We conclude that both density and probability of occurrence models can be useful for setting conservation priorities but that density-based models are best suited for identifying the highest priority areas. Developing methods to aggregate species count data from unrelated monitoring efforts and making these data widely available through ecoinformatics portals such as the Avian Knowledge Network will enable species count data to be more widely incorporated into systematic conservation planning efforts. © 2015, Society for Conservation Biology.

  7. Multibody Kinematics Optimization for the Estimation of Upper and Lower Limb Human Joint Kinematics: A Systematized Methodological Review.

    PubMed

    Begon, Mickaël; Andersen, Michael Skipper; Dumas, Raphaël

    2018-03-01

    Multibody kinematics optimization (MKO) aims to reduce soft tissue artefact (STA) and is a key step in musculoskeletal modeling. The objective of this review was to identify the numerical methods, their validation and performance for the estimation of the human joint kinematics using MKO. Seventy-four papers were extracted from a systematized search in five databases and cross-referencing. Model-derived kinematics were obtained using either constrained optimization or Kalman filtering to minimize the difference between measured (i.e., by skin markers, electromagnetic or inertial sensors) and model-derived positions and/or orientations. While hinge, universal, and spherical joints prevail, advanced models (e.g., parallel and four-bar mechanisms, elastic joint) have been introduced, mainly for the knee and shoulder joints. Models and methods were evaluated using: (i) simulated data based, however, on oversimplified STA and joint models; (ii) reconstruction residual errors, ranging from 4 mm to 40 mm; (iii) sensitivity analyses which highlighted the effect (up to 36 deg and 12 mm) of model geometrical parameters, joint models, and computational methods; (iv) comparison with other approaches (i.e., single body kinematics optimization and nonoptimized kinematics); (v) repeatability studies that showed low intra- and inter-observer variability; and (vi) validation against ground-truth bone kinematics (with errors between 1 deg and 22 deg for tibiofemoral rotations and between 3 deg and 10 deg for glenohumeral rotations). Moreover, MKO was applied to various movements (e.g., walking, running, arm elevation). Additional validations, especially for the upper limb, should be undertaken and we recommend a more systematic approach for the evaluation of MKO. In addition, further model development, scaling, and personalization methods are required to better estimate the secondary degrees-of-freedom (DoF).

  8. Systematic review and retrospective validation of prediction models for weight loss after bariatric surgery.

    PubMed

    Sharples, Alistair J; Mahawar, Kamal; Cheruvu, Chandra V N

    2017-11-01

    Patients often have less than realistic expectations of the weight loss they are likely to achieve after bariatric surgery. It would be useful to have a well-validated prediction tool that could give patients a realistic estimate of their expected weight loss. To perform a systematic review of the literature to identify existing prediction models and attempt to validate these models. University hospital, United Kingdom. A systematic review was performed. All English language studies were included if they used data to create a prediction model for postoperative weight loss after bariatric surgery. These models were then tested on patients undergoing bariatric surgery between January 1, 2013 and December 31, 2014 within our unit. An initial literature search produced 446 results, of which only 4 were included in the final review. Our study population included 317 patients. Mean preoperative body mass index was 46.1 ± 7.1. For 257 (81.1%) patients, 12-month follow-up was available, and mean body mass index and percentage excess weight loss at 12 months was 33.0 ± 6.7 and 66.1% ± 23.7%, respectively. All 4 of the prediction models significantly overestimated the amount of weight loss achieved by patients. The best performing prediction model in our series produced a correlation coefficient (R 2 ) of .61 and an area under the curve of .71 on receiver operating curve analysis. All prediction models overestimated weight loss after bariatric surgery in our cohort. There is a need to develop better procedures and patient-specific models for better patient counselling. Copyright © 2017 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  9. Evidence used in model-based economic evaluations for evaluating pharmacogenetic and pharmacogenomic tests: a systematic review protocol

    PubMed Central

    Peters, Jaime L; Cooper, Chris; Buchanan, James

    2015-01-01

    Introduction Decision models can be used to conduct economic evaluations of new pharmacogenetic and pharmacogenomic tests to ensure they offer value for money to healthcare systems. These models require a great deal of evidence, yet research suggests the evidence used is diverse and of uncertain quality. By conducting a systematic review, we aim to investigate the test-related evidence used to inform decision models developed for the economic evaluation of genetic tests. Methods and analysis We will search electronic databases including MEDLINE, EMBASE and NHS EEDs to identify model-based economic evaluations of pharmacogenetic and pharmacogenomic tests. The search will not be limited by language or date. Title and abstract screening will be conducted independently by 2 reviewers, with screening of full texts and data extraction conducted by 1 reviewer, and checked by another. Characteristics of the decision problem, the decision model and the test evidence used to inform the model will be extracted. Specifically, we will identify the reported evidence sources for the test-related evidence used, describe the study design and how the evidence was identified. A checklist developed specifically for decision analytic models will be used to critically appraise the models described in these studies. Variations in the test evidence used in the decision models will be explored across the included studies, and we will identify gaps in the evidence in terms of both quantity and quality. Dissemination The findings of this work will be disseminated via a peer-reviewed journal publication and at national and international conferences. PMID:26560056

  10. Stochastic differential equations in NONMEM: implementation, application, and comparison with ordinary differential equations.

    PubMed

    Tornøe, Christoffer W; Overgaard, Rune V; Agersø, Henrik; Nielsen, Henrik A; Madsen, Henrik; Jonsson, E Niclas

    2005-08-01

    The objective of the present analysis was to explore the use of stochastic differential equations (SDEs) in population pharmacokinetic/pharmacodynamic (PK/PD) modeling. The intra-individual variability in nonlinear mixed-effects models based on SDEs is decomposed into two types of noise: a measurement and a system noise term. The measurement noise represents uncorrelated error due to, for example, assay error while the system noise accounts for structural misspecifications, approximations of the dynamical model, and true random physiological fluctuations. Since the system noise accounts for model misspecifications, the SDEs provide a diagnostic tool for model appropriateness. The focus of the article is on the implementation of the Extended Kalman Filter (EKF) in NONMEM for parameter estimation in SDE models. Various applications of SDEs in population PK/PD modeling are illustrated through a systematic model development example using clinical PK data of the gonadotropin releasing hormone (GnRH) antagonist degarelix. The dynamic noise estimates were used to track variations in model parameters and systematically build an absorption model for subcutaneously administered degarelix. The EKF-based algorithm was successfully implemented in NONMEM for parameter estimation in population PK/PD models described by systems of SDEs. The example indicated that it was possible to pinpoint structural model deficiencies, and that valuable information may be obtained by tracking unexplained variations in parameters.

  11. Synoptic scale forecast skill and systematic errors in the MASS 2.0 model. [Mesoscale Atmospheric Simulation System

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.

    1985-01-01

    The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.

  12. An analysis of the least-squares problem for the DSN systematic pointing error model

    NASA Technical Reports Server (NTRS)

    Alvarez, L. S.

    1991-01-01

    A systematic pointing error model is used to calibrate antennas in the Deep Space Network. The least squares problem is described and analyzed along with the solution methods used to determine the model's parameters. Specifically studied are the rank degeneracy problems resulting from beam pointing error measurement sets that incorporate inadequate sky coverage. A least squares parameter subset selection method is described and its applicability to the systematic error modeling process is demonstrated on Voyager 2 measurement distribution.

  13. Comparing productive vocabulary measures from the CDI and a systematic diary study.

    PubMed

    Robinson, B F; Mervis, C B

    1999-02-01

    Expressive vocabulary data gathered during a systematic diary study of one male child's early language development are compared to data that would have resulted from longitudinal administration of the MacArthur Communicative Development Inventories spoken vocabulary checklist (CDI). Comparisons are made for (1) the number of words at monthly intervals (9; 10.15 to 2; 0.15), (2) proportion of words by lexical class (i.e. noun, predicate, closed class, 'other'), (3) growth curves. The CDI underestimates the number of words in the diary study, with the underestimation increasing as vocabulary size increases. The proportion of diary study words appearing on the CDI differed as a function of lexical class. Finally, despite the differences in vocabulary size, logistic curves proved to be the best fitting model to characterize vocabulary development as measured by both the diary study and the CDI. Implications for the longitudinal use of the CDI are discussed.

  14. The Evaluation of Hospital Performance in Iran: A Systematic Review Article

    PubMed Central

    BAHADORI, Mohammadkarim; IZADI, Ahmad Reza; GHARDASHI, Fatemeh; RAVANGARD, Ramin; HOSSEINI, Seyed Mojtaba

    2016-01-01

    Background: This research aimed to systematically study and outline the methods of hospital performance evaluation used in Iran. Methods: In this systematic review, all Persian and English-language articles published in the Iranian and non-Iranian scientific journals indexed from Sep 2004 to Sep 2014 were studied. For finding the related articles, the researchers searched the Iranian electronic databases, including SID, IranMedex, IranDoc, Magiran, as well as the non-Iranian electronic databases, including Medline, Embase, Scopus, and Google Scholar. For reviewing the selected articles, a data extraction form, developed by the researchers was used. Results: The entire review process led to the selection of 51 articles. The publication of articles on the hospital performance evaluation in Iran has increased considerably in the recent years. Besides, among these 51 articles, 38 articles (74.51%) had been published in Persian language and 13 articles (25.49%) in English language. Eight models were recognized as evaluation model for Iranian hospitals. Totally, in 15 studies, the data envelopment analysis model had been used to evaluate the hospital performance. Conclusion: Using a combination of model to integrate indicators in the hospital evaluation process is inevitable. Therefore, the Ministry of Health and Medical Education should use a set of indicators such as the balanced scorecard in the process of hospital evaluation and accreditation and encourage the hospital managers to use them. PMID:27516991

  15. The Evaluation of Hospital Performance in Iran: A Systematic Review Article.

    PubMed

    Bahadori, Mohammadkarim; Izadi, Ahmad Reza; Ghardashi, Fatemeh; Ravangard, Ramin; Hosseini, Seyed Mojtaba

    2016-07-01

    This research aimed to systematically study and outline the methods of hospital performance evaluation used in Iran. In this systematic review, all Persian and English-language articles published in the Iranian and non-Iranian scientific journals indexed from Sep 2004 to Sep 2014 were studied. For finding the related articles, the researchers searched the Iranian electronic databases, including SID, IranMedex, IranDoc, Magiran, as well as the non-Iranian electronic databases, including Medline, Embase, Scopus, and Google Scholar. For reviewing the selected articles, a data extraction form, developed by the researchers was used. The entire review process led to the selection of 51 articles. The publication of articles on the hospital performance evaluation in Iran has increased considerably in the recent years. Besides, among these 51 articles, 38 articles (74.51%) had been published in Persian language and 13 articles (25.49%) in English language. Eight models were recognized as evaluation model for Iranian hospitals. Totally, in 15 studies, the data envelopment analysis model had been used to evaluate the hospital performance. Using a combination of model to integrate indicators in the hospital evaluation process is inevitable. Therefore, the Ministry of Health and Medical Education should use a set of indicators such as the balanced scorecard in the process of hospital evaluation and accreditation and encourage the hospital managers to use them.

  16. Read Naturally. Revised. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2007

    2007-01-01

    "Read Naturally" is designed to improve reading fluency using a combination of books, audio-tapes, and computer software. This program includes three main strategies: repeated reading of English text for oral reading fluency development, teacher modeling of story reading, and systematic monitoring of student progress by teachers.…

  17. Read Naturally. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2006

    2006-01-01

    "Read Naturally" is designed to improve reading fluency using a combination of books, audio-tapes, and computer software. This program includes three main strategies: (1) repeated reading of English text for oral reading fluency development; (2) teacher modeling of story reading; and (3) systematic monitoring of student progress by…

  18. Coordinating Educational Assessment Across College Centers.

    ERIC Educational Resources Information Center

    Churchill, Ruth; And Others

    An operational model developed as a result of a systematic analysis of three distinctly different Antioch centers--Juarez Lincoln University, Philadelphia Graduate Center, and Antioch-New England (the Keene Center)--is presented. Juarez Lincoln offers a 15-month program leading to the Master of Education degree. Many of the students are Mexican…

  19. Growing Food for Thought: A New Model of Site-Specific Research from Bolivia.

    ERIC Educational Resources Information Center

    Ruddell, Edward

    1995-01-01

    A severe drought precipitated systematic documentation of farmers' field trials in the farmer-to-farmer extension service in Bolivia. Successful agricultural experiments and seminars on agronomic practices and on data recording and analysis increased farmer self-confidence; developed awareness of literacy and numeracy educational needs; and…

  20. Exporting U.S. CTE Goods and Services

    ERIC Educational Resources Information Center

    Fretwell, David

    2009-01-01

    The United States needs to develop and implement systematic programs to market Technical Vocational Education and Training (TVET) models internationally--the way other nations have. The short- and long-term social, political and economic implications for the United States are considerable, including the acquisition of new knowledge to improve…

  1. Teachers' Perceptions of Examining Students' Thinking: Changing Mathematics Instructional Practice

    ERIC Educational Resources Information Center

    Anderson-Pence, Katie L.

    2015-01-01

    This paper seeks to illuminate teachers' perceptions of the challenges and benefits of systematically examining students' thinking as part of a professional development program in elementary mathematics education. Using a framework of models of conceptual change and principles of discomfort, three elementary teachers' perceptions of their…

  2. An Application of Markov Chains and a Monte-Carlo Simulation to Decision-Making Behavior of an Educational Administrator

    ERIC Educational Resources Information Center

    Yoda, Koji

    1973-01-01

    Develops models to systematically forecast the tendency of an educational administrator in charge of personnel selection processes to shift from one decision strategy to another under generally stable environmental conditions. Urges further research on these processes by educational planners. (JF)

  3. Ethics: A Bridge for Studying the Social Contexts of Professional Communication.

    ERIC Educational Resources Information Center

    Speck, Bruce W.

    1989-01-01

    Describes a method for helping students evaluate ethical issues in a systematic way, based on Lawrence Kohlberg's stages of moral development. Recommends the case-study approach for creating social constructs in which students face ethical dilemmas, and outlines a case-study ethics unit using Kohlberg's model. (MM)

  4. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  5. Career Development in Rural Education.

    ERIC Educational Resources Information Center

    Baskerville, Roger A.

    The Lohrville Career Education Model (LCEM) was instituted as a systematic attempt at exploring careers in Iowa and inducing Iowa youth to seek careers closer to home following high school graduation or post-secondary education training; a major purpose of the Toward Community Growth project was to teach positive attitudes about living and working…

  6. MODIA: Vol. 4. The Resource Utilization Model. A Project AIR FORCE Report.

    ERIC Educational Resources Information Center

    Gallegos, Margaret

    MODIA (Method of Designing Instructional Alternatives) was developed to help the Air Force manage resources for formal training by systematically and explicitly relating quantitative requirements for training resources to the details of course design and course operation during the planning stage. This report describes the Resource Utilization…

  7. Do antibody responses to the influenza vaccine persist year-round in the elderly? A systematic review and meta-analysis.

    PubMed

    Young, Barnaby; Zhao, Xiahong; Cook, Alex R; Parry, Christopher M; Wilder-Smith, Annelies; I-Cheng, Mark Chen

    2017-01-05

    The influenza vaccine is less immunogenic in older than younger adults, and the duration of protection is unclear. Determining if protection persists beyond a typical seasonal epidemic is important for climates where influenza virus activity is year-round. A systematic review protocol was developed and registered with PROSPERO [CRD42015023847]. Electronic databases were searched systematically for studies reporting haemagglutination-inhibition (HI) titres 180-360days following vaccination with inactivated trivalent seasonal influenza vaccine, in adults aged ⩾65years. Geometric mean titre (GMT) and seroprotection (HI titre ⩾1:40) at each time point was extracted. A Bayesian model was developed of titre trajectories from pre-vaccination to Day 360. In the meta-analysis, studies were aggregated using a random-effects model to compare pre-vaccination with post-vaccination HI titres at Day 21-42 ('seroconversion'), Day 180 and Day 360. Potential sources of bias were systematically assessed, and heterogeneity explored. 2864 articles were identified in the literature search, of which nineteen met study inclusion/exclusion criteria. Sixteen studies contained analysable data from 2565 subjects. In the Bayesian model, the proportion of subjects seroprotected increased from 41-51% pre-vaccination to 75-78% at seroconversion. Seroprotection subsequently fell below 60% for all serotypes by Day 360: A/H1 42% (95% CI 38-46), A/H3 59% (54-63), B 47% (42-52). The Bayesian model of GMT trajectories revealed a similar pattern. By Day 360, titres were similar to pre-vaccination levels. In the meta-analysis, no significant difference in proportion of subjects seroprotected, 0 (-0.11, 0.11) or in log 2 GMT 0.30 (-0.02, 0.63) was identified by Day 360 compared with pre-vaccination. The quality of this evidence was limited to moderate on account of significant participant dropout. The review found consistent evidence that HI antibody responses following influenza vaccination do not reliably persist year-round in older adults. Alternative vaccination strategies could provide clinical benefits in regions where year-round protection is important. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Screening strategies for atrial fibrillation: a systematic review and cost-effectiveness analysis.

    PubMed

    Welton, Nicky J; McAleenan, Alexandra; Thom, Howard Hz; Davies, Philippa; Hollingworth, Will; Higgins, Julian Pt; Okoli, George; Sterne, Jonathan Ac; Feder, Gene; Eaton, Diane; Hingorani, Aroon; Fawsitt, Christopher; Lobban, Trudie; Bryden, Peter; Richards, Alison; Sofat, Reecha

    2017-05-01

    Atrial fibrillation (AF) is a common cardiac arrhythmia that increases the risk of thromboembolic events. Anticoagulation therapy to prevent AF-related stroke has been shown to be cost-effective. A national screening programme for AF may prevent AF-related events, but would involve a substantial investment of NHS resources. To conduct a systematic review of the diagnostic test accuracy (DTA) of screening tests for AF, update a systematic review of comparative studies evaluating screening strategies for AF, develop an economic model to compare the cost-effectiveness of different screening strategies and review observational studies of AF screening to provide inputs to the model. Systematic review, meta-analysis and cost-effectiveness analysis. Primary care. Adults. Screening strategies, defined by screening test, age at initial and final screens, screening interval and format of screening {systematic opportunistic screening [individuals offered screening if they consult with their general practitioner (GP)] or systematic population screening (when all eligible individuals are invited to screening)}. Sensitivity, specificity and diagnostic odds ratios; the odds ratio of detecting new AF cases compared with no screening; and the mean incremental net benefit compared with no screening. Two reviewers screened the search results, extracted data and assessed the risk of bias. A DTA meta-analysis was perfomed, and a decision tree and Markov model was used to evaluate the cost-effectiveness of the screening strategies. Diagnostic test accuracy depended on the screening test and how it was interpreted. In general, the screening tests identified in our review had high sensitivity (> 0.9). Systematic population and systematic opportunistic screening strategies were found to be similarly effective, with an estimated 170 individuals needed to be screened to detect one additional AF case compared with no screening. Systematic opportunistic screening was more likely to be cost-effective than systematic population screening, as long as the uptake of opportunistic screening observed in randomised controlled trials translates to practice. Modified blood pressure monitors, photoplethysmography or nurse pulse palpation were more likely to be cost-effective than other screening tests. A screening strategy with an initial screening age of 65 years and repeated screens every 5 years until age 80 years was likely to be cost-effective, provided that compliance with treatment does not decline with increasing age. A national screening programme for AF is likely to represent a cost-effective use of resources. Systematic opportunistic screening is more likely to be cost-effective than systematic population screening. Nurse pulse palpation or modified blood pressure monitors would be appropriate screening tests, with confirmation by diagnostic 12-lead electrocardiography interpreted by a trained GP, with referral to a specialist in the case of an unclear diagnosis. Implementation strategies to operationalise uptake of systematic opportunistic screening in primary care should accompany any screening recommendations. Many inputs for the economic model relied on a single trial [the Screening for Atrial Fibrillation in the Elderly (SAFE) study] and DTA results were based on a few studies at high risk of bias/of low applicability. Comparative studies measuring long-term outcomes of screening strategies and DTA studies for new, emerging technologies and to replicate the results for photoplethysmography and GP interpretation of 12-lead electrocardiography in a screening population. This study is registered as PROSPERO CRD42014013739. The National Institute for Health Research Health Technology Assessment programme.

  9. Clinical Prediction Models for Patients With Nontraumatic Knee Pain in Primary Care: A Systematic Review and Internal Validation Study.

    PubMed

    Panken, Guus; Verhagen, Arianne P; Terwee, Caroline B; Heymans, Martijn W

    2017-08-01

    Study Design Systematic review and validation study. Background Many prognostic models of knee pain outcomes have been developed for use in primary care. Variability among published studies with regard to patient population, outcome measures, and relevant prognostic factors hampers the generalizability and implementation of these models. Objectives To summarize existing prognostic models in patients with knee pain in a primary care setting and to develop and internally validate new summary prognostic models. Methods After a sensitive search strategy, 2 reviewers independently selected prognostic models for patients with nontraumatic knee pain and assessed the methodological quality of the included studies. All predictors of the included studies were evaluated, summarized, and classified. The predictors assessed in multiple studies of sufficient quality are presented in this review. Using data from the Musculoskeletal System Study (BAS) cohort of patients with a new episode of knee pain, recruited consecutively by Dutch general medical practitioners (n = 372), we used predictors with a strong level of evidence to develop new prognostic models for each outcome measure and internally validated these models. Results Sixteen studies were eligible for inclusion. We considered 11 studies to be of sufficient quality. None of these studies validated their models. Five predictors with strong evidence were related to function and 6 to recovery, and were used to compose 2 prognostic models for patients with knee pain at 1 year. Running these new models in another data set showed explained variances (R 2 ) of 0.36 (function) and 0.33 (recovery). The area under the curve of the recovery model was 0.79. After internal validation, the adjusted R 2 values of the models were 0.30 (function) and 0.20 (recovery), and the area under the curve was 0.73. Conclusion We developed 2 valid prognostic models for function and recovery for patients with nontraumatic knee pain, based on predictors with strong evidence. A longer duration of complaints predicted poorer function but did not adequately predict chance of recovery. Level of Evidence Prognosis, levels 1a and 1b. J Orthop Sports Phys Ther 2017;47(8):518-529. Epub 16 Jun 2017. doi:10.2519/jospt.2017.7142.

  10. ILRS Activities in Monitoring Systematic Errors in SLR Data

    NASA Astrophysics Data System (ADS)

    Pavlis, E. C.; Luceri, V.; Kuzmicz-Cieslak, M.; Bianco, G.

    2017-12-01

    The International Laser Ranging Service (ILRS) contributes to ITRF development unique information that only Satellite Laser Ranging—SLR is sensitive to: the definition of the origin, and in equal parts with VLBI, the scale of the model. For the development of ITRF2014, the ILRS analysts adopted a revision of the internal standards and procedures in generating our contribution from the eight ILRS Analysis Centers. The improved results for the ILRS components were reflected in the resulting new time series of the ITRF origin and scale, showing insignificant trends and tighter scatter. This effort was further extended after the release of ITRF2014, with the execution of a Pilot Project (PP) in the 2016-2017 timeframe that demonstrated the robust estimation of persistent systematic errors at the millimeter level. ILRS ASC is now turning this into an operational tool to monitor station performance and to generate a history of systematics at each station, to be used with each re-analysis for future ITRF model developments. This is part of a broader ILRS effort to improve the quality control of the data collection process as well as that of our products. To this end, the ILRS has established a "Quality Control Board—QCB" that comprises of members from the analysis and engineering groups, the Central Bureau, and even user groups with special interests. The QCB meets by telecon monthly and oversees the various ongoing projects, develops ideas for new tools and future products. This presentation will focus on the main topic with an update on the results so far, the schedule for the near future and its operational implementation, along with a brief description of upcoming new ILRS products.

  11. A Systematic Review of the Impact of Multi-Strategy Nutrition Education Programs on Health and Nutrition of Adolescents.

    PubMed

    Meiklejohn, Sarah; Ryan, Lisa; Palermo, Claire

    2016-10-01

    To update evidence on the impact of multi-strategy nutrition education interventions on adolescents' health and nutrition outcomes and behaviors. Systematic review of randomized controlled studies of multi-strategy interventions encompassing nutrition education published from 2000 to 2014 guided by the Preferred Reported Items for Systematic Reviews and Meta-analyses statement. Secondary schools in developed countries. Adolescents aged 10-18 years. Anthropometric and dietary intake. Systematic search of 7,009 unduplicated articles and review of 11 studies (13 articles) meeting inclusion criteria using qualitative comparison. Four studies reported significant changes in anthropometric measures and 9 showed significant changes in dietary intake. Type of nutrition education varied. Components of the interventions that showed statistically significant changes in anthropometric and dietary intake included facilitation of the programs by school staff and teachers, parental involvement, and using theoretical models to guide the intervention's development. Changes in canteens, food supply, and vending machines were associated with significant changes in dietary intake. Multi-strategy interventions can have significant impacts on nutrition of adolescents when the nutrition education is theoretically based and facilitated by school staff in conjunction with parents and families, and includes changes to the school food environment. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  12. Implementation of a WRF-CMAQ Air Quality Modeling System in Bogotá, Colombia

    NASA Astrophysics Data System (ADS)

    Nedbor-Gross, R.; Henderson, B. H.; Pachon, J. E.; Davis, J. R.; Baublitz, C. B.; Rincón, A.

    2014-12-01

    Due to a continuous economic growth Bogotá, Colombia has experienced air pollution issues in recent years. The local environmental authority has implemented several strategies to curb air pollution that have resulted in the decrease of PM10 concentrations since 2010. However, more activities are necessary in order to meet international air quality standards in the city. The University of Florida Air Quality and Climate group is collaborating with the Universidad de La Salle to prioritize regulatory strategies for Bogotá using air pollution simulations. To simulate pollution, we developed a modeling platform that combines the Weather Research and Forecasting Model (WRF), local emissions, and the Community Multi-scale Air Quality model (CMAQ). This platform is the first of its kind to be implemented in the megacity of Bogota, Colombia. The presentation will discuss development and evaluation of the air quality modeling system, highlight initial results characterizing photochemical conditions in Bogotá, and characterize air pollution under proposed regulatory strategies. The WRF model has been configured and applied to Bogotá, which resides in a tropical climate with complex mountainous topography. Developing the configuration included incorporation of local topography and land-use data, a physics sensitivity analysis, review, and systematic evaluation. The threshold, however, was set based on synthesis of model performance under less mountainous conditions. We will evaluate the impact that differences in autocorrelation contribute to the non-ideal performance. Air pollution predictions are currently under way. CMAQ has been configured with WRF meteorology, global boundary conditions from GEOS-Chem, and a locally produced emission inventory. Preliminary results from simulations show promising performance of CMAQ in Bogota. Anticipated results include a systematic performance evaluation of ozone and PM10, characterization of photochemical sensitivity, and air quality predictions under proposed regulatory scenarios.

  13. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  14. [Creativity and bipolar disorder].

    PubMed

    Maçkalı, Zeynep; Gülöksüz, Sinan; Oral, Timuçin

    2014-01-01

    The relationship between creativity and bipolar disorder has been an intriguing topic since ancient times. Early studies focused on describing characteristics of creative people. From the last quarter of the twentieth century, researchers began to focus on the relationship between mood disorders and creativity. Initially, the studies were based on biographical texts and the obtained results indicated a relationship between these two concepts. The limitations of the retrospective studies led the researchers to develop systematic investigations into this area. The systematic studies that have focused on artistic creativity have examined both the prevalence of mood disorders and the creative process. In addition, a group of researchers addressed the relationship in terms of affective temperaments. Through the end of the 90's, the scope of creativity was widened and the notion of everyday creativity was proposed. The emergence of this notion led researchers to investigate the associations of the creative process in ordinary (non-artist) individuals. In this review, the descriptions of creativity and creative process are mentioned. Also, the creative process is addressed with regards to bipolar disorder. Then, the relationship between creativity and bipolar disorder are evaluated in terms of aforementioned studies (biographical, systematic, psychobiographical, affective temperaments). In addition, a new model, the "Shared Vulnerability Model" which was developed to explain the relationship between creativity and psychopathology is introduced. Finally, the methodological limitations and the suggestions for resolving these limitations are included.

  15. Prospective systematic review registration: perspective from the Guidelines International Network (G-I-N).

    PubMed

    Van der Wees, Philip; Qaseem, Amir; Kaila, Minna; Ollenschlaeger, Guenter; Rosenfeld, Richard

    2012-02-09

    Clinical practice and public health guidelines are important tools for translating research findings into practice with the aim of assisting health practitioners as well as patients and consumers in health behavior and healthcare decision-making. Numerous programs for guideline development exist around the world, with growing international collaboration to improve their quality. One of the key features in developing trustworthy guidelines is that recommendations should be based on high-quality systematic reviews of the best available evidence. The review process used by guideline developers to identify and grade relevant evidence for developing recommendations should be systematic, transparent and unbiased. In this paper, we provide an overview of current international developments in the field of practice guidelines and methods to develop guidelines, with a specific focus on the role of systematic reviews. The Guidelines International Network (G-I-N) aims to stimulate collaboration between guideline developers and systematic reviewers to optimize the use of available evidence in guideline development and to increase efficiency in the guideline development process. Considering the significant benefit of systematic reviews for the guideline community, the G-I-N Board of Trustees supports the international prospective register of systematic reviews (PROSPERO) initiative. G-I-N also recently launched a Data Extraction Resource (GINDER) to present and share data extracted from individual studies in a standardized template. PROSPERO and GINDER are complementary tools to enhance collaboration between guideline developers and systematic reviewers to allow for alignment of activities and a reduction in duplication of effort.

  16. A strategy for understanding noise-induced annoyance

    NASA Astrophysics Data System (ADS)

    Fidell, S.; Green, D. M.; Schultz, T. J.; Pearsons, K. S.

    1988-08-01

    This report provides a rationale for development of a systematic approach to understanding noise-induced annoyance. Two quantitative models are developed to explain: (1) the prevalence of annoyance due to residential exposure to community noise sources; and (2) the intrusiveness of individual noise events. Both models deal explicitly with the probabilistic nature of annoyance, and assign clear roles to acoustic and nonacoustic determinants of annoyance. The former model provides a theoretical foundation for empirical dosage-effect relationships between noise exposure and community response, while the latter model differentiates between the direct and immediate annoyance of noise intrusions and response bias factors that influence the reporting of annoyance. The assumptions of both models are identified, and the nature of the experimentation necessary to test hypotheses derived from the models is described.

  17. The presence of accessory cusps in chimpanzee lower molars is consistent with a patterning cascade model of development

    PubMed Central

    Skinner, Matthew M; Gunz, Philipp

    2010-01-01

    Tooth crown morphology is of primary importance in fossil primate systematics and understanding the developmental basis of its variation facilitates phenotypic analyses of fossil teeth. Lower molars of species in the chimp/human clade (including fossil hominins) possess between four and seven cusps and this variability has been implicated in alpha taxonomy and phylogenetic systematics. What is known about the developmental basis of variation in cusp number – based primarily on experimental studies of rodent molars – suggests that cusps form under a morphodynamic, patterning cascade model involving the iterative formation of enamel knots. In this study we test whether variation in cusp 6 (C6) presence in common chimpanzee and bonobo lower molars (n = 55) is consistent with predictions derived from the patterning cascade model. Using microcomputed tomography we imaged the enamel-dentine junction of lower molars and used geometric morphometrics to examine shape variation in the molar crown correlated with variation in C6 presence (in particular the size and spacing of the dentine horns). Results indicate that C6 presence is consistent with predictions of a patterning cascade model, with larger molars exhibiting a higher frequency of C6 and with the location and size of later-forming cusps correlated with C6 variation. These results demonstrate that a patterning cascade model is appropriate for interpreting cusp variation in Pan and have implications for cusp nomenclature and the use of accessory cusp morphology in primate systematics. PMID:20629983

  18. Calculation of the detection limit in radiation measurements with systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, J. M.; Russ, W.; Venkataraman, R.; Young, B. M.

    2015-06-01

    The detection limit (LD) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case.

  19. Omens of coupled model biases in the CMIP5 AMIP simulations

    NASA Astrophysics Data System (ADS)

    Găinuşă-Bogdan, Alina; Hourdin, Frédéric; Traore, Abdoul Khadre; Braconnot, Pascale

    2018-02-01

    Despite decades of efforts and improvements in the representation of processes as well as in model resolution, current global climate models still suffer from a set of important, systematic biases in sea surface temperature (SST), not much different from the previous generation of climate models. Many studies have looked at errors in the wind field, cloud representation or oceanic upwelling in coupled models to explain the SST errors. In this paper we highlight the relationship between latent heat flux (LH) biases in forced atmospheric simulations and the SST biases models develop in coupled mode, at the scale of the entire intertropical domain. By analyzing 22 pairs of forced atmospheric and coupled ocean-atmosphere simulations from the CMIP5 database, we show a systematic, negative correlation between the spatial patterns of these two biases. This link between forced and coupled bias patterns is also confirmed by two sets of dedicated sensitivity experiments with the IPSL-CM5A-LR model. The analysis of the sources of the atmospheric LH bias pattern reveals that the near-surface wind speed bias dominates the zonal structure of the LH bias and that the near-surface relative humidity dominates the east-west contrasts.

  20. Elevation Control on Vegetation Organization in a Semiarid Ecosystem in Central New Mexico

    NASA Astrophysics Data System (ADS)

    Nudurupati, S. S.; Istanbulluoglu, E.; Adams, J. M.; Hobley, D. E. J.; Gasparini, N. M.; Tucker, G. E.; Hutton, E. W. H.

    2015-12-01

    Many semiarid and desert ecosystems are characterized by patchy and dynamic vegetation. Topography plays a commanding role on vegetation patterns. It is observed that plant biomes and biodiversity vary systematically with slope and aspect, from shrublands in low desert elevations, to mixed grass/shrublands in mid elevations, and forests at high elevations. In this study, we investigate the role of elevation dependent climatology on vegetation organization in a semiarid New Mexico catchment where elevation and hillslope aspect play a defining role on plant types. An ecohydrologic cellular automaton model developed within Landlab (component based modeling framework) is used. The model couples local vegetation dynamics (that simulate biomass production based on local soil moisture and potential evapotranspiration) and plant establishment and mortality based on competition for resources and space. This model is driven by elevation dependent rainfall pulses and solar radiation. The domain is initialized with randomly assigned plant types and the model parameters that couple plant response with soil moisture are systematically changed. Climate perturbation experiments are conducted to examine spatial vegetation organization and associated timescales. Model results reproduce elevation and aspect controls on observed vegetation patterns indicating that this model captures necessary and sufficient conditions that explain these observed ecohydrological patterns.

  1. Assessing pesticide risks to threatened and endangered species using population models: Findings and recommendations from a CropLife America Science Forum.

    PubMed

    Forbes, V E; Brain, R; Edwards, D; Galic, N; Hall, T; Honegger, J; Meyer, C; Moore, D R J; Nacci, D; Pastorok, R; Preuss, T G; Railsback, S F; Salice, C; Sibly, R M; Tenhumberg, B; Thorbek, P; Wang, M

    2015-07-01

    This brief communication reports on the main findings and recommendations from the 2014 Science Forum organized by CropLife America. The aim of the Forum was to gain a better understanding of the current status of population models and how they could be used in ecological risk assessments for threatened and endangered species potentially exposed to pesticides in the United States. The Forum panelists' recommendations are intended to assist the relevant government agencies with implementation of population modeling in future endangered species risk assessments for pesticides. The Forum included keynote presentations that provided an overview of current practices, highlighted the findings of a recent National Academy of Sciences report and its implications, reviewed the main categories of existing population models and the types of risk expressions that can be produced as model outputs, and provided examples of how population models are currently being used in different legislative contexts. The panel concluded that models developed for listed species assessments should provide quantitative risk estimates, incorporate realistic variability in environmental and demographic factors, integrate complex patterns of exposure and effects, and use baseline conditions that include present factors that have caused the species to be listed (e.g., habitat loss, invasive species) or have resulted in positive management action. Furthermore, the panel advocates for the formation of a multipartite advisory committee to provide best available knowledge and guidance related to model implementation and use, to address such needs as more systematic collection, digitization, and dissemination of data for listed species; consideration of the newest developments in good modeling practice; comprehensive review of existing population models and their applicability for listed species assessments; and development of case studies using a few well-tested models for particular species to demonstrate proof of concept. To advance our common goals, the panel recommends the following as important areas for further research and development: quantitative analysis of the causes of species listings to guide model development; systematic assessment of the relative role of toxicity versus other factors in driving pesticide risk; additional study of how interactions between density dependence and pesticides influence risk; and development of pragmatic approaches to assessing indirect effects of pesticides on listed species. © 2015 SETAC.

  2. Predictors of human immunodeficiency virus (HIV) infection in primary care: a systematic review protocol.

    PubMed

    Rumbwere Dube, Benhildah N; Marshall, Tom P; Ryan, Ronan P

    2016-09-20

    Antiretroviral therapies for human immunodeficiency virus are more effective if infected individuals are diagnosed early, before they have irreversible immunologic damage. A large proportion of patients that are diagnosed with HIV, in United Kingdom, would have seen a general practitioner (GP) within the previous year. Determining the demographic and clinical characteristics of HIV-infected patients prior to diagnosis of HIV may be useful in identifying patients likely to be HIV positive in primary care. This could help inform a strategy of early HIV testing in primary care. This systematic review aims to identify characteristics of HIV-infected adults prior to diagnosis that could be used in a prediction model for early detection of HIV in primary care. The systematic review will search for literature, mainly observational (cohort and case-control) studies, with human participants aged 18 years and over. The exposures are demographic, socio-economic or clinical risk factors or characteristics associated with HIV infection. The comparison group will be patients with no risk factors or no comparison group. The outcome is laboratory-confirmed HIV/AIDS infection. Evidence will be identified from electronic searches of online databases of EMBASE, MEDLINE, The Cochrane Library and grey literature search engines of Open Grey, Web of Science Conference Proceedings Citation Index and examination of reference lists from selected studies (reference searching). Two reviewers will be involved in quality assessment and data extraction of the review. A data extraction form will be developed to collate data from selected studies. A checklist for quality assessment will be adapted from the Scottish Intercollegiate Guidelines Network (SIGN). This systematic review will identify and consolidate existing scientific evidence on characteristics of HIV infected individuals that could be used to inform decision-making in prognostic model development. PROSPERO CRD42016042427.

  3. A systematic narrative review of consumer-directed care for older people: implications for model development.

    PubMed

    Ottmann, Goetz; Allen, Jacqui; Feldman, Peter

    2013-11-01

    Consumer-directed care is increasingly becoming a mainstream option in community-based aged care. However, a systematic review describing how the current evaluation research translates into practise has not been published to date. This review aimed to systematically establish an evidence base of user preferences for and satisfaction with services associated with consumer-directed care programmes for older people. Twelve databases were searched, including MedLine, BioMed Central, Cinahl, Expanded Academic ASAP, PsychInfo, ProQuest, Age Line, Science Direct, Social Citation Index, Sociological Abstracts, Web of Science and the Cochrane Library. Google Scholar and Google were also searched. Eligible studies were those reporting on choice, user preferences and service satisfaction outcomes regarding a programme or model of home-based care in the United States or United Kingdom. This systematic narrative review retrieved literature published from January 1992 to August 2011. A total of 277 references were identified. Of these 17 met the selection criteria and were reviewed. Findings indicate that older people report varying preferences for consumer-directed care with some demonstrating limited interest. Clients and carers reported good service satisfaction. However, research comparing user preferences across countries or investigating how ecological factors shape user preferences has received limited attention. Policy-makers and practitioners need to carefully consider the diverse contexts, needs and preferences of older adults in adopting consumer-directed care approaches in community aged care. The review calls for the development of consumer-directed care programmes offering a broad range of options that allow for personalisation and greater control over services without necessarily transferring the responsibility for administrative responsibilities to service users. Review findings suggest that consumer-directed care approaches have the potential to empower older people. © 2013 Blackwell Publishing Ltd.

  4. Assessing the performance of community-available global MHD models using key system parameters and empirical relationships

    NASA Astrophysics Data System (ADS)

    Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.

    2015-12-01

    Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively evaluate progress in magnetospheric modeling.

  5. A systematic methodology for the robust quantification of energy efficiency at wastewater treatment plants featuring Data Envelopment Analysis.

    PubMed

    Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M

    2018-05-10

    This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. The Systematic Development of an Internet-Based Smoking Cessation Intervention for Adults.

    PubMed

    Dalum, Peter; Brandt, Caroline Lyng; Skov-Ettrup, Lise; Tolstrup, Janne; Kok, Gerjo

    2016-07-01

    Objectives The objective of this project was to determine whether intervention mapping is a suitable strategy for developing an Internet- and text message-based smoking cessation intervention. Method We used the Intervention Mapping framework for planning health promotion programs. After a needs assessment, we identified important changeable determinants of cessation behavior, specified objectives for the intervention, selected theoretical methods for meeting our objectives, and operationalized change methods into practical intervention strategies. Results We found that "social cognitive theory," the "transtheoretical model/stages of change," "self-regulation theory," and "appreciative inquiry" were relevant theories for smoking cessation interventions. From these theories, we selected modeling/behavioral journalism, feedback, planning coping responses/if-then statements, gain frame/positive imaging, consciousness-raising, helping relationships, stimulus control, and goal-setting as suitable methods for an Internet- and text-based adult smoking cessation program. Furthermore, we identified computer tailoring as a useful strategy for adapting the intervention to individual users. Conclusion The Intervention Mapping method, with a clear link between behavioral goals, theoretical methods, and practical strategies and materials, proved useful for systematic development of a digital smoking cessation intervention for adults. © 2016 Society for Public Health Education.

  7. The clinical effectiveness and cost-effectiveness of testing for cytochrome P450 polymorphisms in patients with schizophrenia treated with antipsychotics: a systematic review and economic evaluation.

    PubMed

    Fleeman, N; McLeod, C; Bagust, A; Beale, S; Boland, A; Dundar, Y; Jorgensen, A; Payne, K; Pirmohamed, M; Pushpakom, S; Walley, T; de Warren-Penny, P; Dickson, R

    2010-01-01

    To determine whether testing for cytochrome P450 (CYP) polymorphisms in adults entering antipsychotic treatment for schizophrenia leads to improvement in outcomes, is useful in medical, personal or public health decision-making, and is a cost-effective use of health-care resources. The following electronic databases were searched for relevant published literature: Cochrane Controlled Trials Register, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effectiveness, EMBASE, Health Technology Assessment database, ISI Web of Knowledge, MEDLINE, PsycINFO, NHS Economic Evaluation Database, Health Economic Evaluation Database, Cost-effectiveness Analysis (CEA) Registry and the Centre for Health Economics website. In addition, publicly available information on various genotyping tests was sought from the internet and advisory panel members. A systematic review of analytical validity, clinical validity and clinical utility of CYP testing was undertaken. Data were extracted into structured tables and narratively discussed, and meta-analysis was undertaken when possible. A review of economic evaluations of CYP testing in psychiatry and a review of economic models related to schizophrenia were also carried out. For analytical validity, 46 studies of a range of different genotyping tests for 11 different CYP polymorphisms (most commonly CYP2D6) were included. Sensitivity and specificity were high (99-100%). For clinical validity, 51 studies were found. In patients tested for CYP2D6, an association between genotype and tardive dyskinesia (including Abnormal Involuntary Movement Scale scores) was found. The only other significant finding linked the CYP2D6 genotype to parkinsonism. One small unpublished study met the inclusion criteria for clinical utility. One economic evaluation assessing the costs and benefits of CYP testing for prescribing antidepressants and 28 economic models of schizophrenia were identified; none was suitable for developing a model to examine the cost-effectiveness of CYP testing. Tests for determining genotypes appear to be accurate although not all aspects of analytical validity were reported. Given the absence of convincing evidence from clinical validity studies, the lack of clinical utility and economic studies, and the unsuitability of published schizophrenia models, no model was developed; instead key features and data requirements for economic modelling are presented. Recommendations for future research cover both aspects of research quality and data that will be required to inform the development of future economic models.

  8. A Systematic Review of Conceptual Frameworks of Medical Complexity and New Model Development.

    PubMed

    Zullig, Leah L; Whitson, Heather E; Hastings, Susan N; Beadles, Chris; Kravchenko, Julia; Akushevich, Igor; Maciejewski, Matthew L

    2016-03-01

    Patient complexity is often operationalized by counting multiple chronic conditions (MCC) without considering contextual factors that can affect patient risk for adverse outcomes. Our objective was to develop a conceptual model of complexity addressing gaps identified in a review of published conceptual models. We searched for English-language MEDLINE papers published between 1 January 2004 and 16 January 2014. Two reviewers independently evaluated abstracts and all authors contributed to the development of the conceptual model in an iterative process. From 1606 identified abstracts, six conceptual models were selected. One additional model was identified through reference review. Each model had strengths, but several constructs were not fully considered: 1) contextual factors; 2) dynamics of complexity; 3) patients' preferences; 4) acute health shocks; and 5) resilience. Our Cycle of Complexity model illustrates relationships between acute shocks and medical events, healthcare access and utilization, workload and capacity, and patient preferences in the context of interpersonal, organizational, and community factors. This model may inform studies on the etiology of and changes in complexity, the relationship between complexity and patient outcomes, and intervention development to improve modifiable elements of complex patients.

  9. Knowledge discovery in cardiology: A systematic literature review.

    PubMed

    Kadi, I; Idri, A; Fernandez-Aleman, J L

    2017-01-01

    Data mining (DM) provides the methodology and technology needed to transform huge amounts of data into useful information for decision making. It is a powerful process employed to extract knowledge and discover new patterns embedded in large data sets. Data mining has been increasingly used in medicine, particularly in cardiology. In fact, DM applications can greatly benefit all those involved in cardiology, such as patients, cardiologists and nurses. The purpose of this paper is to review papers concerning the application of DM techniques in cardiology so as to summarize and analyze evidence regarding: (1) the DM techniques most frequently used in cardiology; (2) the performance of DM models in cardiology; (3) comparisons of the performance of different DM models in cardiology. We performed a systematic literature review of empirical studies on the application of DM techniques in cardiology published in the period between 1 January 2000 and 31 December 2015. A total of 149 articles published between 2000 and 2015 were selected, studied and analyzed according to the following criteria: DM techniques and performance of the approaches developed. The results obtained showed that a significant number of the studies selected used classification and prediction techniques when developing DM models. Neural networks, decision trees and support vector machines were identified as being the techniques most frequently employed when developing DM models in cardiology. Moreover, neural networks and support vector machines achieved the highest accuracy rates and were proved to be more efficient than other techniques. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Systematic iteration between model and methodology: A proposed approach to evaluating unintended consequences.

    PubMed

    Morell, Jonathan A

    2018-06-01

    This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Systematic Eclectic Models.

    ERIC Educational Resources Information Center

    Mahalik, James R.

    1990-01-01

    Presents and evaluates four systematic eclectic models of psychotherapy: Beutler's eclectic psychotherapy; Howard, Nance, and Myers' adaptive counseling and therapy; Lazarus' multimodal therapy; and Prochaska and DiClemente's transtheoretical approach. Examines support for these models and makes conceptual and empirical recommendations.…

  12. MAVEN-SA: Model-Based Automated Visualization for Enhanced Situation Awareness

    DTIC Science & Technology

    2005-11-01

    34 methods. But historically, as arts evolve, these how to methods become systematized and codified (e.g. the development and refinement of color theory ...schema (as necessary) 3. Draw inferences from new knowledge to support decision making process 33 Visual language theory suggests that humans process...informed by theories of learning. Over the years, many types of software have been developed to support student learning. The various types of

  13. Developing Learning Objectives for a Model Course to Prepare Adults for the Assessment of Prior, Non-Sponsored Learning by Portfolio Evaluation.

    ERIC Educational Resources Information Center

    Stevens, Mary A.

    A study was conducted in order to develop a systematic method for the evaluation of students' prior, non-sponsored learning for the award of college credit at Blackhawk College (Illinois). It was determined that a course designed to prepare the student for assessment of prior learning was the best way for the institution to provide assistance to…

  14. Drug and alcohol treatment providers' views about the disease model of addiction and its impact on clinical practice: A systematic review.

    PubMed

    Barnett, Anthony I; Hall, Wayne; Fry, Craig L; Dilkes-Frayne, Ella; Carter, Adrian

    2017-12-14

    Addiction treatment providers' views about the disease model of addiction (DMA), and their contemporary views about the brain disease model of addiction (BDMA), remain an understudied area. We systematically reviewed treatment providers' attitudes about the DMA/BDMA, examined factors associated with positive or negative attitudes and assessed their views on the potential clinical impact of both models. Pubmed, EMBASE, PsycINFO, CINAHL Plus and Sociological Abstracts were systematically searched. Original papers on treatment providers' views about the DMA/BDMA and its clinical impact were included. Studies focussing on tobacco, behavioural addictions or non-Western populations were excluded. The 34 included studies were predominantly quantitative and conducted in the USA. Among mixed findings of treatment providers' support for the DMA, strong validity studies indicated treatment providers supported the disease concept and moral, free-will or social models simultaneously. Support for the DMA was positively associated with treatment providers' age, year of qualification, certification status, religious beliefs, being in recovery and Alcoholics Anonymous attendance. Greater education was negatively associated with DMA support. Treatment providers identified potential positive (e.g. reduced stigma) and negative (e.g. increased sense of helplessness) impacts of the DMA on client behaviour. The review suggests treatment providers may endorse disease and other models while strategically deploying the DMA for presumed therapeutic benefits. Varying DMA support across workforces indicated service users may experience multiple and potentially contradictory explanations of addiction. Future policy development will benefit by considering how treatment providers adopt disease concepts in practice. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  15. Developing parenting programs to prevent child health risk behaviors: a practice model

    PubMed Central

    Jackson, Christine; Dickinson, Denise M.

    2009-01-01

    Research indicates that developing public health programs to modify parenting behaviors could lead to multiple beneficial health outcomes for children. Developing feasible effective parenting programs requires an approach that applies a theory-based model of parenting to a specific domain of child health and engages participant representatives in intervention development. This article describes this approach to intervention development in detail. Our presentation emphasizes three points that provide key insights into the goals and procedures of parenting program development. These are a generalized theoretical model of parenting derived from the child development literature, an established eight-step parenting intervention development process and an approach to integrating experiential learning methods into interventions for parents and children. By disseminating this framework for a systematic theory-based approach to developing parenting programs, we aim to support the program development efforts of public health researchers and practitioners who recognize the potential of parenting programs to achieve primary prevention of health risk behaviors in children. PMID:19661165

  16. Propellant injection systems and processes

    NASA Technical Reports Server (NTRS)

    Ito, Jackson I.

    1995-01-01

    The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.

  17. Anorexia nervosa and cancer: a protocol for a systematic review and meta-analysis of observational studies.

    PubMed

    Catalá-López, Ferrán; Hutton, Brian; Driver, Jane A; Ridao, Manuel; Valderas, José M; Gènova-Maleras, Ricard; Forés-Martos, Jaume; Alonso-Arroyo, Adolfo; Saint-Gerons, Diego Macías; Vieta, Eduard; Valencia, Alfonso; Tabarés-Seisdedos, Rafael

    2017-07-11

    Anorexia nervosa is characterized by a severe restriction of caloric intake, low body weight, fear of gaining weight or of becoming fat, and disturbance of body image. Pathogenesis of the disorder may include genetic predisposition, hormonal changes and a combination of environmental, psychosocial, and cultural factors. Cancer is the second leading cause of death worldwide. At present, no systematic reviews and meta-analyses have evaluated the risk of cancer in people with anorexia nervosa. The objective of this study will be to evaluate the association between anorexia nervosa and the risk of developing or dying from cancer. This study protocol is part of a systematic collection and assessment of multiple systematic reviews and meta-analyses (umbrella review) evaluating the association of cancer and multiple central nervous system disorders. We designed a specific protocol for a new systematic review and meta-analysis of observational studies of anorexia nervosa with risk of developing or dying from any cancer. Data sources will be PubMed, Embase, Scopus, Web of Science, and manual screening of references. Observational studies (case-control and cohort) in humans that examined the association between anorexia nervosa and risk of developing or dying from cancer will be sought. The primary outcomes will be cancer incidence and cancer mortality in association with anorexia nervosa. Secondary outcomes will be site-specific cancer incidence and mortality, respectively. Screening of abstracts and full texts, and data abstraction will be performed by two team members independently. Conflicts at all levels of screening and abstraction will be resolved through discussion. The quality of studies will be assessed by using the Ottawa-Newcastle scale by two team members independently. Random effects models will be conducted where appropriate. Subgroup and additional analyses will be conducted to explore the potential sources of heterogeneity. The World Cancer Research Fund (WCRF)/American Institute for Cancer Research (AICR) criteria and the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach will be used for determining the quality of evidence for cancer outcomes. Findings from this systematic review will inform an ongoing umbrella review on cancer and central nervous system disorders. Our systematic review and meta-analysis of observational studies will establish the extent of the epidemiological evidence underlying the association between anorexia nervosa and cancer. PROSPERO CRD42017067462.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youkhana, Adel H.; Ogoshi, Richard M.; Kiniry, James R.

    Biomass is a promising renewable energy option that provides a more environmentally sustainable alternative to fossil resources by reducing the net flux of greenhouse gasses to the atmosphere. Yet, allometric models that allow the prediction of aboveground biomass (AGB), biomass carbon (C) stock non-destructively have not yet been developed for tropical perennial C 4 grasses currently under consideration as potential bioenergy feedstock in Hawaii and other subtropical and tropical locations. The objectives of this study were to develop optimal allometric relationships and site-specific models to predict AGB, biomass C stock of napiergrass, energycane, and sugarcane under cultivation practices for renewablemore » energy and validate these site-specific models against independent data sets generated from sites with widely different environments. Several allometric models were developed for each species from data at a low elevation field on the island of Maui, Hawaii. A simple power model with stalk diameter (D) was best related to AGB and biomass C stock for napiergrass, energycane, and sugarcane, (R 2 = 0.98, 0.96, and 0.97, respectively). The models were then tested against data collected from independent fields across an environmental gradient. For all crops, the models over-predicted AGB in plants with lower stalk D, but AGB was under-predicted in plants with higher stalk D. The models using stalk D were better for biomass prediction compared to dewlap H (Height from the base cut to most recently exposed leaf dewlap) models, which showed weak validation performance. Although stalk D model performed better, however, the mean square error (MSE)-systematic was ranged from 23 to 43 % of MSE for all crops. A strong relationship between model coefficient and rainfall was existed, although these were irrigated systems; suggesting a simple site-specific coefficient modulator for rainfall to reduce systematic errors in water-limited areas. These allometric equations provide a tool for farmers in the tropics to estimate perennial C4 grass biomass and C stock during decision-making for land management and as an environmental sustainability indicator within a renewable energy system.« less

  19. AAA gunnermodel based on observer theory. [predicting a gunner's tracking response

    NASA Technical Reports Server (NTRS)

    Kou, R. S.; Glass, B. C.; Day, C. N.; Vikmanis, M. M.

    1978-01-01

    The Luenberger observer theory is used to develop a predictive model of a gunner's tracking response in antiaircraft artillery systems. This model is composed of an observer, a feedback controller and a remnant element. An important feature of the model is that the structure is simple, hence a computer simulation requires only a short execution time. A parameter identification program based on the least squares curve fitting method and the Gauss Newton gradient algorithm is developed to determine the parameter values of the gunner model. Thus, a systematic procedure exists for identifying model parameters for a given antiaircraft tracking task. Model predictions of tracking errors are compared with human tracking data obtained from manned simulation experiments. Model predictions are in excellent agreement with the empirical data for several flyby and maneuvering target trajectories.

  20. Systematic assignment of thermodynamic constraints in metabolic network models

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    Background The availability of genome sequences for many organisms enabled the reconstruction of several genome-scale metabolic network models. Currently, significant efforts are put into the automated reconstruction of such models. For this, several computational tools have been developed that particularly assist in identifying and compiling the organism-specific lists of metabolic reactions. In contrast, the last step of the model reconstruction process, which is the definition of the thermodynamic constraints in terms of reaction directionalities, still needs to be done manually. No computational method exists that allows for an automated and systematic assignment of reaction directions in genome-scale models. Results We present an algorithm that – based on thermodynamics, network topology and heuristic rules – automatically assigns reaction directions in metabolic models such that the reaction network is thermodynamically feasible with respect to the production of energy equivalents. It first exploits all available experimentally derived Gibbs energies of formation to identify irreversible reactions. As these thermodynamic data are not available for all metabolites, in a next step, further reaction directions are assigned on the basis of network topology considerations and thermodynamics-based heuristic rules. Briefly, the algorithm identifies reaction subsets from the metabolic network that are able to convert low-energy co-substrates into their high-energy counterparts and thus net produce energy. Our algorithm aims at disabling such thermodynamically infeasible cyclic operation of reaction subnetworks by assigning reaction directions based on a set of thermodynamics-derived heuristic rules. We demonstrate our algorithm on a genome-scale metabolic model of E. coli. The introduced systematic direction assignment yielded 130 irreversible reactions (out of 920 total reactions), which corresponds to about 70% of all irreversible reactions that are required to disable thermodynamically infeasible energy production. Conclusion Although not being fully comprehensive, our algorithm for systematic reaction direction assignment could define a significant number of irreversible reactions automatically with low computational effort. We envision that the presented algorithm is a valuable part of a computational framework that assists the automated reconstruction of genome-scale metabolic models. PMID:17123434

  1. Prevention and assessment of infectious diseases among children and adult migrants arriving to the European Union/European Economic Association: a protocol for a suite of systematic reviews for public health and health systems.

    PubMed

    Pottie, Kevin; Mayhew, Alain D; Morton, Rachael L; Greenaway, Christina; Akl, Elie A; Rahman, Prinon; Zenner, Dominik; Pareek, Manish; Tugwell, Peter; Welch, Vivian; Meerpohl, Joerg; Alonso-Coello, Pablo; Hui, Charles; Biggs, Beverley-Ann; Requena-Méndez, Ana; Agbata, Eric; Noori, Teymur; Schünemann, Holger J

    2017-09-11

    The European Centre for Disease Prevention and Control is developing evidence-based guidance for voluntary screening, treatment and vaccine prevention of infectious diseases for newly arriving migrants to the European Union/European Economic Area. The objective of this systematic review protocol is to guide the identification, appraisal and synthesis of the best available evidence on prevention and assessment of the following priority infectious diseases: tuberculosis, HIV, hepatitis B, hepatitis C, measles, mumps, rubella, diphtheria, tetanus, pertussis, poliomyelitis (polio), Haemophilus influenza disease, strongyloidiasis and schistosomiasis. The search strategy will identify evidence from existing systematic reviews and then update the effectiveness and cost-effectiveness evidence using prospective trials, economic evaluations and/or recently published systematic reviews. Interdisciplinary teams have designed logic models to help define study inclusion and exclusion criteria, guiding the search strategy and identifying relevant outcomes. We will assess the certainty of evidence using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. There are no ethical or safety issues. We anticipate disseminating the findings through open-access publications, conference abstracts and presentations. We plan to publish technical syntheses as GRADEpro evidence summaries and the systematic reviews as part of a special edition open-access publication on refugee health. We are following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Protocols reporting guideline. This protocol is registered in PROSPERO: CRD42016045798. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Collaborative Practice Model: Improving the Delivery of Bad News.

    PubMed

    Bowman, Pamela N; Slusser, Kim; Allen, Deborah

    2018-02-01

    Ideal bad news delivery requires skilled communication and team support. The literature has primarily focused on patient preferences, impact on care decisions, healthcare roles, and communication styles, without addressing systematic implementation. This article describes how an interdisciplinary team, led by advanced practice nurses, developed and implemented a collaborative practice model to deliver bad news on a unit that had struggled with inconsistencies. Using evidence-based practices, the authors explored current processes, role perceptions and expectations, and perceived barriers to developing the model, which is now the standard of care and an example of interprofessional team collaboration across the healthcare system. This model for delivering bad news can be easily adapted to meet the needs of other clinical units.
.

  3. Comparing models of helper behavior to actual practice in telephone crisis intervention: a Silent Monitoring Study of Calls to the U.S. 1-800-SUICIDE Network.

    PubMed

    Mishara, Brian L; Chagnon, François; Daigle, Marc; Balan, Bogdan; Raymond, Sylvaine; Marcoux, Isabelle; Bardon, Cécile; Campbell, Julie K; Berman, Alan

    2007-06-01

    Models of telephone crisis intervention in suicide prevention and best practices were developed from a literature review and surveys of crisis centers. We monitored 2,611 calls to 14 centers using reliable behavioral ratings to compare actual interventions with the models. Active listening and collaborative problem-solving models describe help provided. Centers vary greatly in the nature of interventions and their quality according to predetermined criteria. Helpers do not systematically assess suicide risk. Some lives may have been saved but occasionally unacceptable responses occur. Recommendations include the need for quality assurance, development of standardized practices and research relating intervention processes to outcomes.

  4. A systematic review of the psychological and social benefits of participation in sport for children and adolescents: informing development of a conceptual model of health through sport

    PubMed Central

    2013-01-01

    Background There are specific guidelines regarding the level of physical activity (PA) required to provide health benefits. However, the research underpinning these PA guidelines does not address the element of social health. Furthermore, there is insufficient evidence about the levels or types of PA associated specifically with psychological health. This paper first presents the results of a systematic review of the psychological and social health benefits of participation in sport by children and adolescents. Secondly, the information arising from the systematic review has been used to develop a conceptual model. Methods A systematic review of 14 electronic databases was conducted in June 2012, and studies published since 1990 were considered for inclusion. Studies that addressed mental and/or social health benefits from participation in sport were included. Results A total of 3668 publications were initially identified, of which 30 met the selection criteria. There were many different psychological and social health benefits reported, with the most commonly being improved self-esteem, social interaction followed by fewer depressive symptoms. Sport may be associated with improved psychosocial health above and beyond improvements attributable to participation in PA. Specifically, team sport seems to be associated with improved health outcomes compared to individual activities, due to the social nature of the participation. A conceptual model, Health through Sport, is proposed. The model depicts the relationship between psychological, psychosocial and social health domains, and their positive associations with sport participation, as reported in the literature. However, it is acknowledged that the capacity to determine the existence and direction of causal links between participation and health is limited by the fact that the majority of studies identified (n=21) were cross-sectional. Conclusion It is recommended that community sport participation is advocated as a form of leisure time PA for children and adolescents, in an effort to not only improve physical health in relation to such matters as the obesity crisis, but also to enhance psychological and social health outcomes. It is also recommended that the causal link between participation in sport and psychosocial health be further investigated and the conceptual model of Health through Sport tested. PMID:23945179

  5. A systematic multiscale modeling and experimental approach to protect grain boundaries in magnesium alloys from corrosion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horstemeyer, Mark R.; Chaudhuri, Santanu

    2015-09-30

    A multiscale modeling Internal State Variable (ISV) constitutive model was developed that captures the fundamental structure-property relationships. The macroscale ISV model used lower length scale simulations (Butler-Volmer and Electronics Structures results) in order to inform the ISVs at the macroscale. The chemomechanical ISV model was calibrated and validated from experiments with magnesium (Mg) alloys that were investigated under corrosive environments coupled with experimental electrochemical studies. Because the ISV chemomechanical model is physically based, it can be used for other material systems to predict corrosion behavior. As such, others can use the chemomechanical model for analyzing corrosion effects on their designs.

  6. Development of the Rice Convection Model as a Space Weather Tool

    DTIC Science & Technology

    2015-05-31

    coupled to the ionosphere that is suitable for both scientific studies as well as a prediction tool. We are able to run the model faster than “real...of work by finding ways to fund a more systematic effort in making the RCM a space weather prediction tool for magnetospheric and ionospheric studies...convection electric field, total electron content, TEC, ionospheric convection, plasmasphere 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  7. Using Instructional Design, Analyze, Design, Develop, Implement, and Evaluate, to Develop e-Learning Modules to Disseminate Supported Employment for Community Behavioral Health Treatment Programs in New York State.

    PubMed

    Patel, Sapana R; Margolies, Paul J; Covell, Nancy H; Lipscomb, Cristine; Dixon, Lisa B

    2018-01-01

    Implementation science lacks a systematic approach to the development of learning strategies for online training in evidence-based practices (EBPs) that takes the context of real-world practice into account. The field of instructional design offers ecologically valid and systematic processes to develop learning strategies for workforce development and performance support. This report describes the application of an instructional design framework-Analyze, Design, Develop, Implement, and Evaluate (ADDIE) model-in the development and evaluation of e-learning modules as one strategy among a multifaceted approach to the implementation of individual placement and support (IPS), a model of supported employment for community behavioral health treatment programs, in New York State. We applied quantitative and qualitative methods to develop and evaluate three IPS e-learning modules. Throughout the ADDIE process, we conducted formative and summative evaluations and identified determinants of implementation using the Consolidated Framework for Implementation Research (CFIR). Formative evaluations consisted of qualitative feedback received from recipients and providers during early pilot work. The summative evaluation consisted of levels 1 and 2 (reaction to the training, self-reported knowledge, and practice change) quantitative and qualitative data and was guided by the Kirkpatrick model for training evaluation. Formative evaluation with key stakeholders identified a range of learning needs that informed the development of a pilot training program in IPS. Feedback on this pilot training program informed the design document of three e-learning modules on IPS: Introduction to IPS, IPS Job development, and Using the IPS Employment Resource Book . Each module was developed iteratively and provided an assessment of learning needs that informed successive modules. All modules were disseminated and evaluated through a learning management system. Summative evaluation revealed that learners rated the modules positively, and self-report of knowledge acquisition was high (mean range: 4.4-4.6 out of 5). About half of learners indicated that they would change their practice after watching the modules (range: 48-51%). All learners who completed the level 1 evaluation demonstrated 80% or better mastery of knowledge on the level 2 evaluation embedded in each module. The CFIR was used to identify implementation barriers and facilitators among the evaluation data which facilitated planning for subsequent implementation support activities in the IPS initiative. Instructional design approaches such as ADDIE may offer implementation scientists and practitioners a flexible and systematic approach for the development of e-learning modules as a single component or one strategy in a multifaceted approach for training in EBPs.

  8. Applicability and feasibility of systematic review for performing evidence-based risk assessment in food and feed safety.

    PubMed

    Aiassa, E; Higgins, J P T; Frampton, G K; Greiner, M; Afonso, A; Amzal, B; Deeks, J; Dorne, J-L; Glanville, J; Lövei, G L; Nienstedt, K; O'connor, A M; Pullin, A S; Rajić, A; Verloo, D

    2015-01-01

    Food and feed safety risk assessment uses multi-parameter models to evaluate the likelihood of adverse events associated with exposure to hazards in human health, plant health, animal health, animal welfare, and the environment. Systematic review and meta-analysis are established methods for answering questions in health care, and can be implemented to minimize biases in food and feed safety risk assessment. However, no methodological frameworks exist for refining risk assessment multi-parameter models into questions suitable for systematic review, and use of meta-analysis to estimate all parameters required by a risk model may not be always feasible. This paper describes novel approaches for determining question suitability and for prioritizing questions for systematic review in this area. Risk assessment questions that aim to estimate a parameter are likely to be suitable for systematic review. Such questions can be structured by their "key elements" [e.g., for intervention questions, the population(s), intervention(s), comparator(s), and outcome(s)]. Prioritization of questions to be addressed by systematic review relies on the likely impact and related uncertainty of individual parameters in the risk model. This approach to planning and prioritizing systematic review seems to have useful implications for producing evidence-based food and feed safety risk assessment.

  9. A competency framework for librarians involved in systematic reviews.

    PubMed

    Townsend, Whitney A; Anderson, Patricia F; Ginier, Emily C; MacEachern, Mark P; Saylor, Kate M; Shipman, Barbara L; Smith, Judith E

    2017-07-01

    The project identified a set of core competencies for librarians who are involved in systematic reviews. A team of seven informationists with broad systematic review experience examined existing systematic review standards, conducted a literature search, and used their own expertise to identify core competencies and skills that are necessary to undertake various roles in systematic review projects. The team identified a total of six competencies for librarian involvement in systematic reviews: "Systematic review foundations," "Process management and communication," "Research methodology," "Comprehensive searching," "Data management," and "Reporting." Within each competency are the associated skills and knowledge pieces (indicators). Competence can be measured using an adaptation of Miller's Pyramid for Clinical Assessment, either through self-assessment or identification of formal assessment instruments. The Systematic Review Competencies Framework provides a standards-based, flexible way for librarians and organizations to identify areas of competence and areas in need of development to build capacity for systematic review integration. The framework can be used to identify or develop appropriate assessment tools and to target skill development opportunities.

  10. A Data Model for Teleconsultation in Managing High-Risk Pregnancies: Design and Preliminary Evaluation

    PubMed Central

    Deldar, Kolsoum

    2017-01-01

    Background Teleconsultation is a guarantor for virtual supervision of clinical professors on clinical decisions made by medical residents in teaching hospitals. Type, format, volume, and quality of exchanged information have a great influence on the quality of remote clinical decisions or tele-decisions. Thus, it is necessary to develop a reliable and standard model for these clinical relationships. Objective The goal of this study was to design and evaluate a data model for teleconsultation in the management of high-risk pregnancies. Methods This study was implemented in three phases. In the first phase, a systematic review, a qualitative study, and a Delphi approach were done in selected teaching hospitals. Systematic extraction and localization of diagnostic items to develop the tele-decision clinical archetypes were performed as the second phase. Finally, the developed model was evaluated using predefined consultation scenarios. Results Our review study has shown that present medical consultations have no specific structure or template for patient information exchange. Furthermore, there are many challenges in the remote medical decision-making process, and some of them are related to the lack of the mentioned structure. The evaluation phase of our research has shown that data quality (P<.001), adequacy (P<.001), organization (P<.001), confidence (P<.001), and convenience (P<.001) had more scores in archetype-based consultation scenarios compared with routine-based ones. Conclusions Our archetype-based model could acquire better and higher scores in the data quality, adequacy, organization, confidence, and convenience dimensions than ones with routine scenarios. It is probable that the suggested archetype-based teleconsultation model may improve the quality of physician-physician remote medical consultations. PMID:29242181

  11. Linear Models for Systematics and Nuisances

    NASA Astrophysics Data System (ADS)

    Luger, Rodrigo; Foreman-Mackey, Daniel; Hogg, David W.

    2017-12-01

    The target of many astronomical studies is the recovery of tiny astrophysical signals living in a sea of uninteresting (but usually dominant) noise. In many contexts (i.e., stellar time-series, or high-contrast imaging, or stellar spectroscopy), there are structured components in this noise caused by systematic effects in the astronomical source, the atmosphere, the telescope, or the detector. More often than not, evaluation of the true physical model for these nuisances is computationally intractable and dependent on too many (unknown) parameters to allow rigorous probabilistic inference. Sometimes, housekeeping data---and often the science data themselves---can be used as predictors of the systematic noise. Linear combinations of simple functions of these predictors are often used as computationally tractable models that can capture the nuisances. These models can be used to fit and subtract systematics prior to investigation of the signals of interest, or they can be used in a simultaneous fit of the systematics and the signals. In this Note, we show that if a Gaussian prior is placed on the weights of the linear components, the weights can be marginalized out with an operation in pure linear algebra, which can (often) be made fast. We illustrate this model by demonstrating the applicability of a linear model for the non-linear systematics in K2 time-series data, where the dominant noise source for many stars is spacecraft motion and variability.

  12. Ross, macdonald, and a theory for the dynamics and control of mosquito-transmitted pathogens.

    PubMed

    Smith, David L; Battle, Katherine E; Hay, Simon I; Barker, Christopher M; Scott, Thomas W; McKenzie, F Ellis

    2012-01-01

    Ronald Ross and George Macdonald are credited with developing a mathematical model of mosquito-borne pathogen transmission. A systematic historical review suggests that several mathematicians and scientists contributed to development of the Ross-Macdonald model over a period of 70 years. Ross developed two different mathematical models, Macdonald a third, and various "Ross-Macdonald" mathematical models exist. Ross-Macdonald models are best defined by a consensus set of assumptions. The mathematical model is just one part of a theory for the dynamics and control of mosquito-transmitted pathogens that also includes epidemiological and entomological concepts and metrics for measuring transmission. All the basic elements of the theory had fallen into place by the end of the Global Malaria Eradication Programme (GMEP, 1955-1969) with the concept of vectorial capacity, methods for measuring key components of transmission by mosquitoes, and a quantitative theory of vector control. The Ross-Macdonald theory has since played a central role in development of research on mosquito-borne pathogen transmission and the development of strategies for mosquito-borne disease prevention.

  13. Ross, Macdonald, and a Theory for the Dynamics and Control of Mosquito-Transmitted Pathogens

    PubMed Central

    Smith, David L.; Battle, Katherine E.; Hay, Simon I.; Barker, Christopher M.; Scott, Thomas W.; McKenzie, F. Ellis

    2012-01-01

    Ronald Ross and George Macdonald are credited with developing a mathematical model of mosquito-borne pathogen transmission. A systematic historical review suggests that several mathematicians and scientists contributed to development of the Ross-Macdonald model over a period of 70 years. Ross developed two different mathematical models, Macdonald a third, and various “Ross-Macdonald” mathematical models exist. Ross-Macdonald models are best defined by a consensus set of assumptions. The mathematical model is just one part of a theory for the dynamics and control of mosquito-transmitted pathogens that also includes epidemiological and entomological concepts and metrics for measuring transmission. All the basic elements of the theory had fallen into place by the end of the Global Malaria Eradication Programme (GMEP, 1955–1969) with the concept of vectorial capacity, methods for measuring key components of transmission by mosquitoes, and a quantitative theory of vector control. The Ross-Macdonald theory has since played a central role in development of research on mosquito-borne pathogen transmission and the development of strategies for mosquito-borne disease prevention. PMID:22496640

  14. Creating, documenting and sharing network models.

    PubMed

    Crook, Sharon M; Bednar, James A; Berger, Sandra; Cannon, Robert; Davison, Andrew P; Djurfeldt, Mikael; Eppler, Jochen; Kriener, Birgit; Furber, Steve; Graham, Bruce; Plesser, Hans E; Schwabe, Lars; Smith, Leslie; Steuber, Volker; van Albada, Sacha

    2012-01-01

    As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.

  15. [Implementing models of cross-sectoral mental health care (integrated health care, regional psychiatry budget) in Germany: systematic literature review].

    PubMed

    Schmid, Petra; Steinert, Tilman; Borbé, Raoul

    2013-11-01

    Cross-sectoral integrated health-care and the regional psychiatry budget are two models of cross-sectoral health care (comprising in-patient and out-patient care) in Germany. Both models of financing were created in order to overcome the so-called fragmentation in German health care. The regional psychiatry budget is a specific solution for psychiatric services whereas integrated health care models can be developed for all areas of health care. The purpose of this overview is to elucidate both the current state of implementation of these models and the results of evaluation research. Systematic literature review, additional manual search. 28 journal articles and 38 websites referring to 21 projects were identified. The projects are highly heterogenuous in terms of size, included populations and services, aims, and steering-function (concerning the different pathways of care). The projects yield innovative models of mental health care capable of competing with the co-existing traditional financing systems of in-patient and out-patient services. The future of mental health care organisation in Germany is currently open and under political discussion. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Systematic Reviews of Animal Models: Methodology versus Epistemology

    PubMed Central

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions. PMID:23372426

  17. Metasynthesis findings: potential versus reality.

    PubMed

    Finfgeld-Connett, Deborah

    2014-11-01

    Early on, qualitative researchers predicted that metasynthesis research had the potential to significantly push knowledge development forward. More recently, scholars have questioned whether this is actually occurring. To examine this concern, a randomly selected sample of metasynthesis articles was systematically reviewed to identify the types of findings that have been produced. Based on this systematic examination, it appears that findings from metasynthesis investigations might not be reaching their full potential. Metasynthesis investigations frequently result in isolated findings rather than findings in relationship, and opportunities to generate research hypotheses and theoretical models are not always fully realized. With this in mind, methods for moving metasynthesis findings into relationship are discussed. © The Author(s) 2014.

  18. Towards the Prediction of Decadal to Centennial Climate Processes in the Coupled Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhengyu; Kutzbach, J.; Jacob, R.

    2011-12-05

    In this proposal, we have made major advances in the understanding of decadal and long term climate variability. (a) We performed a systematic study of multidecadal climate variability in FOAM-LPJ and CCSM-T31, and are starting exploring decadal variability in the IPCC AR4 models. (b) We develop several novel methods for the assessment of climate feedbacks in the observation. (c) We also developed a new initialization scheme DAI (Dynamical Analogue Initialization) for ensemble decadal prediction. (d) We also studied climate-vegetation feedback in the observation and models. (e) Finally, we started a pilot program using Ensemble Kalman Filter in CGCM for decadalmore » climate prediction.« less

  19. Polarization reconstruction algorithm for a Compton polarimeter

    NASA Astrophysics Data System (ADS)

    Vockert, M.; Weber, G.; Spillmann, U.; Krings, T.; Stöhlker, Th

    2018-05-01

    We present the technique of Compton polarimetry using X-ray detectors based on double-sided segmented semiconductor crystals that were developed within the SPARC collaboration. In addition, we discuss the polarization reconstruction algorithm with particular emphasis on systematic deviations between the observed detector response and our model function for the Compton scattering distribution inside the detector.

  20. I CAN Physical Education Curriculum Resource Materials: Primary through Secondary.

    ERIC Educational Resources Information Center

    Wessel, Janet A.; And Others

    The I CAN primary and secondary phsycial education curriculum resource materials were developed, field tested, and published 1971-1979. The Achievement Based Curriculum Model, a systematic training process designed to assist teachers in using the I CAN database resource materials to improve the quality of teaching and instruction, was developed…

  1. Improvements in safety testing of lithium cells

    NASA Astrophysics Data System (ADS)

    Stinebring, R. C.; Krehl, P.

    1985-07-01

    A systematic approach was developed for evaluating the basic safety parameters of high power lithium soluble cathode cells. This approach consists of performing a series of tests on each cell model during the design, prototype and production phases. Abusive testing is performed in a facility where maximum protection is given to test personnel.

  2. Improvements in safety testing of lithium cells

    NASA Technical Reports Server (NTRS)

    Stinebring, R. C.; Krehl, P.

    1985-01-01

    A systematic approach was developed for evaluating the basic safety parameters of high power lithium soluble cathode cells. This approach consists of performing a series of tests on each cell model during the design, prototype and production phases. Abusive testing is performed in a facility where maximum protection is given to test personnel.

  3. Increasing Job Satisfaction. Symposium 22. [Concurrent Symposium Session at AHRD Annual Conference, 2000.

    ERIC Educational Resources Information Center

    2000

    This document contains three papers from a symposium on increasing job satisfaction that was conducted as part of a conference on human resource development (HRD). "A Systematic Model of Job Design by Examining the Organizational Factors Affecting Satisfaction" (Zhichao Cheng, Danyang Yang, Fenglou Liu) reports on a project in which…

  4. Interpersonal Behaviour in One-to-One Instrumental Lessons: An Observational Analysis

    ERIC Educational Resources Information Center

    Creech, Andrea

    2012-01-01

    This paper explores patterns of interpersonal behaviour amongst teachers and pupils during one-to-one instrumental lessons. It was hypothesised that these patterns might differ in systematic ways, according to an existing model of six interaction "types" developed within a systems theory perspective and based on measures of interpersonal control…

  5. Application of IATA - A case study in evaluating the global and local performance of a Bayesian Network model for Skin Sensitization

    EPA Science Inventory

    Since the publication of the Adverse Outcome Pathway (AOP) for skin sensitization, there have been many efforts to develop systematic approaches to integrate the information generated from different key events for decision making. The types of information characterizing key event...

  6. Development of Systematic Approaches for Calibration of Subsurface Transport Models Using Hard and Soft Data on System Characteristics and Behavior

    DTIC Science & Technology

    2011-02-02

    who graduated during this period and will receive scholarships or fellowships for further studies in science, mathematics, engineering or technology...nature or are collected at discrete points or localized areas in the system. The qualitative data includes, geology , large-scale stratigraphy and

  7. Short Term Objectives. (SCAT Project, Title VI-G).

    ERIC Educational Resources Information Center

    Archer, Anita

    Developed by the staff of the SCAT (Support, Competency-Assistance and Training) Project, the document deals with the third step of the systematic instructional model--sequencing short term objectives for exceptional students. The manual focuses on reviewing long term goals established by the child study team, converting these goals into long term…

  8. Introducing Pocket PCs in Schools: Attitudes and Beliefs in the First Year

    ERIC Educational Resources Information Center

    Ng, Wan; Nicholas, Howard

    2009-01-01

    As more schools adopt the use of handheld computers in their classrooms, research that systematically tracks their introduction is essential in order to develop a model for successful implementation leading to improved classroom teaching. This research report seeks to explore the realities of introducing and integrating handheld computers into…

  9. A Conceptual Model for the Design and Delivery of Explicit Thinking Skills Instruction

    ERIC Educational Resources Information Center

    Kassem, Cherrie L.

    2005-01-01

    Developing student thinking skills is an important goal for most educators. However, due to time constraints and weighty content standards, thinking skills instruction is often embedded in subject matter, implicit and incidental. For best results, thinking skills instruction requires a systematic design and explicit teaching strategies. The…

  10. Do Parents Model Gestures Differently When Children's Gestures Differ?

    ERIC Educational Resources Information Center

    Özçaliskan, Seyda; Adamson, Lauren B.; Dimitrova, Nevena; Baumann, Stephanie

    2018-01-01

    Children with autism spectrum disorder (ASD) or with Down syndrome (DS) show diagnosis-specific differences from typically developing (TD) children in gesture production. We asked whether these differences reflect the differences in parental gesture input. Our systematic observations of 23 children with ASD and 23 with DS (M[subscript…

  11. Investigating a Systematic Process to Develop Teacher Expertise: A Comparative Case Study

    ERIC Educational Resources Information Center

    Mielke, Paul George

    2012-01-01

    There is little evidence that traditional clinical supervision models improve teaching practice (Donaldson, 2009; Schmoker, 1992). However, the use of video (Brophy, 2004; King, 2011; Marshall, 2002; Sherin and Van Es 2009) and reflective peer observation (Cosh, 1999) coupled with a research based teaching framework (Danielson, 1996; Marzano,…

  12. Usability in product design--the importance and need for systematic assessment models in product development--Usa-Design Model (U-D) ©.

    PubMed

    Merino, Giselle Schmidt Alves Díaz; Teixeira, Clarissa Stefani; Schoenardie, Rodrigo Petry; Merino, Eugenio Andrés Diáz; Gontijo, Leila Amaral

    2012-01-01

    In product design, human factors are considered as an element of differentiation given that today's consumer demands are increasing. Safety, wellbeing, satisfaction, health, effectiveness, efficiency, and other aspects must be effectively incorporated into the product development process. This work proposes a usability assessment model that can be incorporated as an assessment tool. The methodological approach is settled in two stages. First a literature review focus specifically on usability and developing user-centred products. After this, a model of usability named Usa-Design (U-D©) is presented. Consisted of four phases: understanding the use context, pre-preliminary usability assessment (efficiency/effectiveness/satisfaction); assessment of usability principles and results, U-D© features are modular and flexible, allowing principles used in Phase 3 to be changed according to the needs and scenario of each situation. With qualitative/quantitative measurement scales of easy understanding and application, the model results are viable and applicable throughout all the product development process.

  13. Flux analysis and metabolomics for systematic metabolic engineering of microorganisms.

    PubMed

    Toya, Yoshihiro; Shimizu, Hiroshi

    2013-11-01

    Rational engineering of metabolism is important for bio-production using microorganisms. Metabolic design based on in silico simulations and experimental validation of the metabolic state in the engineered strain helps in accomplishing systematic metabolic engineering. Flux balance analysis (FBA) is a method for the prediction of metabolic phenotype, and many applications have been developed using FBA to design metabolic networks. Elementary mode analysis (EMA) and ensemble modeling techniques are also useful tools for in silico strain design. The metabolome and flux distribution of the metabolic pathways enable us to evaluate the metabolic state and provide useful clues to improve target productivity. Here, we reviewed several computational applications for metabolic engineering by using genome-scale metabolic models of microorganisms. We also discussed the recent progress made in the field of metabolomics and (13)C-metabolic flux analysis techniques, and reviewed these applications pertaining to bio-production development. Because these in silico or experimental approaches have their respective advantages and disadvantages, the combined usage of these methods is complementary and effective for metabolic engineering. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.

    PubMed

    Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis

    2016-07-01

    Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.

  15. A Systematic Review of Literature on Culturally Adapted Obesity Prevention Interventions for African American Youth.

    PubMed

    Lofton, Saria; Julion, Wrenetha A; McNaughton, Diane B; Bergren, Martha Dewey; Keim, Kathryn S

    2016-02-01

    Obesity and overweight prevalence in African American (AA) youth continues to be one of the highest of all major ethnic groups, which has led researchers to pursue culturally based approaches as a means to improve obesity prevention interventions. The purpose of this systematic review was to evaluate culturally adapted obesity prevention interventions targeting AA youth. A search of electronic databases, limited to multicomponent culturally adapted obesity prevention controlled trials from 2003 to 2013, was conducted for key terms. Eleven studies met inclusion criteria. We used the PEN-3 model to evaluate the strengths and weaknesses of interventions as well as to identify cultural adaptation strategies. The PEN-3 model highlighted the value of designing joint parent-youth interventions, building a relationship between AA mentors and youth, and emphasizing healthful activities that the youth preferred. The PEN-3 model shows promise as an overarching framework to develop culturally adapted obesity interventions. © The Author(s) 2015.

  16. Bayesian Immunological Model Development from the Literature: Example Investigation of Recent Thymic Emigrants†

    PubMed Central

    Holmes, Tyson H.; Lewis, David B.

    2014-01-01

    Bayesian estimation techniques offer a systematic and quantitative approach for synthesizing data drawn from the literature to model immunological systems. As detailed here, the practitioner begins with a theoretical model and then sequentially draws information from source data sets and/or published findings to inform estimation of model parameters. Options are available to weigh these various sources of information differentially per objective measures of their corresponding scientific strengths. This approach is illustrated in depth through a carefully worked example for a model of decline in T-cell receptor excision circle content of peripheral T cells during development and aging. Estimates from this model indicate that 21 years of age is plausible for the developmental timing of mean age of onset of decline in T-cell receptor excision circle content of peripheral T cells. PMID:25179832

  17. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  18. Does Bariatric Surgery Affect the Incidence of Endometrial Cancer Development? A Systematic Review.

    PubMed

    Winder, Alec A; Kularatna, Malsha; MacCormick, Andrew D

    2018-05-01

    Obesity has been linked to an increased prevalence in multiple cancers. Studies have suggested a reduction in the overall risk of cancer after bariatric surgery. We reviewed the evidence for bariatric surgery reducing the risk of endometrial cancer. Data was extracted from PubMed, EMBASE, and Medline to perform a systematic review. Thirty-one full text articles were identified from 265 abstracts. Nine observational studies were relevant to endometrial cancer. In the five controlled studies, 462 of 113,032 (0.4%) patients receiving bariatric surgery versus 11,997 of 848,864 (1.4%) controls developed endometrial cancer, odds ratio of 0.317 (95% CI 0.161 to 0.627) using random effects model (P < 0.001). Bariatric surgery seems to reduce the risk of endometrial cancer; however, more research is required.

  19. [Psychological aspects of the preparation and performance of endoscopies in children and adolescents].

    PubMed

    Stier, R; Schultz-Brauns, B; Riedesser, P; Zeisel, H J

    1983-01-01

    Individual observations lead to the realisation that during endoscopy children may develop such a degree of seemingly unexplainable anxiety that the performance of the examination is considerably prejudiced. We therefore examined 39 children systematically, evaluating them according to fearsome products of their imagination on the one hand and real or warranted anxiety on the other. Adjusted to age the children were tested using drawings, projectional tests and role-playing in addition to interviewing, sometimes of their parents as well. The most prominent expressions of anxiety in conjunction with endoscopy were fear of suffocation; fear of damage to internal organs and, in girls, fear of lesions to a "baby inside". In adolescents problems with prudery became evident. On the basis of our experience we developed a systematic model of psychological preparation for endoscopic examinations.

  20. Evaluation of wave runup predictions from numerical and parametric models

    USGS Publications Warehouse

    Stockdon, Hilary F.; Thompson, David M.; Plant, Nathaniel G.; Long, Joseph W.

    2014-01-01

    Wave runup during storms is a primary driver of coastal evolution, including shoreline and dune erosion and barrier island overwash. Runup and its components, setup and swash, can be predicted from a parameterized model that was developed by comparing runup observations to offshore wave height, wave period, and local beach slope. Because observations during extreme storms are often unavailable, a numerical model is used to simulate the storm-driven runup to compare to the parameterized model and then develop an approach to improve the accuracy of the parameterization. Numerically simulated and parameterized runup were compared to observations to evaluate model accuracies. The analysis demonstrated that setup was accurately predicted by both the parameterized model and numerical simulations. Infragravity swash heights were most accurately predicted by the parameterized model. The numerical model suffered from bias and gain errors that depended on whether a one-dimensional or two-dimensional spatial domain was used. Nonetheless, all of the predictions were significantly correlated to the observations, implying that the systematic errors can be corrected. The numerical simulations did not resolve the incident-band swash motions, as expected, and the parameterized model performed best at predicting incident-band swash heights. An assimilated prediction using a weighted average of the parameterized model and the numerical simulations resulted in a reduction in prediction error variance. Finally, the numerical simulations were extended to include storm conditions that have not been previously observed. These results indicated that the parameterized predictions of setup may need modification for extreme conditions; numerical simulations can be used to extend the validity of the parameterized predictions of infragravity swash; and numerical simulations systematically underpredict incident swash, which is relatively unimportant under extreme conditions.

  1. Which learning activities enhance physiotherapy practice? A systematic review protocol of quantitative and qualitative studies.

    PubMed

    Leahy, Edmund; Chipchase, Lucy; Blackstock, Felicity

    2017-04-17

    Learning activities are fundamental for the development of expertise in physiotherapy practice. Continuing professional development (CPD) encompasses formal and informal learning activities undertaken by physiotherapists. Identifying the most efficient and effective learning activities is essential to enable the profession to assimilate research findings and improve clinical skills to ensure the most efficacious care for clients. To date, systematic reviews on the effectiveness of CPD provide limited guidance on the most efficacious models of professional development for physiotherapists. The aim of this systematic review is to evaluate which learning activities enhance physiotherapy practice. A search of Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO (Psychological Abstracts), PEDro, Cochrane Library, AMED and Educational Resources and Information Center (ERIC) will be completed. Citation searching and reference list searching will be undertaken to locate additional studies. Quantitative and qualitative studies will be included if they examine the impact of learning activities on clinician's behaviour, attitude, knowledge, beliefs, skills, self-efficacy, work satisfaction and patient outcomes. Risk of bias will be assessed by two independent researchers. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) and Confidence in the Evidence from Reviews of Qualitative research (CERQual) will be used to synthesise results where a meta-analysis is possible. Where a meta-analysis is not possible, a narrative synthesis will be conducted. PROSPERO CRD42016050157.

  2. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  3. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  4. Development of an oral care guide for patients undergoing autologous stem cell transplantation.

    PubMed

    Salvador, Prisco T

    2006-01-01

    Nurses identified oral mucositis as a recurring issue in clinical practice. To meet this challenge, a group of nurses took a leadership role in developing an oral care guide. The University Health Network Nursing Research Utilization Model and the Neuman Systems Model served as conceptual frameworks. A flowchart was developed to ensure a coordinated and continuous provision of oral care. Educational presentations were conducted to familiarize nurses and members of the multidisciplinary team of the practice changes. The introduction of the oral care regimen as primary prevention, plus systematic oral assessment and monitoring had the potential to reduce the occurrence and severity of oral mucositis in patients undergoing autologous stem cell transplantation.

  5. The Content Validation and Resource Development For a Course in Materials and Processes of Industry Through the Use of NASA Experts at Norfolk State College. Final Report.

    ERIC Educational Resources Information Center

    Jacobs, James A.

    In an effort to develop a course in materials and processes of industry at Norfolk State College using Barton Herrscher's model of systematic instruction, a group of 12 NASA-Langley Research Center's (NASA-LRC) research engineers and technicians were recruited. The group acted as consultants in validating the content of the course and aided in…

  6. Effects of an Interdisciplinary Science Professional Development Program on Teacher Pedagogical Content Knowledge, Science Inquiry Instruction, and Student Understanding of Science Crosscutting Concepts in Twelve Public Schools: A Multi-Level Modeling Study

    ERIC Educational Resources Information Center

    Yang, Yang

    2017-01-01

    Systematic studies on effectiveness of in-service teacher professional development (PD) are important for science education research and practice. Previous studies mostly focus on one certain aspect of the entire program, for example, effectiveness of PD on improvement of teachers' knowledge or students' learning outcomes. This study, however,…

  7. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  8. Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme.

    PubMed

    Raftery, James; Hanney, Steve; Greenhalgh, Trish; Glover, Matthew; Blatch-Jones, Amanda

    2016-10-01

    This report reviews approaches and tools for measuring the impact of research programmes, building on, and extending, a 2007 review. (1) To identify the range of theoretical models and empirical approaches for measuring the impact of health research programmes; (2) to develop a taxonomy of models and approaches; (3) to summarise the evidence on the application and use of these models; and (4) to evaluate the different options for the Health Technology Assessment (HTA) programme. We searched databases including Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature and The Cochrane Library from January 2005 to August 2014. This narrative systematic literature review comprised an update, extension and analysis/discussion. We systematically searched eight databases, supplemented by personal knowledge, in August 2014 through to March 2015. The literature on impact assessment has much expanded. The Payback Framework, with adaptations, remains the most widely used approach. It draws on different philosophical traditions, enhancing an underlying logic model with an interpretative case study element and attention to context. Besides the logic model, other ideal type approaches included constructionist, realist, critical and performative. Most models in practice drew pragmatically on elements of several ideal types. Monetisation of impact, an increasingly popular approach, shows a high return from research but relies heavily on assumptions about the extent to which health gains depend on research. Despite usually requiring systematic reviews before funding trials, the HTA programme does not routinely examine the impact of those trials on subsequent systematic reviews. The York/Patient-Centered Outcomes Research Institute and the Grading of Recommendations Assessment, Development and Evaluation toolkits provide ways of assessing such impact, but need to be evaluated. The literature, as reviewed here, provides very few instances of a randomised trial playing a major role in stopping the use of a new technology. The few trials funded by the HTA programme that may have played such a role were outliers. The findings of this review support the continued use of the Payback Framework by the HTA programme. Changes in the structure of the NHS, the development of NHS England and changes in the National Institute for Health and Care Excellence's remit pose new challenges for identifying and meeting current and future research needs. Future assessments of the impact of the HTA programme will have to take account of wider changes, especially as the Research Excellence Framework (REF), which assesses the quality of universities' research, seems likely to continue to rely on case studies to measure impact. The HTA programme should consider how the format and selection of case studies might be improved to aid more systematic assessment. The selection of case studies, such as in the REF, but also more generally, tends to be biased towards high-impact rather than low-impact stories. Experience for other industries indicate that much can be learnt from the latter. The adoption of researchfish ® (researchfish Ltd, Cambridge, UK) by most major UK research funders has implications for future assessments of impact. Although the routine capture of indexed research publications has merit, the degree to which researchfish will succeed in collecting other, non-indexed outputs and activities remains to be established. There were limitations in how far we could address challenges that faced us as we extended the focus beyond that of the 2007 review, and well beyond a narrow focus just on the HTA programme. Research funders can benefit from continuing to monitor and evaluate the impacts of the studies they fund. They should also review the contribution of case studies and expand work on linking trials to meta-analyses and to guidelines. The National Institute for Health Research HTA programme.

  9. Twelve recommendations for integrating existing systematic reviews into new reviews: EPC guidance.

    PubMed

    Robinson, Karen A; Chou, Roger; Berkman, Nancy D; Newberry, Sydne J; Fu, Rongwei; Hartling, Lisa; Dryden, Donna; Butler, Mary; Foisy, Michelle; Anderson, Johanna; Motu'apuaka, Makalapua; Relevo, Rose; Guise, Jeanne-Marie; Chang, Stephanie

    2016-02-01

    As time and cost constraints in the conduct of systematic reviews increase, the need to consider the use of existing systematic reviews also increases. We developed guidance on the integration of systematic reviews into new reviews. A workgroup of methodologists from Evidence-based Practice Centers developed consensus-based recommendations. Discussions were informed by a literature scan and by interviews with organizations that conduct systematic reviews. Twelve recommendations were developed addressing selecting reviews, assessing risk of bias, qualitative and quantitative synthesis, and summarizing and assessing body of evidence. We provide preliminary guidance for an efficient and unbiased approach to integrating existing systematic reviews with primary studies in a new review. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. [Counseling interventions for smoking cessation: systematic review].

    PubMed

    Alba, Luz Helena; Murillo, Raúl; Castillo, Juan Sebastián

    2013-04-01

    A systematic review on efficacy and safety of smoking cessation counseling was developed. The ADAPTE methodology was used with a search of Clinical Practice Guidelines (CPG) in Medline, EMBASE, CINAHL, LILACS, and Cochrane. DELBI was used to select CPG with score over 60 in methodological rigor and applicability to the Colombian health system. Smoking cessation rates at 6 months were assessed according to counseling provider, model, and format. In total 5 CPG out of 925 references were selected comprising 44 systematic reviews and meta-analyses. Physician brief counseling and trained health professionals' intensive counseling (individual, group, proactive telephone) are effective with abstinence rates between 2.1% and 17.4%. Only practical counseling and motivational interview were found effective intensive interventions. The clinical effect of smoking cessation counseling is low and long term cessation rates uncertain. Cost-effectiveness analyses are recommended for the implementation of counseling in public health programs.

  11. Design and analysis of a sub-aperture scanning machine for the transmittance measurements of large-aperture optical system

    NASA Astrophysics Data System (ADS)

    He, Yingwei; Li, Ping; Feng, Guojin; Cheng, Li; Wang, Yu; Wu, Houping; Liu, Zilong; Zheng, Chundi; Sha, Dingguo

    2010-11-01

    For measuring large-aperture optical system transmittance, a novel sub-aperture scanning machine with double-rotating arms (SSMDA) was designed to obtain sub-aperture beam spot. Optical system full-aperture transmittance measurements can be achieved by applying sub-aperture beam spot scanning technology. The mathematical model of the SSMDA based on a homogeneous coordinate transformation matrix is established to develop a detailed methodology for analyzing the beam spot scanning errors. The error analysis methodology considers two fundamental sources of scanning errors, namely (1) the length systematic errors and (2) the rotational systematic errors. As the systematic errors of the parameters are given beforehand, computational results of scanning errors are between -0.007~0.028mm while scanning radius is not lager than 400.000mm. The results offer theoretical and data basis to the research on transmission characteristics of large optical system.

  12. Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable?

    PubMed

    Booth, Andrew; Carroll, Christopher

    2015-09-01

    In recognising the potential value of theory in understanding how interventions work comes a challenge - how to make identification of theory less haphazard? To explore the feasibility of systematic identification of theory. We searched PubMed for published reviews (1998-2012) that had explicitly sought to identify theory. Systematic searching may be characterised by a structured question, methodological filters and an itemised search procedure. We constructed a template (BeHEMoTh - Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory. The authors tested the template within two systematic reviews. Of 34 systematic reviews, only 12 reviews (35%) reported a method for identifying theory. Nineteen did not specify how they identified studies containing theory. Data were unavailable for three reviews. Candidate terms include concept(s)/conceptual, framework(s), model(s), and theory/theories/theoretical. Information professionals must overcome inadequate reporting and the use of theory out of context. The review team faces an additional concern in lack of 'theory fidelity'. Based on experience with two systematic reviews, the BeHEMoTh template and procedure offers a feasible and useful approach for identification of theory. Applications include realist synthesis, framework synthesis or review of complex interventions. The procedure requires rigorous evaluation. © 2015 Health Libraries Group.

  13. Systematic parameter estimation and sensitivity analysis using a multidimensional PEMFC model coupled with DAKOTA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chao Yang; Luo, Gang; Jiang, Fangming

    2010-05-01

    Current computational models for proton exchange membrane fuel cells (PEMFCs) include a large number of parameters such as boundary conditions, material properties, and numerous parameters used in sub-models for membrane transport, two-phase flow and electrochemistry. In order to successfully use a computational PEMFC model in design and optimization, it is important to identify critical parameters under a wide variety of operating conditions, such as relative humidity, current load, temperature, etc. Moreover, when experimental data is available in the form of polarization curves or local distribution of current and reactant/product species (e.g., O2, H2O concentrations), critical parameters can be estimated inmore » order to enable the model to better fit the data. Sensitivity analysis and parameter estimation are typically performed using manual adjustment of parameters, which is also common in parameter studies. We present work to demonstrate a systematic approach based on using a widely available toolkit developed at Sandia called DAKOTA that supports many kinds of design studies, such as sensitivity analysis as well as optimization and uncertainty quantification. In the present work, we couple a multidimensional PEMFC model (which is being developed, tested and later validated in a joint effort by a team from Penn State Univ. and Sandia National Laboratories) with DAKOTA through the mapping of model parameters to system responses. Using this interface, we demonstrate the efficiency of performing simple parameter studies as well as identifying critical parameters using sensitivity analysis. Finally, we show examples of optimization and parameter estimation using the automated capability in DAKOTA.« less

  14. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    ERIC Educational Resources Information Center

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  15. The economics of dentistry: a neglected concern.

    PubMed

    Shariati, Batoul; MacEntee, Michael I; Yazdizadeh, Maryam

    2013-10-01

    Demand for economic evaluations in health care is growing with expectations that they will help to develop regional and national policies on health and social programmes. We present here the scope, quality and content of systematic reviews and meta-analyses relating to the economics of dentistry published over the last 15 years. To review the quality and outcome of systematic reviews and meta-analyses relating to the economics of dental treatments, preventions and services. A systematic search was conducted in 14 electronic databases for systematic reviews and meta-analyses published between January 1997 and July 2011 on the economics of oral disorders and oral health care. Review papers were extracted by two independent investigators to identify the characteristics, results and quality of the reviews and to highlight gaps in knowledge about the economics of dentistry. From 3150 unique references, we found 73 systematic reviews or meta-analyses of dental economics as primary or secondary outcomes. The focus of 12 of them was on the cost or cost-effectiveness of dental prevention, 54 on treatment, five on prevention and treatment and two on delivery of dental services. However, only 12 of the systematic reviews drew conclusions from economic data, and four of them constructed an economic model from synthesized data. Overall, the quality was good in the 12 systematic reviews but poor in the original studies. There is very little helpful data published on the economics of dentistry. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Retrieve the Bethe states of quantum integrable models solved via the off-diagonal Bethe Ansatz

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Li, Yuan-Yuan; Cao, Junpeng; Yang, Wen-Li; Shi, Kangjie; Wang, Yupeng

    2015-05-01

    Based on the inhomogeneous T-Q relation constructed via the off-diagonal Bethe Ansatz, a systematic method for retrieving the Bethe-type eigenstates of integrable models without obvious reference state is developed by employing certain orthogonal basis of the Hilbert space. With the XXZ spin torus model and the open XXX spin- \\frac{1}{2} chain as examples, we show that for a given inhomogeneous T-Q relation and the associated Bethe Ansatz equations, the constructed Bethe-type eigenstate has a well-defined homogeneous limit.

  17. Quantitative Prediction of Drug–Drug Interactions Involving Inhibitory Metabolites in Drug Development: How Can Physiologically Based Pharmacokinetic Modeling Help?

    PubMed Central

    Chen, Y; Mao, J; Lin, J; Yu, H; Peters, S; Shebley, M

    2016-01-01

    This subteam under the Drug Metabolism Leadership Group (Innovation and Quality Consortium) investigated the quantitative role of circulating inhibitory metabolites in drug–drug interactions using physiologically based pharmacokinetic (PBPK) modeling. Three drugs with major circulating inhibitory metabolites (amiodarone, gemfibrozil, and sertraline) were systematically evaluated in addition to the literature review of recent examples. The application of PBPK modeling in drug interactions by inhibitory parent–metabolite pairs is described and guidance on strategic application is provided. PMID:27642087

  18. The first Fermi LAT supernova remnant catalog

    DOE PAGES

    Acero, F.

    2016-05-16

    To uniformly determine the properties of supernova remnants (SNRs) at high energies, we have developed the first systematic survey at energies from 1 to 100 GeV using data from the Fermi Large Area Telescope. Based on the spatial overlap of sources detected at GeV energies with SNRs known from radio surveys, we classify 30 sources as likely GeV SNRs. We also report 14 marginal associations and 245 flux upper limits. A mock catalog in which the positions of known remnants are scrambled in Galactic longitude, allows us to determine an upper limit of 22% on the number of GeV candidatesmore » falsely identified as SNRs. We have also developed a method to estimate spectral and spatial systematic errors arising from the diffuse interstellar emission model, a key component of all Galactic Fermi LAT analyses. By studying remnants uniformly in aggregate, we measure the GeV properties common to these objects and provide a crucial context for the detailed modeling of individual SNRs. Combining our GeV results with multiwavelength (MW) data, including radio, X-ray, and TeV, demonstrates the need for improvements to previously sufficient, simple models describing the GeV and radio emission from these objects. As a result, we model the GeV and MW emission from SNRs in aggregate to constrain their maximal contribution to observed Galactic cosmic rays.« less

  19. THE FIRST FERMI LAT SUPERNOVA REMNANT CATALOG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acero, F.; Ballet, J.; Ackermann, M.

    2016-05-01

    To uniformly determine the properties of supernova remnants (SNRs) at high energies, we have developed the first systematic survey at energies from 1 to 100 GeV using data from the Fermi Large Area Telescope (LAT). Based on the spatial overlap of sources detected at GeV energies with SNRs known from radio surveys, we classify 30 sources as likely GeV SNRs. We also report 14 marginal associations and 245 flux upper limits. A mock catalog in which the positions of known remnants are scrambled in Galactic longitude allows us to determine an upper limit of 22% on the number of GeV candidates falsely identifiedmore » as SNRs. We have also developed a method to estimate spectral and spatial systematic errors arising from the diffuse interstellar emission model, a key component of all Galactic Fermi LAT analyses. By studying remnants uniformly in aggregate, we measure the GeV properties common to these objects and provide a crucial context for the detailed modeling of individual SNRs. Combining our GeV results with multiwavelength (MW) data, including radio, X-ray, and TeV, we demonstrate the need for improvements to previously sufficient, simple models describing the GeV and radio emission from these objects. We model the GeV and MW emission from SNRs in aggregate to constrain their maximal contribution to observed Galactic cosmic rays.« less

  20. Initial Systematic Investigations of the Weakly Coupled Free Fermionic Heterotic String Landscape Statistics

    NASA Astrophysics Data System (ADS)

    Renner, Timothy

    2011-12-01

    A C++ framework was constructed with the explicit purpose of systematically generating string models using the Weakly Coupled Free Fermionic Heterotic String (WCFFHS) method. The software, optimized for speed, generality, and ease of use, has been used to conduct preliminary systematic investigations of WCFFHS vacua. Documentation for this framework is provided in the Appendix. After an introduction to theoretical and computational aspects of WCFFHS model building, a study of ten-dimensional WCFFHS models is presented. Degeneracies among equivalent expressions of each of the known models are investigated and classified. A study of more phenomenologically realistic four-dimensional models based on the well known "NAHE" set is then presented, with statistics being reported on gauge content, matter representations, and space-time supersymmetries. The final study is a parallel to the NAHE study in which a variation of the NAHE set is systematically extended and examined statistically. Special attention is paid to models with "mirroring"---identical observable and hidden sector gauge groups and matter representations.

  1. How Methodologic Differences Affect Results of Economic Analyses: A Systematic Review of Interferon Gamma Release Assays for the Diagnosis of LTBI

    PubMed Central

    Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick

    2013-01-01

    Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412

  2. [The use of systematic review to develop a self-management program for CKD].

    PubMed

    Lee, Yu-Chin; Wu, Shu-Fang Vivienne; Lee, Mei-Chen; Chen, Fu-An; Yao, Yen-Hong; Wang, Chin-Ling

    2014-12-01

    Chronic kidney disease (CKD) has become a public health issue of international concern due to its high prevalence. The concept of self-management has been comprehensively applied in education programs that address chronic diseases. In recent years, many studies have used self-management programs in CKD interventions and have investigated the pre- and post-intervention physiological and psychological effectiveness of this approach. However, a complete clinical application program in the self-management model has yet to be developed for use in clinical renal care settings. A systematic review is used to develop a self-management program for CKD. Three implementation steps were used in this study. These steps include: (1) A systematic literature search and review using databases including CEPS (Chinese Electronic Periodical Services) of Airiti, National Digital Library of Theses and Dissertations in Taiwan, CINAHL, Pubmed, Medline, Cochrane Library, and Joanna Briggs Institute. A total of 22 studies were identified as valid and submitted to rigorous analysis. Of these, 4 were systematic literature reviews, 10 were randomized experimental studies, and 8 were non-randomized experimental studies. (2) Empirical evidence then was used to draft relevant guidelines on clinical application. (3) Finally, expert panels tested the validity of the draft to ensure the final version was valid for application in practice. This study designed a self-management program for CKD based on the findings of empirical studies. The content of this program included: design principles, categories, elements, and the intervention measures used in the self-management program. This program and then was assessed using the content validity index (CVI) and a four-point Liker's scale. The content validity score was .98. The guideline of self-management program to CKD was thus developed. This study developed a self-management program applicable to local care of CKD. It is hoped that the guidelines developed in this study offer a reference for clinical caregivers to improve their healthcare practices.

  3. Understanding Systematics in ZZ Ceti Model Fitting to Enable Differential Seismology

    NASA Astrophysics Data System (ADS)

    Fuchs, J. T.; Dunlap, B. H.; Clemens, J. C.; Meza, J. A.; Dennihy, E.; Koester, D.

    2017-03-01

    We are conducting a large spectroscopic survey of over 130 Southern ZZ Cetis with the Goodman Spectrograph on the SOAR Telescope. Because it employs a single instrument with high UV throughput, this survey will both improve the signal-to-noise of the sample of SDSS ZZ Cetis and provide a uniform dataset for model comparison. We are paying special attention to systematics in the spectral fitting and quantify three of those systematics here. We show that relative positions in the log g -Teff plane are consistent for these three systematics.

  4. 3D Bioprinting of Tissue/Organ Models.

    PubMed

    Pati, Falguni; Gantelius, Jesper; Svahn, Helene Andersson

    2016-04-04

    In vitro tissue/organ models are useful platforms that can facilitate systematic, repetitive, and quantitative investigations of drugs/chemicals. The primary objective when developing tissue/organ models is to reproduce physiologically relevant functions that typically require complex culture systems. Bioprinting offers exciting prospects for constructing 3D tissue/organ models, as it enables the reproducible, automated production of complex living tissues. Bioprinted tissues/organs may prove useful for screening novel compounds or predicting toxicity, as the spatial and chemical complexity inherent to native tissues/organs can be recreated. In this Review, we highlight the importance of developing 3D in vitro tissue/organ models by 3D bioprinting techniques, characterization of these models for evaluating their resemblance to native tissue, and their application in the prioritization of lead candidates, toxicity testing, and as disease/tumor models. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The University of British Columbia model of interprofessional education.

    PubMed

    Charles, Grant; Bainbridge, Lesley; Gilbert, John

    2010-01-01

    The College of Health Disciplines, at the University of British Columbia (UBC) has a long history of developing interprofessional learning opportunities for students and practitioners. Historically, many of the courses and programmes were developed because they intuitively made sense or because certain streams of funding were available at particular times. While each of them fit generally within our understanding of interprofessional education in the health and human service education programs, they were not systematically developed within an educational or theoretical framework. This paper discusses the model we have subsequently developed at the College for conceptualizing the various types of interprofessional experiences offered at UBC. It has been developed so that we can offer the broadest range of courses and most effective learning experiences for our students. Our model is based on the premise that there are optimal learning times for health and human services students (and practitioners) depending upon their stage of development as professionals in their respective disciplines and their readiness to learn and develop new perspectives on professional interaction.

  6. Theoretical approaches of online social network interventions and implications for behavioral change: a systematic review.

    PubMed

    Arguel, Amaël; Perez-Concha, Oscar; Li, Simon Y W; Lau, Annie Y S

    2018-02-01

    The aim of this review was to identify general theoretical frameworks used in online social network interventions for behavioral change. To address this research question, a PRISMA-compliant systematic review was conducted. A systematic review (PROSPERO registration number CRD42014007555) was conducted using 3 electronic databases (PsycINFO, Pubmed, and Embase). Four reviewers screened 1788 abstracts. 15 studies were selected according to the eligibility criteria. Randomized controlled trials and controlled studies were assessed using Cochrane Collaboration's "risk-of-bias" tool, and narrative synthesis. Five eligible articles used the social cognitive theory as a framework to develop interventions targeting behavioral change. Other theoretical frameworks were related to the dynamics of social networks, intention models, and community engagement theories. Only one of the studies selected in the review mentioned a well-known theory from the field of health psychology. Conclusions were that guidelines are lacking in the design of online social network interventions for behavioral change. Existing theories and models from health psychology that are traditionally used for in situ behavioral change should be considered when designing online social network interventions in a health care setting. © 2016 John Wiley & Sons, Ltd.

  7. Role of conceptual models in a physical therapy curriculum: application of an integrated model of theory, research, and clinical practice.

    PubMed

    Darrah, Johanna; Loomis, Joan; Manns, Patricia; Norton, Barbara; May, Laura

    2006-11-01

    The Department of Physical Therapy, University of Alberta, Edmonton, Alberta, Canada, recently implemented a Master of Physical Therapy (MPT) entry-level degree program. As part of the curriculum design, two models were developed, a Model of Best Practice and the Clinical Decision-Making Model. Both models incorporate four key concepts of the new curriculum: 1) the concept that theory, research, and clinical practice are interdependent and inform each other; 2) the importance of client-centered practice; 3) the terminology and philosophical framework of the World Health Organization's International Classification of Functioning, Disability, and Health; and 4) the importance of evidence-based practice. In this article the general purposes of models for learning are described; the two models developed for the MPT program are described; and examples of their use with curriculum design and teaching are provided. Our experiences with both the development and use of models of practice have been positive. The models have provided both faculty and students with a simple, systematic structured framework to organize teaching and learning in the MPT program.

  8. Parental engagement in preventive parenting programs for child mental health: a systematic review of predictors and strategies to increase engagement

    PubMed Central

    Finan, Samantha J.; Swierzbiolek, Brooke; Priest, Naomi; Warren, Narelle

    2018-01-01

    Background Child mental health problems are now recognised as a key public health concern. Parenting programs have been developed as one solution to reduce children’s risk of developing mental health problems. However, their potential for widespread dissemination is hindered by low parental engagement, which includes intent to enrol, enrolment, and attendance. To increase parental engagement in preventive parenting programs, we need a better understanding of the predictors of engagement, and the strategies that can be used to enhance engagement. Method Employing a PRISMA method, we conducted a systematic review of the predictors of parent engagement and engagement enhancement strategies in preventive parenting programs. Key inclusion criteria included: (1) the intervention is directed primarily at the parent, (2) parent age >18 years, the article is (3) written in English and (4) published between 2004–2016. Stouffer’s method of combining p-values was used to determine whether associations between variables were reliable. Results Twenty-three articles reported a variety of predictors of parental engagement and engagement enhancement strategies. Only one of eleven predictors (child mental health symptoms) demonstrated a reliable association with enrolment (Stouffer’s p < .01). Discussion There was a lack of consistent evidence for predictors of parental engagement. Nonetheless, preliminary evidence suggests that engagement enhancement strategies modelled on theories, such as the Health Belief Model and Theory of Planned Behaviour, may increase parents’ engagement. Systematic review registration PROSPERO CRD42014013664. PMID:29719737

  9. Testing Scientific Software: A Systematic Literature Review

    PubMed Central

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  10. Model representations of kerogen structures: An insight from density functional theory calculations and spectroscopic measurements.

    PubMed

    Weck, Philippe F; Kim, Eunja; Wang, Yifeng; Kruichak, Jessica N; Mills, Melissa M; Matteo, Edward N; Pellenq, Roland J-M

    2017-08-01

    Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematically compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.

  11. Asteroseismic modelling of solar-type stars: internal systematics from input physics and surface correction methods

    NASA Astrophysics Data System (ADS)

    Nsamba, B.; Campante, T. L.; Monteiro, M. J. P. F. G.; Cunha, M. S.; Rendle, B. M.; Reese, D. R.; Verma, K.

    2018-04-01

    Asteroseismic forward modelling techniques are being used to determine fundamental properties (e.g. mass, radius, and age) of solar-type stars. The need to take into account all possible sources of error is of paramount importance towards a robust determination of stellar properties. We present a study of 34 solar-type stars for which high signal-to-noise asteroseismic data is available from multi-year Kepler photometry. We explore the internal systematics on the stellar properties, that is, associated with the uncertainty in the input physics used to construct the stellar models. In particular, we explore the systematics arising from: (i) the inclusion of the diffusion of helium and heavy elements; and (ii) the uncertainty in solar metallicity mixture. We also assess the systematics arising from (iii) different surface correction methods used in optimisation/fitting procedures. The systematics arising from comparing results of models with and without diffusion are found to be 0.5%, 0.8%, 2.1%, and 16% in mean density, radius, mass, and age, respectively. The internal systematics in age are significantly larger than the statistical uncertainties. We find the internal systematics resulting from the uncertainty in solar metallicity mixture to be 0.7% in mean density, 0.5% in radius, 1.4% in mass, and 6.7% in age. The surface correction method by Sonoi et al. and Ball & Gizon's two-term correction produce the lowest internal systematics among the different correction methods, namely, ˜1%, ˜1%, ˜2%, and ˜8% in mean density, radius, mass, and age, respectively. Stellar masses obtained using the surface correction methods by Kjeldsen et al. and Ball & Gizon's one-term correction are systematically higher than those obtained using frequency ratios.

  12. Estimating the Spatial Extent of Unsaturated Zones in Heterogeneous River-Aquifer Systems

    NASA Astrophysics Data System (ADS)

    Schilling, Oliver S.; Irvine, Dylan J.; Hendricks Franssen, Harrie-Jan; Brunner, Philip

    2017-12-01

    The presence of unsaturated zones at the river-aquifer interface has large implications on numerous hydraulic and chemical processes. However, the hydrological and geological controls that influence the development of unsaturated zones have so far only been analyzed with simplified conceptualizations of flow processes, or homogeneous conceptualizations of the hydraulic conductivity in either the aquifer or the riverbed. We systematically investigated the influence of heterogeneous structures in both the riverbed and the aquifer on the development of unsaturated zones. A stochastic 1-D criterion that takes both riverbed and aquifer heterogeneity into account was developed using a Monte Carlo sampling technique. The approach allows the reliable estimation of the upper bound of the spatial extent of unsaturated areas underneath a riverbed. Through systematic numerical modeling experiments, we furthermore show that horizontal capillary forces can reduce the spatial extent of unsaturated zones under clogged areas. This analysis shows how the spatial structure of clogging layers and aquifers influence the propensity for unsaturated zones to develop: In riverbeds where clogged areas are made up of many small, spatially disconnected patches with a diameter in the order of 1 m, unsaturated areas are less likely to develop compared to riverbeds where large clogged areas exist adjacent to unclogged areas. A combination of the stochastic 1-D criterion with an analysis of the spatial structure of the clogging layers and the potential for resaturation can help develop an appropriate conceptual model and inform the choice of a suitable numerical simulator for river-aquifer systems.

  13. Pain management: a review of organisation models with integrated processes for the management of pain in adult cancer patients.

    PubMed

    Brink-Huis, Anita; van Achterberg, Theo; Schoonhoven, Lisette

    2008-08-01

    This paper reports a review of the literature conducted to identify organisation models in cancer pain management that contain integrated care processes and describe their effectiveness. Pain is experienced by 30-50% of cancer patients receiving treatment and by 70-90% of those with advanced disease. Efforts to improve pain management have been made through the development and dissemination of clinical guidelines. Early improvements in pain management were focussed on just one or two single processes such as pain assessment and patient education. Little is known about organisational models with multiple integrated processes throughout the course of the disease trajectory and concerning all stages of the care process. Systematic review. The review involved a systematic search of the literature, published between 1986-2006. Subject-specific keywords used to describe patients, disease, pain management interventions and integrated care processes, relevant for this review were selected using the thesaurus of the databases. Institutional models, clinical pathways and consultation services are three alternative models for the integration of care processes in cancer pain management. A clinical pathway is a comprehensive institutionalisation model, whereas a pain consultation service is a 'stand-alone' model that can be integrated in a clinical pathway. Positive patient and process outcomes have been described for all three models, although the level of evidence is generally low. Evaluation of the quality of pain management must involve standardised measurements of both patient and process outcomes. We recommend the development of policies for referrals to a pain consultation service. These policies can be integrated within a clinical pathway. To evaluate the effectiveness of pain management models standardised outcome measures are needed.

  14. The Application of Adaptive Behaviour Models: A Systematic Review

    PubMed Central

    Price, Jessica A.; Morris, Zoe A.; Costello, Shane

    2018-01-01

    Adaptive behaviour has been viewed broadly as an individual’s ability to meet the standards of social responsibilities and independence; however, this definition has been a source of debate amongst researchers and clinicians. Based on the rich history and the importance of the construct of adaptive behaviour, the current study aimed to provide a comprehensive overview of the application of adaptive behaviour models to assessment tools, through a systematic review. A plethora of assessment measures for adaptive behaviour have been developed in order to adequately assess the construct; however, it appears that the only definition on which authors seem to agree is that adaptive behaviour is what adaptive behaviour scales measure. The importance of the construct for diagnosis, intervention and planning has been highlighted throughout the literature. It is recommended that researchers and clinicians critically review what measures of adaptive behaviour they are utilising and it is suggested that the definition and theory is revisited. PMID:29342927

  15. Direct-to-consumer-advertising of prescription medicines: a theoretical approach to understanding.

    PubMed

    Harker, Michael; Harker, Debra

    2007-01-01

    The pharmaceutical industry is a leader in research and development investment. New treatments need to be communicated to the market, and consumers are increasingly interested in learning about new drugs. Direct to consumer advertising of prescription medicines (DTCA) is a controversial practice where many of the arguments for and against are not supported by strong evidence. This paper aims to contribute to a research agenda that is forming in this area. The paper reports on a systematic review that was conducted and applies accepted theoretical models to the DTCA context. The systematic review methodology is widely accepted in the medical sector and is successfully applied here in the marketing field. The hierarchy of effects model is specifically applied to DTCA with a clear emphasis on consumer rights, empowerment, protection and knowledge. This paper provides healthcare practitioners with insight into how consumers process DTCA messages and provides guidance into how to assist in this message processing.

  16. A review of unmanned aircraft system ground risk models

    NASA Astrophysics Data System (ADS)

    Washington, Achim; Clothier, Reece A.; Silva, Jose

    2017-11-01

    There is much effort being directed towards the development of safety regulations for unmanned aircraft systems (UAS). National airworthiness authorities have advocated the adoption of a risk-based approach, whereby regulations are driven by the outcomes of a systematic process to assess and manage identified safety risks. Subsequently, models characterising the primary hazards associated with UAS operations have now become critical to the development of regulations and in turn, to the future of the industry. Key to the development of airworthiness regulations for UAS is a comprehensive understanding of the risks UAS operations pose to people and property on the ground. A comprehensive review of the literature identified 33 different models (and component sub models) used to estimate ground risk posed by UAS. These models comprise failure, impact location, recovery, stress, exposure, incident stress and harm sub-models. The underlying assumptions and treatment of uncertainties in each of these sub-models differ significantly between models, which can have a significant impact on the development of regulations. This paper reviews the state-of-the-art in research into UAS ground risk modelling, discusses how the various sub-models relate to the different components of the regulation, and explores how model-uncertainties potentially impact the development of regulations for UAS.

  17. Constraints and Opportunities in GCM Model Development

    NASA Technical Reports Server (NTRS)

    Schmidt, Gavin; Clune, Thomas

    2010-01-01

    Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.

  18. PARAGON: A Systematic, Integrated Approach to Aerosol Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Diner, David J.; Kahn, Ralph A.; Braverman, Amy J.; Davies, Roger; Martonchik, John V.; Menzies, Robert T.; Ackerman, Thomas P.; Seinfeld, John H.; Anderson, Theodore L.; Charlson, Robert J.; hide

    2004-01-01

    Aerosols are generated and transformed by myriad processes operating across many spatial and temporal scales. Evaluation of climate models and their sensitivity to changes, such as in greenhouse gas abundances, requires quantifying natural and anthropogenic aerosol forcings and accounting for other critical factors, such as cloud feedbacks. High accuracy is required to provide sufficient sensitivity to perturbations, separate anthropogenic from natural influences, and develop confidence in inputs used to support policy decisions. Although many relevant data sources exist, the aerosol research community does not currently have the means to combine these diverse inputs into an integrated data set for maximum scientific benefit. Bridging observational gaps, adapting to evolving measurements, and establishing rigorous protocols for evaluating models are necessary, while simultaneously maintaining consistent, well understood accuracies. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) concept represents a systematic, integrated approach to global aerosol Characterization, bringing together modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies to provide the machinery necessary for achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the Earth system. We outline a framework for integrating and interpreting observations and models and establishing an accurate, consistent and cohesive long-term data record.

  19. Automated Systematic Generation and Exploration of Flat Direction Phenomenology in Free Fermionic Heterotic String Theory

    NASA Astrophysics Data System (ADS)

    Greenwald, Jared

    Any good physical theory must resolve current experimental data as well as offer predictions for potential searches in the future. The Standard Model of particle physics, Grand Unied Theories, Minimal Supersymmetric Models and Supergravity are all attempts to provide such a framework. However, they all lack the ability to predict many of the parameters that each of the theories utilize. String theory may yield a solution to this naturalness (or self-predictiveness) problem as well as offer a unifed theory of gravity. Studies in particle physics phenomenology based on perturbative low energy analysis of various string theories can help determine the candidacy of such models. After a review of principles and problems leading up to our current understanding of the universe, we will discuss some of the best particle physics model building techniques that have been developed using string theory. This will culminate in the introduction of a novel approach to a computational, systematic analysis of the various physical phenomena that arise from these string models. We focus on the necessary assumptions, complexity and open questions that arise while making a fully-automated at direction analysis program.

  20. Contributions of Dynamic Systems Theory to Cognitive Development

    PubMed Central

    Spencer, John P.; Austin, Andrew; Schutte, Anne R.

    2015-01-01

    This paper examines the contributions of dynamic systems theory to the field of cognitive development, focusing on modeling using dynamic neural fields. A brief overview highlights the contributions of dynamic systems theory and the central concepts of dynamic field theory (DFT). We then probe empirical predictions and findings generated by DFT around two examples—the DFT of infant perseverative reaching that explains the Piagetian A-not-B error, and the DFT of spatial memory that explain changes in spatial cognition in early development. A systematic review of the literature around these examples reveals that computational modeling is having an impact on empirical research in cognitive development; however, this impact does not extend to neural and clinical research. Moreover, there is a tendency for researchers to interpret models narrowly, anchoring them to specific tasks. We conclude on an optimistic note, encouraging both theoreticians and experimentalists to work toward a more theory-driven future. PMID:26052181

  1. Development and validation of risk models and molecular diagnostics to permit personalized management of cancer.

    PubMed

    Pu, Xia; Ye, Yuanqing; Wu, Xifeng

    2014-01-01

    Despite the advances made in cancer management over the past few decades, improvements in cancer diagnosis and prognosis are still poor, highlighting the need for individualized strategies. Toward this goal, risk prediction models and molecular diagnostic tools have been developed, tailoring each step of risk assessment from diagnosis to treatment and clinical outcomes based on the individual's clinical, epidemiological, and molecular profiles. These approaches hold increasing promise for delivering a new paradigm to maximize the efficiency of cancer surveillance and efficacy of treatment. However, they require stringent study design, methodology development, comprehensive assessment of biomarkers and risk factors, and extensive validation to ensure their overall usefulness for clinical translation. In the current study, the authors conducted a systematic review using breast cancer as an example and provide general guidelines for risk prediction models and molecular diagnostic tools, including development, assessment, and validation. © 2013 American Cancer Society.

  2. Evidence-Based Guidelines for Fatigue Risk Management in EMS: Formulating Research Questions and Selecting Outcomes.

    PubMed

    Patterson, P Daniel; Higgins, J Stephen; Lang, Eddy S; Runyon, Michael S; Barger, Laura K; Studnek, Jonathan R; Moore, Charity G; Robinson, Kathy; Gainor, Dia; Infinger, Allison; Weiss, Patricia M; Sequeira, Denisse J; Martin-Gill, Christian

    2017-01-01

    Greater than half of Emergency Medical Services (EMS) personnel report work-related fatigue, yet there are no guidelines for the management of fatigue in EMS. A novel process has been established for evidence-based guideline (EBG) development germane to clinical EMS questions. This process has not yet been applied to operational EMS questions like fatigue risk management. The objective of this study was to develop content valid research questions in the Population, Intervention, Comparison, and Outcome (PICO) framework, and select outcomes to guide systematic reviews and development of EBGs for EMS fatigue risk management. We adopted the National Prehospital EBG Model Process and Grading of Recommendations Assessment, Development, and Evaluation (GRADE) framework for developing, implementing, and evaluating EBGs in the prehospital care setting. In accordance with steps one and two of the Model Process, we searched for existing EBGs, developed a multi-disciplinary expert panel and received external input. Panelists completed an iterative process to formulate research questions. We used the Content Validity Index (CVI) to score relevance and clarity of candidate PICO questions. The panel completed multiple rounds of question editing and used a CVI benchmark of ≥0.78 to indicate acceptable levels of clarity and relevance. Outcomes for each PICO question were rated from 1 = less important to 9 = critical. Panelists formulated 13 candidate PICO questions, of which 6 were eliminated or merged with other questions. Panelists reached consensus on seven PICO questions (n = 1 diagnosis and n = 6 intervention). Final CVI scores of relevance ranged from 0.81 to 1.00. Final CVI scores of clarity ranged from 0.88 to 1.00. The mean number of outcomes rated as critical, important, and less important by PICO question was 0.7 (SD 0.7), 5.4 (SD 1.4), and 3.6 (SD 1.9), respectively. Patient and personnel safety were rated as critical for most PICO questions. PICO questions and outcomes were registered with PROSPERO, an international database of prospectively registered systematic reviews. We describe formulating and refining research questions and selection of outcomes to guide systematic reviews germane to EMS fatigue risk management. We outline a protocol for applying the Model Process and GRADE framework to create evidence-based guidelines.

  3. Early warning systems for the management of chronic heart failure: a systematic literature review of cost-effectiveness models.

    PubMed

    Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L

    2018-04-01

    Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.

  4. Understanding system dynamics of an adaptive enzyme network from globally profiled kinetic parameters.

    PubMed

    Chiang, Austin W T; Liu, Wei-Chung; Charusanti, Pep; Hwang, Ming-Jing

    2014-01-15

    A major challenge in mathematical modeling of biological systems is to determine how model parameters contribute to systems dynamics. As biological processes are often complex in nature, it is desirable to address this issue using a systematic approach. Here, we propose a simple methodology that first performs an enrichment test to find patterns in the values of globally profiled kinetic parameters with which a model can produce the required system dynamics; this is then followed by a statistical test to elucidate the association between individual parameters and different parts of the system's dynamics. We demonstrate our methodology on a prototype biological system of perfect adaptation dynamics, namely the chemotaxis model for Escherichia coli. Our results agreed well with those derived from experimental data and theoretical studies in the literature. Using this model system, we showed that there are motifs in kinetic parameters and that these motifs are governed by constraints of the specified system dynamics. A systematic approach based on enrichment statistical tests has been developed to elucidate the relationships between model parameters and the roles they play in affecting system dynamics of a prototype biological network. The proposed approach is generally applicable and therefore can find wide use in systems biology modeling research.

  5. Building Sustainable Professional Development Programs: Applying Strategies From Implementation Science to Translate Evidence Into Practice.

    PubMed

    Baldwin, Constance D; Chandran, Latha; Gusic, Maryellen E

    2017-01-01

    Multisite and national professional development (PD) programs for educators are challenging to establish. Use of implementation science (IS) frameworks designed to convert evidence-based intervention methods into effective health care practice may help PD developers translate proven educational methods and models into successful, well-run programs. Implementation of the national Educational Scholars Program (ESP) is used to illustrate the value of the IS model. Four adaptable elements of IS are described: (1) replication of an evidence-based model, (2) systematic stages of implementation, (3) management of implementation using three implementation drivers, and (4) demonstration of program success through measures of fidelity to proven models and sustainability. Implementation of the ESP was grounded on five established principles and methods for successful PD. The process was conducted in four IS stages over 10 years: Exploration, Installation, Initial Implementation, and Full Implementation. To ensure effective and efficient processes, attention to IS implementation drivers helped to manage organizational relationships, build competence in faculty and scholars, and address leadership challenges. We describe the ESP's fidelity to evidence-based structures and methods, and offer three examples of sustainability efforts that enabled achievement of targeted program outcomes, including academic productivity, strong networking, and career advancement of scholars. Application of IS frameworks to program implementation may help other PD programs to translate evidence-based methods into interventions with enhanced impact. A PD program can follow systematic developmental stages and be operationalized by practical implementation drivers, thereby creating successful and sustainable interventions that promote the academic vitality of health professions educators.

  6. Strategies employed by older people to manage loneliness: systematic review of qualitative studies and model development.

    PubMed

    Kharicha, Kalpa; Manthorpe, Jill; Iliffe, Steve; Davies, Nathan; Walters, Kate

    2018-05-25

    ABSTRACTObjectives:To (i) systematically identify and review strategies employed by community dwelling lonely older people to manage their loneliness and (ii) develop a model for managing loneliness. A narrative synthesis review of English-language qualitative evidence, following Economic and Social Research Council guidance. Seven electronic databases were searched (1990-January 2017). The narrative synthesis included tabulation, thematic analysis, and conceptual model development. All co-authors assessed eligibility of final papers and reached a consensus on analytic themes. From 3,043 records, 11 studies were eligible including a total of 502 older people. Strategies employed to manage loneliness can be described by a model with two overarching dimensions, one related to the context of coping (alone or with/in reference to others), the other related to strategy type (prevention/action or acceptance/endurance of loneliness). The dynamic and subjective nature of loneliness is reflected in the variety of coping mechanisms, drawing on individual coping styles and highlighting considerable efforts in managing time, contacting others, and keeping loneliness hidden. Cognitive strategies were used to re-frame negative feelings, to make them more manageable or to shift the focus from the present or themselves. Few unsuccessful strategies were described. Strategies to manage loneliness vary from prevention/action through to acceptance and endurance. There are distinct preferences to cope alone or involve others; only those in the latter category are likely to engage with services and social activities. Older people who deal with their loneliness privately may find it difficult to articulate an inability to cope.

  7. Leaders' experiences and perceptions implementing activity-based funding and pay-for-performance hospital funding models: A systematic review.

    PubMed

    Baxter, Pamela E; Hewko, Sarah J; Pfaff, Kathryn A; Cleghorn, Laura; Cunningham, Barbara J; Elston, Dawn; Cummings, Greta G

    2015-08-01

    Providing cost-effective, accessible, high quality patient care is a challenge to governments and health care delivery systems across the globe. In response to this challenge, two types of hospital funding models have been widely implemented: (1) activity-based funding (ABF) and (2) pay-for-performance (P4P). Although health care leaders play a critical role in the implementation of these funding models, to date their perspectives have not been systematically examined. The purpose of this systematic review was to gain a better understanding of the experiences of health care leaders implementing hospital funding reforms within Organisation for Economic Cooperation and Development countries. We searched literature from 1982 to 2013 using: Medline, EMBASE, CINAHL, Academic Search Complete, Academic Search Elite, and Business Source Complete. Two independent reviewers screened titles, abstracts and full texts using predefined criteria. We included 2 mixed methods and 12 qualitative studies. Thematic analysis was used in synthesizing results. Five common themes and multiple subthemes emerged. Themes include: pre-requisites for success, perceived benefits, barriers/challenges, unintended consequences, and leader recommendations. Irrespective of which type of hospital funding reform was implemented, health care leaders described a complex process requiring the following: organizational commitment; adequate infrastructure; human, financial and information technology resources; change champions and a personal commitment to quality care. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  8. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine performance tests readily accessible will help advance a more transparent model evaluation process.

  9. Model-data integration to improve the LPJmL dynamic global vegetation model

    NASA Astrophysics Data System (ADS)

    Forkel, Matthias; Thonicke, Kirsten; Schaphoff, Sibyll; Thurner, Martin; von Bloh, Werner; Dorigo, Wouter; Carvalhais, Nuno

    2017-04-01

    Dynamic global vegetation models show large uncertainties regarding the development of the land carbon balance under future climate change conditions. This uncertainty is partly caused by differences in how vegetation carbon turnover is represented in global vegetation models. Model-data integration approaches might help to systematically assess and improve model performances and thus to potentially reduce the uncertainty in terrestrial vegetation responses under future climate change. Here we present several applications of model-data integration with the LPJmL (Lund-Potsdam-Jena managed Lands) dynamic global vegetation model to systematically improve the representation of processes or to estimate model parameters. In a first application, we used global satellite-derived datasets of FAPAR (fraction of absorbed photosynthetic activity), albedo and gross primary production to estimate phenology- and productivity-related model parameters using a genetic optimization algorithm. Thereby we identified major limitations of the phenology module and implemented an alternative empirical phenology model. The new phenology module and optimized model parameters resulted in a better performance of LPJmL in representing global spatial patterns of biomass, tree cover, and the temporal dynamic of atmospheric CO2. Therefore, we used in a second application additionally global datasets of biomass and land cover to estimate model parameters that control vegetation establishment and mortality. The results demonstrate the ability to improve simulations of vegetation dynamics but also highlight the need to improve the representation of mortality processes in dynamic global vegetation models. In a third application, we used multiple site-level observations of ecosystem carbon and water exchange, biomass and soil organic carbon to jointly estimate various model parameters that control ecosystem dynamics. This exercise demonstrates the strong role of individual data streams on the simulated ecosystem dynamics which consequently changed the development of ecosystem carbon stocks and fluxes under future climate and CO2 change. In summary, our results demonstrate challenges and the potential of using model-data integration approaches to improve a dynamic global vegetation model.

  10. Rethinking School Effectiveness and Improvement: A Question of Paradigms

    ERIC Educational Resources Information Center

    Wrigley, Terry

    2013-01-01

    The purpose of this article is to contribute to progressive school change by developing a more systematic critique of school effectiveness (SE) and school improvement (SI) as paradigms. Diverse examples of paradigms and paradigm change in non-educational fields are used to create a model of paradigms for application to SE and SI, and to explore…

  11. LeaD-In: A Cultural Change Model for Peer Review of Teaching in Higher Education

    ERIC Educational Resources Information Center

    Barnard, A.; Nash, R.; McEvoy, K.; Shannon, S.; Waters, C.; Rochester, S.; Bolt, S.

    2015-01-01

    Peer review of teaching is recognized increasingly as one strategy for academic development even though historically peer review of teaching is often unsupported by policy, action and culture in many Australian universities. Higher education leaders report that academics generally do not engage with peer review of teaching in a systematic or…

  12. Developing a Contextual Consciousness: Learning to Address Gender, Societal Power, and Culture in Clinical Practice

    ERIC Educational Resources Information Center

    Esmiol, Elisabeth E.; Knudson-Martin, Carmen; Delgado, Sarah

    2012-01-01

    Despite the growing number of culturally sensitive training models and considerable literature on the importance of training clinicians in larger contextual issues, research examining how students learn to apply these issues is limited. In this participatory action research project, we systematically studied our own process as marriage and family…

  13. "Black Magic" and "Gold Dust": The Epistemic and Political Uses of Evidence Tools in Public Health Policy Making

    ERIC Educational Resources Information Center

    Stewart, Ellen; Smith, Katherine E.

    2015-01-01

    Concerns about the limited influence of research on decision making have prompted the development of tools intended to mediate evidence for policy audiences. This article focuses on three examples, prominent in public health: impact assessments; systematic reviews; and economic decision-making tools (cost-benefit analysis and scenario modelling).…

  14. Some Essential Environmental Ingredients for Sex Offender Reintegration

    ERIC Educational Resources Information Center

    Boer, Douglas P.

    2013-01-01

    Until the systematic work on the Good Lives Model (GLM) produced by Tony Ward, not a great deal of conceptual structure existed to provide sex offender treatment specialists with a theoretical underpinning for their work in helping offenders develop a better life as a way to prevent reoffending. However, the work of Ward and colleagues initially…

  15. Metrics, The Measure of Your Future: Evaluation Report, 1977.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of Development.

    The primary goal of the Metric Education Project was the systematic development of a replicable educational model to facilitate the system-wide conversion to the metric system during the next five to ten years. This document is an evaluation of that project. Three sets of statistical evidence exist to support the fact that the project has been…

  16. Sustainable Design Re-Examined: Integrated Approach to Knowledge Creation for Sustainable Interior Design

    ERIC Educational Resources Information Center

    Lee, Young S.

    2014-01-01

    The article focuses on a systematic approach to the instructional framework to incorporate three aspects of sustainable design. It also aims to provide an instruction model for sustainable design stressing a collective effort to advance knowledge creation as a community. It develops a framework conjoining the concept of integrated process in…

  17. Effective Methodology for Teaching Beginning Reading in English to Bilingual Adults.

    ERIC Educational Resources Information Center

    Sainz, Jo-Ann; Biggins, Maria Goretti

    A systematic model for accelerating the process of developing the word decoding skills and building the vocabularies of bilingual adults was used among prison populations in Rockland County, Dutchess County, Suffolk County, and Essex County, New York, as well as in work-study programs in community centers in New York City. Literacy levels of the…

  18. Weaknesses of South African Education in the Mirror Image of International Educational Development

    ERIC Educational Resources Information Center

    Wolhuter, C. C.

    2014-01-01

    The aim of this article is to present a systematic, holistic evaluation of the South African education system, using international benchmarks as the yardstick. A theoretical model for the evaluation of a national education project is constructed. This consists of three dimensions, namely: a quantitative dimension, a qualitative dimension, and an…

  19. M-Readiness Assessment Model Development and Validation: Investigation of Readiness Index and Factors Affecting Readiness

    ERIC Educational Resources Information Center

    Bakhsh, Muhammad; Mahmood, Amjad; Sangi, Nazir Ahmed

    2018-01-01

    It is important for distance learning institutions to be well prepared before designing and implementing any new technology based learning system to justify the investment and minimize failure risk. It can be achieved by systematically assessing the readiness of all stakeholders. This paper first proposes an m-readiness assessment process and…

  20. An Analysis of the Learning Center in Community Colleges.

    ERIC Educational Resources Information Center

    Peterson, Gary T.

    A study was made to relate: (1) the concepts of a library of materials and (2) newer concepts such as instructional development activities which initiate a more scientific, systematic approach to the improvement and individualization of learning experiences. The major output of the study was to be a definitive model so that the fields of library…

Top