Science.gov

Sample records for systematic model development

  1. Systematic development of reduced reaction mechanisms for dynamic modeling

    NASA Technical Reports Server (NTRS)

    Frenklach, M.; Kailasanath, K.; Oran, E. S.

    1986-01-01

    A method for systematically developing a reduced chemical reaction mechanism for dynamic modeling of chemically reactive flows is presented. The method is based on the postulate that if a reduced reaction mechanism faithfully describes the time evolution of both thermal and chain reaction processes characteristic of a more complete mechanism, then the reduced mechanism will describe the chemical processes in a chemically reacting flow with approximately the same degree of accuracy. Here this postulate is tested by producing a series of mechanisms of reduced accuracy, which are derived from a full detailed mechanism for methane-oxygen combustion. These mechanisms were then tested in a series of reactive flow calculations in which a large-amplitude sinusoidal perturbation is applied to a system that is initially quiescent and whose temperature is high enough to start ignition processes. Comparison of the results for systems with and without convective flow show that this approach produces reduced mechanisms that are useful for calculations of explosions and detonations. Extensions and applicability to flames are discussed.

  2. Development of a Curriculum Integrated Library-Use Instructional Model: A Systematic Approach.

    ERIC Educational Resources Information Center

    Southwick, Neal S.

    A project was undertaken at Ricks College to develop a systematic library-use instructional model to be integrated into the existing genealogy curriculum. In addition to surveying relevant literature and making appropriate contacts with two- and four-year academic libraries, a needs assessment was conducted, instructional objectives were written,…

  3. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula. PMID:27303124

  4. Current Developments in Dementia Risk Prediction Modelling: An Updated Systematic Review

    PubMed Central

    Tang, Eugene Y. H.; Harrison, Stephanie L.; Errington, Linda; Gordon, Mark F.; Visser, Pieter Jelle; Novak, Gerald; Dufouil, Carole; Brayne, Carol; Robinson, Louise; Launer, Lenore J.; Stephan, Blossom C. M.

    2015-01-01

    Background Accurate identification of individuals at high risk of dementia influences clinical care, inclusion criteria for clinical trials and development of preventative strategies. Numerous models have been developed for predicting dementia. To evaluate these models we undertook a systematic review in 2010 and updated this in 2014 due to the increase in research published in this area. Here we include a critique of the variables selected for inclusion and an assessment of model prognostic performance. Methods Our previous systematic review was updated with a search from January 2009 to March 2014 in electronic databases (MEDLINE, Embase, Scopus, Web of Science). Articles examining risk of dementia in non-demented individuals and including measures of sensitivity, specificity or the area under the curve (AUC) or c-statistic were included. Findings In total, 1,234 articles were identified from the search; 21 articles met inclusion criteria. New developments in dementia risk prediction include the testing of non-APOE genes, use of non-traditional dementia risk factors, incorporation of diet, physical function and ethnicity, and model development in specific subgroups of the population including individuals with diabetes and those with different educational levels. Four models have been externally validated. Three studies considered time or cost implications of computing the model. Interpretation There is no one model that is recommended for dementia risk prediction in population-based settings. Further, it is unlikely that one model will fit all. Consideration of the optimal features of new models should focus on methodology (setting/sample, model development and testing in a replication cohort) and the acceptability and cost of attaining the risk variables included in the prediction score. Further work is required to validate existing models or develop new ones in different populations as well as determine the ethical implications of dementia risk prediction

  5. A Systematic Approach for Developing Conceptual Models of Contaminant Transport at the Hanford Site

    NASA Astrophysics Data System (ADS)

    Murray, C. J.; Last, G. V.; Rohay, V. J.; Schelling, F. J.; Hildebrand, R. D.; Morse, J. G.

    2004-12-01

    The U.S. Department of Energy (DOE) faces many decisions regarding future remedial actions and waste disposal at the Hanford Site in southeast Washington State. To support these decisions, DOE recognized the need for a comprehensive and systematic approach to developing and documenting complete, consistent, and defensible conceptual models of contaminant release and migration. After reviewing existing conceptual model development methodologies that might be applicable to environmental assessments at the Hanford Site, DOE initiated efforts to adapt and implement the Features, Events, and Processes (FEP) methodology developed for use in performance assessments of nuclear waste disposal systems by NIREX. In adapting this methodology for use in the environmental assessments at Hanford, the international list of FEPs, compiled from nuclear waste disposal programs, was evaluated to develop a list of potentially relevant Hanford-specific FEPs. The international nuclear waste programs focus on deep geologic disposal while waste disposal at the Hanford Site involves burial in shallow unconsolidated geologic deposits. Thus, a graphical tool called the Process Relationship Diagram (PRD) was created to assist in identifying the international FEPs and additional factors that are relevant to Hanford, and to illustrate the relationships among these factors. The PRD is similar in form and function to the Master Directed Diagram used by NIREX to provide a visual and systematic structure for the FEP methodology. Adaptation of this approach is showing promise in facilitating the development of conceptual models and selection of relevant factors to be incorporated into environmental uncertainty assessments for the Hanford Site.

  6. FOCAL: an experimental design tool for systematizing metabolic discoveries and model development

    PubMed Central

    2012-01-01

    Current computational tools can generate and improve genome-scale models based on existing data; however, for many organisms, the data needed to test and refine such models are not available. To facilitate model development, we created the forced coupling algorithm, FOCAL, to identify genetic and environmental conditions such that a reaction becomes essential for an experimentally measurable phenotype. This reaction's conditional essentiality can then be tested experimentally to evaluate whether network connections occur or to create strains with desirable phenotypes. FOCAL allows network connections to be queried, which improves our understanding of metabolism and accuracy of developed models. PMID:23236964

  7. Development of prognostic models for patients with traumatic brain injury: a systematic review

    PubMed Central

    Gao, Jinxi; Zheng, Zhaocong

    2015-01-01

    Outcome prediction following traumatic brain injury (TBI) is a widely investigated field of research. Several outcome prediction models have been developed for prognosis after TBI. There are two main prognostic models: International Mission for Prognosis and Clinical Trials in Traumatic Brain Injury (IMPACT) prognosis calculator and the Corticosteroid Randomization after Significant Head Injury (CRASH) prognosis calculator. The prognosis model has three or four levels: (1) model A included age, motor GCS, and pupil reactivity; (2) model B included predictors from model A with CT characteristics; and (3) model C included predictors from model B with laboratory parameters. In consideration of the fact that interventions after admission, such as ICP management also have prognostic value for outcome predictions and may improve the models’ performance, Yuan F et al developed another prediction model (model D) which includes ICP. With the development of molecular biology, a handful of brain injury biomarkers were reported that may improve the predictive power of prognostic models, including neuron-specific enolase (NSE), glial fibrillary acid protein (GFAP), S-100β protein, tumour necrosis factor-alpha (TNF-α), interleukin-6 (IL-6), myelin basic protein (MBP), cleaved tau protein (C-tau), spectrin breakdown products (SBDPs), and ubiquitin C-terminal hydrolase-L1 (UCH-L1), and sex hormones. A total of 40 manuscripts reporting 11 biomarkers were identified in the literature. Many substances have been implicated as potential biomarkers for TBI; however, no single biomarker has shown the necessary sensitivity and specificity for predicting outcome. The limited number of publications in this field underscores the need for further investigation. Through fluid biomarker analysis, the advent of multi-analyte profiling technology has enabled substantial advances in the diagnosis and treatment of a variety of conditions. Application of this technology to create a bio

  8. Modeling Agrilus planipennis within-tree colonization patterns and development of a systematic subsampling plan

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Emerald ash borer, Agrilus planipennis Fairmaire, an insect native to central Asia, was first detected in southeast Michigan in 2002, and has since killed millions of ash trees, Fraxinus spp., throughout eastern North America. Here, we use generalized linear mixed models to predict the presence or a...

  9. Systematic development of technical textiles

    NASA Astrophysics Data System (ADS)

    Beer, M.; Schrank, V.; Gloy, Y.-S.; Gries, T.

    2016-07-01

    Technical textiles are used in various fields of applications, ranging from small scale (e.g. medical applications) to large scale products (e.g. aerospace applications). The development of new products is often complex and time consuming, due to multiple interacting parameters. These interacting parameters are production process related and also a result of the textile structure and used material. A huge number of iteration steps are necessary to adjust the process parameter to finalize the new fabric structure. A design method is developed to support the systematic development of technical textiles and to reduce iteration steps. The design method is subdivided into six steps, starting from the identification of the requirements. The fabric characteristics vary depending on the field of application. If possible, benchmarks are tested. A suitable fabric production technology needs to be selected. The aim of the method is to support a development team within the technology selection without restricting the textile developer. After a suitable technology is selected, the transformation and correlation between input and output parameters follows. This generates the information for the production of the structure. Afterwards, the first prototype can be produced and tested. The resulting characteristics are compared with the initial product requirements.

  10. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews

    PubMed Central

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Background Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to ‘think’ conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. Methods and Findings In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Conclusions Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions. PMID:26575182

  11. A systematic review of the existing models of disordered eating: Do they inform the development of effective interventions?

    PubMed

    Pennesi, Jamie-Lee; Wade, Tracey D

    2016-02-01

    Despite significant advances in the development of prevention and treatment interventions for eating disorders and disordered eating over the last decade, there still remains a pressing need to develop more effective interventions. In line with the 2008 Medical Research Council (MRC) evaluation framework from the United Kingdom for the development and evaluation of complex interventions to improve health, the development of sound theory is a necessary precursor to the development of effective interventions. The aim of the current review was to identify the existing models for disordered eating and to identify those models which have helped inform the development of interventions for disordered eating. In addition, we examine the variables that most commonly appear across these models, in terms of future implications for the development of interventions for disordered eating. While an extensive range of theoretical models for the development of disordered eating were identified (N=54), only ten (18.5%) had progressed beyond mere description and to the development of interventions that have been evaluated. It is recommended that future work examines whether interventions in eating disorders increase in efficacy when developed in line with theoretical considerations, that initiation of new models gives way to further development of existing models, and that there be greater utilisation of intervention studies to inform the development of theory. PMID:26781985

  12. Towards Systematic Benchmarking of Climate Model Performance

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine

  13. A Comprehensive and Systematic Approach to Developing and Documenting Conceptual Models of Contaminant Release and Migration at the Hanford Site

    SciTech Connect

    Last, George V.; Rohay, Virginia J.; Schelling, F J.; Bunn, Amoret L.; Delamare, Michael A.; Dirkes, Roger L.; Hildebrand, R D.; Morse, John G.; Napier, Bruce A.; Riley, Robert G.; Soler, Luis; Thorne, Paul D.

    2004-04-01

    The U. S. Department of Energy?s Richland Operations Office has initiated efforts to adapt and implement the Features, Events, and Processes (FEPs) methodology (used in scenario development for nuclear waste disposal programs) to the environmental management and remediation problems facing the Hanford Site. These efforts have shown that modification of the FEPs methodology to incorporate the use of Process Relationship Diagrams (PRD) is effective in facilitating the development of conceptual models and selection of potentially relevant factors to be incorporated into a specific performance assessment. In developing this methodology for Hanford, a master PRD was created to provide an organization structure for identifying the potentially relevant factors (i.e. FEPs) and for illustrating the relationships between these factors. This organizational framework was developed to match the organization of current Hanford site-wide performance assessment activities and to facilitate screening of the FEPs relevant to the problems (and conceptual models) that need to be addressed at the site. However, the link between Hanford specific FEPs and the international list of FEPs was maintained to demonstrate completeness and perhaps to expand the usefulness of the international list for other environmental programs.

  14. Systematic errors in temperature estimates from MODIS data covering the western Palearctic and their impact on a parasite development model.

    PubMed

    Alonso-Carné, Jorge; García-Martín, Alberto; Estrada-Peña, Agustin

    2013-11-01

    The modelling of habitat suitability for parasites is a growing area of research due to its association with climate change and ensuing shifts in the distribution of infectious diseases. Such models depend on remote sensing data and require accurate, high-resolution temperature measurements. The temperature is critical for accurate estimation of development rates and potential habitat ranges for a given parasite. The MODIS sensors aboard the Aqua and Terra satellites provide high-resolution temperature data for remote sensing applications. This paper describes comparative analysis of MODIS-derived temperatures relative to ground records of surface temperature in the western Palaearctic. The results show that MODIS overestimated maximum temperature values and underestimated minimum temperatures by up to 5-6 °C. The combined use of both Aqua and Terra datasets provided the most accurate temperature estimates around latitude 35-44° N, with an overestimation during spring-summer months and an underestimation in autumn-winter. Errors in temperature estimation were associated with specific ecological regions within the target area as well as technical limitations in the temporal and orbital coverage of the satellites (e.g. sensor limitations and satellite transit times). We estimated error propagation of temperature uncertainties in parasite habitat suitability models by comparing outcomes of published models. Error estimates reached 36% of annual respective measurements depending on the model used. Our analysis demonstrates the importance of adequate image processing and points out the limitations of MODIS temperature data as inputs into predictive models concerning parasite lifecycles. PMID:24258878

  15. Developing a Systematic Patent Search Training Program

    ERIC Educational Resources Information Center

    Zhang, Li

    2009-01-01

    This study aims to develop a systematic patent training program using patent analysis and citation analysis techniques applied to patents held by the University of Saskatchewan. The results indicate that the target audience will be researchers in life sciences, and aggregated patent database searching and advanced search techniques should be…

  16. Agent-based modeling: a systematic assessment of use cases and requirements for enhancing pharmaceutical research and development productivity

    PubMed Central

    Hunt, C Anthony; Kennedy, Ryan C; Kim, Sean H J; Ropella, Glen E P

    2013-01-01

    A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework—a dynamic knowledge repository—wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline. © 2013 Wiley Periodicals, Inc. PMID:23737142

  17. Systematic Error Modeling and Bias Estimation

    PubMed Central

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    This paper analyzes the statistic properties of the systematic error in terms of range and bearing during the transformation process. Furthermore, we rely on a weighted nonlinear least square method to calculate the biases based on the proposed models. The results show the high performance of the proposed approach for error modeling and bias estimation. PMID:27213386

  18. Systematic Error Modeling and Bias Estimation.

    PubMed

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    This paper analyzes the statistic properties of the systematic error in terms of range and bearing during the transformation process. Furthermore, we rely on a weighted nonlinear least square method to calculate the biases based on the proposed models. The results show the high performance of the proposed approach for error modeling and bias estimation. PMID:27213386

  19. Development of a Systematic Stakeholder Identification System for 3VS Modeling in the Snohomish Basin, Washington, USA

    EPA Science Inventory

    In the Environmental Protection Agency’s Triple Value Simulation (3VS) models, social, economic and environmental indicators are utilized to understand the interrelated impacts of programs and regulations on ecosystems and human communities. Critical to identifying the app...

  20. Career Exploration Program: A Composite Systematic Functional Objective Model.

    ERIC Educational Resources Information Center

    Mohamed, Othman

    The composite systematic functional objective career exploration program model integrates various career development theoretical approaches. These approaches emphasize self-concept, life values, personality, the environment, and academic achievement and training as separate functions in explaining career development. Current social development in…

  1. Interruptions in the wild: Development of a sociotechnical systems model of interruptions in the emergency department through a systematic review.

    PubMed

    Werner, Nicole E; Holden, Richard J

    2015-11-01

    Interruptions are unavoidable in the "interrupt driven" Emergency Department (ED). A critical review and synthesis of the literature on interruptions in the ED can offer insight into the nature of interruptions in complex real-world environments. Fifteen empirical articles on interruptions in the ED were identified through database searches. Articles were reviewed, critiqued, and synthesized. There was little agreement and several gaps in conceptualizing sociotechnical system factors, process characteristics, and interruption outcomes. While multiple outcomes of interruptions were mentioned, few were measured, and the relationship between multiple outcomes was rarely assessed. Synthesizing the literature and drawing on ergonomic concepts, we present a sociotechnical model of interruptions in complex settings that motivates new directions in research and design. The model conceptualizes interruptions as a process, not a single event, that occurs within and is shaped by an interacting socio-technical system and that results in a variety of interrelated outcomes. PMID:26154223

  2. Pharmacokinetic models of morphine and its metabolites in neonates:: Systematic comparisons of models from the literature, and development of a new meta-model.

    PubMed

    Knøsgaard, Katrine Rørbæk; Foster, David John Richard; Kreilgaard, Mads; Sverrisdóttir, Eva; Upton, Richard Neil; van den Anker, Johannes N

    2016-09-20

    Morphine is commonly used for pain management in preterm neonates. The aims of this study were to compare published models of neonatal pharmacokinetics of morphine and its metabolites with a new dataset, and to combine the characteristics of the best predictive models to design a meta-model for morphine and its metabolites in preterm neonates. Moreover, the concentration-analgesia relationship for morphine in this clinical setting was also investigated. A population of 30 preterm neonates (gestational age: 23-32weeks) received a loading dose of morphine (50-100μg/kg), followed by a continuous infusion (5-10μg/kg/h) until analgesia was no longer required. Pain was assessed using the Premature Infant Pain Profile. Five published population models were compared using numerical and graphical tests of goodness-of-fit and predictive performance. Population modelling was conducted using NONMEM® and the $PRIOR subroutine to describe the time-course of plasma concentrations of morphine, morphine-3-glucuronide, and morphine-6-glucuronide, and the concentration-analgesia relationship for morphine. No published model adequately described morphine concentrations in this new dataset. Previously published population pharmacokinetic models of morphine, morphine-3-glucuronide, and morphine-6-glucuronide were combined into a meta-model. The meta-model provided an adequate description of the time-course of morphine and the concentrations of its metabolites in preterm neonates. Allometric weight scaling was applied to all clearance and volume terms. Maturation of morphine clearance was described as a function of postmenstrual age, while maturation of metabolite elimination was described as a function of postnatal age. A clear relationship between morphine concentrations and pain score was not established. PMID:27373670

  3. Systematic Development of Intelligent Systems for Public Road Transport.

    PubMed

    García, Carmelo R; Quesada-Arencibia, Alexis; Cristóbal, Teresa; Padrón, Gabino; Alayón, Francisco

    2016-01-01

    This paper presents an architecture model for the development of intelligent systems for public passenger transport by road. The main objective of our proposal is to provide a framework for the systematic development and deployment of telematics systems to improve various aspects of this type of transport, such as efficiency, accessibility and safety. The architecture model presented herein is based on international standards on intelligent transport system architectures, ubiquitous computing and service-oriented architecture for distributed systems. To illustrate the utility of the model, we also present a use case of a monitoring system for stops on a public passenger road transport network. PMID:27438836

  4. Systematic Development of Intelligent Systems for Public Road Transport

    PubMed Central

    García, Carmelo R.; Quesada-Arencibia, Alexis; Cristóbal, Teresa; Padrón, Gabino; Alayón, Francisco

    2016-01-01

    This paper presents an architecture model for the development of intelligent systems for public passenger transport by road. The main objective of our proposal is to provide a framework for the systematic development and deployment of telematics systems to improve various aspects of this type of transport, such as efficiency, accessibility and safety. The architecture model presented herein is based on international standards on intelligent transport system architectures, ubiquitous computing and service-oriented architecture for distributed systems. To illustrate the utility of the model, we also present a use case of a monitoring system for stops on a public passenger road transport network. PMID:27438836

  5. Systematic Independent Validation of Inner Heliospheric Models

    NASA Technical Reports Server (NTRS)

    MacNeice, P. J.; Takakishvili, Alexandre

    2008-01-01

    This presentation is the first in a series which will provide independent validation of community models of the outer corona and inner heliosphere. In this work we establish a set of measures to be used in validating this group of models. We use these procedures to generate a comprehensive set of results from the Wang- Sheeley-Arge (WSA) model which will be used as a baseline, or reference, against which to compare all other models. We also run a test of the validation procedures by applying them to a small set of results produced by the ENLIL Magnetohydrodynamic (MHD) model. In future presentations we will validate other models currently hosted by the Community Coordinated Modeling Center(CCMC), including a comprehensive validation of the ENLIL model. The Wang-Sheeley-Arge (WSA) model is widely used to model the Solar wind, and is used by a number of agencies to predict Solar wind conditions at Earth as much as four days into the future. Because it is so important to both the research and space weather forecasting communities, it is essential that its performance be measured systematically, and independently. In this paper we offer just such an independent and systematic validation. We report skill scores for the model's predictions of wind speed and IMF polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests line of sight magnetograms. For this study we generated model results for monthly magnetograms from the National Solar Observatory (SOLIS), Mount Wilson Observatory and the GONG network, spanning the Carrington rotation range from 1650 to 2068. We compare the influence of the different magnetogram sources, performance at quiet and active times, and estimate the effect of different empirical wind speed tunings. We also consider the ability of the WSA model to identify sharp transitions in wind speed from slow to fast wind. These results will serve as a baseline against which to compare future

  6. Systematic errors in strong lens modeling

    NASA Astrophysics Data System (ADS)

    Johnson, Traci Lin; Sharon, Keren; Bayliss, Matthew B.

    2015-08-01

    The lensing community has made great strides in quantifying the statistical errors associated with strong lens modeling. However, we are just now beginning to understand the systematic errors. Quantifying these errors is pertinent to Frontier Fields science, as number counts and luminosity functions are highly sensitive to the value of the magnifications of background sources across the entire field of view. We are aware that models can be very different when modelers change their assumptions about the parameterization of the lensing potential (i.e., parametric vs. non-parametric models). However, models built while utilizing a single methodology can lead to inconsistent outcomes for different quantities, distributions, and qualities of redshift information regarding the multiple images used as constraints in the lens model. We investigate how varying the number of multiple image constraints and available redshift information of those constraints (ex., spectroscopic vs. photometric vs. no redshift) can influence the outputs of our parametric strong lens models, specifically, the mass distribution and magnifications of background sources. We make use of the simulated clusters by M. Meneghetti et al. and the first two Frontier Fields clusters, which have a high number of multiply imaged galaxies with spectroscopically-measured redshifts (or input redshifts, in the case of simulated clusters). This work will not only inform upon Frontier Field science, but also for work on the growing collection of strong lensing galaxy clusters, most of which are less massive and are capable of lensing a handful of galaxies, and are more prone to these systematic errors.

  7. Antenna pointing systematic error model derivations

    NASA Technical Reports Server (NTRS)

    Guiar, C. N.; Lansing, F. L.; Riggs, R.

    1987-01-01

    The pointing model used to represent and correct systematic errors for the Deep Space Network (DSN) antennas is presented. Analytical expressions are given in both azimuth-elevation (az-el) and hour angle-declination (ha-dec) mounts for RF axis collimation error, encoder offset, nonorthogonality of axes, axis plane tilt, and structural flexure due to gravity loading. While the residual pointing errors (rms) after correction appear to be within the ten percent of the half-power beamwidth criterion commonly set for good pointing accuracy, the DSN has embarked on an extensive pointing improvement and modeling program aiming toward an order of magnitude higher pointing precision.

  8. Using data assimilation for systematic model improvement

    NASA Astrophysics Data System (ADS)

    Lang, Matthew S.; van Leeuwen, Peter Jan; Browne, Phil

    2016-04-01

    In Numerical Weather Prediction parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known, and the parameterisations themselves are approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation, such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential data assimilation methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to predetermined functional forms of missing physics or parameterisations, that are based upon prior information. The method picks out the functional form, or that combination of functional forms, that bests fits the error structure. The prior information typically takes the form of expert knowledge. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. It is also demonstrated that state augmentation is not successful. The results indicate that this new method is a powerful tool in systematic model improvement.

  9. Improved Systematic Pointing Error Model for the DSN Antennas

    NASA Technical Reports Server (NTRS)

    Rochblatt, David J.; Withington, Philip M.; Richter, Paul H.

    2011-01-01

    New pointing models have been developed for large reflector antennas whose construction is founded on elevation over azimuth mount. At JPL, the new models were applied to the Deep Space Network (DSN) 34-meter antenna s subnet for corrections of their systematic pointing errors; it achieved significant improvement in performance at Ka-band (32-GHz) and X-band (8.4-GHz). The new models provide pointing improvements relative to the traditional models by a factor of two to three, which translate to approximately 3-dB performance improvement at Ka-band. For radio science experiments where blind pointing performance is critical, the new innovation provides a new enabling technology. The model extends the traditional physical models with higher-order mathematical terms, thereby increasing the resolution of the model for a better fit to the underlying systematic imperfections that are the cause of antenna pointing errors. The philosophy of the traditional model was that all mathematical terms in the model must be traced to a physical phenomenon causing antenna pointing errors. The traditional physical terms are: antenna axis tilts, gravitational flexure, azimuth collimation, azimuth encoder fixed offset, azimuth and elevation skew, elevation encoder fixed offset, residual refraction, azimuth encoder scale error, and antenna pointing de-rotation terms for beam waveguide (BWG) antennas. Besides the addition of spherical harmonics terms, the new models differ from the traditional ones in that the coefficients for the cross-elevation and elevation corrections are completely independent and may be different, while in the traditional model, some of the terms are identical. In addition, the new software allows for all-sky or mission-specific model development, and can utilize the previously used model as an a priori estimate for the development of the updated models.

  10. Systematic comparison of model polymer nanocomposite mechanics.

    PubMed

    Xiao, Senbo; Peter, Christine; Kremer, Kurt

    2016-01-01

    Polymer nanocomposites render a range of outstanding materials from natural products such as silk, sea shells and bones, to synthesized nanoclay or carbon nanotube reinforced polymer systems. In contrast to the fast expanding interest in this type of material, the fundamental mechanisms of their mixing, phase behavior and reinforcement, especially for higher nanoparticle content as relevant for bio-inorganic composites, are still not fully understood. Although polymer nanocomposites exhibit diverse morphologies, qualitatively their mechanical properties are believed to be governed by a few parameters, namely their internal polymer network topology, nanoparticle volume fraction, particle surface properties and so on. Relating material mechanics to such elementary parameters is the purpose of this work. By taking a coarse-grained molecular modeling approach, we study an range of different polymer nanocomposites. We vary polymer nanoparticle connectivity, surface geometry and volume fraction to systematically study rheological/mechanical properties. Our models cover different materials, and reproduce key characteristics of real nanocomposites, such as phase separation, mechanical reinforcement. The results shed light on establishing elementary structure, property and function relationship of polymer nanocomposites. PMID:27623170

  11. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  12. Systematic review and validation of prognostic models in liver transplantation.

    PubMed

    Jacob, Matthew; Lewsey, James D; Sharpin, Carlos; Gimson, Alexander; Rela, Mohammed; van der Meulen, Jan H P

    2005-07-01

    A model that can accurately predict post-liver transplant mortality would be useful for clinical decision making, would help to provide patients with prognostic information, and would facilitate fair comparisons of surgical performance between transplant units. A systematic review of the literature was carried out to assess the quality of the studies that developed and validated prognostic models for mortality after liver transplantation and to validate existing models in a large data set of patients transplanted in the United Kingdom (UK) and Ireland between March 1994 and September 2003. Five prognostic model papers were identified. The quality of the development and validation of all prognostic models was suboptimal according to an explicit assessment tool of the internal, external, and statistical validity, model evaluation, and practicality. The discriminatory ability of the identified models in the UK and Ireland data set was poor (area under the receiver operating characteristic curve always smaller than 0.7 for adult populations). Due to the poor quality of the reporting, the methodology used for the development of the model could not always be determined. In conclusion, these findings demonstrate that currently available prognostic models of mortality after liver transplantation can have only a limited role in clinical practice, audit, and research. PMID:15973726

  13. Microenterprise Development Interventions for Sexual Risk Reduction: A Systematic Review

    PubMed Central

    Lee, Ramon; Thirumurthy, Harsha; Muessig, Kathryn E.; Tucker, Joseph D.

    2013-01-01

    Comprehensive interventions that address both individual and structural determinants associated with HIV/STI risk are gaining increasing attention over the past decade. Microenterprise development offers an appealing model for HIV prevention by addressing poverty and gender equality. This study systematically reviewed the effects of microenterprise development interventions on HIV/STI incidence and sexual risk behaviors. Microenterprise development was defined as developing small business capacity among individuals to alleviate poverty. Seven eligible research studies representing five interventions were identified and included in this review. All of the studies targeted women, and three focused on sex workers. None measured biomarker outcomes. All three sex worker studies showed significant reduction in sexual risk behaviors when compared to the control group. Non-sex worker studies showed limited changes in sexual risk behavior. This review indicates the potential utility of microenterprise development in HIV risk reduction programs. More research is needed to determine how microenterprise development can be effectively incorporated in comprehensive HIV control strategies. PMID:23963497

  14. Microenterprise development interventions for sexual risk reduction: a systematic review.

    PubMed

    Cui, Rosa R; Lee, Ramon; Thirumurthy, Harsha; Muessig, Kathryn E; Tucker, Joseph D

    2013-11-01

    Comprehensive interventions that address both individual and structural determinants associated with HIV/STI risk are gaining increasing attention over the past decade. Microenterprise development offers an appealing model for HIV prevention by addressing poverty and gender equality. This study systematically reviewed the effects of microenterprise development interventions on HIV/STI incidence and sexual risk behaviors. Microenterprise development was defined as developing small business capacity among individuals to alleviate poverty. Seven eligible research studies representing five interventions were identified and included in this review. All of the studies targeted women, and three focused on sex workers. None measured biomarker outcomes. All three sex worker studies showed significant reduction in sexual risk behaviors when compared to the control group. Non-sex worker studies showed limited changes in sexual risk behavior. This review indicates the potential utility of microenterprise development in HIV risk reduction programs. More research is needed to determine how microenterprise development can be effectively incorporated in comprehensive HIV control strategies. PMID:23963497

  15. A Systematic Approach to Leadership Development.

    ERIC Educational Resources Information Center

    Boyce, V. Milton

    The 4-H program is dependent upon adult volunteer leaders to carry out its work. During the decade of the 1970s, this program hopes to double its educational effort. In order to do this, the number of volunteer leaders will have to double also. To accomplish this, a leadership development process to be used in helping 4-H agents effectively…

  16. Systematic approach for modeling tetrachloroethene biodegradation

    SciTech Connect

    Bagley, D.M.

    1998-11-01

    The anaerobic biodegradation of tetrachloroethene (PCE) is a reasonably well understood process. Specific organisms capable of using PCE as an electron acceptor for growth require the addition of an electron donor to remove PCE from contaminated ground waters. However, competition from other anaerobic microorganisms for added electron donor will influence the rate and completeness of PCE degradation. The approach developed here allows for the explicit modeling of PCE and byproduct biodegradation as a function of electron donor and byproduct concentrations, and the microbiological ecology of the system. The approach is general and can be easily modified for ready use with in situ ground-water models or ex situ reactor models. Simulations conducted with models developed from this approach show the sensitivity of PCE biodegradation to input parameter values, in particular initial biomass concentrations. Additionally, the dechlorination rate will be strongly influenced by the microbial ecology of the system. Finally, comparison with experimental acclimation results indicates that existing kinetic constants may not be generally applicable. Better techniques for measuring the biomass of specific organisms groups in mixed systems are required.

  17. Developing medical professionalism in future doctors: a systematic review

    PubMed Central

    Doug, Manjo; Peile, Ed; Thistlethwaite, Jill; Johnson, Neil

    2010-01-01

    Objectives: There are currently no guidelines on the most effective ways of supporting medical students to develop high standards of medical professionalism. The aim of this review is to summarise the evidence currently available on methods used by medical schools to promote medical professionalism. Methods: We performed a systematic search of electronic databases (Medline, PsychInfo, British Education Index, Educational Resources Information Centre, Sociological Abstracts and Topics in Medical Education) from January 1998 to October 2008. Outcomes studied were methods used to support and promote the development of professionalism in medical students. Results: We identified 134 papers and five main themes for supporting the development of professionalism in medical students: curriculum design, student selection, teaching and learning methods, role modelling and assessment methods. However, the level of empirical evidence supporting each of these methods is limited. Conclusions: Identification of these five areas helps medical schools to focus the emphasis of their approaches to developing professionalism and identifies future research areas. This review offers a preliminary guide to future discovery and progress in the area of medical professionalism.

  18. Systematic, Systemic and Motivating: The K-12 Career Development Process

    ERIC Educational Resources Information Center

    Snyder, Deborah; Jackson, Sherry

    2006-01-01

    In Butler County, Ohio, Butler Technology and Career Development Schools (Butler Tech) firmly believes that systematic delivery of career development theory and practice integrated with academic content standards will enable students to do all of the above. Because of this, Butler Tech's Career Initiatives division delivers a countywide career…

  19. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. PMID:26317495

  20. SYSTEMATIC SENSITIVITY ANALYSIS OF AIR QUALITY SIMULATION MODELS

    EPA Science Inventory

    This report reviews and assesses systematic sensitivity and uncertainty analysis methods for applications to air quality simulation models. The discussion of the candidate methods presents their basic variables, mathematical foundations, user motivations and preferences, computer...

  1. Systematic effects in CALOR simulation code to model experimental configurations

    SciTech Connect

    Job, P.K.; Proudfoot, J. ); Handler, T. . Dept. of Physics and Astronomy); Gabriel, T.A. )

    1991-03-27

    CALOR89 code system is being used to simulate test beam results and the design parameters of several calorimeter configurations. It has been bench-marked against the ZEUS, D{theta} and HELIOS data. This study identifies the systematic effects in CALOR simulation to model the experimental configurations. Five major systematic effects are identified. These are the choice of high energy nuclear collision model, material composition, scintillator saturation, shower integration time, and the shower containment. Quantitative estimates of these systematic effects are presented. 23 refs., 6 figs., 7 tabs.

  2. Systematic multiscale models for deep convection on mesoscales

    NASA Astrophysics Data System (ADS)

    Klein, Rupert; Majda, Andrew J.

    2006-11-01

    This paper builds on recent developments of a unified asymptotic approach to meteorological modeling [ZAMM, 80: 765 777, 2000, SIAM Proc. App. Math. 116, 227 289, 2004], which was used successfully in the development of Systematic multiscale models for the tropics in Majda and Klein [J. Atmosph. Sci. 60: 393 408, 2003] and Majda and Biello [PNAS, 101: 4736 4741, 2004]. Biello and Majda [J. Atmosph. Sci. 62: 1694 1720, 2005]. Here we account for typical bulk microphysics parameterizations of moist processes within this framework. The key steps are careful nondimensionalization of the bulk microphysics equations and the choice of appropriate distinguished limits for the various nondimensional small parameters that appear. We are then in a position to study scale interactions in the atmosphere involving moist physics. We demonstrate this by developing two systematic multiscale models that are motivated by our interest in mesoscale organized convection. The emphasis here is on multiple length scales but common time scales. The first of these models describes the short-time evolution of slender, deep convective hot towers with horizontal scale ~ 1 km interacting with the linearized momentum balance on length and time scales of (10 km/3 min). We expect this model to describe how convective inhibition may be overcome near the surface, how the onset of deep convection triggers convective-scale gravity waves, and that it will also yield new insight into how such local convective events may conspire to create larger-scale strong storms. The second model addresses the next larger range of length and time scales (10 km, 100 km, and 20 min) and exhibits mathematical features that are strongly reminiscent of mesoscale organized convection. In both cases, the asymptotic analysis reveals how the stiffness of condensation/evaporation processes induces highly nonlinear dynamics. Besides providing new theoretical insights, the derived models may also serve as a theoretical devices

  3. A Comprehensive and Systematic Model of User Evaluation of Web Search Engines: I. Theory and Background.

    ERIC Educational Resources Information Center

    Su, Louise T.

    2003-01-01

    Reports on a project that proposes and tests a comprehensive and systematic model of user evaluation of Web search engines. This article describes the model, including a set of criteria and measures and a method for implementation. A literature review portrays settings for developing the model and places applications of the model in contemporary…

  4. Development of two shortened systematic review formats for clinicians

    PubMed Central

    2013-01-01

    Background Systematic reviews provide evidence for clinical questions, however the literature suggests they are not used regularly by physicians for decision-making. A shortened systematic review format is proposed as one possible solution to address barriers, such as lack of time, experienced by busy clinicians. The purpose of this paper is to describe the development process of two shortened formats for a systematic review intended for use by primary care physicians as an information tool for clinical decision-making. Methods We developed prototypes for two formats (case-based and evidence-expertise) that represent a summary of a full-length systematic review before seeking input from end-users. The process was composed of the following four phases: 1) selection of a systematic review and creation of initial prototypes that represent a shortened version of the systematic review; 2) a mapping exercise to identify obstacles described by clinicians in using clinical evidence in decision-making; 3) a heuristic evaluation (a usability inspection method); and 4) a review of the clinical content in the prototypes. Results After the initial prototypes were created (Phase 1), the mapping exercise (Phase 2) identified components that prompted modifications. Similarly, the heuristic evaluation and the clinical content review (Phase 3 and Phase 4) uncovered necessary changes. Revisions were made to the prototypes based on the results. Conclusions Documentation of the processes for developing products or tools provides essential information about how they are tailored for the intended user. One step has been described that we hope will increase usability and uptake of these documents to end-users. PMID:23767771

  5. Systematic Characterization of Cyclogenesis in High Resolution Climate Model Simulations

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Rao, P.; Kashinath, K.; Prabhat, M.; O'Brien, T. A.

    2015-12-01

    In this study we develop a systematic methodology to analyze cyclogenesis in high resolution climate model simulations. The motivation for this study is to understand how cyclones develop in simulations with the objective of improving the theoretical foundations of cyclogenesis. We use the toolkit for extreme climate analysis (TECA) [Prabhat et al., ICCS 2012] to detect and track cyclones (TCs) in recent high resolution simulations (25km) of current day and climate change scenarios [Wehner et al, J Climate 2015], as well as reanalyses. We systematically adjust the tracking criteria to identify developing and non-developing TCs. The detection and tracking criteria are based on (i) the local relative vorticity maximum being above a certain value, (ii) the colocation of vorticity maximum, surface pressure minimum and warm core temperature maximum, (iii) surface pressure gradient around the storm center to be above a certain value, and (iv) temperature gradient around the warm core center to be above a certain value. To identify non-developing TCs, we systematically characterize the sensitivity of cyclone detection to these criteria using a principal component analysis on the criteria. First, we composite vorticity, pressure and temperature fields around the start of each cyclone's trajectory. Second, we find the covariance of pairs of thresholded variables, for example, vorticity and pressure gradient. Finally, we construct a cross-correlation matrix with these covariances and find the eigenvectors. The eigenvector corresponding to the largest eigenvalue describes the direction of maximum sensitivity.We simultaneously lower thresholds along the direction of maximum sensitivity, which results in an increase in the number of TC-like systems and trajectory lengths compared to the baseline case. We contrast the behavior of developing and non-developing TCs by constructing multivariate joint PDFs of various environmental conditions along their trajectories. We also compute

  6. Rationale for Systematic Vocabulary Development: Antidote for State Mandates

    ERIC Educational Resources Information Center

    Manzo, Anthony V.; Manzo, Ula C.; Thomas, Matthew M.

    2006-01-01

    The authors assert that vocabulary development is one of the most important things teachers can promote for students--cognitively, culturally, socially, and in preparation for standardized tests. A broad-based review of the literature reveals solid reasons for using systematic vocabulary instruction, which is especially helpful with youngsters…

  7. Developing a systematic approach to ranking residues of veterinary medicines.

    PubMed

    2015-12-12

    This is the last in an occasional series of articles produced for Veterinary Record by the Veterinary Residues Committee(*). It describes a matrix ranking system developed by the committee to provide a systematic approach to ranking residues of veterinary medicines, and some prohibited substances, based on the risk they pose to consumers. PMID:26667431

  8. MODEL DEVELOPMENT - DOSE MODELS

    EPA Science Inventory

    Model Development

    Humans are exposed to mixtures of chemicals from multiple pathways and routes. These exposures may result from a single event or may accumulate over time if multiple exposure events occur. The traditional approach of assessing risk from a single chemica...

  9. Systematic Reviews of Animal Models: Methodology versus Epistemology

    PubMed Central

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions. PMID:23372426

  10. A systematic review of strong gravitational lens modeling software

    NASA Astrophysics Data System (ADS)

    Lefor, Alan T.; Futamase, Toshifumi; Akhlaghi, Mohammad

    2013-07-01

    Despite expanding research activity in gravitational lens modeling, there is no particular software which is considered a standard. Much of the gravitational lens modeling software is written by individual investigators for their own use. Some gravitational lens modeling software is freely available for download but is widely variable with regard to ease of use and quality of documentation. This review of 13 software packages was undertaken to provide a single source of information. Gravitational lens models are classified as parametric models or non-parametric models, and can be further divided into research and educational software. Software used in research includes the GRAVLENS package (with both gravlens and lensmodel), Lenstool, LensPerfect, glafic, PixeLens, SimpLens, Lensview, and GRALE. In this review, GravLensHD, G-Lens, Gravitational Lensing, lens and MOWGLI are categorized as educational programs that are useful for demonstrating various aspects of lensing. Each of the 13 software packages is reviewed with regard to software features (installation, documentation, files provided, etc.) and lensing features (type of model, input data, output data, etc.) as well as a brief review of studies where they have been used. Recent studies have demonstrated the utility of strong gravitational lensing data for mass mapping, and suggest increased use of these techniques in the future. Coupled with the advent of greatly improved imaging, new approaches to modeling of strong gravitational lens systems are needed. This is the first systematic review of strong gravitational lens modeling software, providing investigators with a starting point for future software development to further advance gravitational lens modeling research. http://www.ephysics.org/mowgli/

  11. Systematic Task Allocation Evaluation in Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Münch, Jürgen; Lamersdorf, Ansgar

    Systematic task allocation to different development sites in global software development projects can open business and engineering perspectives and help to reduce risks and problems inherent in distributed development. Relying only on a single evaluation criterion such as development cost when distributing tasks to development sites has shown to be very risky and often does not lead to successful solutions in the long run. Task allocation in global software projects is challenging due to a multitude of impact factors and constraints. Systematic allocation decisions require the ability to evaluate and compare task allocation alternatives and to effectively establish customized task allocation practices in an organization. In this article, we present a customizable process for task allocation evaluation that is based on results from a systematic interview study with practitioners. In this process, the relevant criteria for evaluating task allocation alternatives are derived by applying principles from goal-oriented measurement. In addition, the customization of the process is demonstrated, related work and limitations are sketched, and an outlook on future work is given.

  12. Risk models and scores for type 2 diabetes: systematic review

    PubMed Central

    Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-01-01

    Objective To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Design Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Data sources Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data extraction Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. Results 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as “simple” or “easily implemented,” although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an

  13. Systematic approach to verification and validation: High explosive burn models

    SciTech Connect

    Menikoff, Ralph; Scovel, Christina A.

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  14. PARAGON: A Systematic, Integrated Approach to Aerosol Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Diner, David J.; Kahn, Ralph A.; Braverman, Amy J.; Davies, Roger; Martonchik, John V.; Menzies, Robert T.; Ackerman, Thomas P.; Seinfeld, John H.; Anderson, Theodore L.; Charlson, Robert J.; Bosenberg, Jens; Collins, William D.; Rasch, Philip J.; Holben, Brent N.; Hostetler, Chris A.; Wielicki, Bruce A.; Miller, Mark A.; Schwartz, Stephen E.; Ogren, John A.; Penner, Joyce E.; Stephens, Graeme L.; Torres, Omar; Travis, Larry D.; Yu, Bin

    2004-01-01

    Aerosols are generated and transformed by myriad processes operating across many spatial and temporal scales. Evaluation of climate models and their sensitivity to changes, such as in greenhouse gas abundances, requires quantifying natural and anthropogenic aerosol forcings and accounting for other critical factors, such as cloud feedbacks. High accuracy is required to provide sufficient sensitivity to perturbations, separate anthropogenic from natural influences, and develop confidence in inputs used to support policy decisions. Although many relevant data sources exist, the aerosol research community does not currently have the means to combine these diverse inputs into an integrated data set for maximum scientific benefit. Bridging observational gaps, adapting to evolving measurements, and establishing rigorous protocols for evaluating models are necessary, while simultaneously maintaining consistent, well understood accuracies. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) concept represents a systematic, integrated approach to global aerosol Characterization, bringing together modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies to provide the machinery necessary for achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the Earth system. We outline a framework for integrating and interpreting observations and models and establishing an accurate, consistent and cohesive long-term data record.

  15. Systematic Improvement of a Classical Molecular Model of Water

    PubMed Central

    Wang, Lee-Ping; Head-Gordon, Teresa; Ponder, Jay W.; Ren, Pengyu; Chodera, John D.; Eastman, Peter K.; Martinez, Todd J.; Pande, Vijay S.

    2013-01-01

    We report the iAMOEBA (i.e. “inexpensive AMOEBA”) classical polarizable water model. iAMOEBA uses a direct approximation to describe electronic polarizability, which reduces the computational cost relative to a fully polarizable model such as AMOEBA. The model is parameterized using ForceBalance, a systematic optimization method that simultaneously utilizes training data from experimental measurements and high-level ab initio calculations. We show that iAMOEBA is a highly accurate model for water in the solid, liquid, and gas phases, with the ability to fully capture the effects of electronic polarization and predict a comprehensive set of water properties beyond the training data set including the phase diagram. The increased accuracy of iAMOEBA over the fully polarizable AMOEBA model demonstrates ForceBalance as a method that allows the researcher to systematically improve empirical models by optimally utilizing the available data. PMID:23750713

  16. Discovering a Gold Mine of Strategies for At-Risk Students through Systematic Staff Development.

    ERIC Educational Resources Information Center

    Bernal, Jesse R.; Villarreal, Diana

    This paper discusses an effective model of systematic staff development focusing on prevention and intervention strategies used with at-risk students. The following are key elements: (1) matching of the purposes of training to the goals of the school districts; (2) multiple and integrated activities; (3) participants' thorough orientation to the…

  17. A systematic approach for model verification: application on seven published activated sludge models.

    PubMed

    Hauduc, H; Rieger, L; Takács, I; Héduit, A; Vanrolleghem, P A; Gillot, S

    2010-01-01

    The quality of simulation results can be significantly affected by errors in the published model (typing, inconsistencies, gaps or conceptual errors) and/or in the underlying numerical model description. Seven of the most commonly used activated sludge models have been investigated to point out the typing errors, inconsistencies and gaps in the model publications: ASM1; ASM2d; ASM3; ASM3 + Bio-P; ASM2d + TUD; New General; UCTPHO+. A systematic approach to verify models by tracking typing errors and inconsistencies in model development and software implementation is proposed. Then, stoichiometry and kinetic rate expressions are checked for each model and the errors found are reported in detail. An attached spreadsheet (see http://www.iwaponline.com/wst/06104/0898.pdf) provides corrected matrices with the calculations of all stoichiometric coefficients for the discussed biokinetic models and gives an example of proper continuity checks. PMID:20182061

  18. A Demonstration of a Systematic Item-Reduction Approach Using Structural Equation Modeling

    ERIC Educational Resources Information Center

    Larwin, Karen; Harvey, Milton

    2012-01-01

    Establishing model parsimony is an important component of structural equation modeling (SEM). Unfortunately, little attention has been given to developing systematic procedures to accomplish this goal. To this end, the current study introduces an innovative application of the jackknife approach first presented in Rensvold and Cheung (1999). Unlike…

  19. Defining "innovativeness" in drug development: a systematic review.

    PubMed

    Kesselheim, A S; Wang, B; Avorn, J

    2013-09-01

    Some observers of drug development argue that the pace of pharmaceutical innovation is declining, but others deny that contention. This controversy may be due to different methods of defining and assessing innovation. We conducted a systematic review of the literature to develop a taxonomy of methods for measuring innovation in drug development. The 42 studies fell into four main categories: counts of new drugs approved, assessments of therapeutic value, economic outcomes, and patents issued. The definition determined whether a study found a positive or negative trend in innovative drug development. Of 21 studies that relied on counts, 9 (43%) concluded that the trend for drug discovery was favorable, 11 (52%) concluded that the trend was not favorable, and 1 reached no conclusion. By contrast, of 21 studies that used other measures of innovation, 0 concluded that the trend was favorable, 8 (47%) concluded that the trend was not favorable, and 13 reached no conclusion (P = 0.03). PMID:23722626

  20. A Systematic Review on Interventions Supporting Preceptor Development.

    PubMed

    Windey, Maryann; Lawrence, Carol; Guthrie, Kimberly; Weeks, Debra; Sullo, Elaine; Chapa, Deborah W

    2015-01-01

    Increases in newly licensed nurses and experienced nurses changing specialties create a challenge for nursing professional development specialists (NPDS). The NPDS must use the best available evidence in designing programs. A systematic review of interventions for developing preceptors is needed to inform the NPDS in best practice. A search was conducted for full-text, quantitative, and mixed-methods articles published after the year 2000. Over 4000 titles were initially identified, which yielded 12 research studies for evaluation and syntheses. Results identified a limited body of evidence reflecting a need for NPDS to increase efforts in measuring the effectiveness of preceptor development initiatives.(See CE Video, Supplemental Digital Content 1, http://links.lww.com/JNPD/A9). PMID:26580462

  1. A Systematic Review, Meta-Analysis, and Modeling Project

    PubMed Central

    Simeone, Regina M.; Devine, Owen J.; Marcinkevage, Jessica A.; Gilboa, Suzanne M.; Razzaghi, Hilda; Bardenheier, Barbara H.; Sharma, Andrea J.; Honein, Margaret A.

    2015-01-01

    Context Maternal pregestational diabetes (PGDM) is a risk factor for development of congenital heart defects (CHDs). Glycemic control before pregnancy reduces the risk of CHDs. A meta-analysis was used to estimate summary ORs and mathematical modeling was used to estimate population attributable fractions (PAFs) and the annual number of CHDs in the U.S. potentially preventable by establishing glycemic control before pregnancy. Evidence acquisition A systematic search of the literature through December 2012 was conducted in 2012 and 2013. Case–control or cohort studies were included. Data were abstracted from 12 studies for a meta-analysis of all CHDs. Evidence synthesis Summary estimates of the association between PGDM and CHDs and 95% credible intervals (95% CrIs) were developed using Bayesian random-effects meta-analyses for all CHDs and specific CHD subtypes. Posterior estimates of this association were combined with estimates of CHD prevalence to produce estimates of PAFs and annual prevented cases. Ninety-five percent uncertainty intervals (95% UIs) for estimates of the annual number of preventable cases were developed using Monte Carlo simulation. Analyses were conducted in 2013. The summary OR estimate for the association between PGDM and CHDs was 3.8 (95% CrI=3.0, 4.9). Approximately 2670 (95% UI=1795, 3795) cases of CHDs could potentially be prevented annually if all women in the U.S. with PGDM achieved glycemic control before pregnancy. Conclusions Estimates from this analysis suggest that preconception care of women with PGDM could have a measureable impact by reducing the number of infants born with CHDs. PMID:25326416

  2. Systematic discovery of nonobvious human disease models through orthologous phenotypes.

    PubMed

    McGary, Kriston L; Park, Tae Joo; Woods, John O; Cha, Hye Ji; Wallingford, John B; Marcotte, Edward M

    2010-04-01

    Biologists have long used model organisms to study human diseases, particularly when the model bears a close resemblance to the disease. We present a method that quantitatively and systematically identifies nonobvious equivalences between mutant phenotypes in different species, based on overlapping sets of orthologous genes from human, mouse, yeast, worm, and plant (212,542 gene-phenotype associations). These orthologous phenotypes, or phenologs, predict unique genes associated with diseases. Our method suggests a yeast model for angiogenesis defects, a worm model for breast cancer, mouse models of autism, and a plant model for the neural crest defects associated with Waardenburg syndrome, among others. Using these models, we show that SOX13 regulates angiogenesis, and that SEC23IP is a likely Waardenburg gene. Phenologs reveal functionally coherent, evolutionarily conserved gene networks-many predating the plant-animal divergence-capable of identifying candidate disease genes. PMID:20308572

  3. Flipping the classroom to teach systematic reviews: the development of a continuing education course for librarians*

    PubMed Central

    Conte, Marisa L.; MacEachern, Mark P.; Mani, Nandita S.; Townsend, Whitney A.; Smith, Judith E.; Masters, Chase; Kelley, Caitlin

    2015-01-01

    Objective: The researchers used the flipped classroom model to develop and conduct a systematic review course for librarians. Setting: The research took place at an academic health sciences library. Method: A team of informationists developed and conducted a pilot course. Assessment informed changes to both course components; a second course addressed gaps in the pilot. Main Results: Both the pilot and subsequent course received positive reviews. Changes based on assessment data will inform future iterations. Conclusion: The flipped classroom model can be successful in developing and implementing a course that is well rated by students. PMID:25918484

  4. Analysis and Correction of Systematic Height Model Errors

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.

    2016-06-01

    The geometry of digital height models (DHM) determined with optical satellite stereo combinations depends upon the image orientation, influenced by the satellite camera, the system calibration and attitude registration. As standard these days the image orientation is available in form of rational polynomial coefficients (RPC). Usually a bias correction of the RPC based on ground control points is required. In most cases the bias correction requires affine transformation, sometimes only shifts, in image or object space. For some satellites and some cases, as caused by small base length, such an image orientation does not lead to the possible accuracy of height models. As reported e.g. by Yong-hua et al. 2015 and Zhang et al. 2015, especially the Chinese stereo satellite ZiYuan-3 (ZY-3) has a limited calibration accuracy and just an attitude recording of 4 Hz which may not be satisfying. Zhang et al. 2015 tried to improve the attitude based on the color sensor bands of ZY-3, but the color images are not always available as also detailed satellite orientation information. There is a tendency of systematic deformation at a Pléiades tri-stereo combination with small base length. The small base length enlarges small systematic errors to object space. But also in some other satellite stereo combinations systematic height model errors have been detected. The largest influence is the not satisfying leveling of height models, but also low frequency height deformations can be seen. A tilt of the DHM by theory can be eliminated by ground control points (GCP), but often the GCP accuracy and distribution is not optimal, not allowing a correct leveling of the height model. In addition a model deformation at GCP locations may lead to not optimal DHM leveling. Supported by reference height models better accuracy has been reached. As reference height model the Shuttle Radar Topography Mission (SRTM) digital surface model (DSM) or the new AW3D30 DSM, based on ALOS PRISM images, are

  5. Systematic review of health-related quality of life models

    PubMed Central

    2012-01-01

    Background A systematic literature review was conducted to (a) identify the most frequently used health-related quality of life (HRQOL) models and (b) critique those models. Methods Online search engines were queried using pre-determined inclusion and exclusion criteria. We reviewed titles, abstracts, and then full-text articles for their relevance to this review. Then the most commonly used models were identified, reviewed in tables, and critiqued using published criteria. Results Of 1,602 titles identified, 100 articles from 21 countries met the inclusion criteria. The most frequently used HRQOL models were: Wilson and Cleary (16%), Ferrans and colleagues (4%), or World Health Organization (WHO) (5%). Ferrans and colleagues’ model was a revision of Wilson and Cleary’s model and appeared to have the greatest potential to guide future HRQOL research and practice. Conclusions Recommendations are for researchers to use one of the three common HRQOL models unless there are compelling and clearly delineated reasons for creating new models. Disease-specific models can be derived from one of the three commonly used HRQOL models. We recommend Ferrans and colleagues’ model because they added individual and environmental characteristics to the popular Wilson and Cleary model to better explain HRQOL. Using a common HRQOL model across studies will promote a coherent body of evidence that will more quickly advance the science in the area of HRQOL. PMID:23158687

  6. Systematic Review of Traumatic Brain Injury Animal Models.

    PubMed

    Phipps, Helen W

    2016-01-01

    The goals of this chapter are to provide an introduction into the variety of animal models available for studying traumatic brain injury (TBI) and to provide a concise systematic review of the general materials and methods involved in each model. Materials and methods were obtained from a literature search of relevant peer-reviewed articles. Strengths and weaknesses of each animal choice were presented to include relative cost, anatomical and physiological features, and mechanism of injury desired. Further, a variety of homologous, isomorphic/induced, and predictive animal models were defined, described, and compared with respect to their relative ease of use, characteristics, range, adjustability (e.g., amplitude, duration, mass/size, velocity, and pressure), and rough order of magnitude cost. Just as the primary mechanism of action of TBI is limitless, so are the animal models available to study TBI. With such a wide variety of available animals, types of injury models, along with the research needs, there exists no single "gold standard" model of TBI rendering cross-comparison of data extremely difficult. Therefore, this chapter reflects a representative sampling of the TBI animal models available and is not an exhaustive comparison of every possible model and associated parameters. Throughout this chapter, special considerations for animal choice and TBI animal model classification are discussed. Criteria central to choosing appropriate animal models of TBI include ethics, funding, complexity (ease of use, safety, and controlled access requirements), type of model, model characteristics, and range of control (scope). PMID:27604713

  7. Systematic review of character development and childhood chronic illness

    PubMed Central

    Maslow, Gary R; Hill, Sherika N

    2016-01-01

    AIM: To review empirical evidence on character development among youth with chronic illnesses. METHODS: A systematic literature review was conducted using PubMed and PSYCHINFO from inception until November 2013 to find quantitative studies that measured character strengths among youth with chronic illnesses. Inclusion criteria were limited to English language studies examining constructs of character development among adolescents or young adults aged 13-24 years with a childhood-onset chronic medical condition. A librarian at Duke University Medical Center Library assisted with the development of the mesh search term. Two researchers independently reviewed relevant titles (n = 549), then abstracts (n = 45), and finally manuscripts (n = 3). RESULTS: There is a lack of empirical research on character development and childhood-onset chronic medical conditions. Three studies were identified that used different measures of character based on moral themes. One study examined moral reasoning among deaf adolescents using Kohlberg’s Moral Judgement Instrument; another, investigated moral values of adolescent cancer survivors with the Values In Action Classification of Strengths. A third study evaluated moral behavior among young adult survivors of burn injury utilizing the Tennessee Self-Concept, 2nd edition. The studies observed that youth with chronic conditions reasoned at less advanced stages and had a lower moral self-concept compared to referent populations, but that they did differ on character virtues and strengths when matched with healthy peers for age, sex, and race/ethnicity. Yet, generalizations could not be drawn regarding character development of youth with chronic medical conditions because the studies were too divergent from each other and biased from study design limitations. CONCLUSION: Future empirical studies should learn from the strengths and weaknesses of the existing literature on character development among youth with chronic medical conditions

  8. Models Predicting Success of Infertility Treatment: A Systematic Review

    PubMed Central

    Zarinara, Alireza; Zeraati, Hojjat; Kamali, Koorosh; Mohammad, Kazem; Shahnazari, Parisa; Akhondi, Mohammad Mehdi

    2016-01-01

    Background: Infertile couples are faced with problems that affect their marital life. Infertility treatment is expensive and time consuming and occasionally isn’t simply possible. Prediction models for infertility treatment have been proposed and prediction of treatment success is a new field in infertility treatment. Because prediction of treatment success is a new need for infertile couples, this paper reviewed previous studies for catching a general concept in applicability of the models. Methods: This study was conducted as a systematic review at Avicenna Research Institute in 2015. Six data bases were searched based on WHO definitions and MESH key words. Papers about prediction models in infertility were evaluated. Results: Eighty one papers were eligible for the study. Papers covered years after 1986 and studies were designed retrospectively and prospectively. IVF prediction models have more shares in papers. Most common predictors were age, duration of infertility, ovarian and tubal problems. Conclusion: Prediction model can be clinically applied if the model can be statistically evaluated and has a good validation for treatment success. To achieve better results, the physician and the couples’ needs estimation for treatment success rate were based on history, the examination and clinical tests. Models must be checked for theoretical approach and appropriate validation. The privileges for applying the prediction models are the decrease in the cost and time, avoiding painful treatment of patients, assessment of treatment approach for physicians and decision making for health managers. The selection of the approach for designing and using these models is inevitable. PMID:27141461

  9. Systematic Reconstruction of Molecular Cascades Regulating GP Development Using Single-Cell RNA-Seq.

    PubMed

    Li, Junxiang; Luo, Haofei; Wang, Rui; Lang, Jidong; Zhu, Siyu; Zhang, Zhenming; Fang, Jianhuo; Qu, Keke; Lin, Yuting; Long, Haizhou; Yao, Yi; Tian, Geng; Wu, Qiong

    2016-05-17

    The growth plate (GP) comprising sequentially differentiated cell layers is a critical structure for bone elongation and regeneration. Although several key regulators in GP development have been identified using genetic perturbation, systematic understanding is still limited. Here, we used single-cell RNA-sequencing (RNA-seq) to determine the gene expression profiles of 217 single cells from GPs and developed a bioinformatics pipeline named Sinova to de novo reconstruct physiological GP development in both temporal and spatial high resolution. Our unsupervised model not only confirmed prior knowledge, but also enabled the systematic discovery of genes, potential signal pathways, and surface markers CD9/CD200 to precisely depict development. Sinova further identified the effective combination of transcriptional factors (TFs) that regulates GP maturation, and the result was validated using an in vitro EGFP-Col10a screening system. Our case systematically reconstructed molecular cascades in GP development through single-cell profiling, and the bioinformatics pipeline is applicable to other developmental processes. VIDEO ABSTRACT. PMID:27160914

  10. Agent-Based Modeling of Noncommunicable Diseases: A Systematic Review

    PubMed Central

    Arah, Onyebuchi A.

    2015-01-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application. PMID:25602871

  11. Obesity and socioeconomic status in developing countries: a systematic review

    PubMed Central

    Dinsa, GD; Goryakin, Y; Fumagalli, E; Suhrcke, M

    2012-01-01

    Summary We undertook a systematic review of studies assessing the association between socioeconomic status (SES) and measured obesity in low- and middle-income countries (defined by the World Bank as countries with per capita income up to US$12,275) among children, men and women. The evidence on the subject has grown significantly since an earlier influential review was published in 2004. We find that in low-income countries or in countries with low human development index (HDI), the association between SES and obesity appears to be positive for both men and women: the more affluent and/or those with higher educational attainment tend to be more likely to be obese. However, in middle-income countries or in countries with medium HDI, the association becomes largely mixed for men and mainly negative for women. This particular shift appears to occur at an even lower level of per capita income than suggested by an influential earlier review. By contrast, obesity in children appears to be predominantly a problem of the rich in low- and middle-income countries. PMID:22764734

  12. A systematic review of animal models for experimental neuroma.

    PubMed

    Toia, Francesca; Giesen, Thomas; Giovanoli, Pietro; Calcagni, Maurizio

    2015-10-01

    Peripheral neuromas can result in an unbearable neuropathic pain and functional impairment. Their treatment is still challenging, and their optimal management is to be defined. Experimental research still plays a major role, but - although numerous neuroma models have been proposed on different animals - there is still no single model recognised as being the reference. Several models show advantages over the others in specific aspects of neuroma physiopathology, prevention or treatment, making it unlikely that a single model could be of reference. A reproducible and standardised model of peripheral neuroma would allow better comparison of results from different studies. We present a systematic review of the literature on experimental in vivo models, analysing advantages and disadvantages, specific features and indications, with the goal of providing suggestions to help their standardisation. Published models greatly differ in the animal and the nerve employed, the mechanisms of nerve injury and the evaluation methods. Specific experimental models exist for terminal neuromas and neuromas in continuity (NIC). The rat is the most widely employed animal, the rabbit being the second most popular model. NIC models are more actively researched, but it is more difficult to generate such studies in a reproducible manner. Nerve transection is considered the best method to cause terminal neuromas, whereas partial transection is the best method to cause NIC. Traditional histomorphology is the historical gold-standard evaluation method, but immunolabelling, reverse transcriptase-polymerase chain reaction (RT-PCR) and proteomics are gaining increasing popularity. Computerised gait analysis is the gold standard for motor-recovery evaluation, whereas mechanical testing of allodynia and hyperalgesia reproducibly assesses sensory recovery. This review summarises current knowledge on experimental neuroma models, and it provides a useful tool for defining experimental protocols

  13. SCID: Model for Effective Instructional Development.

    ERIC Educational Resources Information Center

    Norton, Robert E.

    The Systematic Curriculum and Instructional Development (SCID) model provides a tested procedure for developing high-quality, low-cost competency-based education and tech prep curriculum and instructional materials. It consists of 5 phases--analysis, design, development, implementation, and evaluation--and 23 components. The analysis phase…

  14. A systematic review of animal models for Staphylococcus aureus osteomyelitis

    PubMed Central

    Reizner, W.; Hunter, J.G.; O’Malley, N.T.; Southgate, R.D.; Schwarz, E.M.; Kates, S.L.

    2015-01-01

    Staphylococcus aureus (S. aureus) osteomyelitis is a significant complication for orthopaedic patients undergoing surgery, particularly with fracture fixation and arthroplasty. Given the difficulty in studying S. aureus infections in human subjects, animal models serve an integral role in exploring the pathogenesis of osteomyelitis, and aid in determining the efficacy of prophylactic and therapeutic treatments. Animal models should mimic the clinical scenarios seen in patients as closely as possible to permit the experimental results to be translated to the corresponding clinical care. To help understand existing animal models of S. aureus, we conducted a systematic search of PubMed & Ovid MEDLINE to identify in vivo animal experiments that have investigated the management of S. aureus osteomyelitis in the context of fractures and metallic implants. In this review, experimental studies are categorized by animal species and are further classified by the setting of the infection. Study methods are summarized and the relevant advantages and disadvantages of each species and model are discussed. While no ideal animal model exists, the understanding of a model’s strengths and limitations should assist clinicians and researchers to appropriately select an animal model to translate the conclusions to the clinical setting. PMID:24668594

  15. Background model systematics for the Fermi GeV excess

    NASA Astrophysics Data System (ADS)

    Calore, Francesca; Cholis, Ilias; Weniger, Christoph

    2015-03-01

    The possible gamma-ray excess in the inner Galaxy and the Galactic center (GC) suggested by Fermi-LAT observations has triggered a large number of studies. It has been interpreted as a variety of different phenomena such as a signal from WIMP dark matter annihilation, gamma-ray emission from a population of millisecond pulsars, or emission from cosmic rays injected in a sequence of burst-like events or continuously at the GC. We present the first comprehensive study of model systematics coming from the Galactic diffuse emission in the inner part of our Galaxy and their impact on the inferred properties of the excess emission at Galactic latitudes 2° < |b| < 20° and 300 MeV to 500 GeV. We study both theoretical and empirical model systematics, which we deduce from a large range of Galactic diffuse emission models and a principal component analysis of residuals in numerous test regions along the Galactic plane. We show that the hypothesis of an extended spherical excess emission with a uniform energy spectrum is compatible with the Fermi-LAT data in our region of interest at 95% CL. Assuming that this excess is the extended counterpart of the one seen in the inner few degrees of the Galaxy, we derive a lower limit of 10.0° (95% CL) on its extension away from the GC. We show that, in light of the large correlated uncertainties that affect the subtraction of the Galactic diffuse emission in the relevant regions, the energy spectrum of the excess is equally compatible with both a simple broken power-law of break energy Ebreak = 2.1 ± 0.2 GeV, and with spectra predicted by the self-annihilation of dark matter, implying in the case of bar bb final states a dark matter mass of mχ=49+6.4-5.4 GeV.

  16. Communication for Development Interventions in Fragile States: A Systematic Review

    PubMed Central

    Skuse, Andrew; Rodger, Dianne; Power, Gerry; Mbus, Domenic Friguglietti; Brimacombe, Tait

    2013-01-01

    Executive summary Background A wide range of contextual and programmatic factors frame, affect and constrain communication for development (C4D) interventions undertaken in fragile or conflict affected states. For the purposes of this review, contextual factors include culture, poverty, different stages of conflict (such as latent, open or post-conflict scenarios), policy, legislation and so on, while programmatic factors include the type of intervention, formative and summative evaluation, project design and management, human and financial resources and so on. Understanding the various factors that influence C4D interventions in fragile states is important to improving practice, implementation and evaluation, as well as to the future development of methodologies and frameworks that can be utilised in conflict or crisis situations. Objective The objective of this review is to assess the contextual and programmatic factors that influence communication for development interventions in fragile states. Types of participants Persons regardless of age, gender and ethnicity – living in fragile states. Phenomena of interest The contextual and programmatic factors that influence communication for development (C4D) interventions in fragile states. Types of studies Qualitative peer reviewed studies, expert opinion, discussion papers, project reports, policy papers, position papers and other text. Search strategy Searches were conducted for published and unpublished material (between January 2001 – September 2011), including grey literature, in the English language. Databases searched were: Academic Search Premier; African Women's Bibliographic Database; Anthropology Plus; Bibliography of Asian Studies; Educational Resources Information Centre; Ingenta Connect; JSTOR; Scopus; and Sociological Abstracts; Communication for Social Change Consortium; DevComm (World Bank); Eldis; Search for Common Ground; The Communication Initiative; United Nations Development Programme

  17. Using Laser Scanners to Augment the Systematic Error Pointing Model

    NASA Astrophysics Data System (ADS)

    Wernicke, D. R.

    2016-08-01

    The antennas of the Deep Space Network (DSN) rely on precise pointing algorithms to communicate with spacecraft that are billions of miles away. Although the existing systematic error pointing model is effective at reducing blind pointing errors due to static misalignments, several of its terms have a strong dependence on seasonal and even daily thermal variation and are thus not easily modeled. Changes in the thermal state of the structure create a separation from the model and introduce a varying pointing offset. Compensating for this varying offset is possible by augmenting the pointing model with laser scanners. In this approach, laser scanners mounted to the alidade measure structural displacements while a series of transformations generate correction angles. Two sets of experiments were conducted in August 2015 using commercially available laser scanners. When compared with historical monopulse corrections under similar conditions, the computed corrections are within 3 mdeg of the mean. However, although the results show promise, several key challenges relating to the sensitivity of the optical equipment to sunlight render an implementation of this approach impractical. Other measurement devices such as inclinometers may be implementable at a significantly lower cost.

  18. Systematic Errors of the Fsu Global Spectral Model

    NASA Astrophysics Data System (ADS)

    Surgi, Naomi

    Three 20 day winter forecasts have been carried out using the Florida State University Global Spectral Model to examine the systematic errors of the model. Most GCM's and global forecast models exhibit the same kind of error patterns even though the model formulations vary somewhat between them. Some of the dominant errors are a breakdown of the trade winds in the low latitudes, an over-prediction of the subtropical jets accompanied by an upward and poleward shift of the jets, an error in the mean sea-level pressure with over-intensification of the quasi-stationary oceanic lows and continental highs and a warming of the tropical mid and upper troposphere. In this study, a number of sensitivity experiments have been performed for which orography, model physics and initialization are considered as possible causes of these errors. A parameterization of the vertical distribution of momentum due to the sub-grid scale orography has been implemented in the model to address the model deficiencies associated with orographic forcing. This scheme incorporates the effects of moisture on the wave induced stress. The parameterization of gravity wave drag is shown to substantially reduce the large-scale wind and height errors in regions of direct forcing and well downstream of the mountainous regions. Also, a parameterization of the heat and moisture transport associated with shallow convection is found to have a positive impact on the errors particularly in the tropics. This is accomplished by the increase of moisture supply from the subtropics into the deep tropics and a subsequent enhancement of the secondary circulations. A dynamic relaxation was carried out to examine the impact of the long wave errors on the shorter wave. By constraining the long wave error, improvement is shown for wavenumbers 5-7 on medium to extended range time intervals. Thus, improved predictability of the transient flow is expected by applying this initialization procedure.

  19. Prevalence of Gastrointestinal Pathogens In Developed and Developing Countries: Systematic Review and Meta-Analysis

    PubMed Central

    Fletcher, Stephanie M.; McLaws, Mary-Louise; Ellis, John T.

    2013-01-01

    Diarrhoeal illness is a leading cause of child mortality and morbidity worldwide. There are no precise or current estimates of the types and prevalence of pathogens associated with diarrheal illnesses in developed and developing settings. This systematic review assessed data from 60 studies published in the English language from five developing regions and developed countries worldwide to provide regional estimates of enteric pathogens affecting children. The random-effect method was used to establish the weighted average prevalence of pathogens in adults and children for each region. Significantly more pathogens were reported by studies from developing regions compared with Organisation for Economic Co-operation and Development countries (P<0.016). The identification rates of pathogens from community based and hospital based studies were similar (58.5% and 58.1% respectively, P<0.619). The overall detection of enteric pathogens in developing countries was higher in adults (74.8%; 95% CI 63.1-83.8%) compared with children (56.7%; 95% CI 53.0-60.4%) (P<0.001). Rotavirus was the most frequently detected pathogen in all regions with the highest rate, 24.8% (95% CI 18.0-33.1%), detected in the developed countries. This systematic review is the first to provide an estimate of the prevalence of enteric pathogens associated with diarrhoeal illnesses in adults and children in developed and developing settings. While pathogen detection rate is greater in developing regions the consistently high prevalence of rotavirus in both developed and developing settings underscores the urgent need for access to rotavirus vaccines. Increased travel between developing and developed countries increases disease risk, and hence developed countries have a vested interest in supporting vaccine accessibility in developing settings. PMID:25170480

  20. Technology Development Roadmaps - a Systematic Approach to Maturing Needed Technologies

    SciTech Connect

    John W. Colllins; Layne Pincock

    2010-07-01

    Abstract. Planning and decision making represent important challenges for all projects. This paper presents the steps needed to assess technical readiness and determine the path forward to mature the technologies required for the Next Generation Nuclear Plant. A Technology Readiness Assessment is used to evaluate the required systems, subsystems, and components (SSC) comprising the desired plant architecture and assess the SSCs against established Technology Readiness Levels (TRLs). A validated TRL baseline is then established for the proposed physical design. Technology Development Roadmaps are generated to define the path forward and focus project research and development and engineering tasks on advancing the technologies to increasing levels of maturity. Tasks include modeling, testing, bench-scale demonstrations, pilot-scale demonstrations, and fully integrated prototype demonstrations. The roadmaps identify precise project objectives and requirements; create a consensus vision of project needs; provide a structured, defensible, decision-based project plan; and, minimize project costs and schedules.

  1. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    PubMed Central

    Speybroeck, Niko; Van Malderen, Carine; Harper, Sam; Müller, Birgit; Devleesschauwer, Brecht

    2013-01-01

    Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks. PMID:24192788

  2. Background model systematics for the Fermi GeV excess

    SciTech Connect

    Calore, Francesca; Cholis, Ilias; Weniger, Christoph

    2015-03-01

    The possible gamma-ray excess in the inner Galaxy and the Galactic center (GC) suggested by Fermi-LAT observations has triggered a large number of studies. It has been interpreted as a variety of different phenomena such as a signal from WIMP dark matter annihilation, gamma-ray emission from a population of millisecond pulsars, or emission from cosmic rays injected in a sequence of burst-like events or continuously at the GC. We present the first comprehensive study of model systematics coming from the Galactic diffuse emission in the inner part of our Galaxy and their impact on the inferred properties of the excess emission at Galactic latitudes 2° < |b| < 20° and 300 MeV to 500 GeV. We study both theoretical and empirical model systematics, which we deduce from a large range of Galactic diffuse emission models and a principal component analysis of residuals in numerous test regions along the Galactic plane. We show that the hypothesis of an extended spherical excess emission with a uniform energy spectrum is compatible with the Fermi-LAT data in our region of interest at 95% CL. Assuming that this excess is the extended counterpart of the one seen in the inner few degrees of the Galaxy, we derive a lower limit of 10.0° (95% CL) on its extension away from the GC. We show that, in light of the large correlated uncertainties that affect the subtraction of the Galactic diffuse emission in the relevant regions, the energy spectrum of the excess is equally compatible with both a simple broken power-law of break energy E(break) = 2.1 ± 0.2 GeV, and with spectra predicted by the self-annihilation of dark matter, implying in the case of bar bb final states a dark matter mass of m(χ)=49(+6.4)(-)(5.4)  GeV.

  3. Procedure for the systematic orientation of digitised cranial models. Design and validation.

    PubMed

    Bailo, M; Baena, S; Marín, J J; Arredondo, J M; Auría, J M; Sánchez, B; Tardío, E; Falcón, L

    2015-12-01

    Comparison of bony pieces requires that they are oriented systematically to ensure that homologous regions are compared. Few orientation methods are highly accurate; this is particularly true for methods applied to three-dimensional models obtained by surface scanning, a technique whose special features make it a powerful tool in forensic contexts. The aim of this study was to develop and evaluate a systematic, assisted orientation method for aligning three-dimensional cranial models relative to the Frankfurt Plane, which would be produce accurate orientations independent of operator and anthropological expertise. The study sample comprised four crania of known age and sex. All the crania were scanned and reconstructed using an Eva Artec™ portable 3D surface scanner and subsequently, the position of certain characteristic landmarks were determined by three different operators using the Rhinoceros 3D surface modelling software. Intra-observer analysis showed a tendency for orientation to be more accurate when using the assisted method than when using conventional manual orientation. Inter-observer analysis showed that experienced evaluators achieve results at least as accurate if not more accurate using the assisted method than those obtained using manual orientation; while inexperienced evaluators achieved more accurate orientation using the assisted method. The method tested is a an innovative system capable of providing very precise, systematic and automatised spatial orientations of virtual cranial models relative to standardised anatomical planes independent of the operator and operator experience. PMID:26481346

  4. Decoding {beta}-decay systematics: A global statistical model for {beta}{sup -} half-lives

    SciTech Connect

    Costiris, N. J.; Mavrommatis, E.; Gernoth, K. A.; Clark, J. W.

    2009-10-15

    Statistical modeling of nuclear data provides a novel approach to nuclear systematics complementary to established theoretical and phenomenological approaches based on quantum theory. Continuing previous studies in which global statistical modeling is pursued within the general framework of machine learning theory, we implement advances in training algorithms designed to improve generalization, in application to the problem of reproducing and predicting the half-lives of nuclear ground states that decay 100% by the {beta}{sup -} mode. More specifically, fully connected, multilayer feed-forward artificial neural network models are developed using the Levenberg-Marquardt optimization algorithm together with Bayesian regularization and cross-validation. The predictive performance of models emerging from extensive computer experiments is compared with that of traditional microscopic and phenomenological models as well as with the performance of other learning systems, including earlier neural network models as well as the support vector machines recently applied to the same problem. In discussing the results, emphasis is placed on predictions for nuclei that are far from the stability line, and especially those involved in r-process nucleosynthesis. It is found that the new statistical models can match or even surpass the predictive performance of conventional models for {beta}-decay systematics and accordingly should provide a valuable additional tool for exploring the expanding nuclear landscape.

  5. Decoding β-decay systematics: A global statistical model for β- half-lives

    NASA Astrophysics Data System (ADS)

    Costiris, N. J.; Mavrommatis, E.; Gernoth, K. A.; Clark, J. W.

    2009-10-01

    Statistical modeling of nuclear data provides a novel approach to nuclear systematics complementary to established theoretical and phenomenological approaches based on quantum theory. Continuing previous studies in which global statistical modeling is pursued within the general framework of machine learning theory, we implement advances in training algorithms designed to improve generalization, in application to the problem of reproducing and predicting the half-lives of nuclear ground states that decay 100% by the β- mode. More specifically, fully connected, multilayer feed-forward artificial neural network models are developed using the Levenberg-Marquardt optimization algorithm together with Bayesian regularization and cross-validation. The predictive performance of models emerging from extensive computer experiments is compared with that of traditional microscopic and phenomenological models as well as with the performance of other learning systems, including earlier neural network models as well as the support vector machines recently applied to the same problem. In discussing the results, emphasis is placed on predictions for nuclei that are far from the stability line, and especially those involved in r-process nucleosynthesis. It is found that the new statistical models can match or even surpass the predictive performance of conventional models for β-decay systematics and accordingly should provide a valuable additional tool for exploring the expanding nuclear landscape.

  6. Develop a Model Component

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a

  7. Prediction models for cardiovascular disease risk in the general population: systematic review

    PubMed Central

    Hooft, Lotty; Schuit, Ewoud; Debray, Thomas P A; Collins, Gary S; Tzoulaki, Ioanna; Lassale, Camille M; Siontis, George C M; Chiocchia, Virginia; Roberts, Corran; Schlüssel, Michael Maia; Gerry, Stephen; Black, James A; Heus, Pauline; van der Schouw, Yvonne T; Peelen, Linda M; Moons, Karel G M

    2016-01-01

    Objective To provide an overview of prediction models for risk of cardiovascular disease (CVD) in the general population. Design Systematic review. Data sources Medline and Embase until June 2013. Eligibility criteria for study selection Studies describing the development or external validation of a multivariable model for predicting CVD risk in the general population. Results 9965 references were screened, of which 212 articles were included in the review, describing the development of 363 prediction models and 473 external validations. Most models were developed in Europe (n=167, 46%), predicted risk of fatal or non-fatal coronary heart disease (n=118, 33%) over a 10 year period (n=209, 58%). The most common predictors were smoking (n=325, 90%) and age (n=321, 88%), and most models were sex specific (n=250, 69%). Substantial heterogeneity in predictor and outcome definitions was observed between models, and important clinical and methodological information were often missing. The prediction horizon was not specified for 49 models (13%), and for 92 (25%) crucial information was missing to enable the model to be used for individual risk prediction. Only 132 developed models (36%) were externally validated and only 70 (19%) by independent investigators. Model performance was heterogeneous and measures such as discrimination and calibration were reported for only 65% and 58% of the external validations, respectively. Conclusions There is an excess of models predicting incident CVD in the general population. The usefulness of most of the models remains unclear owing to methodological shortcomings, incomplete presentation, and lack of external validation and model impact studies. Rather than developing yet another similar CVD risk prediction model, in this era of large datasets, future research should focus on externally validating and comparing head-to-head promising CVD risk models that already exist, on tailoring or even combining these models to local

  8. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  9. What should be considered if you decide to build your own mathematical model for predicting the development of bacterial resistance? Recommendations based on a systematic review of the literature

    PubMed Central

    Arepeva, Maria; Kolbin, Alexey; Kurylev, Alexey; Balykina, Julia; Sidorenko, Sergey

    2015-01-01

    Acquired bacterial resistance is one of the causes of mortality and morbidity from infectious diseases. Mathematical modeling allows us to predict the spread of resistance and to some extent to control its dynamics. The purpose of this review was to examine existing mathematical models in order to understand the pros and cons of currently used approaches and to build our own model. During the analysis, seven articles on mathematical approaches to studying resistance that satisfied the inclusion/exclusion criteria were selected. All models were classified according to the approach used to study resistance in the presence of an antibiotic and were analyzed in terms of our research. Some models require modifications due to the specifics of the research. The plan for further work on model building is as follows: modify some models, according to our research, check all obtained models against our data, and select the optimal model or models with the best quality of prediction. After that we would be able to build a model for the development of resistance using the obtained results. PMID:25972847

  10. UCNA Systematic Uncertainties: Developments in Analysis and Method

    NASA Astrophysics Data System (ADS)

    Zeck, Bryan

    2012-10-01

    The UCNA experiment is an effort to measure the beta-decay asymmetry parameter A of the correlation between the electron momentum and the neutron spin, using bottled polarized ultracold neutrons in a homogenous 1 T magnetic field. Continued improvements in both analysis and method are helping to push the measurement uncertainty to the limits of the current statistical sensitivity (less than 0.4%). The implementation of thinner decay trap windows will be discussed, as will the use of a tagged beta particle calibration source to measure angle-dependent scattering effects and energy loss. Additionally, improvements in position reconstruction and polarization measurements using a new shutter system will be introduced. A full accounting of the current systematic uncertainties will be given.

  11. Systematic Model Building Based on Quark-Lepton Complementarity Assumptions

    NASA Astrophysics Data System (ADS)

    Winter, Walter

    2008-02-01

    In this talk, we present a procedure to systematically generate a large number of valid mass matrix textures from very generic assumptions. Compared to plain anarchy arguments, we postulate some structure for the theory, such as a possible connection between quarks and leptons, and a mechanism to generate flavor structure. We illustrate how this parameter space can be used to test the exclusion power of future experiments, and we point out that one can systematically generate embeddings in ZN product flavor symmetry groups.

  12. Model-Driven Design: Systematically Building Integrated Blended Learning Experiences

    ERIC Educational Resources Information Center

    Laster, Stephen

    2010-01-01

    Developing and delivering curricula that are integrated and that use blended learning techniques requires a highly orchestrated design. While institutions have demonstrated the ability to design complex curricula on an ad-hoc basis, these projects are generally successful at a great human and capital cost. Model-driven design provides a…

  13. The Analysis on Systematic Development of College Microlecture

    ERIC Educational Resources Information Center

    Liu, Xiaohong; Wang, Lisi

    2013-01-01

    In order to apply micro lectures to college education successfully, construct new teaching and learning strategies and teaching model, this paper proposes characteristics of college microlecture based on the college education features and construct microlecture structure model based on the definitions by the experts and scholars. Microlecture's…

  14. Systematic Model Building Based on Quark-Lepton Complementarity Assumptions

    SciTech Connect

    Winter, Walter

    2008-02-21

    In this talk, we present a procedure to systematically generate a large number of valid mass matrix textures from very generic assumptions. Compared to plain anarchy arguments, we postulate some structure for the theory, such as a possible connection between quarks and leptons, and a mechanism to generate flavor structure. We illustrate how this parameter space can be used to test the exclusion power of future experiments, and we point out that one can systematically generate embeddings in Z{sub N} product flavor symmetry groups.

  15. Systematic coarse-grained modeling of complexation between small interfering RNA and polycations

    NASA Astrophysics Data System (ADS)

    Wei, Zonghui; Luijten, Erik

    2015-12-01

    All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed binding patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.

  16. Systematic coarse-grained modeling of complexation between small interfering RNA and polycations

    SciTech Connect

    Wei, Zonghui; Luijten, Erik

    2015-12-28

    All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed binding patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.

  17. Systematic Development of Special Educators as Facilitators of Mainstreaming.

    ERIC Educational Resources Information Center

    Peryon, Charleen Dolphin

    1982-01-01

    A discussion is presented on the knowledge and skills helpful to special educators in their role as consulting teachers in mainstreaming. A parallel is drawn between adult development phases and career development. Sources of resistance to mainstreaming are cited. Three modes of consulting (provision, prescriptive, and mediation) are described.…

  18. ATMOSPHERIC MODEL DEVELOPMENT

    EPA Science Inventory

    This task provides credible state of the art air quality models and guidance for use in implementation of National Ambient Air Quality Standards for ozone and PM. This research effort is to develop and improve air quality models, such as the Community Multiscale Air Quality (CMA...

  19. Development of an integrated surface stimulation device for systematic evaluation of wound electrotherapy.

    PubMed

    Howe, D S; Dunning, J; Zorman, C; Garverick, S L; Bogie, K M

    2015-02-01

    Ideally, all chronic wounds would be prevented as they can become life threatening complications. The concept that a wound produces a 'current of injury' due to the discontinuity in the electrical field of intact skin provides the basis for the concept that electrical stimulation (ES) may provide an effective treatment for chronic wounds. The optimal stimulation waveform parameters are unknown, limiting the reliability of achieving a successful clinical therapeutic outcome. In order to gain a more thorough understanding of ES for chronic wound therapy, systematic evaluation using a valid in vivo model is required. The focus of the current paper is development of the flexible modular surface stimulation (MSS) device by our group. This device can be programed to deliver a variety of clinically relevant stimulation paradigms and is essential to facilitate systematic in vivo studies. The MSS version 2.0 for small animal use provides all components of a single-channel, programmable current-controlled ES system within a lightweight, flexible, independently-powered portable device. Benchtop testing and validation indicates that custom electronics and control algorithms support the generation of high-voltage, low duty-cycle current pulses in a power-efficient manner, extending battery life and allowing ES therapy to be delivered for up to 7 days without needing to replace or disturb the wound dressing. PMID:25274162

  20. Development of a systematic methodology for evaluation of soil vapor extraction at Rocky Mountain Arsenal

    SciTech Connect

    Aamodt, E.C.; Gilmore, J.E.; Weaver, J.D.; Dahm, M.A.; Riese, A.C.; Tortoso, A.

    1994-12-31

    A systematic methodology was developed to evaluate the feasibility of using soil vapor extraction (SVE) to treat South Plants and former Basin F soil media in support of the ongoing Onpost Operable Unit Feasibility Study (FS) at Rocky Mountain Arsenal. The methodology used in situ air permeability testing, chemical and physical property characterization, and computer modeling to evaluate the potential for using SVE to treat soil contaminated with volatile organic compounds (VOCs), semivolatile organic compounds (SVOCs), and pesticide manufacturing process wastes, including potential odor-causing compounds. In situ air permeability tests were performed to measure air permeabilities and extracted vapor flow rates. Soil samples were collected at each test location and were analyzed for VOCs, low molecular weight SVOCs, potential odor-causing compounds, and physical property characteristics. In situ air permeability test and chemical and physical property characterization results were used during computer modeling evaluations to develop SVE conceptual designs, estimate remediation timeframes, select appropriate treatment technologies, and develop preliminary cost estimates for full-scale implementation. The methodology developed provides information necessary to evaluate SVE at the FS stage and provides a sound technical basis for design of full-scale SVE systems.

  1. Reference Model Development

    SciTech Connect

    Jepsen, Richard

    2011-11-02

    Presentation from the 2011 Water Peer Review in which principal investigator discusses project progress to develop a representative set of Reference Models (RM) for the MHK industry to develop baseline cost of energy (COE) and evaluate key cost component/system reduction pathways.

  2. Predicting perioperative mortality after oesophagectomy: a systematic review of performance and methods of multivariate models.

    PubMed

    Warnell, I; Chincholkar, M; Eccles, M

    2015-01-01

    Predicting risk of perioperative mortality after oesophagectomy for cancer may assist patients to make treatment choices and allow balanced comparison of providers. The aim of this systematic review of multivariate prediction models is to report their performance in new patients, and compare study methods against current recommendations. We used PRISMA guidelines and searched Medline, Embase, and standard texts from 1990 to 2012. Inclusion criteria were English language articles reporting development and validation of prediction models of perioperative mortality after open oesophagectomy. Two reviewers screened articles and extracted data for methods, results, and potential biases. We identified 11 development, 10 external validation, and two clinical impact studies. Overestimation of predicted mortality was common (5-200% error), discrimination was poor to moderate (area under receiver operator curves ranged from 0.58 to 0.78), and reporting of potential bias was poor. There were potentially important case mix differences between modelling and validation samples, and sample sizes were considerably smaller than is currently recommended. Steyerberg and colleagues' model used the most 'transportable' predictors and was validated in the largest sample. Most models have not been adequately validated and reported performance has been unsatisfactory. There is a need to clarify definition, effect size, and selection of currently available candidate predictors for inclusion in prediction models, and to identify new ones strongly associated with outcome. Adoption of prediction models into practice requires further development and validation in well-designed large sample prospective studies. PMID:25231768

  3. Economic evaluation in chronic pain: a systematic review and de novo flexible economic model.

    PubMed

    Sullivan, W; Hirst, M; Beard, S; Gladwell, D; Fagnani, F; López Bastida, J; Phillips, C; Dunlop, W C N

    2016-07-01

    There is unmet need in patients suffering from chronic pain, yet innovation may be impeded by the difficulty of justifying economic value in a field beset by data limitations and methodological variability. A systematic review was conducted to identify and summarise the key areas of variability and limitations in modelling approaches in the economic evaluation of treatments for chronic pain. The results of the literature review were then used to support the development of a fully flexible open-source economic model structure, designed to test structural and data assumptions and act as a reference for future modelling practice. The key model design themes identified from the systematic review included: time horizon; titration and stabilisation; number of treatment lines; choice/ordering of treatment; and the impact of parameter uncertainty (given reliance on expert opinion). Exploratory analyses using the model to compare a hypothetical novel therapy versus morphine as first-line treatments showed cost-effectiveness results to be sensitive to structural and data assumptions. Assumptions about the treatment pathway and choice of time horizon were key model drivers. Our results suggest structural model design and data assumptions may have driven previous cost-effectiveness results and ultimately decisions based on economic value. We therefore conclude that it is vital that future economic models in chronic pain are designed to be fully transparent and hope our open-source code is useful in order to aspire to a common approach to modelling pain that includes robust sensitivity analyses to test structural and parameter uncertainty. PMID:26377997

  4. Development of a Systematic Automotive Education Program. Final Report.

    ERIC Educational Resources Information Center

    Wylie, Peter B.; And Others

    As part of a project to provide techniques for developing and implementing realistic vocational training and placement programs for prisoner rehabilitation, an Automotive Trades Council (ATC) was established to test the concept of using a citizens' council to function as a bridge between correctional and training personnel and the using society.…

  5. An analysis of the least-squares problem for the DSN systematic pointing error model

    NASA Technical Reports Server (NTRS)

    Alvarez, L. S.

    1991-01-01

    A systematic pointing error model is used to calibrate antennas in the Deep Space Network. The least squares problem is described and analyzed along with the solution methods used to determine the model's parameters. Specifically studied are the rank degeneracy problems resulting from beam pointing error measurement sets that incorporate inadequate sky coverage. A least squares parameter subset selection method is described and its applicability to the systematic error modeling process is demonstrated on Voyager 2 measurement distribution.

  6. Towards a controlled sensitivity analysis of model development decisions

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Nijssen, Bart

    2016-04-01

    The current generation of hydrologic models have followed a myriad of different development paths, making it difficult for the community to test underlying hypotheses and identify a clear path to model improvement. Model comparison studies have been undertaken to explore model differences, but these studies have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than a systematic analysis of model shortcomings. This presentation will discuss a unified approach to process-based hydrologic modeling to enable controlled and systematic analysis of multiple model representations (hypotheses) of hydrologic processes and scaling behavior. Our approach, which we term the Structure for Unifying Multiple Modeling Alternatives (SUMMA), formulates a general set of conservation equations, providing the flexibility to experiment with different spatial representations, different flux parameterizations, different model parameter values, and different time stepping schemes. We will discuss the use of SUMMA to systematically analyze different model development decisions, focusing on both analysis of simulations for intensively instrumented research watersheds as well as simulations across a global dataset of FLUXNET sites. The intent of the presentation is to demonstrate how the systematic analysis of model shortcomings can help identify model weaknesses and inform future model development priorities.

  7. Validation of Ultrafilter Performance Model Based on Systematic Simulant Evaluation

    SciTech Connect

    Russell, Renee L.; Billing, Justin M.; Smith, Harry D.; Peterson, Reid A.

    2009-11-18

    Because of limited availability of test data with actual Hanford tank waste samples, a method was developed to estimate expected filtration performance based on physical characterization data for the Hanford Tank Waste Treatment and Immobilization Plant. A test with simulated waste was analyzed to demonstrate that filtration of this class of waste is consistent with a concentration polarization model. Subsequently, filtration data from actual waste samples were analyzed to demonstrate that centrifuged solids concentrations provide a reasonable estimate of the limiting concentration for filtration.

  8. Systematic spectral analysis of GX 339-4: Influence of Galactic background and reflection models

    NASA Astrophysics Data System (ADS)

    Clavel, M.; Rodriguez, J.; Corbel, S.; Coriat, M.

    2016-05-01

    Black hole X-ray binaries display large outbursts, during which their properties are strongly variable. We develop a systematic spectral analysis of the 3-40 keV {RXTE}/PCA data in order to study the evolution of these systems and apply it to GX 339-4. Using the low count rate observations, we provide a precise model of the Galactic background at GX 339-4's location and discuss its possible impact on the source spectral parameters. At higher fluxes, the use of a Gaussian line to model the reflection component can lead to the detection of a high-temperature disk, in particular in the high-hard state. We demonstrate that this component is an artifact arising from an incomplete modeling of the reflection spectrum.

  9. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.

  10. Arms race modeling: systematic analysis and synthesis. (Volumes I and II)

    SciTech Connect

    Anderton, C.H.

    1986-01-01

    In recent years there has been a significant proliferation of arms race models in journals and books across many disciplines. Numerous factors have been put forth as relevant by arms race modelers and diverse analytical and explanatory approaches have been employed. This indicates that arms race modeling can be advanced by a systematic analysis (review) and synthesis. This is the purpose of this thesis). Part I discusses the perspective, purposes, and organization of this thesis, definitions of arms race, and mathematical modeling as a tool of arms race research. Parts II-VI review in detail over 125 arms race models and numerous other defense determination models. Part VII synthesizes arms race modeling factors and approaches from the perspective of economics. Each side in an arms race is viewed as facing an economic choice problem of how to allocate scarce resources between defense and nondefense goods. Part VIII undertakes policy analysis using the models developed in Part VII. The three issues investigated are (1) nuclear-conventional substitutability in the European theater, (2) the Strategic Defense Initiative, and (3) the impact of arms races on outputs and factor rewards. Part IX discusses future research avenues related to arms race modeling.

  11. Participatory operations model for cost-efficient monitoring and modeling of river basins--A systematic approach.

    PubMed

    Malve, Olli; Hjerppe, Turo; Tattari, Sirkka; Väisänen, Sari; Huttunen, Inese; Kotamäki, Niina; Kallio, Kari; Taskinen, Antti; Kauppila, Pirkko

    2016-01-01

    The worldwide economic downturn and the climate change in the beginning of 21st century have stressed the need for cost efficient and systematic operations model for the monitoring and management of surface waters. However, these processes are still all too fragmented and incapable to respond these challenges. For example in Finland, the estimation of the costs and benefits of planned management measures is insufficient. On this account, we present a new operations model to streamline these processes and to ensure the lucid decision making and the coherent implementation which facilitate the participation of public and all the involved stakeholders. The model was demonstrated in the real world management of a lake. The benefits, pitfalls and development needs were identified. After the demonstration, the operations model was put into operation and has been actively used in several other management projects throughout Finland. PMID:26184863

  12. Development of Practical Supported Ionic Liquid Membranes: A Systematic Approach

    SciTech Connect

    Luebke, D.R.; Ilconich, J.B.; Myers, C.R.; Pennline, H.W.

    2007-11-01

    Supported liquid membranes (SLMs) are a class of materials that allow the researcher to utilize the wealth of knowledge available on liquid properties to optimize membrane performance. These membranes also have the advantage of liquid phase diffusivities, which are higher than those observed in polymers and grant proportionally greater permeabilities. The primary shortcoming of the supported liquid membranes demonstrated in past research has been the lack of stability caused by volatilization of the transport liquid. Ionic liquids, which may possess high CO2 solubility relative to light gases such as H2, are excellent candidates for this type of membrane since they are stable at elevated temperatures and have negligible vapor pressure. A study has been conducted evaluating the use of a variety of ionic liquids in supported ionic liquid membranes for the capture of CO2 from streams containing H2. In a joint project, researchers at the University of Notre Dame synthesized and characterized ionic liquids, and researchers at the National Energy Technology Laboratory incorporated candidate ionic liquids into supports and evaluated membrane performance for the resulting materials. Several steps have been taken in the development of practical supported ionic liquid membranes. Proof-of-concept was established by showing that ionic liquids could be used as the transport media in SLMs. Results showed that ionic liquids are suitable media for gas transport, but the preferred polymeric supports were not stable at temperatures above 135oC. The use of cross-linked nylon66 supports was found to produce membranes mechanically stable at temperatures exceeding 300oC but CO2/H2 selectivity was poor. An ionic liquid whose selectivity does not decrease with increasing temperature was needed, and a functionalized ionic liquid that complexes with CO2 was used. An increase in CO2/H2 selectivity with increasing temperature over the range of 37 to 85oC was observed and the dominance of a

  13. Product development public-private partnerships for public health: a systematic review using qualitative data.

    PubMed

    De Pinho Campos, Katia; Norman, Cameron D; Jadad, Alejandro R

    2011-10-01

    Almost a decade ago, public health initiated a number of innovative ventures to attract investments from multinational drug companies for the development of new drugs and vaccines to tackle neglected diseases (NDs). These ventures - known as product development public-private partnerships (PD PPPs) - represent the participation of the public and private actors toward the discovery and development of essential medicines to reduce the suffering of over one billion people worldwide living with NDs. This systematic review aimed to identify empirical-based descriptive articles to understand critical elements in the partnership process, and propose a framework to shed light on future guidelines to support better planning, design and management of existing and new forms of PPPs for public health. Ten articles met the inclusion criteria and were analyzed and synthesized using qualitative content analysis. The findings show that the development stage of PD PPPs requires a careful initiation and planning process including discussion on values and shared goals, agreement on mutual interests & equality of power relation, exchange of expertise & resources, stakeholder engagement, and assessment of the local health capacity. The management stage of PD PPPs entails transparency, extensive communication and participatory decision-making among partner organizations. This review illustrates the difficulties, challenges and effective responses during the partnering process. This model of collaboration may offer a way to advance population health at present, while creating streams of innovation that can yield future social and financial dividends in enhancing the public's health more widely. PMID:21839562

  14. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    PubMed Central

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  15. Systematic literature review for clinical practice guideline development.

    PubMed

    O'Day, D M; Steinberg, E P; Dickersin, K

    1993-01-01

    The purpose of this paper was to evaluate the quality and scope of the published literature on functional impairment due to cataract in adults as reviewed for the Agency for Health Care Policy and Research Clinical Practice Guideline. We examined the method of literature retrieved and analysis performed in the course of development of literature-based recommendations for the guideline panel. To collect data, we reviewed the process of literature acquisition and identification and the quality assessments made by reviewers of 14 individual topics composed of 77 issues related to the guideline. We collated this information to provide an assessment of the quality and scope of the relevant literature. Less than 4% (310) of the approximately 8,000 articles initially identified as potentially relevant to the guideline were ultimately used. The majority covered three topics (surgery and complication, 100; Nd:YAG capsulotomy, 77; and potential vision testing, 40). Three other topics--indications for surgery, preoperative medical evaluation, and rehabilitation--were devoid of articles meeting inclusion criteria. For 43 issues, there was no identifiable relevant literature. With few exceptions, the quality of the literature was rated fair to poor owing to major flaws in experimental design. Case series (256 reports) of one type or another accounted for the majority of the included literature. There were 17 random controlled trials. This review revealed a sparse and generally low-quality literature relevant to the management of functional impairment due to cataract, despite a relatively large data base in reputable peer-reviewed journals. PMID:8140702

  16. Using a Systematic Approach to Develop a Chemistry Course Introducing Students to Instrumental Analysis

    ERIC Educational Resources Information Center

    Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher

    2013-01-01

    A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…

  17. Barriers to the Uptake of Eye Care Services in Developing Countries: A Systematic Review of Interventions

    ERIC Educational Resources Information Center

    Abdullah, Khadija Nowaira; Al-Sharqi, Omar Zayan; Abdullah, Muhammad Tanweer

    2013-01-01

    Objective: This research identifies effective and ineffective interventions for reducing barriers to the uptake of eye care services in developing countries. Design: Systematic literature review. Setting: Only research studies done in developing countries were included. Method: The review is restricted to English-language articles published…

  18. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  19. How parents choose to use CAM: a systematic review of theoretical models

    PubMed Central

    Lorenc, Ava; Ilan-Clarke, Yael; Robinson, Nicola; Blair, Mitch

    2009-01-01

    Background Complementary and Alternative Medicine (CAM) is widely used throughout the UK and the Western world. CAM is commonly used for children and the decision-making process to use CAM is affected by numerous factors. Most research on CAM use lacks a theoretical framework and is largely based on bivariate statistics. The aim of this review was to identify a conceptual model which could be used to explain the decision-making process in parental choice of CAM. Methods A systematic search of the literature was carried out. A two-stage selection process with predetermined inclusion/exclusion criteria identified studies using a theoretical framework depicting the interaction of psychological factors involved in the CAM decision process. Papers were critically appraised and findings summarised. Results Twenty two studies using a theoretical model to predict CAM use were included in the final review; only one examined child use. Seven different models were identified. The most commonly used and successful model was Andersen's Sociobehavioural Model (SBM). Two papers proposed modifications to the SBM for CAM use. Six qualitative studies developed their own model. Conclusion The SBM modified for CAM use, which incorporates both psychological and pragmatic determinants, was identified as the best conceptual model of CAM use. This model provides a valuable framework for future research, and could be used to explain child CAM use. An understanding of the decision making process is crucial in promoting shared decision making between healthcare practitioners and parents and could inform service delivery, guidance and policy. PMID:19386106

  20. Systematic development and optimization of chemically defined medium supporting high cell density growth of Bacillus coagulans.

    PubMed

    Chen, Yu; Dong, Fengqing; Wang, Yonghong

    2016-09-01

    With determined components and experimental reducibility, the chemically defined medium (CDM) and the minimal chemically defined medium (MCDM) are used in many metabolism and regulation studies. This research aimed to develop the chemically defined medium supporting high cell density growth of Bacillus coagulans, which is a promising producer of lactic acid and other bio-chemicals. In this study, a systematic methodology combining the experimental technique with flux balance analysis (FBA) was proposed to design and simplify a CDM. The single omission technique and single addition technique were employed to determine the essential and stimulatory compounds, before the optimization of their concentrations by the statistical method. In addition, to improve the growth rationally, in silico omission and addition were performed by FBA based on the construction of a medium-size metabolic model of B. coagulans 36D1. Thus, CDMs were developed to obtain considerable biomass production of at least five B. coagulans strains, in which two model strains B. coagulans 36D1 and ATCC 7050 were involved. PMID:27262567

  1. The Systematic Development of an Internet-Based Smoking Cessation Intervention for Adults.

    PubMed

    Dalum, Peter; Brandt, Caroline Lyng; Skov-Ettrup, Lise; Tolstrup, Janne; Kok, Gerjo

    2016-07-01

    Objectives The objective of this project was to determine whether intervention mapping is a suitable strategy for developing an Internet- and text message-based smoking cessation intervention. Method We used the Intervention Mapping framework for planning health promotion programs. After a needs assessment, we identified important changeable determinants of cessation behavior, specified objectives for the intervention, selected theoretical methods for meeting our objectives, and operationalized change methods into practical intervention strategies. Results We found that "social cognitive theory," the "transtheoretical model/stages of change," "self-regulation theory," and "appreciative inquiry" were relevant theories for smoking cessation interventions. From these theories, we selected modeling/behavioral journalism, feedback, planning coping responses/if-then statements, gain frame/positive imaging, consciousness-raising, helping relationships, stimulus control, and goal-setting as suitable methods for an Internet- and text-based adult smoking cessation program. Furthermore, we identified computer tailoring as a useful strategy for adapting the intervention to individual users. Conclusion The Intervention Mapping method, with a clear link between behavioral goals, theoretical methods, and practical strategies and materials, proved useful for systematic development of a digital smoking cessation intervention for adults. PMID:27101996

  2. Development of a systematic procedure for analyzing bus service cutback programs

    SciTech Connect

    Tadi, R.R.

    1984-01-01

    In this era of shrinking public resources and mounting operating costs, it is becoming increasingly important to operate public transit as efficiently as possible. The primary purpose of this thesis is to develop a systematic procedure (planning tool) that will enable transit operators to quickly and efficiently evaluate transit routes in order to minimize the total operating deficit while responding to service cutbacks or modifications. As a part of this research, a computer model called Transit Route Evaluation Model (TREM) is developed that can evaluate each transit route both at the system level and at the route level based upon a set of pre-specified criteria such as revenue/operating cost ratio, transit demand, population density, subsidy, and income. The unproductive routes, thus identified, are reviewed to analyze if service cutbacks or service modifications can be considered along those routes so that total operating deficit can be minimized. The alternative which yields the minimum operating deficit is considered as the optimal alternative. A case study example (Flint, Michigan) consisting of 12 routes is used to demonstrate the applicability of the research methodology. A sensitivity analysis is also conducted by changing the various input parameters in order to judge the viability of the methodology.

  3. Mathematical modelling in tobacco control research: protocol for a systematic review

    PubMed Central

    Feirman, Shari; Donaldson, Elisabeth; Pearson, Jennifer; Zawistowski, Grace; Glasser, Allison; Villanti, Andrea C

    2015-01-01

    Introduction Tobacco control researchers have recently become more interested in systems science methods and mathematical modelling techniques as a means to understand how complex inter-relationships among various factors translate into population-level summaries of tobacco use prevalence and its associated medical and social costs. However, there is currently no resource that provides an overview of how mathematical modelling has been used in tobacco control research. This review will provide a summary of studies that employ modelling techniques to predict tobacco-related outcomes. It will also propose a conceptual framework for grouping existing modelling studies by their objectives. Methods and analysis We will conduct a systematic review that is informed by Cochrane procedures, as well as guidelines developed for reviews that are specifically intended to inform policy and programme decision-making. We will search 5 electronic databases to identify studies that use a mathematical model to project a tobacco-related outcome. An online data extraction form will be developed based on the ISPOR-SMDM Modeling Good Research Practices. We will perform a qualitative synthesis of included studies. Ethics and dissemination Ethical approval is not required for this study. An initial paper, published in a peer-reviewed journal, will provide an overview of our findings. Subsequent papers will provide greater detail on results within each study objective category and an assessment of the risk of bias of these grouped studies. PMID:25877276

  4. Systematic Land-Surface-Model Performance Evaluation on different time scales

    NASA Astrophysics Data System (ADS)

    Mahecha, M. D.; Jung, M.; Reichstein, M.; Beer, C.; Braakhekke, M.; Carvalhais, N.; Lange, H.; Lasslop, G.; Le Maire, G.; Seneviratne, S. I.; Vetter, M.

    2008-12-01

    Keeping track of the space--time evolution of CO2--, and H2O--fluxes between the terrestrial biosphere and atmosphere is essential to our understanding of current climate. Monitoring fluxes at site level is one option to characterize the temporal development of ecosystem--atmosphere interactions. Nevertheless, many aspects of ecosystem--atmosphere fluxes become meaningful only when interpreted in time over larger geographical regions. Empirical and process based models play a key role in spatial and temporal upscaling exercises. In this context, comparative model performance evaluations at site level are indispensable. We present a model evaluation scheme which investigates the model-data agreement separately on different time scales. Observed and modeled time series were decomposed by essentially non parametric techniques into subsignals (time scales) of characteristic fluctuations. By evaluating the extracted subsignals of observed and modeled C--fluxes (gross and net ecosystem exchange, GEE and NEE, and terrestrial ecosystem respiration, TER) separately, we obtain scale--dependent performances for the different evaluation measures. Our diagnostic model comparison allows uncovering time scales of model-data agreement and fundamental mismatch. We focus on the systematic evaluation of three land--surface models: Biome--BGC, ORCHIDEE, and LPJ. For the first time all models were driven by consistent site meteorology and compared to respective Eddy-Covariance flux observations. The results show that correct net C--fluxes may result from systematic (simultaneous) biases in TER and GEE on specific time scales of variation. We localize significant model-data mismatches of the annual-seasonal cycles in time and illustrate the recurrence characteristics of such problems. For example LPJ underestimates GEE during winter months and over estimates it in early summer at specific sites. Contrary, ORCHIDEE over-estimates the flux from July to September at these sites. Finally

  5. Systematic Assessment of Terrestrial Biogeochemistry in Coupled Climate-Carbon Models

    SciTech Connect

    Randerson, Jim; Hoffman, Forrest M; Thornton, Peter E; Mahowald, Natalie; Lindsay, Keith; Lee, Jeff; Nevison, Cynthia; Doney, Scott C.; Bonan, Gordon; Stockli, Reto; Covey, Curtis; Running, Steven; Fung, Inez

    2009-01-01

    With representation of the global carbon cycle becoming increasingly complex in climate models, it is important to develop ways to quantitatively evaluate model performance against in situ and remote sensing observations. Here we present a systematic framework, the Carbon-LAnd Model Intercomparison Project (C-LAMP), for assessing terrestrial biogeochemistry models coupled to climate models using observations that span a wide range of temporal and spatial scales. As an example of the value of such comparisons, we used this framework to evaluate two biogeochemistry models that are integrated within the Community Climate System Model (CCSM) - Carnegie-Ames-Stanford Approach (CASA) and carbon-nitrogen (CN). Both models underestimated the magnitude of net carbon uptake during the growing season in temperate and boreal forest ecosystems, based on comparison with atmospheric CO{sub 2} measurements and eddy covariance measurements of net ecosystem exchange. Comparison with MODerate Resolution Imaging Spectroradiometer (MODIS) measurements show that this low bias in model fluxes was caused, at least in part, by 1-3 month delays in the timing of maximum leaf area. In the tropics, the models overestimated carbon storage in woody biomass based on comparison with datasets from the Amazon. Reducing this model bias will probably weaken the sensitivity of terrestrial carbon fluxes to both atmospheric CO{sub 2} and climate. Global carbon sinks during the 1990s differed by a factor of two (2.4 Pg C yr{sup -1} for CASA vs. 1.2 Pg C yr{sup -1} for CN), with fluxes from both models compatible with the atmospheric budget given uncertainties in other terms. The models captured some of the timing of interannual global terrestrial carbon exchange during 1988-2004 based on comparison with atmospheric inversion results from TRANSCOM (r=0.66 for CASA and r=0.73 for CN). Adding (CASA) or improving (CN) the representation of deforestation fires may further increase agreement with the

  6. Biomechanical factors associated with the development of tibiofemoral knee osteoarthritis: protocol for a systematic review and meta-analysis

    PubMed Central

    van Tunen, Joyce A C; Dell'Isola, Andrea; Juhl, Carsten; Dekker, Joost; Steultjens, Martijn; Lund, Hans

    2016-01-01

    Introduction Altered biomechanics, increased joint loading and tissue damage, might be related in a vicious cycle within the development of knee osteoarthritis (KOA). We have defined biomechanical factors as joint-related factors that interact with the forces, moments and kinematics in and around a synovial joint. Although a number of studies and systematic reviews have been performed to assess the association of various factors with the development of KOA, a comprehensive overview focusing on biomechanical factors that are associated with the development of KOA is not available. The aim of this review is (1) to identify biomechanical factors that are associated with (the development of) KOA and (2) to identify the impact of other relevant risk factors on this association. Methods and analysis Cohort, cross-sectional and case–control studies investigating the association of a biomechanical factor with (the development of) KOA will be included. MEDLINE, EMBASE, CINAHL and SPORTDiscus will be searched from their inception until August 2015. 2 reviewers will independently screen articles obtained by the search for eligibility, extract data and score risk of bias. Quality of evidence will be evaluated. Meta-analysis using random effects model will be applied in each of the biomechanical factors, if possible. Ethics and dissemination This systematic review and meta-analysis does not require ethical approval. The results of this systematic review and meta-analysis will be disseminated through publications in peer-reviewed journals and presentations at (inter)national conferences. Trial registration number CRD42015025092. PMID:27311908

  7. Obesity--a neuropsychological disease? Systematic review and neuropsychological model.

    PubMed

    Jauch-Chara, Kamila; Oltmanns, Kerstin M

    2014-03-01

    Obesity is a global epidemic associated with a series of secondary complications and comorbid diseases such as diabetes mellitus, cardiovascular disease, sleep-breathing disorders, and certain forms of cancer. On the surface, it seems that obesity is simply the phenotypic manifestation of deliberately flawed food intake behavior with the consequence of dysbalanced energy uptake and expenditure and can easily be reversed by caloric restriction and exercise. Notwithstanding this assumption, the disappointing outcomes of long-term clinical studies based on this assumption show that the problem is much more complex. Obviously, recent studies render that specific neurocircuits involved in appetite regulation are etiologically integrated in the pathomechanism, suggesting obesity should be regarded as a neurobiological disease rather than the consequence of detrimental food intake habits. Moreover, apart from the physical manifestation of overeating, a growing body of evidence suggests a close relationship with psychological components comprising mood disturbances, altered reward perception and motivation, or addictive behavior. Given that current dietary and pharmacological strategies to overcome the burgeoning threat of the obesity problem are of limited efficacy, bear the risk of adverse side-effects, and in most cases are not curative, new concepts integratively focusing on the fundamental neurobiological and psychological mechanisms underlying overeating are urgently required. This new approach to develop preventive and therapeutic strategies would justify assigning obesity to the spectrum of neuropsychological diseases. Our objective is to give an overview on the current literature that argues for this view and, on the basis of this knowledge, to deduce an integrative model for the development of obesity originating from disturbed neuropsychological functioning. PMID:24394671

  8. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  9. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  10. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    ERIC Educational Resources Information Center

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  11. Development of Children in Iran: A Systematic Review and Meta-Analysis

    PubMed Central

    Sajedi, Firoozeh; Doulabi, Mahbobeh ahmadi; Vameghi, Roshanak; Baghban, Alireza Akbarzadeh; Mazaheri, Mohammad Ali; Mahmodi, Zohreh; Ghasemi, Erfan

    2016-01-01

    Background: In order to gain a better perspective of the developmental status of children in different regions of Iran, this study was carried out to determine the prevalence and the factors impacting child development in Iranian studies. Materials and Methods: Articles published in Iranian and international journals indexed in the SID, PubMed, Scopus and Magiran databases from 2001-2015 were systematically reviewed using standard and sensitive keywords. After evaluating the quality of 155 articles in the initial search, 26 articles were analyzed according to the inclusion criteria. After investigations, meta-analysis was done for six studies and the results were combined using Random Effects model, and the heterogeneity of studies was evaluated using the I2 index. Data analysis was performed using STATA version 11.2. Results: Eagger & Beggs tests, respectively with 0/273 & 0/260 did not confirm the probability of publication bias in the data, but heterogeneity in studies was confirmed (p<0/001). On such basis, the pooled prevalence of developmental disorder based on Random Effect model was calculated to be 0.146, CI (0/107-0/184). The prevalence of developmental disorders in children in the studies reviewed was reported between 7 to 22.4%. The most important risk factors were in SES (Socio Economic Status) and Prenatal, Perinatal, Neonatal &Child groups. Conclusion: More extensive studies and early intervention with respect to causes of developmental delay in children seems necessary.

  12. Developing a Model Component

    NASA Technical Reports Server (NTRS)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  13. Developing a Model Component

    NASA Technical Reports Server (NTRS)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is,. responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) is a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The purpose of the UCTS is to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems s:luring their development. As an intern at KSC, my assignment was to develop a model component for the UCTS. I was given a fluid component (drier) to model in Matlab. The drier was a Catch All replaceable core type filter-drier. The filter-drier provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-drier also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. I completed training for UNIX and Simulink to help aid in my assignment. The filter-drier was modeled by determining affects it has on the pressure, velocity and temperature of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my model filter-drier in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements.

  14. Development of the Spanish version of the Systematized Nomenclature of Medicine: methodology and main issues.

    PubMed Central

    Reynoso, G. A.; March, A. D.; Berra, C. M.; Strobietto, R. P.; Barani, M.; Iubatti, M.; Chiaradio, M. P.; Serebrisky, D.; Kahn, A.; Vaccarezza, O. A.; Leguiza, J. L.; Ceitlin, M.; Luna, D. A.; Bernaldo de Quirós, F. G.; Otegui, M. I.; Puga, M. C.; Vallejos, M.

    2000-01-01

    This presentation features linguistic and terminology management issues related to the development of the Spanish version of the Systematized Nomenclature of Medicine (SNOMED). It aims at describing the aspects of translating and the difficulties encountered in delivering a natural and consistent medical nomenclature. Bunge's three-layered model is referenced to analyze the sequence of symbolic concept representations. It further explains how a communicative translation based on a concept-to-concept approach was used to achieve the highest level of flawlessness and naturalness for the Spanish rendition of SNOMED. Translation procedures and techniques are described and exemplified. Both the computer-aided and human translation methods are portrayed. The scientific and translation team tasks are detailed, with focus on Newmark's four-level principle for the translation process, extended with a fifth further level relevant to the ontology to control the consistency of the typology of concepts. Finally the convenience for a common methodology to develop non-English versions of SNOMED is suggested. PMID:11079973

  15. Systematic Review of Cognitive Development across Childhood in Down Syndrome: Implications for Treatment Interventions

    ERIC Educational Resources Information Center

    Patterson, T.; Rapsey, C. M.; Glue, P.

    2013-01-01

    Background: There is conjecture regarding the profile of cognitive development over time in children with Down syndrome (DS). Characterising this profile would be valuable for the planning and assessment of intervention studies. Method: A systematic search of the literature from 1990 to the present was conducted to identify longitudinal data on…

  16. Toward a Systematic and Intentional Approach to Leadership Development for the Early Childhood Profession

    ERIC Educational Resources Information Center

    Sturges, Lisa Ann

    2011-01-01

    An examination of the literature indicated that the field of early childhood would benefit from a more systematic and intentional approach to developing leadership for professionals at all levels, including those with a range of training/education across a diversity of program types and professional positions. The intent of the present study was…

  17. "Learning to Play with New Friends": Systematic Quality Development Work in a Leisure-Time Centre

    ERIC Educational Resources Information Center

    Lager, Karin

    2016-01-01

    This article explores the recontextualisation of systematic quality development work (Sqdw) in a leisure-time centre. Two teachers' processes of planning, organisation, documentation and evaluation were investigated, the aim being to explore the recontextualisation of Sqdw in practice. The study is thus a case study of these teachers' practice…

  18. A Systematic Approach to the Management of Program Development in Teacher Education.

    ERIC Educational Resources Information Center

    Sybouts, Ward; And Others

    Research was conducted in four schools of education to determine if there was any relationship between the degree of faculty involvement in program development and a systematic approach to change that involves a concern for the whole organizational structure rather than its constituent parts. The following internal factors were found in…

  19. Rumination and postnatal depression: A systematic review and a cognitive model.

    PubMed

    DeJong, Hannah; Fox, Elaine; Stein, Alan

    2016-07-01

    Postnatal depression (PND) confers risk for a range of negative child developmental outcomes, at least in part through its impact on parenting behaviour. Whilst the behavioural effects of depression on parenting are well established, the cognitive mechanisms that may mediate this effect are less well understood. The current paper proposes that rumination may be a key cognitive mechanism through which parenting is affected in PND, and provides a systematic review of the existing literature on rumination in the context of perinatal depression. The review identifies ten relevant papers. Eight are questionnaire-based studies examining the role of rumination in predicting future depression and/or mother-infant relationship outcomes, such as bonding. Two are experimental studies examining the effects of induced rumination on parenting behaviours. The results of the review are discussed, and remaining questions highlighted. We then present a new theoretical model, developed specifically for the perinatal context, and informed by existing models of rumination and worry. Our cognitive model emphasises the relationship between rumination, cognitive biases and cognitive control, and the impact of these variables on infant cue processing and subsequent parenting responses. The model provides a potential framework for future work in this area, and to guide the development of treatment interventions. PMID:27203622

  20. IMPACT fragmentation model developments

    NASA Astrophysics Data System (ADS)

    Sorge, Marlon E.; Mains, Deanna L.

    2016-09-01

    The IMPACT fragmentation model has been used by The Aerospace Corporation for more than 25 years to analyze orbital altitude explosions and hypervelocity collisions. The model is semi-empirical, combining mass, energy and momentum conservation laws with empirically derived relationships for fragment characteristics such as number, mass, area-to-mass ratio, and spreading velocity as well as event energy distribution. Model results are used for several types of analysis including assessment of short-term risks to satellites from orbital altitude fragmentations, prediction of the long-term evolution of the orbital debris environment and forensic assessments of breakup events. A new version of IMPACT, version 6, has been completed and incorporates a number of advancements enabled by a multi-year long effort to characterize more than 11,000 debris fragments from more than three dozen historical on-orbit breakup events. These events involved a wide range of causes, energies, and fragmenting objects. Special focus was placed on the explosion model, as the majority of events examined were explosions. Revisions were made to the mass distribution used for explosion events, increasing the number of smaller fragments generated. The algorithm for modeling upper stage large fragment generation was updated. A momentum conserving asymmetric spreading velocity distribution algorithm was implemented to better represent sub-catastrophic events. An approach was developed for modeling sub-catastrophic explosions, those where the majority of the parent object remains intact, based on estimated event energy. Finally, significant modifications were made to the area-to-mass ratio distribution to incorporate the tendencies of different materials to fragment into different shapes. This ability enabled better matches between the observed area-to-mass ratios and those generated by the model. It also opened up additional possibilities for post-event analysis of breakups. The paper will discuss

  1. A Systematic Investigation of Computation Models for Predicting Adverse Drug Reactions (ADRs)

    PubMed Central

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Background Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. Principal Findings In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Conclusion Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms. PMID:25180585

  2. Systematic derivation of a generalized t-J model

    NASA Astrophysics Data System (ADS)

    Aligia, A. A.; Simón, M. E.; Batista, C. D.

    1994-05-01

    We perform a consistent low-energy reduction of the three-band Hubbard model H3b to an effective one-band model H1b, using as an intermediate step a renormalized effective spin-fermion model Hsf in which Cu+ and Cu3+ configurations are contained as virtual states. We map PHsfP into H1b using two projectors P onto different Zhang-Rice singlets: (1) orthogonal singlets constructed using O Wanner functions, (2) nonorthogonal singlets. In case 1 (2) the mapping is almost exact (exact as shown before by Zhang) if direct O-O hopping and the Cu3+ (Cu+) configuration can be neglected.

  3. P300 Development across the Lifespan: A Systematic Review and Meta-Analysis

    PubMed Central

    van Dinteren, Rik; Arns, Martijn; Jongsma, Marijtje L. A.; Kessels, Roy P. C.

    2014-01-01

    Background The P300 component of the event-related potential is a large positive waveform that can be extracted from the ongoing electroencephalogram using a two-stimuli oddball paradigm, and has been associated with cognitive information processing (e.g. memory, attention, executive function). This paper reviews the development of the auditory P300 across the lifespan. Methodology/Principal Findings A systematic review and meta-analysis on the P300 was performed including 75 studies (n = 2,811). Scopus was searched for studies using healthy subjects and that reported means of P300 latency and amplitude measured at Pz and mean age. These findings were validated in an independent, existing cross-sectional dataset including 1,572 participants from ages 6–87. Curve-fitting procedures were applied to obtain a model of P300 development across the lifespan. In both studies logarithmic Gaussian models fitted the latency and amplitude data best. The P300 latency and amplitude follow a maturational path from childhood to adolescence, resulting in a period that marks a plateau, after which degenerative effects begin. We were able to determine ages that mark a maximum (in P300 amplitude) or trough (in P300 latency) segregating maturational from degenerative stages. We found these points of deflection occurred at different ages. Conclusions/Significance It is hypothesized that latency and amplitude index different aspects of brain maturation. The P300 latency possibly indexes neural speed or brain efficiency. The P300 amplitude might index neural power or cognitive resources, which increase with maturation. PMID:24551055

  4. Scaling up depot medroxyprogesterone acetate (DMPA): a systematic literature review illustrating the AIDED model

    PubMed Central

    2013-01-01

    Background Use of depot medroxyprogesterone acetate (DMPA), often known by the brand name Depo-Provera, has increased globally, particularly in multiple low- and middle-income countries (LMICs). As a reproductive health technology that has scaled up in diverse contexts, DMPA is an exemplar product innovation with which to illustrate the utility of the AIDED model for scaling up family health innovations. Methods We conducted a systematic review of the enabling factors and barriers to scaling up DMPA use in LMICs. We searched 11 electronic databases for academic literature published through January 2013 (n = 284 articles), and grey literature from major health organizations. We applied exclusion criteria to identify relevant articles from peer-reviewed (n = 10) and grey literature (n = 9), extracting data on scale up of DMPA in 13 countries. We then mapped the resulting factors to the five AIDED model components: ASSESS, INNOVATE, DEVELOP, ENGAGE, and DEVOLVE. Results The final sample of sources included studies representing variation in geographies and methodologies. We identified 15 enabling factors and 10 barriers to dissemination, diffusion, scale up, and/or sustainability of DMPA use. The greatest number of factors were mapped to the ASSESS, DEVELOP, and ENGAGE components. Conclusions Findings offer early empirical support for the AIDED model, and provide insights into scale up of DMPA that may be relevant for other family planning product innovations. PMID:23915274

  5. A Systematic Ecological Model for Adapting Physical Activities: Theoretical Foundations and Practical Examples

    ERIC Educational Resources Information Center

    Hutzler, Yeshayahu

    2007-01-01

    This article proposes a theory- and practice-based model for adapting physical activities. The ecological frame of reference includes Dynamic and Action System Theory, World Health Organization International Classification of Function and Disability, and Adaptation Theory. A systematic model is presented addressing (a) the task objective, (b) task…

  6. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    ERIC Educational Resources Information Center

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  7. Devices for In situ Development of Non-disturbed Oral Biofilm. A Systematic Review

    PubMed Central

    Prada-López, Isabel; Quintas, Víctor; Vilaboa, Carlos; Suárez-Quintanilla, David; Tomás, Inmaculada

    2016-01-01

    Objective: The aim of this review was to assess the types of devices used for in situ development of oral biofilm analyzed microbiologically. Materials and Methods: A systematic search of the literature was conducted to identify all in situ studies of oral biofilm which used an oral device; the Ovid MEDLINE and EMBASE databases complemented with manual search were used. Specific devices used to microbiologically analyze oral biofilm in adults were included. After reading of the selected full texts, devices were identified and classified according to the oral cavity zone and manufacturing material. The “ideal” characteristics were analyzed in every group. Results: The search provided 787 abstracts, of which 111 papers were included. The devices used in these studies were classified as palatal, lingual or buccal. The last group was sub-classified in six groups based on the material of the device. Considering the analyzed characteristics, the thermoplastic devices and the Intraoral Device of Overlaid Disk-holding Splints (IDODS) presented more advantages than limitations. Conclusions: Buccal devices were the most commonly used for the study of in situ biofilm. The majority of buccal devices seemed to slightly affect the volunteer's comfort, the IDODS being the closest to the “ideal” model. Clinical Relevance: New devices for in situ oral biofilm microbiological studies should take into account the possible effect of their design on the volunteer's comfort and biofilm formation. PMID:27486437

  8. Systematical bifurcation analysis of an intracellular calcium oscillation model.

    PubMed

    Liu, Xijun; Li, Xiang

    2016-07-01

    As a very important second messenger, Ca(2+) plays the role of adjusting various cellular physiological processes through calcium oscillations. In this paper, a further theoretical study is conducted to explore the kinetic behavior of the calcium signals based on a mathematical model. At first, the causes behind the appearance and disappearance of calcium oscillations are strictly verified from the theoretical level and a comparative analysis between the improved model and the original model is also made. Then, it is found that with the increase of relaxation time, the second bifurcation point of the system moves towards the increasing direction of the stimulus intensity and the oscillation interval displays gradual increase. It is also found that under given stimulus intensity, with the relaxation time getting longer, both the peak value and the period of the calcium oscillations display significant increase. Combining the results from the comparative analysis between the improved model and the original model with the results from the analysis of the relaxation time, it shows that the calcium pump activity exerts a direct impact on the calcium oscillation interval. Finally, the calcium leakage item is introduced into the improved model and it is found that as the calcium leakage increases, the two Hopf bifurcation points of the system both move towards the decreasing direction of the stimulus intensity and the oscillation interval gradually narrows down. The study also shows that under given stimulus intensity, as the calcium leakage increases, the peak value of the calcium oscillations displays slow increase and the oscillation period displays gradual decline. PMID:27172874

  9. Postpartum Depression among Rural Women from Developed and Developing Countries: A Systematic Review

    ERIC Educational Resources Information Center

    Villegas, Laura; McKay, Katherine; Dennis, Cindy-Lee; Ross, Lori E.

    2011-01-01

    Purpose: Postpartum depression (PPD) is a significant public health problem, with significant consequences for the mother, infant, and family. Available research has not adequately examined the potential impact of sociodemographic characteristics, such as place of residence, on risk for PPD. Therefore, this systematic review and meta-analysis…

  10. Two models of multiple family therapy in the treatment of adolescent anorexia nervosa: a systematic review.

    PubMed

    Gelin, Zoé; Cook-Darzens, Solange; Simon, Yves; Hendrick, Stéphan

    2016-03-01

    Multiple family therapy (MFT) is a therapeutic method that brings together several families affected by the same pathology. Although from an ideological and conceptual point of view, MFT is often linked to family therapy and group therapy, it is difficult to define it with precision, a weakness which may in turn hinder research on therapeutic effectiveness. This is most notable in the field of eating disorders (ED) where, in spite of MFT's great popularity, research evidence remains limited. Within the context of a systematic review of the literature on MFT in the treatment of anorexia nervosa, the purpose of this article is to provide a theoretical and clinical framework for describing two MFT models, in an attempt to explore their common and distinct concepts, principles, techniques, and factors of change. The first program is a day treatment adaptation of the Maudsley family-based MFT approach, developed in Belgium at the Therapeutic Centre for Adolescents suffering from Eating Disorders: it focuses on the management of ED symptoms, using a strong cognitive behavioral orientation. The second is an integrated systemic MFT outpatient and inpatient program carried out on the ED unit of a pediatric hospital in Paris, France: it emphasizes intra- and inter-family relationships within a systemic framework. Our effort to describe and compare these two models constitutes a first step toward determining the relative value of different models of MFT. Indeed, each model presents specific characteristics that may make it best suited for specific ED populations and/or types of families. PMID:26223191

  11. Quantifying properties of hot and dense QCD matter through systematic model-to-data comparison

    NASA Astrophysics Data System (ADS)

    Bernhard, Jonah E.; Marcy, Peter W.; Coleman-Smith, Christopher E.; Huzurbazar, Snehalata; Wolpert, Robert L.; Bass, Steffen A.

    2015-05-01

    We systematically compare an event-by-event heavy-ion collision model to data from the CERN Large Hadron Collider. Using a general Bayesian method, we probe multiple model parameters including fundamental quark-gluon plasma properties such as the specific shear viscosity η /s , calibrate the model to optimally reproduce experimental data, and extract quantitative constraints for all parameters simultaneously. The method is universal and easily extensible to other data and collision models.

  12. Quantifying properties of hot and dense QCD matter through systematic model-to-data comparison

    SciTech Connect

    Bernhard, Jonah E.; Marcy, Peter W.; Coleman-Smith, Christopher E.; Huzurbazar, Snehalata; Wolpert, Robert L.; Bass, Steffen A.

    2015-05-22

    We systematically compare an event-by-event heavy-ion collision model to data from the CERN Large Hadron Collider. Using a general Bayesian method, we probe multiple model parameters including fundamental quark-gluon plasma properties such as the specific shear viscosity η/s, calibrate the model to optimally reproduce experimental data, and extract quantitative constraints for all parameters simultaneously. Furthermore, the method is universal and easily extensible to other data and collision models.

  13. Cardiovascular Disease Risk Models and Longitudinal Changes in Cognition: A Systematic Review

    PubMed Central

    Harrison, Stephanie L.; Ding, Jie; Tang, Eugene Y. H.; Siervo, Mario; Robinson, Louise; Jagger, Carol; Stephan, Blossom C. M.

    2014-01-01

    Background Cardiovascular disease and its risk factors have consistently been associated with poor cognitive function and incident dementia. Whether cardiovascular disease prediction models, developed to predict an individual's risk of future cardiovascular disease or stroke, are also informative for predicting risk of cognitive decline and dementia is not known. Objective The objective of this systematic review was to compare cohort studies examining the association between cardiovascular disease risk models and longitudinal changes in cognitive function or risk of incident cognitive impairment or dementia. Materials and Methods Medline, PsychINFO, and Embase were searched from inception to March 28, 2014. From 3,413 records initially screened, 21 were included. Results The association between numerous different cardiovascular disease risk models and cognitive outcomes has been tested, including Framingham and non-Framingham risk models. Five studies examined dementia as an outcome; fourteen studies examined cognitive decline or incident cognitive impairment as an outcome; and two studies examined both dementia and cognitive changes as outcomes. In all studies, higher cardiovascular disease risk scores were associated with cognitive changes or risk of dementia. Only four studies reported model prognostic performance indices, such as Area Under the Curve (AUC), for predicting incident dementia or cognitive impairment and these studies all examined non-Framingham Risk models (AUC range: 0.74 to 0.78). Conclusions Cardiovascular risk prediction models are associated with cognitive changes over time and risk of dementia. Such models are easily obtainable in clinical and research settings and may be useful for identifying individuals at high risk of future cognitive decline and dementia. PMID:25478916

  14. SCID: A Competency-Based Curriculum Development Model.

    ERIC Educational Resources Information Center

    Norton, Robert E.

    To provide structure for developing curriculum for Competency Based Education (CBE), an effective and efficient model, Systematic Curriculum and Instructional Development (SCID), has been devised. SCID has five phases: analysis, design, development, implementation, and evaluation. Each of 23 components involves several steps, some optional. Phase…

  15. Systematic parameter estimation and sensitivity analysis using a multidimensional PEMFC model coupled with DAKOTA.

    SciTech Connect

    Wang, Chao Yang; Luo, Gang; Jiang, Fangming; Carnes, Brian; Chen, Ken Shuang

    2010-05-01

    Current computational models for proton exchange membrane fuel cells (PEMFCs) include a large number of parameters such as boundary conditions, material properties, and numerous parameters used in sub-models for membrane transport, two-phase flow and electrochemistry. In order to successfully use a computational PEMFC model in design and optimization, it is important to identify critical parameters under a wide variety of operating conditions, such as relative humidity, current load, temperature, etc. Moreover, when experimental data is available in the form of polarization curves or local distribution of current and reactant/product species (e.g., O2, H2O concentrations), critical parameters can be estimated in order to enable the model to better fit the data. Sensitivity analysis and parameter estimation are typically performed using manual adjustment of parameters, which is also common in parameter studies. We present work to demonstrate a systematic approach based on using a widely available toolkit developed at Sandia called DAKOTA that supports many kinds of design studies, such as sensitivity analysis as well as optimization and uncertainty quantification. In the present work, we couple a multidimensional PEMFC model (which is being developed, tested and later validated in a joint effort by a team from Penn State Univ. and Sandia National Laboratories) with DAKOTA through the mapping of model parameters to system responses. Using this interface, we demonstrate the efficiency of performing simple parameter studies as well as identifying critical parameters using sensitivity analysis. Finally, we show examples of optimization and parameter estimation using the automated capability in DAKOTA.

  16. Systematic process development towards high performance transferred thin silicon solar cells based on epitaxially grown absorbers

    NASA Astrophysics Data System (ADS)

    Murcia Salazar, Clara Paola

    ). First principles modeling, however, predicts that efficiencies of 20+% are achievable with less than 20 mum of c-Si. In addition to a high voltage design, this work reports state of the art epitaxial c-Si solar cell performance and a path towards 20+%-efficient transferred epitaxial solar cells. The design and fabrication approach is based on high open circuit voltage first, high short circuit current second. A first design is a thin solar cell grown on a conductive silicon wafer. This structure allows developing processes to increase bulk lifetime and reduce surface recombination. Important processes that can be used for a transferred solar cell such as increased fill factor (FF) are developed at this stage. A second design is based on the use of a separation layer prior to the solar cell growth. We achieve a comparable performance with the second design. A third design includes the transfer of the solar cell to a secondary substrate. Initial processing development is reported for the transferred solar cells. Improvements in solar cell critical parameters have been characterized with a combination of predictive modeling and solar cell diagnostic tools such as quantum efficiency and voltage measurements. Fabrication processes have been developed to improve solar cell performance. The combination of process development, test structures, systematic fabrication, testing and analysis concludes with a path to high voltage, transferred thin c-Si solar cells towards 20+% efficiencies.

  17. Evidence synthesis in international development: a critique of systematic reviews and a pragmatist alternative

    PubMed Central

    Cornish, Flora

    2015-01-01

    Systematic reviews are an instrument of Evidence-Based Policy designed to produce comprehensive, unbiased, transparent and clear assessments of interventions’ effectiveness. From their origins in medical fields, systematic reviews have recently been promoted as offering important advances in a range of applied social science fields, including international development. Drawing on a case study of a systematic review of the effectiveness of community mobilisation as an intervention to tackle HIV/AIDS, this article problematises the use of systematic reviews to summarise complex and context-specific bodies of evidence. Social development interventions, such as ‘community mobilisation’ often take different forms in different interventions; are made successful by their situation in particular contexts, rather than being successful or unsuccessful universally; and have a rhetorical value that leads to the over-application of positively valued terms (e.g. ‘community mobilisation’), invalidating the keyword search process of a systematic review. The article suggests that the policy interest in definitive summary statements of ‘the evidence’ is at odds with academic assessments that evidence takes multiple, contradictory and complex forms, and with practitioner experience of the variability of practice in context. A pragmatist philosophy of evidence is explored as an alternative. Taking this approach implies expanding the definition of forms of research considered to be ‘useful evidence’ for evidence-based policy-making; decentralising decisions about ‘what works’ to allow for the use of local practical wisdom; and prioritising the establishment of good processes for the critical use of evidence, rather than producing context-insensitive summaries of ‘the evidence’. PMID:26426502

  18. Evidence synthesis in international development: a critique of systematic reviews and a pragmatist alternative.

    PubMed

    Cornish, Flora

    2015-12-01

    Systematic reviews are an instrument of Evidence-Based Policy designed to produce comprehensive, unbiased, transparent and clear assessments of interventions' effectiveness. From their origins in medical fields, systematic reviews have recently been promoted as offering important advances in a range of applied social science fields, including international development. Drawing on a case study of a systematic review of the effectiveness of community mobilisation as an intervention to tackle HIV/AIDS, this article problematises the use of systematic reviews to summarise complex and context-specific bodies of evidence. Social development interventions, such as 'community mobilisation' often take different forms in different interventions; are made successful by their situation in particular contexts, rather than being successful or unsuccessful universally; and have a rhetorical value that leads to the over-application of positively valued terms (e.g. 'community mobilisation'), invalidating the keyword search process of a systematic review. The article suggests that the policy interest in definitive summary statements of 'the evidence' is at odds with academic assessments that evidence takes multiple, contradictory and complex forms, and with practitioner experience of the variability of practice in context. A pragmatist philosophy of evidence is explored as an alternative. Taking this approach implies expanding the definition of forms of research considered to be 'useful evidence' for evidence-based policy-making; decentralising decisions about 'what works' to allow for the use of local practical wisdom; and prioritising the establishment of good processes for the critical use of evidence, rather than producing context-insensitive summaries of 'the evidence'. PMID:26426502

  19. A New Framework for Systematically Characterizing and Improving Extreme Weather Phenomena in Climate Models

    NASA Astrophysics Data System (ADS)

    O'Brien, T. A.; Kashinath, K.; Collins, W.

    2014-12-01

    Extreme weather phenomena remain a significant challenge for climate models due in part to the relatively small space and time scales at which such events occur. Accordingly, robust simulation of extreme events requires models with high fidelity at these relatively small scales. However, numerous recent studies have shown evidence that current climate models exhibit non-convergent changes in extreme weather statistics as spatial and temporal resolution increase. These studies also provide evidence that such non-convergence originates in the subgrid parameterization suites (e.g., micro/macrophysics and convection). In order to provide a framework for identifying parameterization characteristics that cause non-convergent behavior and for testing parameterization improvements, we have developed a hindcast-based system characterizing the fidelity of extremes as a function of spatial and temporal resolution. The use of hindcasts as a model evaluation tool allows us to identify modes of failure (e.g., false-hits and misses) that systematically vary as a function of resolution. We have implemented this framework for the Community Earth System Model, and we have created a dataset of hindcast ensembles at multiple horizontal resolutions. Preliminary analysis of this multi-resolution set of hindcasts shows that in some regions, (1) the tail of the precipitation probability density (PDF) function grows as resolution increases (in accord with recent studies), and that (2) a large portion of this increase in the PDF tail comes from increases in Type I model errors—simulated extreme events that do not occur in observations. We explore possible causes of this inconsistent model behavior.

  20. A Systematic Literature Search on Psychological First Aid: Lack of Evidence to Develop Guidelines

    PubMed Central

    Dieltjens, Tessa; Moonens, Inge; Van Praet, Koen; De Buck, Emmy; Vandekerckhove, Philippe

    2014-01-01

    Background Providing psychological first aid (PFA) is generally considered to be an important element in preliminary care of disaster victims. Using the best available scientific basis for courses and educational materials, the Belgian Red Cross-Flanders wants to ensure that its volunteers are trained in the best way possible. Objective To identify effective PFA practices, by systematically reviewing the evidence in existing guidelines, systematic reviews and individual studies. Methods Systematic literature searches in five bibliographic databases (MEDLINE, PsycINFO, The Cochrane Library, PILOTS and G-I-N) were conducted from inception to July 2013. Results Five practice guidelines were included which were found to vary in the development process (AGREE II score 20–53%) and evidence base used. None of them provides solid evidence concerning the effectiveness of PFA practices. Additionally, two systematic reviews of PFA were found, both noting a lack of studies on PFA. A complementary search for individual studies, using a more sensitive search strategy, identified 11 237 references of which 102 were included for further full-text examination, none of which ultimately provides solid evidence concerning the effectiveness of PFA practices. Conclusion The scientific literature on psychological first aid available to date, does not provide any evidence about the effectiveness of PFA interventions. Currently it is impossible to make evidence-based guidelines about which practices in psychosocial support are most effective to help disaster and trauma victims. PMID:25503520

  1. Modeling of Novel Diagnostic Strategies for Active Tuberculosis – A Systematic Review: Current Practices and Recommendations

    PubMed Central

    Zwerling, Alice; White, Richard G.; Vassall, Anna; Cohen, Ted; Dowdy, David W.; Houben, Rein M. G. J.

    2014-01-01

    Introduction The field of diagnostics for active tuberculosis (TB) is rapidly developing. TB diagnostic modeling can help to inform policy makers and support complicated decisions on diagnostic strategy, with important budgetary implications. Demand for TB diagnostic modeling is likely to increase, and an evaluation of current practice is important. We aimed to systematically review all studies employing mathematical modeling to evaluate cost-effectiveness or epidemiological impact of novel diagnostic strategies for active TB. Methods Pubmed, personal libraries and reference lists were searched to identify eligible papers. We extracted data on a wide variety of model structure, parameter choices, sensitivity analyses and study conclusions, which were discussed during a meeting of content experts. Results & Discussion From 5619 records a total of 36 papers were included in the analysis. Sixteen papers included population impact/transmission modeling, 5 were health systems models, and 24 included estimates of cost-effectiveness. Transmission and health systems models included specific structure to explore the importance of the diagnostic pathway (n = 4), key determinants of diagnostic delay (n = 5), operational context (n = 5), and the pre-diagnostic infectious period (n = 1). The majority of models implemented sensitivity analysis, although only 18 studies described multi-way sensitivity analysis of more than 2 parameters simultaneously. Among the models used to make cost-effectiveness estimates, most frequent diagnostic assays studied included Xpert MTB/RIF (n = 7), and alternative nucleic acid amplification tests (NAATs) (n = 4). Most (n = 16) of the cost-effectiveness models compared new assays to an existing baseline and generated an incremental cost-effectiveness ratio (ICER). Conclusion Although models have addressed a small number of important issues, many decisions regarding implementation of TB diagnostics are being made without

  2. Effectiveness of Peer Education Interventions for HIV Prevention in Developing Countries: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Medley, Amy; Kennedy, Caitlin; O'Reilly, Kevin; Sweat, Michael

    2009-01-01

    Peer education for HIV prevention has been widely implemented in developing countries, yet the effectiveness of this intervention has not been systematically evaluated. We conducted a systematic review and meta-analysis of peer education interventions in developing countries published between January 1990 and November 2006. Standardized methods of…

  3. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    NASA Astrophysics Data System (ADS)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  4. Guidelines 2.0: systematic development of a comprehensive checklist for a successful guideline enterprise

    PubMed Central

    Schünemann, Holger J.; Wiercioch, Wojtek; Etxeandia, Itziar; Falavigna, Maicon; Santesso, Nancy; Mustafa, Reem; Ventresca, Matthew; Brignardello-Petersen, Romina; Laisaar, Kaja-Triin; Kowalski, Sérgio; Baldeh, Tejan; Zhang, Yuan; Raid, Ulla; Neumann, Ignacio; Norris, Susan L.; Thornton, Judith; Harbour, Robin; Treweek, Shaun; Guyatt, Gordon; Alonso-Coello, Pablo; Reinap, Marge; Brožek, Jan; Oxman, Andrew; Akl, Elie A.

    2014-01-01

    Background: Although several tools to evaluate the credibility of health care guidelines exist, guidance on practical steps for developing guidelines is lacking. We systematically compiled a comprehensive checklist of items linked to relevant resources and tools that guideline developers could consider, without the expectation that every guideline would address each item. Methods: We searched data sources, including manuals of international guideline developers, literature on guidelines for guidelines (with a focus on methodology reports from international and national agencies, and professional societies) and recent articles providing systematic guidance. We reviewed these sources in duplicate, extracted items for the checklist using a sensitive approach and developed overarching topics relevant to guidelines. In an iterative process, we reviewed items for duplication and omissions and involved experts in guideline development for revisions and suggestions for items to be added. Results: We developed a checklist with 18 topics and 146 items and a webpage to facilitate its use by guideline developers. The topics and included items cover all stages of the guideline enterprise, from the planning and formulation of guidelines, to their implementation and evaluation. The final checklist includes links to training materials as well as resources with suggested methodology for applying the items. Interpretation: The checklist will serve as a resource for guideline developers. Consideration of items on the checklist will support the development, implementation and evaluation of guidelines. We will use crowdsourcing to revise the checklist and keep it up to date. PMID:24344144

  5. ROBIS: A new tool to assess risk of bias in systematic reviews was developed

    PubMed Central

    Whiting, Penny; Savović, Jelena; Higgins, Julian P.T.; Caldwell, Deborah M.; Reeves, Barnaby C.; Shea, Beverley; Davies, Philippa; Kleijnen, Jos; Churchill, Rachel

    2016-01-01

    Objective To develop ROBIS, a new tool for assessing the risk of bias in systematic reviews (rather than in primary studies). Study Design and Setting We used four-stage approach to develop ROBIS: define the scope, review the evidence base, hold a face-to-face meeting, and refine the tool through piloting. Results ROBIS is currently aimed at four broad categories of reviews mainly within health care settings: interventions, diagnosis, prognosis, and etiology. The target audience of ROBIS is primarily guideline developers, authors of overviews of systematic reviews (“reviews of reviews”), and review authors who might want to assess or avoid risk of bias in their reviews. The tool is completed in three phases: (1) assess relevance (optional), (2) identify concerns with the review process, and (3) judge risk of bias. Phase 2 covers four domains through which bias may be introduced into a systematic review: study eligibility criteria; identification and selection of studies; data collection and study appraisal; and synthesis and findings. Phase 3 assesses the overall risk of bias in the interpretation of review findings and whether this considered limitations identified in any of the phase 2 domains. Signaling questions are included to help judge concerns with the review process (phase 2) and the overall risk of bias in the review (phase 3); these questions flag aspects of review design related to the potential for bias and aim to help assessors judge risk of bias in the review process, results, and conclusions. Conclusions ROBIS is the first rigorously developed tool designed specifically to assess the risk of bias in systematic reviews. PMID:26092286

  6. A systematic review of the association between fish oil supplementation and the development of asthma exacerbations

    PubMed Central

    Hardy, M Scott; Kekic, Adrijana; Graybill, Nicole L; Lancaster, Zachary R

    2016-01-01

    A systematic review was conducted to examine the association between fish oil supplementation and the development of asthma exacerbations. Comprehensive literature reviews of recent fish oil studies were performed to evaluate alterations in asthma surrogate markers. Additionally, the relative compositions of the fish oils used in each study were analyzed. The results of the review were inconclusive, but provide a basis for future research methods.

  7. A systematic strain selection approach for halotolerant and halophilic bioprocess development: a review.

    PubMed

    Uratani, Joao M; Kumaraswamy, Rajkumari; Rodríguez, Jorge

    2014-07-01

    Halotolerant and halophilic microorganisms have potential applications in a number of very relevant environmental and industrial bioprocesses, from wastewater treatment to production of value-added chemicals. While numerous microbial strains have been identified and studied in the literature, the number of those successfully used in industrial applications is comparatively small. Literature is abundant in terms of characterisation of specific strains under a microbiology perspective; however, there is a need for studies tackling the selection of strains for bioprocess applications. This review presents a database of over 200 halophilic and halotolerant prokaryote strains compiled from taxonomic microbiological resources and classified by trophic groups as well as by their salinity, pH and temperature tolerance and optimum ranges, all under a process development perspective. In addition to this database, complementary systematic approaches for the selection of suitable strains for a given trophic activity and environmental conditions are also presented. Both the database and the proposed selection approaches together constitute a general tool for process development that allows researchers to systematically search for strains capable of specific substrate degradations under specific conditions (pH, T, salinity). Many exiting established halotolerant and halophilic environmental and industrial bioprocesses appear to have been developed following strategies in line with the systematic approaches proposed here. PMID:24913901

  8. The development of systematic quality control method using laboratory information system and unity program.

    PubMed

    Min, Won-Ki; Lee, Woochang; Park, Hyosoon

    2002-01-01

    Quality control (QC) process is performed to detect and correct errors in the laboratory, of which systematic errors are repeated and affect all the laboratory process thereafter. This makes it necessary for all the laboratories to detect and correct errors effectively and efficiently. We developed an on-line quality assurance system for detection and correction of systematic error, and linked it to the Unity Plus/Pro (Bio-Rad Laboratories, Irvine, USA), a commercially available quality management system. The laboratory information system based on the client-server paradigm was developed using NCR3600 (NCR, West Columbia, USA) as the server and database for server was Oracle 7.2 (Oracle, Belmont, USA) and development tool was Powerbuilder (Powersoft Burlignton, UK). Each QC material is registered and gets its own identification number and tested the same way as patient sample. The resulting QC data is entered into the Unity Plus/Pro program by in-house data entering program or by manual input. With the implementation of in-house laboratory information system (LIS) and linking it to Unity Plus/Pro, we could apply Westgard's multi-rule for higher error detection rate, resulting in more systematic and precise quality assurance for laboratory product, as well as complementary to conventional external quality assessment. PMID:12755272

  9. Redox systematics in model glass compositions from West Valley

    SciTech Connect

    Schreiber, H.D.; Schreiber, C.W.; Ward, C.C.

    1993-12-31

    At a processing temperature of 1150{degrees}C for model West Valley glass composition, the prescribed range of oxygen fugacities needed to achieve an [Fe{sup 2+}]/[Fe{sup 3+}] of 0.1 to 0.5 is 10{sup -4} to 10{sup -7} atm. Establishment of the Fe{sup 2+}-Fe{sup 0} equilibrium, resulting in metal precipitation from the melt, occurs at oxygen fugacities lower than 10{sup -11} atm at this temperature. The target processing range as defined by the iron redox ratio is equally valid at both lower and higher temperatures ({+-}100{degrees}C). Elevations of the concentrations of redox-active components to 1 wt% Cr{sub 2}O{sub 3}, 1 wt% NiO, 1 wt% CeO{sub 2}, and 4 wt% Mn{sub 2}O{sub 3} in the waste glass will not affect the redox limits as established by the iron redox ratio of 0.1 to 0.5; these limits provide sufficiently large margins of safety to assure no stabilization of reduced or oxidized forms of these elements.

  10. Community Disaster Resilience: a Systematic Review on Assessment Models and Tools

    PubMed Central

    Ostadtaghizadeh, Abbas; Ardalan, Ali; Paton, Douglas; Jabbari, Hossain; Khankeh, Hamid Reza

    2015-01-01

    we summarize the models identified in the literature and suggest that, as a starting point for the systematic operationalization of CDR, that existing indicators of community disaster resilience be classified in five domains. These are social, economic, institutional, physical and natural domains. A need to use appropriate and effective methods to quantify and weigh them with regard to their relative contributions to resilience is identified, as is a need to consider how these levels interrelate to influence resilience. Although assessment of disaster resilience especially at the community level will inform disaster risk reduction strategies, attempts to systematically do so are in preliminary phases. Further empirical investigation is needed to develop a operational and measurable CDR model. PMID:25905026

  11. Construct and criterion-related validation of nutrient profiling models: A systematic review of the literature.

    PubMed

    Cooper, Sheri L; Pelly, Fiona E; Lowe, John B

    2016-05-01

    Nutrient profiling (NP) is defined as the science of ranking foods according to their nutritional composition for the purpose of preventing disease or promoting health. The application of NP is ultimately to assist consumers to make healthier food choices, and thus provide a cost effective public health strategy to reduce the incidence of diet-related chronic disease. To our knowledge, no review has assessed the evidence to confirm the validity of NP models. We conducted a systematic review to investigate the construct and criterion-related validity of NP models in ranking food according to their nutritional composition for the purpose of preventing disease and promoting health. We searched peer-reviewed research published to 30 June 2015 and used PUBMED, Global Health (CABI), and SCOPUS databases. Within study bias was assessed using an adapted version of the QUADAS-2 (Quality Assessment of Diagnostic Accuracy Studies -2) tool for all diagnostic studies and the Cochrane Collaboration's Risk of Bias tool for all non-diagnostic studies. The GRADE (Grades of Recommendation, Assessment, Development, and Evaluation) approach was used to guide our judgement of the quality of the body of evidence for each outcome measure. From a total of 83 studies, 69 confirmed the construct validity of NP models; however most of these studies contained methodological weaknesses. Six studies used objective external measures to confirm the criterion-related validity of NP models; which inherently improved quality. The overall quality of evidence on the accuracy of NP models was judged to be very low to moderate using the GRADE approach. Many carefully designed studies to establish both construct and criterion-related validity are necessary to authenticate the application of NP models and provide the evidence to support the current definition of NP. PMID:26850312

  12. Systematic evaluation of a novel model for cardiac ischemic preconditioning in mice.

    PubMed

    Eckle, Tobias; Grenz, Almut; Köhler, David; Redel, Andreas; Falk, Melanie; Rolauffs, Bernd; Osswald, Hartmut; Kehl, Franz; Eltzschig, Holger K

    2006-11-01

    Cardioprotection by ischemic preconditioning (IP) remains an area of intense investigation. To further elucidate its molecular basis, the use of transgenic mice seems critical. Due to technical difficulty associated with performing cardiac IP in mice, we developed an in situ model for cardiac IP using a hanging-weight system for coronary artery occlusion. This technique has the major advantage of eliminating the necessity of intermittently occluding the coronary artery with a knotted suture. To systematically evaluate this model, we first demonstrated correlation of ischemia times (10-60 min) with infarct sizes [3.5 +/- 1.3 to 42 +/- 5.2% area at risk (AAR), Evan's blue/triphenyltetrazolium chloride staining]. IP (4 x 5 min) and cold ischemia (27 degrees C) reduced infarct size by 69 +/- 6.7% and 84 +/- 4.2%, respectively (n = 6, P < 0.01). In contrast, lower numbers of IP cycles did not alter infarct size. However, infarct sizes were distinctively different in mice from different genetic backgrounds. In addition to infarct staining, we tested cardiac troponin I (cTnI) as marker of myocardial infarction in this model. In fact, plasma levels of cTnI were significantly lower in IP-treated mice and closely correlated with infarct sizes (R(2) = 0.8). To demonstrate transcriptional consequences of cardiac IP, we isolated total RNA from the AAR and showed repression of the equilibrative nucleoside transporters 1-4 by IP in this model. Taken together, this study demonstrates highly reproducible infarct sizes and cardiac protection by IP, thus minimizing the variability associated with knot-based coronary occlusion models. Further studies on cardiac IP using transgenic mice may consider this technique. PMID:16766632

  13. Developing Behavioral Theory With the Systematic Integration of Community Social Capital Concepts

    PubMed Central

    Samuel, Laura J.; Commodore-Mensah, Yvonne; Dennison Himmelfarb, Cheryl R.

    2014-01-01

    Health behavior theories state that social environments influence health behaviors, but theories of how this occurs are relatively underdeveloped. This article systematically surveys community social capital concepts in health behavior literature and proposes a conceptual framework that integrates these concepts into existing behavioral theory. Fifty-three studies tested associations between community social capital concepts and physical activity (38 studies), smoking (19 studies), and diet (2 studies). Trustworthiness of community members was consistently associated with more health-promoting and less disease-promoting behaviors in 19 studies. Neighborly reciprocity showed mixed results in 10 studies. Reporting a good sense of community was associated with more physical activity in only 5 of 16 studies. Neighborhood collective efficacy, which includes social cohesion and informal social control, was inconsistently associated with behaviors in 22 studies. Behavioral social norms were associated with smoking and physical activity in 2 of 6 studies, and neighborhood modeling of physical activity was associated with increased activity in 12 of 17 studies, with 1 opposing result. This review identifies several community social capital–related concepts that are, at times, associated with both health-promoting and disease-promoting behaviors and often have no associations. Theory explains these findings by describing the relationships and interactions among these concepts. Using these findings, this article proposes a conceptual framework that integrates community social capital concepts into existing behavioral theory. Iterative empirically based theory development is needed to address these concepts, which affect behaviors. These results can also inform theoretically based community-based and socially tailored interventions. PMID:24092886

  14. A Neurobiological Model of Borderline Personality Disorder: Systematic and Integrative Review.

    PubMed

    Ruocco, Anthony C; Carcone, Dean

    2016-01-01

    Borderline personality disorder (BPD) is a severe mental disorder with a multifactorial etiology. The development and maintenance of BPD is sustained by diverse neurobiological factors that contribute to the disorder's complex clinical phenotype. These factors may be identified using a range of techniques to probe alterations in brain systems that underlie BPD. We systematically searched the scientific literature for empirical studies on the neurobiology of BPD, identifying 146 articles in three broad research areas: neuroendocrinology and biological specimens; structural neuroimaging; and functional neuroimaging. We consolidate the results of these studies and provide an integrative model that attempts to incorporate the heterogeneous findings. The model specifies interactions among endogenous stress hormones, neurometabolism, and brain structures and circuits involved in emotion and cognition. The role of the amygdala in BPD is expanded to consider its functions in coordinating the brain's dynamic evaluation of the relevance of emotional stimuli in the context of an individual's goals and motivations. Future directions for neurobiological research on BPD are discussed, including implications for the Research Domain Criteria framework, accelerating genetics research by incorporating endophenotypes and gene × environment interactions, and exploring novel applications of neuroscience findings to treatment research. PMID:27603741

  15. Systematic modeling versus the learning cycle: Comparative effects on integrated science process skill achievement

    NASA Astrophysics Data System (ADS)

    Rubin, Rochelle L.; Norman, John T.

    This study assessed the effectiveness of the systematic modeling teaching strategy on integrated science process skills and formal reasoning ability. Urban middle school students received a three-month process skill intervention treatment from teachers trained in either the use of systematic modeling or the learning-cycle model. A third, control group received traditional science instruction. The analysis of data revealed that (a) students receiving modeled instruction demonstrated a significant difference in their achievement of process skills when compared to either of the control groups. (b) Students taught by teachers who had received special process skill and strategy training demonstrated a significant difference in their process skill achievement when compared with the control group. (c) Students at different cognitive reasoning levels demonstrated significantly different process skill ability.

  16. Study of child language development and disorders in Iran: A systematic review of the literature

    PubMed Central

    Kazemi, Yalda; Stringer, Helen; Klee, Thomas

    2015-01-01

    Child language development and disorder in Iran has been the focus for research by different professions, the most prominent ones among them being psychologists and speech therapists. Epidemiological studies indicate that between 8% and 12% of children show noticeable signs of language impairment in the preschool years; however, research on child language in Iran is not extensive compared to studies in English speaking countries, which are currently the basis of clinical decision-making in Iran. Consequently, there is no information about the prevalence of child language disorders in Iranian population. This review summarizes Iranian studies on child language development and disorder in the preschool years and aims to systematically find the most studied topics in the field of normal development, the assessment and diagnosis of language impairments as well as exploring the current gaps within the body of literature. Three main Iranian academic websites of indexed articles along with four other nonIranian databases were scrutinized for all relevant articles according to the inclusion criteria: Iranian studies within the field of Persian language development and disorders in preschool children published up to December 2013. They are classified according to the hierarchy of evidence and weighed against the criteria of critical appraisal of study types. As this is a type of nonintervention systematic review, the preferred reporting items for systematic reviews and meta-analyses is modified to be more compatible to the designs of eligible studies, including descriptive studies, test-developing and/or diagnostic studies. Several limitations made the process of searching and retrieving problematic; e.g., lack of unified keywords and incompatibility of Persian typing structure embedded in Iranian search engines. Overall, eligible studies met the criteria up to the third level of the hierarchy of evidence that shows the necessity of conducting studies with higher levels of

  17. Links of Adolescents Identity Development and Relationship with Peers: A Systematic Literature Review

    PubMed Central

    Ragelienė, Tija

    2016-01-01

    Objective: According to Erik Erikson, the main task of adolescents is to solve the crisis of identity versus role confusion. Research has shown that a stable and strong sense of identity is associated with better mental health of adolescents. Good relationships with peers are also linked with better emotional and psychological well-being of adolescents. However, there is a lack of reviews of studies in the scientific literature examining the relationship between the adolescents’ identity development and relationships with peers. The aims of this article were to analyze links between adolescent identity development and relationships with peers identified from a literature review, summarize the results, and discuss the theoretical factors that may predict these relationships. Method: A systematic literature review. Results: Analysis of findings from the systematic literature review revealed that a good relationship with peers is positively related to adolescent identity development, but empirical research in this area is extremely limited. Conclusions: The links between adolescents’ identity development and their relationship with peers are not completely clear. The possible intermediate factors that could determine the relationship between adolescent identity development and their relationships with peers are discussed. Further empirical researches is needed in this area. PMID:27274745

  18. Constraints and Opportunities in GCM Model Development

    NASA Astrophysics Data System (ADS)

    Schmidt, G. A.; Clune, T.

    2010-12-01

    Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.

  19. Effectiveness of Peer Education Interventions for HIV Prevention in Developing Countries: A Systematic Review and Meta-Analysis

    PubMed Central

    Medley, Amy; Kennedy, Caitlin; O’Reilly, Kevin; Sweat, Michael

    2014-01-01

    Background Peer education for HIV prevention has been widely implemented in developing countries, yet the effectiveness of this intervention has not been systematically evaluated. Methods We conducted a systematic review and meta-analysis of peer education interventions in developing countries published between January 1990 and November 2006. Standardized methods of searching and data abstraction were utilized. Merged effect sizes were calculated using random effects models. Results Thirty studies were identified. In meta-analysis, peer education interventions were significantly associated with increased HIV knowledge (OR:2.28; 95% CI:1.88, 2.75), reduced equipment sharing among injection drug users (OR:0.37; 95% CI:0.20, 0.67), and increased condom use (OR:1.92; 95% CI:1.59, 2.33). Peer education programs had a non-significant effect on STI infection (OR: 1.22; 95% CI:0.88, 1.71). Conclusions Meta-analysis indicates that peer education programs in developing countries are moderately effective at improving behavioral outcomes, but show no significant impact on biological outcomes. Further research is needed to determine factors that maximize the likelihood of program success. PMID:19519235

  20. Developing New Models for Collection Development.

    ERIC Educational Resources Information Center

    Stoffle, Carla J.; Fore, Janet; Allen, Barbara

    1999-01-01

    Discusses the need to develop new models for collection development in academic libraries, based on experiences at the University of Arizona. Highlights include changes in the organizational chart; focusing on users' information goals and needs; integrative services; shared resources; interlibrary loans; digital technology; and funding. (LRW)

  1. Utility of models to predict 28-day or 30-day unplanned hospital readmissions: an updated systematic review

    PubMed Central

    Zhou, Huaqiong; Della, Phillip R; Roberts, Pamela; Goh, Louise; Dhaliwal, Satvinder S

    2016-01-01

    Objective To update previous systematic review of predictive models for 28-day or 30-day unplanned hospital readmissions. Design Systematic review. Setting/data source CINAHL, Embase, MEDLINE from 2011 to 2015. Participants All studies of 28-day and 30-day readmission predictive model. Outcome measures Characteristics of the included studies, performance of the identified predictive models and key predictive variables included in the models. Results Of 7310 records, a total of 60 studies with 73 unique predictive models met the inclusion criteria. The utilisation outcome of the models included all-cause readmissions, cardiovascular disease including pneumonia, medical conditions, surgical conditions and mental health condition-related readmissions. Overall, a wide-range C-statistic was reported in 56/60 studies (0.21–0.88). 11 of 13 predictive models for medical condition-related readmissions were found to have consistent moderate discrimination ability (C-statistic ≥0.7). Only two models were designed for the potentially preventable/avoidable readmissions and had C-statistic >0.8. The variables ‘comorbidities’, ‘length of stay’ and ‘previous admissions’ were frequently cited across 73 models. The variables ‘laboratory tests’ and ‘medication’ had more weight in the models for cardiovascular disease and medical condition-related readmissions. Conclusions The predictive models which focused on general medical condition-related unplanned hospital readmissions reported moderate discriminative ability. Two models for potentially preventable/avoidable readmissions showed high discriminative ability. This updated systematic review, however, found inconsistent performance across the included unique 73 risk predictive models. It is critical to define clearly the utilisation outcomes and the type of accessible data source before the selection of the predictive model. Rigorous validation of the predictive models with moderate-to-high discriminative

  2. Design of roundness measurement model with multi-systematic error for cylindrical components with large radius

    NASA Astrophysics Data System (ADS)

    Sun, Chuanzhi; Wang, Lei; Tan, Jiubin; Zhao, Bo; Tang, Yangchao

    2016-02-01

    The paper designs a roundness measurement model with multi-systematic error, which takes eccentricity, probe offset, radius of tip head of probe, and tilt error into account for roundness measurement of cylindrical components. The effects of the systematic errors and radius of components are analysed in the roundness measurement. The proposed method is built on the instrument with a high precision rotating spindle. The effectiveness of the proposed method is verified by experiment with the standard cylindrical component, which is measured on a roundness measuring machine. Compared to the traditional limacon measurement model, the accuracy of roundness measurement can be increased by about 2.2 μm using the proposed roundness measurement model for the object with a large radius of around 37 mm. The proposed method can improve the accuracy of roundness measurement and can be used for error separation, calibration, and comparison, especially for cylindrical components with a large radius.

  3. Design of roundness measurement model with multi-systematic error for cylindrical components with large radius.

    PubMed

    Sun, Chuanzhi; Wang, Lei; Tan, Jiubin; Zhao, Bo; Tang, Yangchao

    2016-02-01

    The paper designs a roundness measurement model with multi-systematic error, which takes eccentricity, probe offset, radius of tip head of probe, and tilt error into account for roundness measurement of cylindrical components. The effects of the systematic errors and radius of components are analysed in the roundness measurement. The proposed method is built on the instrument with a high precision rotating spindle. The effectiveness of the proposed method is verified by experiment with the standard cylindrical component, which is measured on a roundness measuring machine. Compared to the traditional limacon measurement model, the accuracy of roundness measurement can be increased by about 2.2 μm using the proposed roundness measurement model for the object with a large radius of around 37 mm. The proposed method can improve the accuracy of roundness measurement and can be used for error separation, calibration, and comparison, especially for cylindrical components with a large radius. PMID:26931894

  4. Effect of health belief model and health promotion model on breast cancer early diagnosis behavior: a systematic review.

    PubMed

    Ersin, Fatma; Bahar, Zuhal

    2011-01-01

    Breast cancer is an important public health problem on the grounds that it is frequently seen and it is a fatal disease. The objective of this systematic analysis is to indicate the effects of interventions performed by nurses by using the Health Belief Model (HBM) and Health Promotion Model (HPM) on the breast cancer early diagnosis behaviors and on the components of the Health Belief Model and Health Promotion Model. The reveiw was created in line with the Centre for Reviews and Dissemination guide dated 2009 (CRD) and developed by York University National Institute of Health Researches. Review was conducted by using PUBMED, OVID, EBSCO and COCHRANE databases. Six hundred seventy eight studies (PUBMED: 236, OVID: 162, EBSCO: 175, COCHRANE:105) were found in total at the end of the review. Abstracts and full texts of these six hundred seventy eight studies were evaluated in terms of inclusion and exclusion criteria and 9 studies were determined to meet the criteria. Samplings of the studies varied between ninety four and one thousand six hundred fifty five. It was detected in the studies that educations provided by taking the theories as basis became effective on the breast cancer early diagnosis behaviors. When the literature is examined, it is observed that the experimental researches which compare the concepts of Health Belief Model (HBM) and Health Promotion Model (HPM) preoperatively and postoperatively and show the effect of these concepts on education and are conducted by nurses are limited in number. Randomized controlled studies which compare HBM and HPM concepts preoperatively and postoperatively and show the efficiency of the interventions can be useful in evaluating the efficiency of the interventions. PMID:22320955

  5. Development of a Comprehensive Hospital-Based Elder Abuse Intervention: An Initial Systematic Scoping Review

    PubMed Central

    Du Mont, Janice; Macdonald, Sheila; Kosa, Daisy; Elliot, Shannon; Spencer, Charmaine; Yaffe, Mark

    2015-01-01

    Introduction Elder abuse, a universal human rights problem, is associated with many negative consequences. In most jurisdictions, however, there are no comprehensive hospital-based interventions for elder abuse that address the totality of needs of abused older adults: psychological, physical, legal, and social. As the first step towards the development of such an intervention, we undertook a systematic scoping review. Objectives Our primary objective was to systematically extract and synthesize actionable and applicable recommendations for components of a multidisciplinary intersectoral hospital-based elder abuse intervention. A secondary objective was to summarize the characteristics of the responses reviewed, including methods of development and validation. Methods The grey and scholarly literatures were systematically searched, with two independent reviewers conducting the title, abstract and full text screening. Documents were considered eligible for inclusion if they: 1) addressed a response (e.g., an intervention) to elder abuse, 2) contained recommendations for responding to abused older adults with potential relevance to a multidisciplinary and intersectoral hospital-based elder abuse intervention; and 3) were available in English. Analysis The extracted recommendations for care were collated, coded, categorized into themes, and further reviewed for relevancy to a comprehensive hospital-based response. Characteristics of the responses were summarized using descriptive statistics. Results 649 recommendations were extracted from 68 distinct elder abuse responses, 149 of which were deemed relevant and were categorized into 5 themes: Initial contact; Capacity and consent; Interview with older adult, caregiver, collateral contacts, and/or suspected abuser; Assessment: physical/forensic, mental, psychosocial, and environmental/functional; and care plan. Only 6 responses had been evaluated, suggesting a significant gap between development and implementation of

  6. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  7. SSME structural dynamic model development

    NASA Technical Reports Server (NTRS)

    Foley, M. J.; Tilley, D. M.; Welch, C. T.

    1983-01-01

    A mathematical model of the Space Shuttle Main Engine (SSME) as a complete assembly, with detailed emphasis on LOX and High Fuel Turbopumps is developed. The advantages of both complete engine dynamics, and high fidelity modeling are incorporated. Development of this model, some results, and projected applications are discussed.

  8. Systematic review of educational programs and strategies for developing students' and nurses' writing skills.

    PubMed

    Oermann, Marilyn H; Leonardelli, Adrianne K; Turner, Kathleen M; Hawks, Sharon J; Derouin, Anne L; Hueckel, Rémi M

    2015-01-01

    The purpose of this article is to describe the outcomes of a systematic review of educational programs and strategies for developing the writing skills of nursing students and nurses. Of 728 screened citations, 80 articles were included in the review. Writing assignments in nursing courses were the most common, followed by strategies for writing across the curriculum and specific courses to improve the writing skills of nursing students. To improve nurses' writing skills, workshops were used most frequently. Only 28 (35%) of the articles were data based, and most articles described the writing program, strategy, or assignment but did not evaluate its effectiveness. PMID:25535756

  9. The psychoneuroimmunological effects of music: a systematic review and a new model.

    PubMed

    Fancourt, Daisy; Ockelford, Adam; Belai, Abi

    2014-02-01

    There has been a growing interest over the past decade into the health benefits of music, in particular examining its psychological and neurological effects. Yet this is the first attempt to systematically review publications on the psychoneuroimmunology of music. Of the selected sixty-three studies published over the past 22 years, a range of effects of music on neurotransmitters, hormones, cytokines, lymphocytes, vital signs and immunoglobulins as well as psychological assessments are cataloged. Research so far points to the pivotal role of stress pathways in linking music to an immune response. However, several challenges to this research are noted: (1) there is very little discussion on the possible mechanisms by which music is achieving its neurological and immunological impact; (2) the studies tend to examine biomarkers in isolation, without taking into consideration the interaction of the biomarkers in question with other physiological or metabolic activities of the body, leading to an unclear understanding of the impact that music may be having; (3) terms are not being defined clearly enough, such as distinctions not being made between different kinds of stress and 'music' being used to encompass a broad spectrum of activities without determining which aspects of musical engagement are responsible for alterations in biomarkers. In light of this, a new model is presented which provides a framework for developing a taxonomy of musical and stress-related variables in research design, and tracing the broad pathways that are involved in its influence on the body. PMID:24157429

  10. Multi-Period Many-Objective Groundwater Monitoring Design Given Systematic Model Errors and Uncertainty

    NASA Astrophysics Data System (ADS)

    Kollat, J. B.; Reed, P. M.

    2011-12-01

    This study demonstrates how many-objective long-term groundwater monitoring (LTGM) network design tradeoffs evolve across multiple management periods given systematic models errors (i.e., predictive bias), groundwater flow-and-transport forecasting uncertainties, and contaminant observation uncertainties. Our analysis utilizes the Adaptive Strategies for Sampling in Space and Time (ASSIST) framework, which is composed of three primary components: (1) bias-aware Ensemble Kalman Filtering, (2) many-objective hierarchical Bayesian optimization, and (3) interactive visual analytics for understanding spatiotemporal network design tradeoffs. A physical aquifer experiment is utilized to develop a severely challenging multi-period observation system simulation experiment (OSSE) that reflects the challenges and decisions faced in monitoring contaminated groundwater systems. The experimental aquifer OSSE shows both the influence and consequences of plume dynamics as well as alternative cost-savings strategies in shaping how LTGM many-objective tradeoffs evolve. Our findings highlight the need to move beyond least cost purely statistical monitoring frameworks to consider many-objective evaluations of LTGM tradeoffs. The ASSIST framework provides a highly flexible approach for measuring the value of observables that simultaneously improves how the data are used to inform decisions.

  11. Prognostic models in acute pulmonary embolism: a systematic review and meta-analysis

    PubMed Central

    Elias, Antoine; Mallett, Susan; Daoud-Elias, Marie; Poggi, Jean-Noël; Clarke, Mike

    2016-01-01

    Objective To review the evidence for existing prognostic models in acute pulmonary embolism (PE) and determine how valid and useful they are for predicting patient outcomes. Design Systematic review and meta-analysis. Data sources OVID MEDLINE and EMBASE, and The Cochrane Library from inception to July 2014, and sources of grey literature. Eligibility criteria Studies aiming at constructing, validating, updating or studying the impact of prognostic models to predict all-cause death, PE-related death or venous thromboembolic events up to a 3-month follow-up in patients with an acute symptomatic PE. Data extraction Study characteristics and study quality using prognostic criteria. Studies were selected and data extracted by 2 reviewers. Data analysis Summary estimates (95% CI) for proportion of risk groups and event rates within risk groups, and accuracy. Results We included 71 studies (44 298 patients). Among them, 17 were model construction studies specific to PE prognosis. The most validated models were the PE Severity Index (PESI) and its simplified version (sPESI). The overall 30-day mortality rate was 2.3% (1.7% to 2.9%) in the low-risk group and 11.4% (9.9% to 13.1%) in the high-risk group for PESI (9 studies), and 1.5% (0.9% to 2.5%) in the low-risk group and 10.7% (8.8% to12.9%) in the high-risk group for sPESI (11 studies). PESI has proved clinically useful in an impact study. Shifting the cut-off or using novel and updated models specifically developed for normotensive PE improves the ability for identifying patients at lower risk for early death or adverse outcome (0.5–1%) and those at higher risk (up to 20–29% of event rate). Conclusions We provide evidence-based information about the validity and utility of the existing prognostic models in acute PE that may be helpful for identifying patients at low risk. Novel models seem attractive for the high-risk normotensive PE but need to be externally validated then be assessed in impact studies. PMID

  12. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  13. Methodological developments in searching for studies for systematic reviews: past, present and future?

    PubMed Central

    2013-01-01

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies. Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory’ and 'highly desirable’ standards for various aspects of review conduct and reporting including searching, the development of Standard

  14. Association between Prenatal and Postnatal Psychological Distress and Toddler Cognitive Development: A Systematic Review

    PubMed Central

    2015-01-01

    Purpose Maternal psychological distress is one of the most common perinatal complications, affecting up to 25% of pregnant and postpartum women. Research exploring the association between prenatal and postnatal distress and toddler cognitive development has not been systematically compiled. The objective of this systematic review was to determine the association between prenatal and postnatal psychological distress and toddler cognitive development. Methods Articles were included if: a) they were observational studies published in English; b) the exposure was prenatal or postnatal psychological distress; c) cognitive development was assessed from 13 to 36 months; d) the sample was recruited in developed countries; and e) exposed and unexposed women were included. A university-based librarian conducted a search of electronic databases (Embase, CINAHL, Eric, PsycInfo, Medline) (January, 1990-March, 2014). We searched gray literature, reference lists, and relevant journals. Two reviewers independently evaluated titles/abstracts for inclusion, and quality using the Scottish Intercollegiate Guideline Network appraisal tool for observational studies. One reviewer extracted data using a standardized form. Results Thirteen of 2448 studies were included. There is evidence of an association between prenatal and postnatal distress and cognitive development. While variable effect sizes were reported for postnatal associations, most studies reported medium effect sizes for the association between prenatal psychological distress and cognitive development. Too few studies were available to determine the influence of the timing of prenatal exposure on cognitive outcomes. Conclusion Findings support the need for early identification and treatment of perinatal mental health problems as a potential strategy for optimizing toddler cognitive development. PMID:25996151

  15. Physiological water model development

    NASA Technical Reports Server (NTRS)

    Doty, Susan

    1993-01-01

    The water of the human body can be categorized as existing in two main compartments: intracellular water and extracellular water. The intracellular water consists of all the water within the cells and constitutes over half of the total body water. Since red blood cells are surrounded by plasma, and all other cells are surrounded by interstitial fluid, the intracellular compartment has been subdivided to represent these two cell types. The extracellular water, which includes all of the fluid outside of the cells, can be further subdivided into compartments which represent the interstitial fluid, circulating blood plasma, lymph, and transcellular water. The interstitial fluid surrounds cells outside of the vascular system whereas plasma is contained within the blood vessels. Avascular tissues such as dense connective tissue and cartilage contain interstitial water which slowly equilibrates with tracers used to determine extracellular fluid volume. For this reason, additional compartments are sometimes used to represent these avascular tissues. The average size of each compartment, in terms of percent body weight, has been determined for adult males and females. These compartments and the forces which cause flow between them are presented. The kidneys, a main compartment, receive about 25 percent of the cardiac output and filters out a fluid similar to plasma. The composition of this filtered fluid changes as it flows through the kidney tubules since compounds are continually being secreted and reabsorbed. Through this mechanism, the kidneys eliminate wastes while conserving body water, electrolytes, and metabolites. Since sodium accounts for over 90 percent of the cations in the extracellular fluid, and the number of cations is balanced by the number of anions, considering the renal handling sodium and water only should sufficiently describe the relationship between the plasma compartment and kidneys. A kidney function model is presented which has been adapted from a

  16. The Impact of Official Development Aid on Maternal and Reproductive Health Outcomes: A Systematic Review

    PubMed Central

    Taylor, Emma Michelle; Hayman, Rachel; Crawford, Fay; Jeffery, Patricia; Smith, James

    2013-01-01

    Background Progress toward meeting Millennium Development Goal 5, which aims to improve maternal and reproductive health outcomes, is behind schedule. This is despite ever increasing volumes of official development aid targeting the goal, calling into question the distribution and efficacy of aid. The 2005 Paris Declaration on Aid Effectiveness represented a global commitment to reform aid practices in order to improve development outcomes, encouraging a shift toward collaborative aid arrangements which support the national plans of aid recipient countries (and discouraging unaligned donor projects). Methods and Findings We conducted a systematic review to summarise the evidence of the impact on MDG 5 outcomes of official development aid delivered in line with Paris aid effectiveness principles and to compare this with the impact of aid in general on MDG 5 outcomes. Searches of electronic databases identified 30 studies reporting aid-funded interventions designed to improve maternal and reproductive health outcomes. Aid interventions appear to be associated with small improvements in the MDG indicators, although it is not clear whether changes are happening because of the manner in which aid is delivered. The data do not allow for a meaningful comparison between Paris style and general aid. The review identified discernible gaps in the evidence base on aid interventions targeting MDG 5, notably on indicators MDG 5.4 (adolescent birth rate) and 5.6 (unmet need for family planning). Discussion This review presents the first systematic review of the impact of official development aid delivered according to the Paris principles and aid delivered outside this framework on MDG 5 outcomes. Its findings point to major gaps in the evidence base and should be used to inform new approaches and methodologies aimed at measuring the impact of official development aid. PMID:23468860

  17. Systematic Assessment of Neutron and Gamma Backgrounds Relevant to Operational Modeling and Detection Technology Implementation

    SciTech Connect

    Archer, Daniel E.; Hornback, Donald Eric; Johnson, Jeffrey O.; Nicholson, Andrew D.; Patton, Bruce W.; Peplow, Douglas E.; Miller, Thomas Martin; Ayaz-Maierhafer, Birsen

    2015-01-01

    This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.

  18. Patient neglect in healthcare institutions: a systematic review and conceptual model

    PubMed Central

    2013-01-01

    Background Patient neglect is an issue of increasing public concern in Europe and North America, yet remains poorly understood. This is the first systematic review on the nature, frequency and causes of patient neglect as distinct from patient safety topics such as medical error. Method The Pubmed, Science Direct, and Medline databases were searched in order to identify research studies investigating patient neglect. Ten articles and four government reports met the inclusion criteria of reporting primary data on the occurrence or causes of patient neglect. Qualitative and quantitative data extraction investigated (1) the definition of patient neglect, (2) the forms of behaviour associated with neglect, (3) the reported frequency of neglect, and (4) the causes of neglect. Results Patient neglect is found to have two aspects. First, procedure neglect, which refers to failures of healthcare staff to achieve objective standards of care. Second, caring neglect, which refers to behaviours that lead patients and observers to believe that staff have uncaring attitudes. The perceived frequency of neglectful behaviour varies by observer. Patients and their family members are more likely to report neglect than healthcare staff, and nurses are more likely to report on the neglectful behaviours of other nurses than on their own behaviour. The causes of patient neglect frequently relate to organisational factors (e.g. high workloads that constrain the behaviours of healthcare staff, burnout), and the relationship between carers and patients. Conclusion A social psychology-based conceptual model is developed to explain the occurrence and nature of patient neglect. This model will facilitate investigations of i) differences between patients and healthcare staff in how they perceive neglect, ii) the association with patient neglect and health outcomes, iii) the relative importance of system and organisational factors in causing neglect, and iv) the design of interventions and

  19. Physical Education in Further Education. The Need for a Systematic Approach to Curriculum Development. An FEU Occasional Paper.

    ERIC Educational Resources Information Center

    Jackson, Elizabeth; And Others

    This occasional paper describes the development of physical education (PE) in further education (FE) in Great Britain since 1945, and suggests the need for a more systematic approach to curriculum development in this area. Section I reviews the development of PE in FE and identifies major issues and possible future developments. The document…

  20. Classroom Crisis Intervention through Contracting: A Moral Development Model.

    ERIC Educational Resources Information Center

    Smaby, Marlowe H.; Tamminen, Armas W.

    1981-01-01

    A counselor can arbitrate problem situations using a systematic approach to classroom intervention which includes meetings with the teacher and students. This crisis intervention model based on moral development can be more effective than reliance on guidance activities disconnected from the actual classroom settings where the problems arise.…

  1. Computational Models of Relational Processes in Cognitive Development

    ERIC Educational Resources Information Center

    Halford, Graeme S.; Andrews, Glenda; Wilson, William H.; Phillips, Steven

    2012-01-01

    Acquisition of relational knowledge is a core process in cognitive development. Relational knowledge is dynamic and flexible, entails structure-consistent mappings between representations, has properties of compositionality and systematicity, and depends on binding in working memory. We review three types of computational models relevant to…

  2. Development of Cardiovascular Indices of Acute Pain Responding in Infants: A Systematic Review

    PubMed Central

    Waxman, Jordana A.; Pillai Riddell, Rebecca R.; Tablon, Paula; Schmidt, Louis A.; Pinhasov, Angelina

    2016-01-01

    Background. Cardiovascular indices of pain are pervasive in the hospital setting. However, no prospective research has examined the development of cardiac responses to acutely painful procedures in the first year of life. Objectives. Our main goal was to synthesize existing evidence regarding the development of cardiovascular responses to acutely painful medical procedures over the first year of life in preterm and term born infants. Methods. A systematic search retrieved 6994 articles to review against inclusion criteria. A total of 41 studies were included in the review. Results. In response to acutely painful procedures, most infants had an increase in mean heart rate (HR) that varied in magnitude both across and within gestational and postnatal ages. Research in the area of HR variability has been inconsistent, limiting conclusions. Conclusions. Longitudinal research is needed to further understand the inherent variability of cardiovascular pain responses across and within gestational and postnatal ages and the causes for the variability. PMID:27445630

  3. Strategies for developing competency models.

    PubMed

    Marrelli, Anne F; Tondora, Janis; Hoge, Michael A

    2005-01-01

    There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry. PMID:16082796

  4. Social inequalities in early childhood health and development: a European-wide systematic review.

    PubMed

    Pillas, Demetris; Marmot, Michael; Naicker, Kiyuri; Goldblatt, Peter; Morrison, Joana; Pikhart, Hynek

    2014-11-01

    The evidence examining the relationship between specific social factors and early childhood health and developmental outcomes has never been systematically collated or synthesized. This review aims to identify the key social factors operating at the household, neighborhood, and country levels that drive inequalities in child health and development. Medline and CHICOS (a European child-cohort inventory) were systematically searched to identify all European studies published within the past 10 y. 13,270 Medline articles and 77 European child cohorts were searched, identifying 201 studies from 32 European countries. Neighborhood deprivation, lower parental income/wealth, educational attainment, and occupational social class, higher parental job strain, parental unemployment, lack of housing tenure, and household material deprivation were identified as the key social factors associated with a wide range of adverse child health and developmental outcomes. Similar association trends were observed across most European countries, with only minor country-level differences. Multiple adverse social factors operating at both the household and neighborhood levels are independently associated with a range of adverse health and developmental outcomes throughout early childhood. The social gradient in health and developmental outcomes observed throughout the remaining life course may be partly explained by gradients initiated in early childhood. PMID:25122581

  5. Microscopic model versus systematic low-energy effective field theory for a doped quantum ferromagnet

    SciTech Connect

    Gerber, U.; Wiese, U.-J.; Hofmann, C. P.; Kaempfer, F.

    2010-02-01

    We consider a microscopic model for a doped quantum ferromagnet as a test case for the systematic low-energy effective field theory for magnons and holes, which is constructed in complete analogy to the case of quantum antiferromagnets. In contrast to antiferromagnets, for which the effective field theory approach can be tested only numerically, in the ferromagnetic case, both the microscopic and the effective theory can be solved analytically. In this way, the low-energy parameters of the effective theory are determined exactly by matching to the underlying microscopic model. The low-energy behavior at half-filling as well as in the single- and two-hole sectors is described exactly by the systematic low-energy effective field theory. In particular, for weakly bound two-hole states the effective field theory even works beyond perturbation theory. This lends strong support to the quantitative success of the systematic low-energy effective field theory method not only in the ferromagnetic but also in the physically most interesting antiferromagnetic case.

  6. A systematic evaluation of a multidisciplinary social work-lawyer elder mistreatment intervention model.

    PubMed

    Rizzo, Victoria M; Burnes, David; Chalfy, Amy

    2015-01-01

    This study introduces a conceptually based, systematic evaluation process employing multivariate techniques to evaluate a multidisciplinary social work-lawyer intervention model (JASA-LEAP). Logistic regression analyses were used with a random sample of case records (n = 250) from three intervention sites. Client retention, program fidelity, and exposure to multidisciplinary services were significantly related to reduction in mistreatment risk at case closure. Female gender, married status, and living with perpetrator significantly predicted unfavorable outcomes. This study extends the elder mistreatment program evaluation literature beyond descriptive/bivariate evaluation strategies. Findings suggest that a multidisciplinary social work-lawyer elder mistreatment intervention model is a successful approach. PMID:24965802

  7. Online communities of practice and their role in educational development: a systematic appraisal.

    PubMed

    Swift, Lynn

    2014-04-01

    Practice teachers and academics have a role in developing knowledge and promoting evidence-based practice with their students in a supportive and creative learning environment. Recent advances in technology are enabling communities of practice' (CoPs) to be developed online and may present a valuable opportunity to form greater connections between educators. To explore this idea, the author conducted a systematic appraisal of published evidence relating to the impact of using an online CoP (OCoP) to develop knowledge among healthcare educators. Three academic databases were targeted for articles and the search retrieved nine articles that were analysed for quality. The findings identified that an OCoP offers a 'polycontextual' environment that can enhance knowledge development, strengthen social ties and build social capital. Communities that support tacit knowledge development, information sharing and problem solving are most valued and existing information and communication technology (ICT) tools can be used to promote usability and accessibility. Recognising the value of tacit knowledge and using ICT for educational development within workload hours will require a shift in cultural thinking at both an individual and organisational level. PMID:24791455

  8. Synoptic scale forecast skill and systematic errors in the MASS 2.0 model. [Mesoscale Atmospheric Simulation System

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Skillman, W. C.; Kocin, P. J.; Wetzel, P. J.; Brill, K. F.

    1985-01-01

    The synoptic scale performance characteristics of MASS 2.0 are determined by comparing filtered 12-24 hr model forecasts to same-case forecasts made by the National Meteorological Center's synoptic-scale Limited-area Fine Mesh model. Characteristics of the two systems are contrasted, and the analysis methodology used to determine statistical skill scores and systematic errors is described. The overall relative performance of the two models in the sample is documented, and important systematic errors uncovered are presented.

  9. The air forces on a systematic series of biplane and triplane cellule models

    NASA Technical Reports Server (NTRS)

    Munk, Max M

    1927-01-01

    The air forces on a systematic series of biplane and triplane cellule models are the subject of this report. The test consist in the determination of the lift, drag, and moment of each individual airfoil in each cellule, mostly with the same wing section. The magnitude of the gap and of the stagger is systematically varied; not, however, the decalage, which is zero throughout the tests. Certain check tests with a second wing section make the tests more complete and conclusions more convincing. The results give evidence that the present army and navy specifications for the relative lifts of biplanes are good. They furnish material for improving such specifications for the relative lifts of triplanes. A larger number of factors can now be prescribed to take care of different cases.

  10. Coordinating the Provision of Health Services in Humanitarian Crises: a Systematic Review of Suggested Models

    PubMed Central

    Lotfi, Tamara; Bou-Karroum, Lama; Darzi, Andrea; Hajjar, Rayan; El Rahyel, Ahmed; El Eid, Jamale; Itani, Mira; Brax, Hneine; Akik, Chaza; Osman, Mona; Hassan, Ghayda; El-Jardali, Fadi; Akl, Elie

    2016-01-01

    Background: Our objective was to identify published models of coordination between entities funding or delivering health services in humanitarian crises, whether the coordination took place during or after the crises. Methods: We included reports describing models of coordination in sufficient detail to allow reproducibility. We also included reports describing implementation of identified models, as case studies. We searched Medline, PubMed, EMBASE, Cochrane Central Register of Controlled Trials, CINAHL, PsycINFO, and the WHO Global Health Library. We also searched websites of relevant organizations. We followed standard systematic review methodology. Results: Our search captured 14,309 citations. The screening process identified 34 eligible papers describing five models of coordination of delivering health services: the “Cluster Approach” (with 16 case studies), the 4Ws “Who is Where, When, doing What” mapping tool (with four case studies), the “Sphere Project” (with two case studies), the “5x5” model (with one case study), and the “model of information coordination” (with one case study). The 4Ws and the 5x5 focus on coordination of services for mental health, the remaining models do not focus on a specific health topic. The Cluster approach appears to be the most widely used. One case study was a mixed implementation of the Cluster approach and the Sphere model. We identified no model of coordination for funding of health service. Conclusion: This systematic review identified five proposed coordination models that have been implemented by entities funding or delivering health service in humanitarian crises. There is a need to compare the effect of these different models on outcomes such as availability of and access to health services. PMID:27617167

  11. Systematic temporal patterns in the relationship between housing development and forest bird biodiversity.

    PubMed

    Pidgeon, Anna M; Flather, Curtis H; Radeloff, Volker C; Lepczyk, Christopher A; Keuler, Nicholas S; Wood, Eric M; Stewart, Susan I; Hammer, Roger B

    2014-10-01

    As people encroach increasingly on natural areas, one question is how this affects avian biodiversity. The answer to this is partly scale-dependent. At broad scales, human populations and biodiversity concentrate in the same areas and are positively associated, but at local scales people and biodiversity are negatively associated with biodiversity. We investigated whether there is also a systematic temporal trend in the relationship between bird biodiversity and housing development. We used linear regression to examine associations between forest bird species richness and housing growth in the conterminous United States over 30 years. Our data sources were the North American Breeding Bird Survey and the 2000 decennial U.S. Census. In the 9 largest forested ecoregions, housing density increased continually over time. Across the conterminous United States, the association between bird species richness and housing density was positive for virtually all guilds except ground nesting birds. We found a systematic trajectory of declining bird species richness as housing increased through time. In more recently developed ecoregions, where housing density was still low, the association with bird species richness was neutral or positive. In ecoregions that were developed earlier and where housing density was highest, the association of housing density with bird species richness for most guilds was negative and grew stronger with advancing decades. We propose that in general the relationship between human settlement and biodiversity over time unfolds as a 2-phase process. The first phase is apparently innocuous; associations are positive due to coincidence of low-density housing with high biodiversity. The second phase is highly detrimental to biodiversity, and increases in housing density are associated with biodiversity losses. The long-term effect on biodiversity depends on the final housing density. This general pattern can help unify our understanding of the relationship

  12. A systematic approach to identify the sources of tropical SST errors in coupled models using the adjustment of initialised experiments

    NASA Astrophysics Data System (ADS)

    Vannière, Benoît; Guilyardi, Eric; Toniazzo, Thomas; Madec, Gurvan; Woolnough, Steve

    2014-10-01

    Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which

  13. Pilipino American Identity Development Model

    ERIC Educational Resources Information Center

    Nadal, Kevin L.

    2004-01-01

    This article examines the identity development of F/Pilipino Americans. Because of a distinct history and culture that differentiates them from other Asian groups, F/Pilipino Americans may experience a different ethnic identity development than other Asian Americans. A nonlinear 6-stage ethnic identity development model is proposed to promote…

  14. The Status of Faculty Development Programmes in Iran after the Medical Education Reform: A Systematic and Comprehensive Approach

    ERIC Educational Resources Information Center

    Ahmady, Soleiman; Changiz, Tahereh; Brommels, Mats; Gaffney, Andrew F.; Masiello, Italo

    2009-01-01

    Modern universities achieve institutional goals when faculty members are able to fulfil diverse roles. Faculty development must therefore employ pedagogical principles while guided by institutional needs. Systematic evaluation of such programmes has not been done in developing countries. This paper examines faculty development in Iran, where…

  15. Systematic Approach to the Development, Evolution, and Effectiveness of Integrated Product Development Teams (IPDTs)

    SciTech Connect

    Margie Jeffs; R. Douglas Hamelin

    2011-06-01

    Integrated Product Development Teams (IPDT) are a key component of any systems engineering (SE) application, but since they are formed primarily from technical considerations, many IPDTs are far less productive than they otherwise could be. By recognizing specific personality types and skill sets, a random group of 'technical' individuals can be structured to become a highly effective team capable of delivering much more than the sum of its members.

  16. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  17. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics

    PubMed Central

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-01-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs. PMID:26464014

  18. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics.

    PubMed

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-02-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs. PMID:26464014

  19. Development of a structured observational method for the systematic assessment of school food-choice architecture.

    PubMed

    Ozturk, Orgul D; McInnes, Melayne M; Blake, Christine E; Frongillo, Edward A; Jones, Sonya J

    2016-01-01

    The objective of this study is to develop a structured observational method for the systematic assessment of the food-choice architecture that can be used to identify key points for behavioral economic intervention intended to improve the health quality of children's diets. We use an ethnographic approach with observations at twelve elementary schools to construct our survey instrument. Elements of the structured observational method include decision environment, salience, accessibility/convenience, defaults/verbal prompts, number of choices, serving ware/method/packaging, and social/physical eating environment. Our survey reveals important "nudgeable" components of the elementary school food-choice architecture, including precommitment and default options on the lunch line. PMID:26654767

  20. A Systematic Approach for Model-Based Aircraft Engine Performance Estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Garg, Sanjay

    2010-01-01

    A requirement for effective aircraft engine performance estimation is the ability to account for engine degradation, generally described in terms of unmeasurable health parameters such as efficiencies and flow capacities related to each major engine module. This paper presents a linear point design methodology for minimizing the degradation-induced error in model-based aircraft engine performance estimation applications. The technique specifically focuses on the underdetermined estimation problem, where there are more unknown health parameters than available sensor measurements. A condition for Kalman filter-based estimation is that the number of health parameters estimated cannot exceed the number of sensed measurements. In this paper, the estimated health parameter vector will be replaced by a reduced order tuner vector whose dimension is equivalent to the sensed measurement vector. The reduced order tuner vector is systematically selected to minimize the theoretical mean squared estimation error of a maximum a posteriori estimator formulation. This paper derives theoretical estimation errors at steady-state operating conditions, and presents the tuner selection routine applied to minimize these values. Results from the application of the technique to an aircraft engine simulation are presented and compared to the estimation accuracy achieved through conventional maximum a posteriori and Kalman filter estimation approaches. Maximum a posteriori estimation results demonstrate that reduced order tuning parameter vectors can be found that approximate the accuracy of estimating all health parameters directly. Kalman filter estimation results based on the same reduced order tuning parameter vectors demonstrate that significantly improved estimation accuracy can be achieved over the conventional approach of selecting a subset of health parameters to serve as the tuner vector. However, additional development is necessary to fully extend the methodology to Kalman filter

  1. Placental Pathology, Perinatal Death, Neonatal Outcome, and Neurological Development: A Systematic Review

    PubMed Central

    Roescher, Annemiek M.; Timmer, Albert; Erwich, Jan Jaap H. M.; Bos, Arend F.

    2014-01-01

    Background The placenta plays a crucial role during pregnancy for growth and development of the fetus. Less than optimal placental performance may result in morbidity or even mortality of both mother and fetus. Awareness among pediatricians, however, of the benefit of placental findings for neonatal care, is limited. Objectives To provide a systematic overview of the relation between placental lesions and neonatal outcome. Data sources Pubmed database, reference lists of selected publications and important research groups in the field. Study appraisal and synthesis methods We systematically searched the Pubmed database for literature on the relation between placental lesions and fetal and neonatal mortality, neonatal morbidity and neurological outcome. We conducted three separate searches starting with a search for placental pathology and fetal and neonatal mortality, followed by placental pathology and neonatal morbidity, and finally placental pathology and neurological development. We limited our search to full-text articles published in English from January 1995 to October 2013. We refined our search results by selecting the appropriate articles from the ones found during the initial searches. The first selection was based on the title, the second on the abstract, and the third on the full article. The quality of the selected articles was determined by using the Newcastle-Ottawa Quality Assessment Scale. Results Placental lesions are one of the main causes of fetal death, where placental lesions consistent with maternal vascular underperfusion are most important. Several neonatal problems are also associated with placental lesions, whereby ascending intrauterine infection (with a fetal component) and fetal thrombotic vasculopathy constitute the greatest problem. Conclusions The placenta plays a key role in fetal and neonatal mortality, morbidity, and outcome. Pediatricians should make an effort to obtain the results of placental examinations. PMID:24586764

  2. Systematic review of the birth prevalence of congenital cytomegalovirus infection in developing countries

    PubMed Central

    Lanzieri, Tatiana M.; Dollard, Sheila C.; Bialek, Stephanie R.; Grosse, Scott D.

    2016-01-01

    Summary Background Congenital cytomegalovirus (CMV) infection is the leading infectious cause of congenital hearing loss and neurodevelopmental disability in developed countries. Information on congenital CMV infection in developing countries appears to be lacking. Methods We conducted a systematic literature review to identify studies from developing countries with population-based samples of at least 300 infants that used laboratory methods established as reliable for the diagnosis of congenital CMV infection. Results Most studies were excluded due to biased samples or inadequate diagnostic methods; consequently the search identified just 11 studies that were from Africa, Asia, and Latin America. The number of newborns tested ranged from 317 to 12 195. Maternal CMV seroprevalence ranged from 84% to 100%. CMV birth prevalence varied from 0.6% to 6.1%. CMV-associated impairments were not documented in most studies. Conclusions Birth prevalence ranges were higher than for Europe and North America, as expected based on the higher maternal CMV seroprevalence. With very limited data available on sequelae, the disease burden of congenital CMV in developing countries remains largely unknown at this time. PMID:24631522

  3. How to Grow Project Scientists: A Systematic Approach to Developing Project Scientists

    NASA Technical Reports Server (NTRS)

    Kea, Howard

    2011-01-01

    The Project Manager is one of the key individuals that can determine the success or failure of a project. NASA is fully committed to the training and development of Project Managers across the agency to ensure that highly capable individuals are equipped with the competencies and experience to successfully lead a project. An equally critical position is that of the Project Scientist. The Project Scientist provides the scientific leadership necessary for the scientific success of a project by insuring that the mission meets or exceeds the scientific requirements. Traditionally, NASA Goddard project scientists were appointed and approved by the Center Science Director based on their knowledge, experience, and other qualifications. However the process to obtain the necessary knowledge, skills and abilities was not documented or done in a systematic way. NASA Goddard's current Science Director, Nicholas White saw the need to create a pipeline for developing new projects scientists, and appointed a team to develop a process for training potential project scientists. The team members were Dr. Harley Thronson, Chair, Dr. Howard Kea, Mr. Mark Goldman, DACUM facilitator and the late Dr. Michael VanSteenberg. The DACUM process, an occupational analysis and evaluation system, was used to produce a picture of the project scientist's duties, tasks, knowledge, and skills. The output resulted in a 3-Day introductory course detailing all the required knowledge, skills and abilities a scientist must develop over time to be qualified for selections as a Project Scientist.

  4. Factors Influencing Development of Professional Values Among Nursing Students and Instructors: A Systematic Review

    PubMed Central

    Parandeh, Akram; Khaghanizade, Morteza; Mohammadi, Eesa; Nouri, Jamileh Mokhtari

    2015-01-01

    Introduction: Professional values are standards of behavior for performance that provide a framework for appraising beliefs and attitudes that influence behavior. Development of professional values has been a continuous and long process and it is influenced by different factors. The aim of this study is “assessing different factors influencing development of professional values among nursing students and instructors”. Method: In this systematic review, a broad research was performed to find articles from Persian and English databases: pub Med, Pro quest, Elsevier, SID, Google scholar, Ovid and Iran Doc; nursing student, instructors, ethics, professional value, ethical value and educators were used as the key words. Among 3205 achieved articles, by eliminating repeated ones, 22 articles were assessed during the period 1995–2013. Data achieved from the articles were summarized, categorized and analyzed based on the research question. Results: In this study “education and achieving professional experiences”, “Students and instructors’ perspectives on professional values”, “the role of culture in considering and developing professional special values” and “the effect of learners’ individual characteristics” were extracted as the four main themes. Conclusion: Considering the effect of educational, cultural and individual factors in developing nurses’ professional values; it is recommended to the educational and health centers to consider value-based cares in clinical environments for the patients in addition to considering the content of educational programs based on ethical values in the students’ curriculum. PMID:25716397

  5. Advanced Mirror & Modelling Technology Development

    NASA Technical Reports Server (NTRS)

    Effinger, Michael; Stahl, H. Philip; Abplanalp, Laura; Maffett, Steven; Egerman, Robert; Eng, Ron; Arnold, William; Mosier, Gary; Blaurock, Carl

    2014-01-01

    The 2020 Decadal technology survey is starting in 2018. Technology on the shelf at that time will help guide selection to future low risk and low cost missions. The Advanced Mirror Technology Development (AMTD) team has identified development priorities based on science goals and engineering requirements for Ultraviolet Optical near-Infrared (UVOIR) missions in order to contribute to the selection process. One key development identified was lightweight mirror fabrication and testing. A monolithic, stacked, deep core mirror was fused and replicated twice to achieve the desired radius of curvature. It was subsequently successfully polished and tested. A recently awarded second phase to the AMTD project will develop larger mirrors to demonstrate the lateral scaling of the deep core mirror technology. Another key development was rapid modeling for the mirror. One model focused on generating optical and structural model results in minutes instead of months. Many variables could be accounted for regarding the core, face plate and back structure details. A portion of a spacecraft model was also developed. The spacecraft model incorporated direct integration to transform optical path difference to Point Spread Function (PSF) and between PSF to modulation transfer function. The second phase to the project will take the results of the rapid mirror modeler and integrate them into the rapid spacecraft modeler.

  6. Systematic vertical error in UAV-derived topographic models: Origins and solutions

    NASA Astrophysics Data System (ADS)

    James, Mike R.; Robson, Stuart

    2014-05-01

    Unmanned aerial vehicles (UAVs) equipped with consumer cameras are increasingly being used to produce high resolution digital elevation models (DEMs). However, although such DEMs may achieve centimetric detail, they can also display broad-scale systematic deformation (usually a vertical 'doming') that restricts their wider use. This effect can be particularly apparent in DEMs derived by structure-from-motion (SfM) processing, especially when control point data have not been incorporated in the bundle adjustment process. We illustrate that doming error results from a combination of inaccurate description of radial lens distortion and the use of imagery captured in near-parallel viewing directions. With such imagery, enabling camera self-calibration within the processing inherently leads to erroneous radial distortion values and associated DEM error. Using a simulation approach, we illustrate how existing understanding of systematic DEM error in stereo-pairs (from unaccounted radial distortion) up-scales in typical multiple-image blocks of UAV surveys. For image sets with dominantly parallel viewing directions, self-calibrating bundle adjustment (as normally used with images taken using consumer cameras) will not be able to derive radial lens distortion accurately, and will give associated systematic 'doming' DEM deformation. In the presence of image measurement noise (at levels characteristic of SfM software), and in the absence of control measurements, our simulations display domed deformation with amplitude of ~2 m over horizontal distances of ~100 m. We illustrate the sensitivity of this effect to variations in camera angle and flight height. Deformation will be reduced if suitable control points can be included within the bundle adjustment, but residual systematic vertical error may remain, accommodated by the estimated precision of the control measurements. Doming bias can be minimised by the inclusion of inclined images within the image set, for example

  7. Globalization of Continuing Professional Development by Journal Clubs via Microblogging: A Systematic Review

    PubMed Central

    Perera, Marlon; Lawrentschuk, Nathan; Romanic, Diana; Papa, Nathan; Bolton, Damien

    2015-01-01

    Background Journal clubs are an essential tool in promoting clinical evidence-based medical education to all medical and allied health professionals. Twitter represents a public, microblogging forum that can facilitate traditional journal club requirements, while also reaching a global audience, and participation for discussion with study authors and colleagues. Objective The aim of the current study was to evaluate the current state of social media–facilitated journal clubs, specifically Twitter, as an example of continuing professional development. Methods A systematic review of literature databases (Medline, Embase, CINAHL, Web of Science, ERIC via ProQuest) was performed according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. A systematic search of Twitter, the followers of identified journal clubs, and Symplur was also performed. Demographic and monthly tweet data were extracted from Twitter and Symplur. All manuscripts related to Twitter-based journal clubs were included. Statistical analyses were performed in MS Excel and STATA. Results From a total of 469 citations, 11 manuscripts were included and referred to five Twitter-based journal clubs (#ALiEMJC, #BlueJC, #ebnjc, #urojc, #meded). A Twitter-based journal club search yielded 34 potential hashtags/accounts, of which 24 were included in the final analysis. The median duration of activity was 11.75 (interquartile range [IQR] 19.9, SD 10.9) months, with 7 now inactive. The median number of followers and participants was 374 (IQR 574) and 157 (IQR 272), respectively. An overall increasing establishment of active Twitter-based journal clubs was observed, resulting in an exponential increase in total cumulative tweets (R 2=.98), and tweets per month (R 2=.72). Cumulative tweets for specific journal clubs increased linearly, with @ADC_JC, @EBNursingBMJ, @igsjc, @iurojc, and @NephJC, and showing greatest rate of change, as well as total impressions per month since

  8. Diagnosing the Systematic Effects of Parametrized Physical Processes in a Numerical Weather Prediction Model

    NASA Astrophysics Data System (ADS)

    Saffin, Leo; Methven, John; Gray, Sue

    2016-04-01

    Numerical models of the atmosphere combine a dynamical core, which approximates solutions to the adiabatic and frictionless governing equations, with the tendencies arising from the parametrization of physical processes. Tracers of potential vorticity (PV) can be used to accumulate the tendencies of parametrized physical processes and diagnose their impacts on the large-scale dynamics. This is due to two key properties of PV, conservation following an air mass and invertibility which relates the PV distribution to the balanced dynamics of the atmosphere. Applying the PV tracers to many short forecasts allows for a systematic investigation of the behaviour of parametrized physical processes. The forecasts are 2.5 day lead time forecasts run using the Met Office Unified Model (MetUM) initialised at 0Z for each day in November/December/January 2013/14. The analysis of the PV tracers has been focussed on regions where diabatic processes can be important (tropopause ridges and troughs, frontal regions and the boundary layer top). The tropopause can be described as a surface of constant PV with a sharp PV gradient. Previous work using the PV tracers in individual case studies has shown that parametrized physical processes act to enhance the tropopause PV contrast which can affect the Rossby wave phase speed. The short forecasts show results consistent with a systematic enhancement of tropopause PV contrast by diabatic processes and show systematically different behaviour between ridges and troughs. The implication of this work is that a failure to correctly represent the effects of diabatic processes on the tropopause in models can lead to poor Rossby wave evolution and potentially downstream forecast busts.

  9. Space Flight Cable Model Development

    NASA Technical Reports Server (NTRS)

    Spak, Kaitlin

    2013-01-01

    This work concentrates the modeling efforts presented in last year's VSGC conference paper, "Model Development for Cable-Harnessed Beams." The focus is narrowed to modeling of space-flight cables only, as a reliable damped cable model is not yet readily available and is necessary to continue modeling cable-harnessed space structures. New experimental data is presented, eliminating the low-frequency noise that plagued the first year's efforts. The distributed transfer function method is applied to a single section of space flight cable for Euler-Bernoulli and shear beams. The work presented here will be developed into a damped cable model that can be incorporated into an interconnected beam-cable system. The overall goal of this work is to accurately predict natural frequencies and modal damping ratios for cabled space structures.

  10. Antiretroviral Therapy and Pregnancy Outcomes in Developing Countries: A Systematic Review

    PubMed Central

    Alemu, Fekadu Mazengia; Yalew, Alemayehu Worku; Fantahun, Mesganaw; Ashu, Eta Ebasi

    2015-01-01

    Background: Despite significant efforts to understand adverse pregnancy outcome in women receiving Antiretroviral Therapy (ART), ART-related adverse birth outcomes are still poorly understood. We systematically review ART-related adverse birth outcomes among HIV-infected pregnant women; we also review the covariates associated with adverse birth outcomes in the aforementioned group. Methods: The main source for our systematic review was electronic bibliographic databases. Databases such as MEDLINE, PubMed, EMBASE and AIDSLINE were searched. Furthermore, search engines such as Google and Google Scholar were specifically searched for gray literature. Methodological quality of available literature was assessed using the Newcastle - Ottawa Quality Assessment Scale & M. Hewitt guideline. We examined a total of 1,124 papers and reviewed the studies using the PICOT criteria which stands for Patient (population), Intervention (or “Exposure”), Comparison, Outcome and Type of study. Finally, 32 methodologically fit studies were retained and included in our review. Results: Frequently observed adverse birth outcomes included low birth weight (LBW), Preterm Birth (PB), Small for Gestational Age (SGA), while still birth and congenital anomalies were infrequent. Type of regimen such as Protease Inhibitor (PI) based regimens and timing of initiation of ART are some of the factors associated with adverse pregnancy outcomes. Covariates principally included malnutrition and other co-morbidities such as malaria and HIV. Conclusions and Public Health Implications: There is growing evidence in published literature suggesting that ART might be causing adverse birth outcomes among pregnant women in developing countries. There is a need to consider regimen types for HIV-infected pregnant women. There is need to design large cohort studies.

  11. A study for systematic errors of the GLA forecast model in tropical regions

    NASA Technical Reports Server (NTRS)

    Chen, Tsing-Chang; Baker, Wayman E.; Pfaendtner, James; Corrigan, Martin

    1988-01-01

    From the sensitivity studies performed with the Goddard Laboratory for Atmospheres (GLA) analysis/forecast system, it was revealed that the forecast errors in the tropics affect the ability to forecast midlatitude weather in some cases. Apparently, the forecast errors occurring in the tropics can propagate to midlatitudes. Therefore, the systematic error analysis of the GLA forecast system becomes a necessary step in improving the model's forecast performance. The major effort of this study is to examine the possible impact of the hydrological-cycle forecast error on dynamical fields in the GLA forecast system.

  12. Systematic analysis of dynamic miRNA-target interactions during C. elegans development.

    PubMed

    Zhang, Liang; Hammell, Molly; Kudlow, Brian A; Ambros, Victor; Han, Min

    2009-09-01

    Although microRNA (miRNA)-mediated functions have been implicated in many aspects of animal development, the majority of miRNA::mRNA regulatory interactions remain to be characterized experimentally. We used an AIN/GW182 protein immunoprecipitation approach to systematically analyze miRNA::mRNA interactions during C. elegans development. We characterized the composition of miRNAs in functional miRNA-induced silencing complexes (miRISCs) at each developmental stage and identified three sets of miRNAs with distinct stage-specificity of function. We then identified thousands of miRNA targets in each developmental stage, including a significant portion that is subject to differential miRNA regulation during development. By identifying thousands of miRNA family-mRNA pairs with temporally correlated patterns of AIN-2 association, we gained valuable information on the principles of physiological miRNA::target recognition and predicted 1589 high-confidence miRNA family::mRNA interactions. Our data support the idea that miRNAs preferentially target genes involved in signaling processes and avoid genes with housekeeping functions, and that miRNAs orchestrate temporal developmental programs by coordinately targeting or avoiding genes involved in particular biological functions. PMID:19675127

  13. The Impact of Maternal Vitamin D Status on Offspring Brain Development and Function: a Systematic Review.

    PubMed

    Pet, Milou A; Brouwer-Brolsma, Elske M

    2016-07-01

    Various studies have examined associations between maternal vitamin D (VD) deficiency and offspring health, including offspring brain health. The purpose of this review was to summarize current evidence concerning the impact of maternal VD deficiency on brain development and function in offspring. A systematic search was conducted within Medline (on Ovid) for studies published through 7 May 2015. Animal and human studies that examined associations between maternal VD status or developmental VD deficiency and offspring brain development and function were included. A total of 26 animal studies and 10 human studies met the inclusion criteria. Several animal studies confirmed the hypothesis that low prenatal VD status may affect brain morphology and physiology as well as behavioral outcomes. In humans, subtle cognitive and psychological impairments in offspring of VD-deficient mothers were observed. However, data obtained from animal and human studies provide inconclusive evidence, and results seem to depend on strain or race and age of offspring. To conclude, prenatal VD status is thought to play an important role in brain development, cognitive function, and psychological function. However, results are inconclusive; validation of these findings and investigation of underlying mechanisms are required. Thus, more investigation is needed before recommending supplementation of VD during pregnancy to promote brain health of offspring. PMID:27422502

  14. Systematic Error in UAV-derived Topographic Models: The Importance of Control

    NASA Astrophysics Data System (ADS)

    James, M. R.; Robson, S.; d'Oleire-Oltmanns, S.

    2014-12-01

    UAVs equipped with consumer cameras are increasingly being used to produce high resolution digital elevation models (DEMs) for a wide variety of geoscience applications. Image processing and DEM-generation is being facilitated by parallel increases in the use of software based on 'structure from motion' algorithms. However, recent work [1] has demonstrated that image networks from UAVs, for which camera pointing directions are generally near-parallel, are susceptible to producing systematic error in the resulting topographic surfaces (a vertical 'doming'). This issue primarily reflects error in the camera lens distortion model, which is dominated by the radial K1 term. Common data processing scenarios, in which self-calibration is used to refine the camera model within the bundle adjustment, can inherently result in such systematic error via poor K1 estimates. Incorporating oblique imagery into such data sets can mitigate error by enabling more accurate calculation of camera parameters [1]. Here, using a combination of simulated image networks and real imagery collected from a fixed wing UAV, we explore the additional roles of external ground control and the precision of image measurements. We illustrate similarities and differences between a variety of structure from motion software, and underscore the importance of well distributed and suitably accurate control for projects where a demonstrated high accuracy is required. [1] James & Robson (2014) Earth Surf. Proc. Landforms, 39, 1413-1420, doi: 10.1002/esp.3609

  15. Peak Vertical Ground Reaction Force during Two-Leg Landing: A Systematic Review and Mathematical Modeling

    PubMed Central

    Feng, Tienan; Zhang, Ming

    2014-01-01

    Objectives. (1) To systematically review peak vertical ground reaction force (PvGRF) during two-leg drop landing from specific drop height (DH), (2) to construct a mathematical model describing correlations between PvGRF and DH, and (3) to analyze the effects of some factors on the pooled PvGRF regardless of DH. Methods. A computerized bibliographical search was conducted to extract PvGRF data on a single foot when participants landed with both feet from various DHs. An innovative mathematical model was constructed to analyze effects of gender, landing type, shoes, ankle stabilizers, surface stiffness and sample frequency on PvGRF based on the pooled data. Results. Pooled PvGRF and DH data of 26 articles showed that the square root function fits their relationship well. An experimental validation was also done on the regression equation for the medicum frequency. The PvGRF was not significantly affected by surface stiffness, but was significantly higher in men than women, the platform than suspended landing, the barefoot than shod condition, and ankle stabilizer than control condition, and higher than lower frequencies. Conclusions. The PvGRF and root DH showed a linear relationship. The mathematical modeling method with systematic review is helpful to analyze the influence factors during landing movement without considering DH. PMID:25243113

  16. Systematic problems with using dark matter simulations to model stellar halos

    SciTech Connect

    Bailin, Jeremy; Bell, Eric F.; Valluri, Monica; Stinson, Greg S.; Debattista, Victor P.; Couchman, H. M. P.; Wadsley, James

    2014-03-10

    The limits of available computing power have forced models for the structure of stellar halos to adopt one or both of the following simplifying assumptions: (1) stellar mass can be 'painted' onto dark matter (DM) particles in progenitor satellites; (2) pure DM simulations that do not form a luminous galaxy can be used. We estimate the magnitude of the systematic errors introduced by these assumptions using a controlled set of stellar halo models where we independently vary whether we look at star particles or painted DM particles, and whether we use a simulation in which a baryonic disk galaxy forms or a matching pure DM simulation that does not form a baryonic disk. We find that the 'painting' simplification reduces the halo concentration and internal structure, predominantly because painted DM particles have different kinematics from star particles even when both are buried deep in the potential well of the satellite. The simplification of using pure DM simulations reduces the concentration further, but increases the internal structure, and results in a more prolate stellar halo. These differences can be a factor of 1.5-7 in concentration (as measured by the half-mass radius) and 2-7 in internal density structure. Given this level of systematic uncertainty, one should be wary of overinterpreting differences between observations and the current generation of stellar halo models based on DM-only simulations when such differences are less than an order of magnitude.

  17. Systematic Evaluation of Key L-Carnitine Homeostasis Mechanisms during Postnatal Development in Rat

    PubMed Central

    2012-01-01

    Background The conditionally essential nutrient, L-carnitine, plays a critical role in a number of physiological processes vital to normal neonatal growth and development. We conducted a systematic evaluation of the developmental changes in key L-carnitine homeostasis mechanisms in the postnatal rat to better understand the interrelationship between these pathways and their correlation to ontogenic changes in L-carnitine levels during postnatal development. Methods mRNA expression of heart, kidney and intestinal L-carnitine transporters, liver γ-butyrobetaine hydroxylase (Bbh) and trimethyllysine hydroxylase (Tmlh), and heart carnitine palmitoyltransferase (Cpt) were measured using quantitative RT-PCR. L-Carnitine levels were determined by HPLC-UV. Cpt and Bbh activity were measured by a spectrophotometric method and HPLC, respectively. Results Serum and heart L-carnitine levels increased with postnatal development. Increases in serum L-carnitine correlated significantly with postnatal increases in renal organic cation/carnitine transporter 2 (Octn2) expression, and was further matched by postnatal increases in intestinal Octn1 expression and hepatic γ-Bbh activity. Postnatal increases in heart L-carnitine levels were significantly correlated to postnatal increases in heart Octn2 expression. Although cardiac high energy phosphate substrate levels remained constant through postnatal development, creatine showed developmental increases with advancing neonatal age. mRNA levels of Cpt1b and Cpt2 significantly increased at postnatal day 20, which was not accompanied by a similar increase in activity. Conclusions Several L-carnitine homeostasis pathways underwent significant ontogenesis during postnatal development in the rat. This information will facilitate future studies on factors affecting the developmental maturation of L-carnitine homeostasis mechanisms and how such factors might affect growth and development. PMID:22805277

  18. Systematic large-scale secondary circulations in a regional climate model

    NASA Astrophysics Data System (ADS)

    Becker, Nico; Ulbrich, Uwe; Klein, Rupert

    2015-05-01

    Regional climate models (RCMs) are used to add the effects of nonresolved scales to coarser resolved model simulations by using a finer grid within a limited domain. We identify large-scale secondary circulations (SCs) relative to the driving global climate model (GCM) in an RCM simulation over Europe. By applying a clustering technique, we find that the SC depends on the large-scale flow prescribed by the driving GCM data. Evidence is presented that the SC is caused by the different representations of orographic effects in the RCM and the GCM. Flow modifications in the RCM caused by the Alps lead to large-scale vortices in the SC fields. These vortices are limited by the RCM boundaries, causing artificial boundary-parallel flows. The SC is associated with geopotential height and temperature anomalies between RCM and GCM and has the potential to produce systematic large-scale biases in RCMs.

  19. Systematic, theoretically-grounded development and feasibility testing of an innovative, preventive web-based game for children exposed to acute trauma

    PubMed Central

    Marsac, Meghan L.; Winston, Flaura K.; Hildenbrand, Aimee K.; Kohser, Kristen L.; March, Sonja; Kenardy, Justin; Kassam-Adams, Nancy

    2015-01-01

    Background Millions of children are affected by acute medical events annually, creating need for resources to promote recovery. While web-based interventions promise wide reach and low cost for users, development can be time- and cost-intensive. A systematic approach to intervention development can help to minimize costs and increase likelihood of effectiveness. Using a systematic approach, our team integrated evidence on the etiology of traumatic stress, an explicit program theory, and a user-centered design process to intervention development. Objective To describe evidence and the program theory model applied to the Coping Coach intervention and present pilot data evaluating intervention feasibility and acceptability. Method Informed by empirical evidence on traumatic stress prevention, an overarching program theory model was articulated to delineate pathways from a) specific intervention content to b) program targets and proximal outcomes to c) key longer-term health outcomes. Systematic user-testing with children ages 8–12 (N = 42) exposed to an acute medical event and their parents was conducted throughout intervention development. Results Functionality challenges in early prototypes necessitated revisions. Child engagement was positive throughout revisions to the Coping Coach intervention. Final pilot-testing demonstrated promising feasibility and high user-engagement and satisfaction. Conclusion Applying a systematic approach to the development of Coping Coach led to the creation of a functional intervention that is accepted by children and parents. Development of new e-health interventions may benefit from a similar approach. Future research should evaluate the efficacy of Coping Coach in achieving targeted outcomes of reduced trauma symptoms and improved health-related quality of life. PMID:25844276

  20. An online model correction method based on an inverse problem: Part II—systematic model error correction

    NASA Astrophysics Data System (ADS)

    Xue, Haile; Shen, Xueshun; Chou, Jifan

    2015-11-01

    An online systematic error correction is presented and examined as a technique to improve the accuracy of real-time numerical weather prediction, based on the dataset of model errors (MEs) in past intervals. Given the analyses, the ME in each interval (6 h) between two analyses can be iteratively obtained by introducing an unknown tendency term into the prediction equation, shown in Part I of this two-paper series. In this part, after analyzing the 5-year (2001-2005) GRAPES-GFS (Global Forecast System of the Global and Regional Assimilation and Prediction System) error patterns and evolution, a systematic model error correction is given based on the least-squares approach by firstly using the past MEs. To test the correction, we applied the approach in GRAPES-GFS for July 2009 and January 2010. The datasets associated with the initial condition and SST used in this study were based on NCEP (National Centers for Environmental Prediction) FNL (final) data. The results indicated that the Northern Hemispheric systematically underestimated equator-to-pole geopotential gradient and westerly wind of GRAPES-GFS were largely enhanced, and the biases of temperature and wind in the tropics were strongly reduced. Therefore, the correction results in a more skillful forecast with lower mean bias and root-mean-square error and higher anomaly correlation coefficient.

  1. Modeling Systematic Change in Stopover Duration Does Not Improve Bias in Trends Estimated from Migration Counts

    PubMed Central

    Crewe, Tara L.; Taylor, Philip D.; Lepage, Denis

    2015-01-01

    The use of counts of unmarked migrating animals to monitor long term population trends assumes independence of daily counts and a constant rate of detection. However, migratory stopovers often last days or weeks, violating the assumption of count independence. Further, a systematic change in stopover duration will result in a change in the probability of detecting individuals once, but also in the probability of detecting individuals on more than one sampling occasion. We tested how variation in stopover duration influenced accuracy and precision of population trends by simulating migration count data with known constant rate of population change and by allowing daily probability of survival (an index of stopover duration) to remain constant, or to vary randomly, cyclically, or increase linearly over time by various levels. Using simulated datasets with a systematic increase in stopover duration, we also tested whether any resulting bias in population trend could be reduced by modeling the underlying source of variation in detection, or by subsampling data to every three or five days to reduce the incidence of recounting. Mean bias in population trend did not differ significantly from zero when stopover duration remained constant or varied randomly over time, but bias and the detection of false trends increased significantly with a systematic increase in stopover duration. Importantly, an increase in stopover duration over time resulted in a compounding effect on counts due to the increased probability of detection and of recounting on subsequent sampling occasions. Under this scenario, bias in population trend could not be modeled using a covariate for stopover duration alone. Rather, to improve inference drawn about long term population change using counts of unmarked migrants, analyses must include a covariate for stopover duration, as well as incorporate sampling modifications (e.g., subsampling) to reduce the probability that individuals will be detected on

  2. Systematic coarse-graining of the wormlike chain model for dynamic simulations

    NASA Astrophysics Data System (ADS)

    Koslover, Elena; Spakowitz, Andrew

    2014-03-01

    One of the key goals of macromolecular modeling is to elucidate how macroscale physical properties arise from the microscale behavior of the polymer constituents. For many biological and industrial applications, a direct simulation approach is impractical due to to the wide range of length and time scales that must be spanned by the model, necessitating physically sound and practically relevant procedures for coarse-graining polymer systems. We present a highly general systematic coarse-graining procedure that maps any detailed polymer model onto effective elastic-chain models at intermediate and large length scales, and we specifically focus on the wormlike chain model of semiflexible polymers. Our approach defines a continuous flow of coarse-grained models starting from the wormlike chain model, proceeding through an intermediate-scale stretchable, shearable wormlike chain, and finally resolving to a Gaussian chain at the longest lengths. Using Brownian dynamic simulations of our coarse grained polymer, we show that this approach to coarse graining the wormlike chain model captures analytical predictions for stress relaxation in a semiflexible polymer. Since we can arbitrarily coarse grain the polymer in these dynamic simulations, our approach greatly accelerates simulations.

  3. Use of the Caulobacter crescentus Genome Sequence To Develop a Method for Systematic Genetic Mapping

    PubMed Central

    West, Lisandra; Yang, Desiree; Stephens, Craig

    2002-01-01

    The functional analysis of sequenced genomes will be facilitated by the development of tools for the rapid mapping of mutations. We have developed a systematic approach to genetic mapping in Caulobacter crescentus that is based on bacteriophage-mediated transduction of strategically placed antibiotic resistance markers. The genomic DNA sequence was used to identify sites distributed evenly around the chromosome at which plasmids could be nondisruptively integrated. DNA fragments from these sites were amplified by PCR and cloned into a kanamycin-resistant (Kanr) suicide vector. Delivery of these plasmids into C. crescentus resulted in integration via homologous recombination. A set of 41 strains containing Kanr markers at 100-kb intervals was thereby generated. These strains serve as donors for generalized transduction using bacteriophage φCr30, which can transduce at least 120 kb of DNA. Transductants are selected with kanamycin and screened for loss of the mutant phenotype to assess linkage between the marker and the site of the mutation. The dependence of cotransduction frequency on sequence distance was evaluated using several markers and mutant strains. With these data as a standard, previously unmapped mutations were readily localized to DNA sequence intervals equivalent to less than 1% of the genome. Candidate genes within the interval were then examined further by subcloning and complementation analysis. Mutations resulting in sensitivity to ampicillin, in nutritional auxotrophies, or temperature-sensitive growth were mapped. This approach to genetic mapping should be applicable to other bacteria with sequenced genomes for which generalized transducing phage are available. PMID:11914347

  4. Systematic approach to development of pressure sensors using dielectric electro-active polymer membranes

    NASA Astrophysics Data System (ADS)

    York, A.; Dunn, J.; Seelecke, S.

    2013-09-01

    Dielectric electro-active polymers (DEAPs) have become attractive materials for various actuation and sensing applications due to their high energy and power density, high efficiency, light weight, and fast response speed. However, commercial development has been hindered due to a variety of constraints such as reliability, non-linear behavior, cost of driving electronics, and form factor requirements. This paper presents the systematic development from laboratory concept to commercial readiness of a novel pressure sensing system using a DEAP membrane. The pressure sensing system was designed for in-line pressure measurements for low pressure applications such as health systems monitoring. A first generation sensor was designed, built and tested with a focus on the qualitative capabilities of EAP membranes as sensors. Experimental measurements were conducted that demonstrated the capability of the sensor to output a voltage signal proportional to a changing pressure. Several undesirable characteristics were observed during these initial tests such as strong hysteresis, non-linearity, very limited pressure range, and low fatigue life. A second generation prototype was then designed to remove or compensate for these undesirable characteristics. This prototype was then built and tested. The new design showed an almost complete removal of hysteretic non-linear effects and was capable of operating at 10 × the pressure range of the initial generation. This new design is the framework for a novel DEAP based pressure sensor ready for commercial applications.

  5. Systematic approach to developing empirical interatomic potentials for III-N semiconductors

    NASA Astrophysics Data System (ADS)

    Ito, Tomonori; Akiyama, Toru; Nakamura, Kohji

    2016-05-01

    A systematic approach to the derivation of empirical interatomic potentials is developed for III-N semiconductors with the aid of ab initio calculations. The parameter values of empirical potential based on bond order potential are determined by reproducing the cohesive energy differences among 3-fold coordinated hexagonal, 4-fold coordinated zinc blende, wurtzite, and 6-fold coordinated rocksalt structures in BN, AlN, GaN, and InN. The bond order p is successfully introduced as a function of the coordination number Z in the form of p = a exp(-bZn ) if Z ≤ 4 and p = (4/Z)α if Z ≥ 4 in empirical interatomic potential. Moreover, the energy difference between wurtzite and zinc blende structures can be successfully evaluated by considering interaction beyond the second-nearest neighbors as a function of ionicity. This approach is feasible for developing empirical interatomic potentials applicable to a system consisting of poorly coordinated atoms at surfaces and interfaces including nanostructures.

  6. A comprehensive model for executing knowledge management audits in organizations: a systematic review.

    PubMed

    Shahmoradi, Leila; Ahmadi, Maryam; Sadoughi, Farahnaz; Piri, Zakieh; Gohari, Mahmood Reza

    2015-01-01

    A knowledge management audit (KMA) is the first phase in knowledge management implementation. Incomplete or incomprehensive execution of the KMA has caused many knowledge management programs to fail. A study was undertaken to investigate how KMAs are performed systematically in organizations and present a comprehensive model for performing KMAs based on a systematic review. Studies were identified by searching electronic databases such as Emerald, LISA, and the Cochrane library and e-journals such as the Oxford Journal and hand searching of printed journals, theses, and books in the Tehran University of Medical Sciences digital library. The sources used in this study consisted of studies available through the digital library of the Tehran University of Medical Sciences that were published between 2000 and 2013, including both Persian- and English-language sources, as well as articles explaining the steps involved in performing a KMA. A comprehensive model for KMAs is presented in this study. To successfully execute a KMA, it is necessary to perform the appropriate preliminary activities in relation to the knowledge management infrastructure, determine the knowledge management situation, and analyze and use the available data on this situation. PMID:25627852

  7. Consequences of systematic model drift in DYNAMO MJO hindcasts with SP-CAM and CAM5

    NASA Astrophysics Data System (ADS)

    Hannah, Walter M.; Maloney, Eric D.; Pritchard, Michael S.

    2015-09-01

    Hindcast simulations of MJO events during the dynamics of the MJO (DYNAMO) field campaign are conducted with two models, one with conventional parameterization (CAM5) and a comparable model that utilizes superparameterization (SP-CAM). SP-CAM is shown to produce a qualitatively better reproduction of the fluctuations of precipitation and low-level zonal wind associated with the first two DYNAMO MJO events compared to CAM5. Interestingly, skill metrics using the real-time multivariate MJO index (RMM) suggest the opposite conclusion that CAM5 has more skill than SP-CAM. This inconsistency can be explained by a systematic increase of RMM amplitude with lead time, which results from a drift of the large-scale wind field in SP-CAM that projects strongly onto the RMM index. CAM5 hindcasts exhibit a contraction of the moisture distribution, in which extreme wet and dry conditions become less frequent with lead time. SP-CAM hindcasts better reproduce the observed moisture distribution, but also have stronger drift patterns of moisture budget terms, such as an increase in drying by meridional advection in SP-CAM. This advection tendency in SP-CAM appears to be associated with enhanced off-equatorial synoptic eddy activity with lead time. Systematic drift moisture tendencies in SP-CAM are of similar magnitude to intraseasonal moisture tendencies, and therefore are important for understanding MJO prediction skill.

  8. Provider payment in community-based health insurance schemes in developing countries: a systematic review

    PubMed Central

    Robyn, Paul Jacob; Sauerborn, Rainer; Bärnighausen, Till

    2013-01-01

    Objectives Community-based health insurance (CBI) is a common mechanism to generate financial resources for health care in developing countries. We review for the first time provider payment methods used in CBI in developing countries and their impact on CBI performance. Methods We conducted a systematic review of the literature on provider payment methods used by CBI in developing countries published up to January 2010. Results Information on provider payment was available for a total of 32 CBI schemes in 34 reviewed publications: 17 schemes in South Asia, 10 in sub-Saharan Africa, 4 in East Asia and 1 in Latin America. Various types of provider payment were applied by the CBI schemes: 17 used fee-for-service, 12 used salaries, 9 applied a coverage ceiling, 7 used capitation and 6 applied a co-insurance. The evidence suggests that provider payment impacts CBI performance through provider participation and support for CBI, population enrolment and patient satisfaction with CBI, quantity and quality of services provided and provider and patient retention. Lack of provider participation in designing and choosing a CBI payment method can lead to reduced provider support for the scheme. Conclusion CBI schemes in developing countries have used a wide range of provider payment methods. The existing evidence suggests that payment methods are a key determinant of CBI performance and sustainability, but the strength of this evidence is limited since it is largely based on observational studies rather than on trials or on quasi-experimental research. According to the evidence, provider payment can affect provider participation, satisfaction and retention in CBI; the quantity and quality of services provided to CBI patients; patient demand of CBI services; and population enrollment, risk pooling and financial sustainability of CBI. CBI schemes should carefully consider how their current payment methods influence their performance, how changes in the methods could improve

  9. Life course socio-economic position and quality of life in adulthood: a systematic review of life course models

    PubMed Central

    2012-01-01

    Background A relationship between current socio-economic position and subjective quality of life has been demonstrated, using wellbeing, life and needs satisfaction approaches. Less is known regarding the influence of different life course socio-economic trajectories on later quality of life. Several conceptual models have been proposed to help explain potential life course effects on health, including accumulation, latent, pathway and social mobility models. This systematic review aimed to assess whether evidence supported an overall relationship between life course socio-economic position and quality of life during adulthood and if so, whether there was support for one or more life course models. Methods A review protocol was developed detailing explicit inclusion and exclusion criteria, search terms, data extraction items and quality appraisal procedures. Literature searches were performed in 12 electronic databases during January 2012 and the references and citations of included articles were checked for additional relevant articles. Narrative synthesis was used to analyze extracted data and studies were categorized based on the life course model analyzed. Results Twelve studies met the eligibility criteria and used data from 10 datasets and five countries. Study quality varied and heterogeneity between studies was high. Seven studies assessed social mobility models, five assessed the latent model, two assessed the pathway model and three tested the accumulation model. Evidence indicated an overall relationship, but mixed results were found for each life course model. Some evidence was found to support the latent model among women, but not men. Social mobility models were supported in some studies, but overall evidence suggested little to no effect. Few studies addressed accumulation and pathway effects and study heterogeneity limited synthesis. Conclusions To improve potential for synthesis in this area, future research should aim to increase study

  10. Immortalized endothelial cell lines for in vitro blood-brain barrier models: A systematic review.

    PubMed

    Rahman, Nurul Adhwa; Rasil, Alifah Nur'ain Haji Mat; Meyding-Lamade, Uta; Craemer, Eva Maria; Diah, Suwarni; Tuah, Ani Afiqah; Muharram, Siti Hanna

    2016-07-01

    Endothelial cells play the most important role in construction of the blood-brain barrier. Many studies have opted to use commercially available, easily transfected or immortalized endothelial cell lines as in vitro blood-brain barrier models. Numerous endothelial cell lines are available, but we do not currently have strong evidence for which cell lines are optimal for establishment of such models. This review aimed to investigate the application of immortalized endothelial cell lines as in vitro blood-brain barrier models. The databases used for this review were PubMed, OVID MEDLINE, ProQuest, ScienceDirect, and SpringerLink. A narrative systematic review was conducted and identified 155 studies. As a result, 36 immortalized endothelial cell lines of human, mouse, rat, porcine and bovine origins were found for the establishment of in vitro blood-brain barrier and brain endothelium models. This review provides a summary of immortalized endothelial cell lines as a guideline for future studies and improvements in the establishment of in vitro blood-brain barrier models. It is important to establish a good and reproducible model that has the potential for multiple applications, in particular a model of such a complex compartment such as the blood-brain barrier. PMID:27086967