Ji, Xiaonan; Yen, Po-Yin
2015-08-31
Systematic reviews and their implementation in practice provide high quality evidence for clinical practice but are both time and labor intensive due to the large number of articles. Automatic text classification has proven to be instrumental in identifying relevant articles for systematic reviews. Existing approaches use machine learning model training to generate classification algorithms for the article screening process but have limitations. We applied a network approach to assist in the article screening process for systematic reviews using predetermined article relationships (similarity). The article similarity metric is calculated using the MEDLINE elements title (TI), abstract (AB), medical subject heading (MH), author (AU), and publication type (PT). We used an article network to illustrate the concept of article relationships. Using the concept, each article can be modeled as a node in the network and the relationship between 2 articles is modeled as an edge connecting them. The purpose of our study was to use the article relationship to facilitate an interactive article recommendation process. We used 15 completed systematic reviews produced by the Drug Effectiveness Review Project and demonstrated the use of article networks to assist article recommendation. We evaluated the predictive performance of MEDLINE elements and compared our approach with existing machine learning model training approaches. The performance was measured by work saved over sampling at 95% recall (WSS95) and the F-measure (F1). We also used repeated analysis over variance and Hommel's multiple comparison adjustment to demonstrate statistical evidence. We found that although there is no significant difference across elements (except AU), TI and AB have better predictive capability in general. Collaborative elements bring performance improvement in both F1 and WSS95. With our approach, a simple combination of TI+AB+PT could achieve a WSS95 performance of 37%, which is competitive to traditional machine learning model training approaches (23%-41% WSS95). We demonstrated a new approach to assist in labor intensive systematic reviews. Predictive ability of different elements (both single and composited) was explored. Without using model training approaches, we established a generalizable method that can achieve a competitive performance.
ERIC Educational Resources Information Center
Mahalik, James R.
1990-01-01
Presents and evaluates four systematic eclectic models of psychotherapy: Beutler's eclectic psychotherapy; Howard, Nance, and Myers' adaptive counseling and therapy; Lazarus' multimodal therapy; and Prochaska and DiClemente's transtheoretical approach. Examines support for these models and makes conceptual and empirical recommendations.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chetvertkov, Mikhail A., E-mail: chetvertkov@wayne
2016-10-15
Purpose: To develop standard (SPCA) and regularized (RPCA) principal component analysis models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients and assess their potential use in adaptive radiation therapy, and for extracting quantitative information for treatment response assessment. Methods: Planning CT images of ten H&N patients were artificially deformed to create “digital phantom” images, which modeled systematic anatomical changes during radiation therapy. Artificial deformations closely mirrored patients’ actual deformations and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and syntheticmore » CBCTs (i.e., digital phantoms) and between pCT and clinical CBCTs. Patient-specific SPCA and RPCA models were built from these synthetic and clinical DVF sets. EigenDVFs (EDVFs) having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Results: Principal component analysis (PCA) models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade PCA’s ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. Conclusions: Leading EDVFs from the both PCA approaches have the potential to capture systematic anatomical change during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in a treatment course, or based on population models.« less
Systematic Applications of Metabolomics in Metabolic Engineering
Dromms, Robert A.; Styczynski, Mark P.
2012-01-01
The goals of metabolic engineering are well-served by the biological information provided by metabolomics: information on how the cell is currently using its biochemical resources is perhaps one of the best ways to inform strategies to engineer a cell to produce a target compound. Using the analysis of extracellular or intracellular levels of the target compound (or a few closely related molecules) to drive metabolic engineering is quite common. However, there is surprisingly little systematic use of metabolomics datasets, which simultaneously measure hundreds of metabolites rather than just a few, for that same purpose. Here, we review the most common systematic approaches to integrating metabolite data with metabolic engineering, with emphasis on existing efforts to use whole-metabolome datasets. We then review some of the most common approaches for computational modeling of cell-wide metabolism, including constraint-based models, and discuss current computational approaches that explicitly use metabolomics data. We conclude with discussion of the broader potential of computational approaches that systematically use metabolomics data to drive metabolic engineering. PMID:24957776
Tsoi, B; O'Reilly, D; Jegathisawaran, J; Tarride, J-E; Blackhouse, G; Goeree, R
2015-06-17
In constructing or appraising a health economic model, an early consideration is whether the modelling approach selected is appropriate for the given decision problem. Frameworks and taxonomies that distinguish between modelling approaches can help make this decision more systematic and this study aims to identify and compare the decision frameworks proposed to date on this topic area. A systematic review was conducted to identify frameworks from peer-reviewed and grey literature sources. The following databases were searched: OVID Medline and EMBASE; Wiley's Cochrane Library and Health Economic Evaluation Database; PubMed; and ProQuest. Eight decision frameworks were identified, each focused on a different set of modelling approaches and employing a different collection of selection criterion. The selection criteria can be categorized as either: (i) structural features (i.e. technical elements that are factual in nature) or (ii) practical considerations (i.e. context-dependent attributes). The most commonly mentioned structural features were population resolution (i.e. aggregate vs. individual) and interactivity (i.e. static vs. dynamic). Furthermore, understanding the needs of the end-users and stakeholders was frequently incorporated as a criterion within these frameworks. There is presently no universally-accepted framework for selecting an economic modelling approach. Rather, each highlights different criteria that may be of importance when determining whether a modelling approach is appropriate. Further discussion is thus necessary as the modelling approach selected will impact the validity of the underlying economic model and have downstream implications on its efficiency, transparency and relevance to decision-makers.
Schwander, Bjoern; Hiligsmann, Mickaël; Nuijten, Mark; Evers, Silvia
2016-10-01
Given the increasing clinical and economic burden of obesity, it is of major importance to identify cost-effective approaches for obesity management. Areas covered: This study aims to systematically review and compile an overview of published decision models for health economic assessments (HEA) in obesity, in order to summarize and compare their key characteristics as well as to identify, inform and guide future research. Of the 4,293 abstracts identified, 87 papers met our inclusion criteria. A wide range of different methodological approaches have been identified. Of the 87 papers, 69 (79%) applied unique /distinctive modelling approaches. Expert commentary: This wide range of approaches suggests the need to develop recommendations /minimal requirements for model-based HEA of obesity. In order to reach this long-term goal, further research is required. Valuable future research steps would be to investigate the predictiveness, validity and quality of the identified modelling approaches.
Aiassa, E; Higgins, J P T; Frampton, G K; Greiner, M; Afonso, A; Amzal, B; Deeks, J; Dorne, J-L; Glanville, J; Lövei, G L; Nienstedt, K; O'connor, A M; Pullin, A S; Rajić, A; Verloo, D
2015-01-01
Food and feed safety risk assessment uses multi-parameter models to evaluate the likelihood of adverse events associated with exposure to hazards in human health, plant health, animal health, animal welfare, and the environment. Systematic review and meta-analysis are established methods for answering questions in health care, and can be implemented to minimize biases in food and feed safety risk assessment. However, no methodological frameworks exist for refining risk assessment multi-parameter models into questions suitable for systematic review, and use of meta-analysis to estimate all parameters required by a risk model may not be always feasible. This paper describes novel approaches for determining question suitability and for prioritizing questions for systematic review in this area. Risk assessment questions that aim to estimate a parameter are likely to be suitable for systematic review. Such questions can be structured by their "key elements" [e.g., for intervention questions, the population(s), intervention(s), comparator(s), and outcome(s)]. Prioritization of questions to be addressed by systematic review relies on the likely impact and related uncertainty of individual parameters in the risk model. This approach to planning and prioritizing systematic review seems to have useful implications for producing evidence-based food and feed safety risk assessment.
Systematic Error Modeling and Bias Estimation
Zhang, Feihu; Knoll, Alois
2016-01-01
This paper analyzes the statistic properties of the systematic error in terms of range and bearing during the transformation process. Furthermore, we rely on a weighted nonlinear least square method to calculate the biases based on the proposed models. The results show the high performance of the proposed approach for error modeling and bias estimation. PMID:27213386
Toward a systematic exploration of nano-bio interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, Xue; Liu, Fang; Liu, Yin
Many studies of nanomaterials make non-systematic alterations of nanoparticle physicochemical properties. Given the immense size of the property space for nanomaterials, such approaches are not very useful in elucidating fundamental relationships between inherent physicochemical properties of these materials and their interactions with, and effects on, biological systems. Data driven artificial intelligence methods such as machine learning algorithms have proven highly effective in generating models with good predictivity and some degree of interpretability. They can provide a viable method of reducing or eliminating animal testing. However, careful experimental design with the modelling of the results in mind is a proven andmore » efficient way of exploring large materials spaces. This approach, coupled with high speed automated experimental synthesis and characterization technologies now appearing, is the fastest route to developing models that regulatory bodies may find useful. We advocate greatly increased focus on systematic modification of physicochemical properties of nanoparticles combined with comprehensive biological evaluation and computational analysis. This is essential to obtain better mechanistic understanding of nano-bio interactions, and to derive quantitatively predictive and robust models for the properties of nanomaterials that have useful domains of applicability. - Highlights: • Nanomaterials studies make non-systematic alterations to nanoparticle properties. • Vast nanomaterials property spaces require systematic studies of nano-bio interactions. • Experimental design and modelling are efficient ways of exploring materials spaces. • We advocate systematic modification and computational analysis to probe nano-bio interactions.« less
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
A Systematic Approach to Sensor Selection for Aircraft Engine Health Estimation
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Garg, Sanjay
2009-01-01
A systematic approach for selecting an optimal suite of sensors for on-board aircraft gas turbine engine health estimation is presented. The methodology optimally chooses the engine sensor suite and the model tuning parameter vector to minimize the Kalman filter mean squared estimation error in the engine s health parameters or other unmeasured engine outputs. This technique specifically addresses the underdetermined estimation problem where there are more unknown system health parameters representing degradation than available sensor measurements. This paper presents the theoretical estimation error equations, and describes the optimization approach that is applied to select the sensors and model tuning parameters to minimize these errors. Two different model tuning parameter vector selection approaches are evaluated: the conventional approach of selecting a subset of health parameters to serve as the tuning parameters, and an alternative approach that selects tuning parameters as a linear combination of all health parameters. Results from the application of the technique to an aircraft engine simulation are presented, and compared to those from an alternative sensor selection strategy.
How Do We Model Learning at Scale? A Systematic Review of Research on MOOCs
ERIC Educational Resources Information Center
Joksimovic, Srecko; Poquet, Oleksandra; Kovanovic, Vitomir; Dowell, Nia; Mills, Caitlin; Gaševic, Dragan; Dawson, Shane; Graesser, Arthur C.; Brooks, Christopher
2018-01-01
Despite a surge of empirical work on student participation in online learning environments, the causal links between the learning-related factors and processes with the desired learning outcomes remain unexplored. This study presents a systematic literature review of approaches to model learning in Massive Open Online Courses offering an analysis…
NASA Astrophysics Data System (ADS)
Del Giudice, Dario; Löwe, Roland; Madsen, Henrik; Mikkelsen, Peter Steen; Rieckermann, Jörg
2015-07-01
In urban rainfall-runoff, commonly applied statistical techniques for uncertainty quantification mostly ignore systematic output errors originating from simplified models and erroneous inputs. Consequently, the resulting predictive uncertainty is often unreliable. Our objective is to present two approaches which use stochastic processes to describe systematic deviations and to discuss their advantages and drawbacks for urban drainage modeling. The two methodologies are an external bias description (EBD) and an internal noise description (IND, also known as stochastic gray-box modeling). They emerge from different fields and have not yet been compared in environmental modeling. To compare the two approaches, we develop a unifying terminology, evaluate them theoretically, and apply them to conceptual rainfall-runoff modeling in the same drainage system. Our results show that both approaches can provide probabilistic predictions of wastewater discharge in a similarly reliable way, both for periods ranging from a few hours up to more than 1 week ahead of time. The EBD produces more accurate predictions on long horizons but relies on computationally heavy MCMC routines for parameter inferences. These properties make it more suitable for off-line applications. The IND can help in diagnosing the causes of output errors and is computationally inexpensive. It produces best results on short forecast horizons that are typical for online applications.
NASA Astrophysics Data System (ADS)
Chetvertkov, Mikhail A.
Purpose: To develop standard and regularized principal component analysis (PCA) models of anatomical changes from daily cone beam CTs (CBCTs) of head and neck (H&N) patients, assess their potential use in adaptive radiation therapy (ART), and to extract quantitative information for treatment response assessment. Methods: Planning CT (pCT) images of H&N patients were artificially deformed to create "digital phantom" images, which modeled systematic anatomical changes during Radiation Therapy (RT). Artificial deformations closely mirrored patients' actual deformations, and were interpolated to generate 35 synthetic CBCTs, representing evolving anatomy over 35 fractions. Deformation vector fields (DVFs) were acquired between pCT and synthetic CBCTs (i.e., digital phantoms), and between pCT and clinical CBCTs. Patient-specific standard PCA (SPCA) and regularized PCA (RPCA) models were built from these synthetic and clinical DVF sets. Eigenvectors, or eigenDVFs (EDVFs), having the largest eigenvalues were hypothesized to capture the major anatomical deformations during treatment. Modeled anatomies were used to assess the dose deviations with respect to the planned dose distribution. Results: PCA models achieve variable results, depending on the size and location of anatomical change. Random changes prevent or degrade SPCA's ability to detect underlying systematic change. RPCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes, and is therefore more successful than SPCA at capturing systematic changes early in treatment. SPCA models were less successful at modeling systematic changes in clinical patient images, which contain a wider range of random motion than synthetic CBCTs, while the regularized approach was able to extract major modes of motion. For dose assessment it has been shown that the modeled dose distribution was different from the planned dose for the parotid glands due to their shrinkage and shift into the higher dose volumes during the radiotherapy course. Modeled DVHs still underestimated the effect of parotid shrinkage due to the large compression factor (CF) used to acquire DVFs. Conclusion: Leading EDVFs from both PCA approaches have the potential to capture systematic anatomical changes during H&N radiotherapy when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the RPCA approach appears to be more reliable than SPCA at capturing systematic changes, enabling dosimetric consequences to be projected to the future treatment fractions based on trends established early in a treatment course, or, potentially, based on population models. This work showed that PCA has a potential in identifying the major mode of anatomical changes during the radiotherapy course and subsequent use of this information in future dose predictions is feasible. Use of smaller CF values for DVFs is preferred, otherwise anatomical motion will be underestimated.
Systematic Assessment for University Sexuality Programming.
ERIC Educational Resources Information Center
Westefeld, John S.; Winkelpleck, Judy M.
1982-01-01
Suggests systematic empirical assessment is needed to plan university sexuality programing. Proposes the traditional approach of asking about students' attitudes, knowledge, and behavior is useful for developing specific programing content. Presents an assessment model emphasizing assessment of students' desires for sexuality programing in terms…
Tuning a climate model using nudging to reanalysis.
NASA Astrophysics Data System (ADS)
Cheedela, S. K.; Mapes, B. E.
2014-12-01
Tuning a atmospheric general circulation model involves a daunting task of adjusting non-observable parameters to adjust the mean climate. These parameters arise from necessity to describe unresolved flow through parametrizations. Tuning a climate model is often done with certain set of priorities, such as global mean temperature, net top of the atmosphere radiation. These priorities are hard enough to reach let alone reducing systematic biases in the models. The goal of currently study is to explore alternate ways to tune a climate model to reduce some systematic biases that can be used in synergy with existing efforts. Nudging a climate model to a known state is a poor man's inverse of tuning process described above. Our approach involves nudging the atmospheric model to state of art reanalysis fields thereby providing a balanced state with respect to the global mean temperature and winds. The tendencies derived from nudging are negative of errors from physical parametrizations as the errors from dynamical core would be small. Patterns of nudging are compared to the patterns of different physical parametrizations to decipher the cause for certain biases in relation to tuning parameters. This approach might also help in understanding certain compensating errors that arise from tuning process. ECHAM6 is a comprehensive general model, also used in recent Coupled Model Intercomparision Project(CMIP5). The approach used to tune it and effect of certain parameters that effect its mean climate are reported clearly, hence it serves as a benchmark for our approach. Our planned experiments include nudging ECHAM6 atmospheric model to European Center Reanalysis (ERA-Interim) and reanalysis from National Center for Environmental Prediction (NCEP) and decipher choice of certain parameters that lead to systematic biases in its simulations. Of particular interest are reducing long standing biases related to simulation of Asian summer monsoon.
ERIC Educational Resources Information Center
Lozano-Parada, Jaime H.; Burnham, Helen; Martinez, Fiderman Machuca
2018-01-01
A classical nonlinear system, the "Brusselator", was used to illustrate the modeling and simulation of oscillating chemical systems using stability analysis techniques with modern software tools such as Comsol Multiphysics, Matlab, and Excel. A systematic approach is proposed in order to establish a regime of parametric conditions that…
A Systematic Approach to Determining the Identifiability of Multistage Carcinogenesis Models.
Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C
2017-07-01
Multistage clonal expansion (MSCE) models of carcinogenesis are continuous-time Markov process models often used to relate cancer incidence to biological mechanism. Identifiability analysis determines what model parameter combinations can, theoretically, be estimated from given data. We use a systematic approach, based on differential algebra methods traditionally used for deterministic ordinary differential equation (ODE) models, to determine identifiable combinations for a generalized subclass of MSCE models with any number of preinitation stages and one clonal expansion. Additionally, we determine the identifiable combinations of the generalized MSCE model with up to four clonal expansion stages, and conjecture the results for any number of clonal expansion stages. The results improve upon previous work in a number of ways and provide a framework to find the identifiable combinations for further variations on the MSCE models. Finally, our approach, which takes advantage of the Kolmogorov backward equations for the probability generating functions of the Markov process, demonstrates that identifiability methods used in engineering and mathematics for systems of ODEs can be applied to continuous-time Markov processes. © 2016 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chetvertkov, M; Henry Ford Health System, Detroit, MI; Siddiqui, F
2016-06-15
Purpose: To use daily cone beam CTs (CBCTs) to develop regularized principal component analysis (PCA) models of anatomical changes in head and neck (H&N) patients, to guide replanning decisions in adaptive radiation therapy (ART). Methods: Known deformations were applied to planning CT (pCT) images of 10 H&N patients to model several different systematic anatomical changes. A Pinnacle plugin was used to interpolate systematic changes over 35 fractions, generating a set of 35 synthetic CTs for each patient. Deformation vector fields (DVFs) were acquired between the pCT and synthetic CTs and random fraction-to-fraction changes were superimposed on the DVFs. Standard non-regularizedmore » and regularized patient-specific PCA models were built using the DVFs. The ability of PCA to extract the known deformations was quantified. PCA models were also generated from clinical CBCTs, for which the deformations and DVFs were not known. It was hypothesized that resulting eigenvectors/eigenfunctions with largest eigenvalues represent the major anatomical deformations during the course of treatment. Results: As demonstrated with quantitative results in the supporting document regularized PCA is more successful than standard PCA at capturing systematic changes early in the treatment. Regularized PCA is able to detect smaller systematic changes against the background of random fraction-to-fraction changes. To be successful at guiding ART, regularized PCA should be coupled with models of when anatomical changes occur: early, late or throughout the treatment course. Conclusion: The leading eigenvector/eigenfunction from the both PCA approaches can tentatively be identified as a major systematic change during radiotherapy course when systematic changes are large enough with respect to random fraction-to-fraction changes. In all cases the regularized PCA approach appears to be more reliable at capturing systematic changes, enabling dosimetric consequences to be projected once trends are established early in the treatment course. This work is supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less
Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J
2017-08-04
There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Petrou, Stavros; Kwon, Joseph; Madan, Jason
2018-05-10
Economic analysts are increasingly likely to rely on systematic reviews and meta-analyses of health state utility values to inform the parameter inputs of decision-analytic modelling-based economic evaluations. Beyond the context of economic evaluation, evidence from systematic reviews and meta-analyses of health state utility values can be used to inform broader health policy decisions. This paper provides practical guidance on how to conduct a systematic review and meta-analysis of health state utility values. The paper outlines a number of stages in conducting a systematic review, including identifying the appropriate evidence, study selection, data extraction and presentation, and quality and relevance assessment. The paper outlines three broad approaches that can be used to synthesise multiple estimates of health utilities for a given health state or condition, namely fixed-effect meta-analysis, random-effects meta-analysis and mixed-effects meta-regression. Each approach is illustrated by a synthesis of utility values for a hypothetical decision problem, and software code is provided. The paper highlights a number of methodological issues pertinent to the conduct of meta-analysis or meta-regression. These include the importance of limiting synthesis to 'comparable' utility estimates, for example those derived using common utility measurement approaches and sources of valuation; the effects of reliance on limited or poorly reported published data from primary utility assessment studies; the use of aggregate outcomes within analyses; approaches to generating measures of uncertainty; handling of median utility values; challenges surrounding the disentanglement of utility estimates collected serially within the context of prospective observational studies or prospective randomised trials; challenges surrounding the disentanglement of intervention effects; and approaches to measuring model validity. Areas of methodological debate and avenues for future research are highlighted.
A systematic approach to engineering ethics education.
Li, Jessica; Fu, Shengli
2012-06-01
Engineering ethics education is a complex field characterized by dynamic topics and diverse students, which results in significant challenges for engineering ethics educators. The purpose of this paper is to introduce a systematic approach to determine what to teach and how to teach in an ethics curriculum. This is a topic that has not been adequately addressed in the engineering ethics literature. This systematic approach provides a method to: (1) develop a context-specific engineering ethics curriculum using the Delphi technique, a process-driven research method; and (2) identify appropriate delivery strategies and instructional strategies using an instructional design model. This approach considers the context-specific needs of different engineering disciplines in ethics education and leverages the collaboration of engineering professors, practicing engineers, engineering graduate students, ethics scholars, and instructional design experts. The proposed approach is most suitable for a department, a discipline/field or a professional society. The approach helps to enhance learning outcomes and to facilitate ethics education curriculum development as part of the regular engineering curriculum.
Evaluating clinical librarian services: a systematic review.
Brettle, Alison; Maden-Jenkins, Michelle; Anderson, Lucy; McNally, Rosalind; Pratchett, Tracey; Tancock, Jenny; Thornton, Debra; Webb, Anne
2011-03-01
Previous systematic reviews have indicated limited evidence and poor quality evaluations of clinical librarian (CL) services. Rigorous evaluations should demonstrate the value of CL services, but guidance is needed before this can be achieved. To undertake a systematic review which examines models of CL services, quality, methods and perspectives of clinical librarian service evaluations. Systematic review methodology and synthesis of evidence, undertaken collaboratively by a group of 8 librarians to develop research and critical appraisal skills. There are four clear models of clinical library service provision. Clinical librarians are effective in saving health professionals time, providing relevant, useful information and high quality services. Clinical librarians have a positive effect on clinical decision making by contributing to better informed decisions, diagnosis and choice of drug or therapy. The quality of CL studies is improving, but more work is needed on reducing bias and providing evidence of specific impacts on patient care. The Critical Incident Technique as part of a mixed method approach appears to offer a useful approach to demonstrating impact. This systematic review provides practical guidance regarding the evaluation of CL services. It also provides updated evidence regarding the effectiveness and impact of CL services. The approach used was successful in developing research and critical appraisal skills in a group of librarians. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.
Mahomed, Ozayr Haroon; Asmall, Shaidah; Freeman, Melvyn
2014-11-01
The integrated chronic disease management model provides a systematic framework for creating a fundamental change in the orientation of the health system. This model adopts a diagonal approach to health system strengthening by establishing a service-linked base to training, supervision, and the opportunity to try out, assess, and implement integrated interventions.
MUSiC—An Automated Scan for Deviations between Data and Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Meyer, Arnd
2010-02-01
A model independent analysis approach is presented, systematically scanning the data for deviations from the standard model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of event generators. The approach is sensitive to a variety of models of new physics, including those not yet thought of.
MUSiC - An Automated Scan for Deviations between Data and Monte Carlo Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Arnd
2010-02-10
A model independent analysis approach is presented, systematically scanning the data for deviations from the standard model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of event generators. The approach is sensitive to a variety of models of new physics, including those not yet thought of.
Lee, Da-Sheng
2010-01-01
Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.
Modeling motivated misreports to sensitive survey questions.
Böckenholt, Ulf
2014-07-01
Asking sensitive or personal questions in surveys or experimental studies can both lower response rates and increase item non-response and misreports. Although non-response is easily diagnosed, misreports are not. However, misreports cannot be ignored because they give rise to systematic bias. The purpose of this paper is to present a modeling approach that identifies misreports and corrects for them. Misreports are conceptualized as a motivated process under which respondents edit their answers before they report them. For example, systematic bias introduced by overreports of socially desirable behaviors or underreports of less socially desirable ones can be modeled, leading to more-valid inferences. The proposed approach is applied to a large-scale experimental study and shows that respondents who feel powerful tend to overclaim their knowledge.
Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail
2017-02-27
Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β-decaymore » lifetimes, and β-delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. Here, we find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail
Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β-decaymore » lifetimes, and β-delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. Here, we find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.« less
MCD Process Model: A Systematic Approach to Curriculum Development in Black Studies.
ERIC Educational Resources Information Center
Miller, Howard J.
1986-01-01
Holds that Black Studies programs have had problems surviving because of (1) resistance to curriculum change in colleges and universities, (2) their lack of supporters in positions of administrative power, and (3) lack of an organized, conceptual approach to developing and implementing a Black Studies curriculum. Presents a model designed to…
NASA Astrophysics Data System (ADS)
Kaiser, Olga; Martius, Olivia; Horenko, Illia
2017-04-01
Regression based Generalized Pareto Distribution (GPD) models are often used to describe the dynamics of hydrological threshold excesses relying on the explicit availability of all of the relevant covariates. But, in real application the complete set of relevant covariates might be not available. In this context, it was shown that under weak assumptions the influence coming from systematically missing covariates can be reflected by a nonstationary and nonhomogenous dynamics. We present a data-driven, semiparametric and an adaptive approach for spatio-temporal regression based clustering of threshold excesses in a presence of systematically missing covariates. The nonstationary and nonhomogenous behavior of threshold excesses is describes by a set of local stationary GPD models, where the parameters are expressed as regression models, and a non-parametric spatio-temporal hidden switching process. Exploiting nonparametric Finite Element time-series analysis Methodology (FEM) with Bounded Variation of the model parameters (BV) for resolving the spatio-temporal switching process, the approach goes beyond strong a priori assumptions made is standard latent class models like Mixture Models and Hidden Markov Models. Additionally, the presented FEM-BV-GPD provides a pragmatic description of the corresponding spatial dependence structure by grouping together all locations that exhibit similar behavior of the switching process. The performance of the framework is demonstrated on daily accumulated precipitation series over 17 different locations in Switzerland from 1981 till 2013 - showing that the introduced approach allows for a better description of the historical data.
Remodeling of legacy systems in health care using UML.
Garde, Sebastian; Knaup, Petra; Herold, Ralf
2002-01-01
Research projects in the field of Medical Informatics often involve the development of application systems. Usually they are developed over a longer period of time, so that at a certain point of time a systematically planned reimplementation is necessary. The first step of reimplementation should be a systematic and comprehensive remodeling. When using UML for this task a systematic approach for remodeling activities is missing. Therefore, we developed a method for remodeling of legacy systems (Qumquad) and applied it to DOSPO, a documentation and therapy planning system for pediatric oncology. Qumquad helps to systematically carry out three steps: the modeling of the current actual state of the application system, the systematic identification of weak points and the development of a target concept for reimplementation considering the identified weak points. Results show that this approach is valuable and feasible and could be applied to various application systems in health care.
Quality metrics in high-dimensional data visualization: an overview and systematization.
Bertini, Enrico; Tatu, Andrada; Keim, Daniel
2011-12-01
In this paper, we present a systematization of techniques that use quality metrics to help in the visual exploration of meaningful patterns in high-dimensional data. In a number of recent papers, different quality metrics are proposed to automate the demanding search through large spaces of alternative visualizations (e.g., alternative projections or ordering), allowing the user to concentrate on the most promising visualizations suggested by the quality metrics. Over the last decade, this approach has witnessed a remarkable development but few reflections exist on how these methods are related to each other and how the approach can be developed further. For this purpose, we provide an overview of approaches that use quality metrics in high-dimensional data visualization and propose a systematization based on a thorough literature review. We carefully analyze the papers and derive a set of factors for discriminating the quality metrics, visualization techniques, and the process itself. The process is described through a reworked version of the well-known information visualization pipeline. We demonstrate the usefulness of our model by applying it to several existing approaches that use quality metrics, and we provide reflections on implications of our model for future research. © 2010 IEEE
Motion planning for an adaptive wing structure with macro-fiber composite actuators
NASA Astrophysics Data System (ADS)
Schröck, J.; Meurer, T.; Kugi, A.
2009-05-01
A systematic approach for flatness-based motion planning and feedforward control is presented for the transient shaping of a piezo-actuated rectangular cantilevered plate modeling an adaptive wing. In the first step the consideration of an idealized infinite-dimensional input allows to determine the state and input parametrization in terms of a flat or basic output, which is used for a systematic motion planning approach. Subsequently, the obtained idealized input function is projected onto a finite number of suitably placed Macro-fiber Composite (MFC) patch actuators. The tracking performance of the proposed approach is evaluated in a simulation scenario.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-04-01
To identify the key methodological challenges for public health economic modelling and set an agenda for future research. An iterative literature search identified papers describing methodological challenges for developing the structure of public health economic models. Additional multidisciplinary literature searches helped expand upon important ideas raised within the review. Fifteen articles were identified within the formal literature search, highlighting three key challenges: inclusion of non-healthcare costs and outcomes; inclusion of equity; and modelling complex systems and multi-component interventions. Based upon these and multidisciplinary searches about dynamic complexity, the social determinants of health, and models of human behaviour, six areas for future research were specified. Future research should focus on: the use of systems approaches within health economic modelling; approaches to assist the systematic consideration of the social determinants of health; methods for incorporating models of behaviour and social interactions; consideration of equity; and methodology to help modellers develop valid, credible and transparent public health economic model structures.
Lotfi, Tamara; Bou-Karroum, Lama; Darzi, Andrea; Hajjar, Rayan; El Rahyel, Ahmed; El Eid, Jamale; Itani, Mira; Brax, Hneine; Akik, Chaza; Osman, Mona; Hassan, Ghayda; El-Jardali, Fadi; Akl, Elie
2016-08-03
Our objective was to identify published models of coordination between entities funding or delivering health services in humanitarian crises, whether the coordination took place during or after the crises. We included reports describing models of coordination in sufficient detail to allow reproducibility. We also included reports describing implementation of identified models, as case studies. We searched Medline, PubMed, EMBASE, Cochrane Central Register of Controlled Trials, CINAHL, PsycINFO, and the WHO Global Health Library. We also searched websites of relevant organizations. We followed standard systematic review methodology. Our search captured 14,309 citations. The screening process identified 34 eligible papers describing five models of coordination of delivering health services: the "Cluster Approach" (with 16 case studies), the 4Ws "Who is Where, When, doing What" mapping tool (with four case studies), the "Sphere Project" (with two case studies), the "5x5" model (with one case study), and the "model of information coordination" (with one case study). The 4Ws and the 5x5 focus on coordination of services for mental health, the remaining models do not focus on a specific health topic. The Cluster approach appears to be the most widely used. One case study was a mixed implementation of the Cluster approach and the Sphere model. We identified no model of coordination for funding of health service. This systematic review identified five proposed coordination models that have been implemented by entities funding or delivering health service in humanitarian crises. There is a need to compare the effect of these different models on outcomes such as availability of and access to health services.
A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales
Ayton, Gary S.; Voth, Gregory A.
2009-01-01
A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167
Turner, Tari; Green, Sally; Tovey, David; McDonald, Steve; Soares-Weiser, Karla; Pestridge, Charlotte; Elliott, Julian
2017-08-01
Producing high-quality, relevant systematic reviews and keeping them up to date is challenging. Cochrane is a leading provider of systematic reviews in health. For Cochrane to continue to contribute to improvements in heath, Cochrane Reviews must be rigorous, reliable and up to date. We aimed to explore existing models of Cochrane Review production and emerging opportunities to improve the efficiency and sustainability of these processes. To inform discussions about how to best achieve this, we conducted 26 interviews and an online survey with 106 respondents. Respondents highlighted the importance and challenge of creating reliable, timely systematic reviews. They described the challenges and opportunities presented by current production models, and they shared what they are doing to improve review production. They particularly highlighted significant challenges with increasing complexity of review methods; difficulty keeping authors on board and on track; and the length of time required to complete the process. Strong themes emerged about the roles of authors and Review Groups, the central actors in the review production process. The results suggest that improvements to Cochrane's systematic review production models could come from improving clarity of roles and expectations, ensuring continuity and consistency of input, enabling active management of the review process, centralising some review production steps; breaking reviews into smaller "chunks", and improving approaches to building capacity of and sharing information between authors and Review Groups. Respondents noted the important role new technologies have to play in enabling these improvements. The findings of this study will inform the development of new Cochrane Review production models and may provide valuable data for other systematic review producers as they consider how best to produce rigorous, reliable, up-to-date reviews.
Collaborative Modeling: Experience of the U.S. Preventive Services Task Force.
Petitti, Diana B; Lin, Jennifer S; Owens, Douglas K; Croswell, Jennifer M; Feuer, Eric J
2018-01-01
Models can be valuable tools to address uncertainty, trade-offs, and preferences when trying to understand the effects of interventions. Availability of results from two or more independently developed models that examine the same question (comparative modeling) allows systematic exploration of differences between models and the effect of these differences on model findings. Guideline groups sometimes commission comparative modeling to support their recommendation process. In this commissioned collaborative modeling, modelers work with the people who are developing a recommendation or policy not only to define the questions to be addressed but ideally, work side-by-side with each other and with systematic reviewers to standardize selected inputs and incorporate selected common assumptions. This paper describes the use of commissioned collaborative modeling by the U.S. Preventive Services Task Force (USPSTF), highlighting the general challenges and opportunities encountered and specific challenges for some topics. It delineates other approaches to use modeling to support evidence-based recommendations and the many strengths of collaborative modeling compared with other approaches. Unlike systematic reviews prepared for the USPSTF, the commissioned collaborative modeling reports used by the USPSTF in making recommendations about screening have not been required to follow a common format, sometimes making it challenging to understand key model features. This paper presents a checklist developed to critically appraise commissioned collaborative modeling reports about cancer screening topics prepared for the USPSTF. Copyright © 2017 American Journal of Preventive Medicine. All rights reserved.
Marsac, Meghan L.; Winston, Flaura K.; Hildenbrand, Aimee K.; Kohser, Kristen L.; March, Sonja; Kenardy, Justin; Kassam-Adams, Nancy
2015-01-01
Background Millions of children are affected by acute medical events annually, creating need for resources to promote recovery. While web-based interventions promise wide reach and low cost for users, development can be time- and cost-intensive. A systematic approach to intervention development can help to minimize costs and increase likelihood of effectiveness. Using a systematic approach, our team integrated evidence on the etiology of traumatic stress, an explicit program theory, and a user-centered design process to intervention development. Objective To describe evidence and the program theory model applied to the Coping Coach intervention and present pilot data evaluating intervention feasibility and acceptability. Method Informed by empirical evidence on traumatic stress prevention, an overarching program theory model was articulated to delineate pathways from a) specific intervention content to b) program targets and proximal outcomes to c) key longer-term health outcomes. Systematic user-testing with children ages 8–12 (N = 42) exposed to an acute medical event and their parents was conducted throughout intervention development. Results Functionality challenges in early prototypes necessitated revisions. Child engagement was positive throughout revisions to the Coping Coach intervention. Final pilot-testing demonstrated promising feasibility and high user-engagement and satisfaction. Conclusion Applying a systematic approach to the development of Coping Coach led to the creation of a functional intervention that is accepted by children and parents. Development of new e-health interventions may benefit from a similar approach. Future research should evaluate the efficacy of Coping Coach in achieving targeted outcomes of reduced trauma symptoms and improved health-related quality of life. PMID:25844276
Reservoir studies with geostatistics to forecast performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, R.W.; Behrens, R.A.; Emanuel, A.S.
1991-05-01
In this paper example geostatistics and streamtube applications are presented for waterflood and CO{sub 2} flood in two low-permeability sandstone reservoirs. Thy hybrid approach of combining fine vertical resolution in cross-sectional models with streamtubes resulted in models that showed water channeling and provided realistic performance estimates. Results indicate that the combination of detailed geostatistical cross sections and fine-grid streamtube models offers a systematic approach for realistic performance forecasts.
Sin, Gürkan; Van Hulle, Stijn W H; De Pauw, Dirk J W; van Griensven, Ann; Vanrolleghem, Peter A
2005-07-01
Modelling activated sludge systems has gained an increasing momentum after the introduction of activated sludge models (ASMs) in 1987. Application of dynamic models for full-scale systems requires essentially a calibration of the chosen ASM to the case under study. Numerous full-scale model applications have been performed so far which were mostly based on ad hoc approaches and expert knowledge. Further, each modelling study has followed a different calibration approach: e.g. different influent wastewater characterization methods, different kinetic parameter estimation methods, different selection of parameters to be calibrated, different priorities within the calibration steps, etc. In short, there was no standard approach in performing the calibration study, which makes it difficult, if not impossible, to (1) compare different calibrations of ASMs with each other and (2) perform internal quality checks for each calibration study. To address these concerns, systematic calibration protocols have recently been proposed to bring guidance to the modeling of activated sludge systems and in particular to the calibration of full-scale models. In this contribution four existing calibration approaches (BIOMATH, HSG, STOWA and WERF) will be critically discussed using a SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis. It will also be assessed in what way these approaches can be further developed in view of further improving the quality of ASM calibration. In this respect, the potential of automating some steps of the calibration procedure by use of mathematical algorithms is highlighted.
NASA Astrophysics Data System (ADS)
Jansen, Peter A.; Watter, Scott
2012-03-01
Connectionist language modelling typically has difficulty with syntactic systematicity, or the ability to generalise language learning to untrained sentences. This work develops an unsupervised connectionist model of infant grammar learning. Following the semantic boostrapping hypothesis, the network distils word category using a developmentally plausible infant-scale database of grounded sensorimotor conceptual representations, as well as a biologically plausible semantic co-occurrence activation function. The network then uses this knowledge to acquire an early benchmark clausal grammar using correlational learning, and further acquires separate conceptual and grammatical category representations. The network displays strongly systematic behaviour indicative of the general acquisition of the combinatorial systematicity present in the grounded infant-scale language stream, outperforms previous contemporary models that contain primarily noun and verb word categories, and successfully generalises broadly to novel untrained sensorimotor grounded sentences composed of unfamiliar nouns and verbs. Limitations as well as implications to later grammar learning are discussed.
Modeling and applications in microbial food safety
USDA-ARS?s Scientific Manuscript database
Mathematical modeling is a scientific and systematic approach to study and describe the recurrent events or phenomena with successful application track for decades. When models are properly developed and validated, their applications may save costs and time. For the microbial food safety concerns, ...
Calculation of the detection limit in radiation measurements with systematic uncertainties
NASA Astrophysics Data System (ADS)
Kirkpatrick, J. M.; Russ, W.; Venkataraman, R.; Young, B. M.
2015-06-01
The detection limit (LD) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case.
PROCRU: A model for analyzing flight crew procedures in approach to landing
NASA Technical Reports Server (NTRS)
Baron, S.; Zacharias, G.; Muraidharan, R.; Lancraft, R.
1982-01-01
A model for the human performance of approach and landing tasks that would provide a means for systematic exploration of questions concerning the impact of procedural and equipment design and the allocation of resources in the cockpit on performance and safety in approach-to-landing is discussed. A system model is needed that accounts for the interactions of crew, procedures, vehicle, approach geometry, and environment. The issues of interest revolve principally around allocation of tasks in the cockpit and crew performance with respect to the cognitive aspects of the tasks. The model must, therefore, deal effectively with information processing and decision-making aspects of human performance.
Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable?
Booth, Andrew; Carroll, Christopher
2015-09-01
In recognising the potential value of theory in understanding how interventions work comes a challenge - how to make identification of theory less haphazard? To explore the feasibility of systematic identification of theory. We searched PubMed for published reviews (1998-2012) that had explicitly sought to identify theory. Systematic searching may be characterised by a structured question, methodological filters and an itemised search procedure. We constructed a template (BeHEMoTh - Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory. The authors tested the template within two systematic reviews. Of 34 systematic reviews, only 12 reviews (35%) reported a method for identifying theory. Nineteen did not specify how they identified studies containing theory. Data were unavailable for three reviews. Candidate terms include concept(s)/conceptual, framework(s), model(s), and theory/theories/theoretical. Information professionals must overcome inadequate reporting and the use of theory out of context. The review team faces an additional concern in lack of 'theory fidelity'. Based on experience with two systematic reviews, the BeHEMoTh template and procedure offers a feasible and useful approach for identification of theory. Applications include realist synthesis, framework synthesis or review of complex interventions. The procedure requires rigorous evaluation. © 2015 Health Libraries Group.
A Three-Step Approach To Model Tree Mortality in the State of Georgia
Qingmin Meng; Chris J. Cieszewski; Roger C. Lowe; Michal Zasada
2005-01-01
Tree mortality is one of the most complex phenomena of forest growth and yield. Many types of factors affect tree mortality, which is considered difficult to predict. This study presents a new systematic approach to simulate tree mortality based on the integration of statistical models and geographical information systems. This method begins with variable preselection...
An approach to achieve progress in spacecraft shielding
NASA Astrophysics Data System (ADS)
Thoma, K.; Schäfer, F.; Hiermaier, S.; Schneider, E.
2004-01-01
Progress in shield design against space debris can be achieved only when a combined approach based on several tools is used. This approach depends on the combined application of advanced numerical methods, specific material models and experimental determination of input parameters for these models. Examples of experimental methods for material characterization are given, covering the range from quasi static to very high strain rates for materials like Nextel and carbon fiber-reinforced materials. Mesh free numerical methods have extraordinary capabilities in the simulation of extreme material behaviour including complete failure with phase changes, combined with shock wave phenomena and the interaction with structural components. In this paper the benefits from combining numerical methods, material modelling and detailed experimental studies for shield design are demonstrated. The following examples are given: (1) Development of a material model for Nextel and Kevlar-Epoxy to enable numerical simulation of hypervelocity impacts on complex heavy protection shields for the International Space Station. (2) The influence of projectile shape on protection performance of Whipple Shields and how experimental problems in accelerating such shapes can be overcome by systematic numerical simulation. (3) The benefits of using metallic foams in "sandwich bumper shields" for spacecraft and how to approach systematic characterization of such materials.
Lee, Da-Sheng
2010-01-01
Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design. PMID:22315563
MUSiC - A Generic Search for Deviations from Monte Carlo Predictions in CMS
NASA Astrophysics Data System (ADS)
Hof, Carsten
2009-05-01
We present a model independent analysis approach, systematically scanning the data for deviations from the Standard Model Monte Carlo expectation. Such an analysis can contribute to the understanding of the CMS detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. We outline the importance of systematic uncertainties, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving supersymmetry and new heavy gauge bosons have been used as an input to the search algorithm.
Clinical Problem Analysis (CPA): A Systematic Approach To Teaching Complex Medical Problem Solving.
ERIC Educational Resources Information Center
Custers, Eugene J. F. M.; Robbe, Peter F. De Vries; Stuyt, Paul M. J.
2000-01-01
Discusses clinical problem analysis (CPA) in medical education, an approach to solving complex clinical problems. Outlines the five step CPA model and examines the value of CPA's content-independent (methodical) approach. Argues that teaching students to use CPA will enable them to avoid common diagnostic reasoning errors and pitfalls. Compares…
Evaluation of Models of the Reading Process.
ERIC Educational Resources Information Center
Balajthy, Ernest
A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…
Characterization and effectiveness of pay-for-performance in ophthalmology: a systematic review.
Herbst, Tim; Emmert, Martin
2017-06-05
To identify, characterize and compare existing pay-for-performance approaches and their impact on the quality of care and efficiency in ophthalmology. A systematic evidence-based review was conducted. English, French and German written literature published between 2000 and 2015 were searched in the following databases: Medline (via PubMed), NCBI web site, Scopus, Web of Knowledge, Econlit and the Cochrane Library. Empirical as well as descriptive articles were included. Controlled clinical trials, meta-analyses, randomized controlled studies as well as observational studies were included as empirical articles. Systematic characterization of identified pay-for-performance approaches (P4P approaches) was conducted according to the "Model for Implementing and Monitoring Incentives for Quality" (MIMIQ). Methodological quality of empirical articles was assessed according to the Critical Appraisal Skills Programme (CASP) checklists. Overall, 13 relevant articles were included. Eleven articles were descriptive and two articles included empirical analyses. Based on these articles, four different pay-for-performance approaches implemented in the United States were identified. With regard to quality and incentive elements, systematic comparison showed numerous differences between P4P approaches. Empirical studies showed isolated cost or quality effects, while a simultaneous examination of these effects was missing. Research results show that experiences with pay-for-performance approaches in ophthalmology are limited. Identified approaches differ with regard to quality and incentive elements restricting comparability. Two empirical studies are insufficient to draw strong conclusions about the effectiveness and efficiency of these approaches.
Depression and Distortion in the Attribution of Causality
ERIC Educational Resources Information Center
Rizley, Ross
1978-01-01
Two cognitive models of depression have attracted considerable attention recently: Seligman's (1975) learned helplessness model and Beck's (1967) cognitive schema approach. Describes each model and, in two studies, evaluates the assumption that depression is associated with systematic distortion in cognition regarding causal and controlling…
Process of Continual Improvement in a School of Nursing.
ERIC Educational Resources Information Center
Norman, Linda D.; Lutenbacher, Melanie
1996-01-01
Vanderbilt University School of Nursing used the Batalden model of systems improvement to change its program. The model analyzes services and products, customers, social community need, and customer knowledge to approach improvements in a systematic way. (JOW)
NASA Astrophysics Data System (ADS)
Riccio, A.; Giunta, G.; Galmarini, S.
2007-04-01
In this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner.
NASA Astrophysics Data System (ADS)
Riccio, A.; Giunta, G.; Galmarini, S.
2007-12-01
In this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner.
Economic evaluation in chronic pain: a systematic review and de novo flexible economic model.
Sullivan, W; Hirst, M; Beard, S; Gladwell, D; Fagnani, F; López Bastida, J; Phillips, C; Dunlop, W C N
2016-07-01
There is unmet need in patients suffering from chronic pain, yet innovation may be impeded by the difficulty of justifying economic value in a field beset by data limitations and methodological variability. A systematic review was conducted to identify and summarise the key areas of variability and limitations in modelling approaches in the economic evaluation of treatments for chronic pain. The results of the literature review were then used to support the development of a fully flexible open-source economic model structure, designed to test structural and data assumptions and act as a reference for future modelling practice. The key model design themes identified from the systematic review included: time horizon; titration and stabilisation; number of treatment lines; choice/ordering of treatment; and the impact of parameter uncertainty (given reliance on expert opinion). Exploratory analyses using the model to compare a hypothetical novel therapy versus morphine as first-line treatments showed cost-effectiveness results to be sensitive to structural and data assumptions. Assumptions about the treatment pathway and choice of time horizon were key model drivers. Our results suggest structural model design and data assumptions may have driven previous cost-effectiveness results and ultimately decisions based on economic value. We therefore conclude that it is vital that future economic models in chronic pain are designed to be fully transparent and hope our open-source code is useful in order to aspire to a common approach to modelling pain that includes robust sensitivity analyses to test structural and parameter uncertainty.
Christensen, Jette; El Allaki, Farouk; Vallières, André
2014-05-01
Scenario tree models with temporal discounting have been applied in four continents to support claims of freedom from animal disease. Recently, a second (new) model was developed for the same population and disease. This is a natural development because surveillance is a dynamic process that needs to adapt to changing circumstances - the difficulty is the justification for, documentation of, presentation of and the acceptance of the changes. Our objective was to propose a systematic approach to present changes to an existing scenario tree model for freedom from disease. We used the example of how we adapted the deterministic Canadian Notifiable Avian Influenza scenario tree model published in 2011 to a stochastic scenario tree model where the definition of sub-populations and the estimation of probability of introduction of the pathogen were modified. We found that the standardized approach by Vanderstichel et al. (2013) with modifications provided a systematic approach to make and present changes to an existing scenario tree model. We believe that the new 2013 CanNAISS scenario tree model is a better model than the 2011 model because the 2013 model included more surveillance data. In particular, the new data on Notifiable Avian Influenza in Canada from the last 5 years were used to improve input parameters and model structure. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Bashaw, W. L., Ed.; Findley, Warren G., Ed.
This volume contains the five major addresses and subsequent discussion from the Symposium on the General Linear Models Approach to the Analysis of Experimental Data in Educational Research, which was held in 1967 in Athens, Georgia. The symposium was designed to produce systematic information, including new methodology, for dissemination to the…
Simulation models in population breast cancer screening: A systematic review.
Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H
2015-08-01
The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dynamics and phenomenology of higher order gravity cosmological models
NASA Astrophysics Data System (ADS)
Moldenhauer, Jacob Andrew
2010-10-01
I present here some new results about a systematic approach to higher-order gravity (HOG) cosmological models. The HOG models are derived from curvature invariants that are more general than the Einstein-Hilbert action. Some of the models exhibit late-time cosmic acceleration without the need for dark energy and fit some current observations. The open question is that there are an infinite number of invariants that one could select, and many of the published papers have stressed the need to find a systematic approach that will allow one to study methodically the various possibilities. We explore a new connection that we made between theorems from the theory of invariants in general relativity and these cosmological models. In summary, the theorems demonstrate that curvature invariants are not all independent from each other and that for a given Ricci Segre type and Petrov type (symmetry classification) of the space-time, there exists a complete minimal set of independent invariants (a basis) in terms of which all the other invariants can be expressed. As an immediate consequence of the proposed approach, the number of invariants to consider is dramatically reduced from infinity to four invariants in the worst case and to only two invariants in the cases of interest, including all Friedmann-Lemaitre-Robertson-Walker metrics. We derive models that pass stability and physical acceptability conditions. We derive dynamical equations and phase portrait analyses that show the promise of the systematic approach. We consider observational constraints from magnitude-redshift Supernovae Type Ia data, distance to the last scattering surface of the Cosmic Microwave Background radiation, and Baryon Acoustic Oscillations. We put observational constraints on general HOG models. We constrain different forms of the Gauss-Bonnet, f(G), modified gravity models with these observations. We show some of these models pass solar system tests. We seek to find models that pass physical and observational constraints and give fits to the data that are almost as good as those of the standard Lambda-Cold-Dark-Matter model. Finding accelerating HOG models with late-time acceleration that pass physical acceptability conditions, solar system tests, and cosmological constraints will constitute serious contenders to explain cosmic acceleration.
Robitschek, Jon; Dresner, Harley; Hilger, Peter
2017-12-01
Photographic nasal analysis constitutes a critical step along the path toward accurate diagnosis and precise surgical planning in rhinoplasty. The learned process by which one assesses photographs, analyzes relevant anatomical landmarks, and generates a global view of the nasal aesthetic is less widely described. To discern the common pitfalls in performing photographic nasal analysis and to quantify the utility of a systematic approach model in teaching photographic nasal analysis to otolaryngology residents. This prospective observational study included 20 participants from a university-based otolaryngology residency program. The control and intervention groups underwent baseline graded assessment of 3 patients. The intervention group received instruction on a systematic approach model for nasal analysis, and both groups underwent postintervention testing at 10 weeks. Data were collected from October 1, 2015, through June 1, 2016. A 10-minute, 11-slide presentation provided instruction on a systematic approach to nasal analysis to the intervention group. Graded photographic nasal analysis using a binary 18-point system. The 20 otolaryngology residents (15 men and 5 women; age range, 24-34 years) were adept at mentioning dorsal deviation and dorsal profile with focused descriptions of tip angle and contour. Areas commonly omitted by residents included verification of the Frankfort plane, position of the lower lateral crura, radix position, and ratio of the ala to tip lobule. The intervention group demonstrated immediate improvement after instruction on the teaching model, with the mean (SD) postintervention test score doubling compared with their baseline performance (7.5 [2.7] vs 10.3 [2.5]; P < .001). At 10 weeks after the intervention, the mean comparative improvement in overall graded nasal analysis was 17% (95% CI, 10%-23%; P < .001). Otolaryngology residents demonstrated proficiency at incorporating nasal deviation, tip angle, and dorsal profile contour into their nasal analysis. They often omitted verification of the Frankfort plane, position of lower lateral crura, radix depth, and ala-to-tip lobule ratio. Findings with this novel 10-minute teaching model should be validated at other teaching institutions, and the instruction model should be further enhanced to teach more sophisticated analysis to residents as they proceed through training. NA.
Influenza forecasting in human populations: a scoping review.
Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A; McKenzie, F Ellis
2014-01-01
Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms "influenza AND (forecast* OR predict*)", excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials.
Influenza Forecasting in Human Populations: A Scoping Review
Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A.; McKenzie, F. Ellis
2014-01-01
Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms “influenza AND (forecast* OR predict*)”, excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials. PMID:24714027
Simplified models for displaced dark matter signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buchmueller, Oliver; De Roeck, Albert; Hahn, Kristian
We propose a systematic programme to search for long-lived neutral particle signatures through a minimal set of displaced =E T searches (dMETs). Here, our approach is to extend the well-established dark matter simpli ed models to include displaced vertices. The dark matter simplified models are used to describe the primary production vertex. A displaced secondary vertex, characterised by the mass of the long-lived particle and its lifetime, is added for the displaced signature. We show how these models can be motivated by, and mapped onto, complete models such as gauge-mediated SUSY breaking and models of neutral naturalness. We also outlinemore » how this approach may be used to extend other simplified models to incorporate displaced signatures and to characterise searches for longlived charged particles. Displaced vertices are a striking signature which is often virtually background free, and thus provide an excellent target for the high-luminosity run of the Large Hadron Collider. The proposed models and searches provide a first step towards a systematic broadening of the displaced dark matter search programme.« less
Simplified models for displaced dark matter signatures
Buchmueller, Oliver; De Roeck, Albert; Hahn, Kristian; ...
2017-09-18
We propose a systematic programme to search for long-lived neutral particle signatures through a minimal set of displaced =E T searches (dMETs). Here, our approach is to extend the well-established dark matter simpli ed models to include displaced vertices. The dark matter simplified models are used to describe the primary production vertex. A displaced secondary vertex, characterised by the mass of the long-lived particle and its lifetime, is added for the displaced signature. We show how these models can be motivated by, and mapped onto, complete models such as gauge-mediated SUSY breaking and models of neutral naturalness. We also outlinemore » how this approach may be used to extend other simplified models to incorporate displaced signatures and to characterise searches for longlived charged particles. Displaced vertices are a striking signature which is often virtually background free, and thus provide an excellent target for the high-luminosity run of the Large Hadron Collider. The proposed models and searches provide a first step towards a systematic broadening of the displaced dark matter search programme.« less
Biofidelic Human Activity Modeling and Simulation with Large Variability
2014-11-25
A systematic approach was developed for biofidelic human activity modeling and simulation by using body scan data and motion capture data to...replicate a human activity in 3D space. Since technologies for simultaneously capturing human motion and dynamic shapes are not yet ready for practical use, a...that can replicate a human activity in 3D space with the true shape and true motion of a human. Using this approach, a model library was built to
ERIC Educational Resources Information Center
Panza, Carol M.
2001-01-01
Suggests that human performance technologists need to have an analysis approach to support the development of an appropriate set of improvement recommendations for clients and then move to an action plan to help them see results. Presents a performance improvement model and a systematic approach that considers organizational context, ownership,…
Adding a Systemic Touch to Rational-Emotive Therapy for Families.
ERIC Educational Resources Information Center
Russell, Todd T.; Morrill, Correen M.
1989-01-01
Proposes a theoretical and practical hybrid model for family counseling based on integrating the rational-emotive and family systems approach. Notes that these combined approaches offer the counselor a systematic theoretical structure for conceptualizing family dysfunction, from which additional strategies for changing inappropriate belief systems…
Treatment of systematic errors in land data assimilation systems
NASA Astrophysics Data System (ADS)
Crow, W. T.; Yilmaz, M.
2012-12-01
Data assimilation systems are generally designed to minimize the influence of random error on the estimation of system states. Yet, experience with land data assimilation systems has also revealed the presence of large systematic differences between model-derived and remotely-sensed estimates of land surface states. Such differences are commonly resolved prior to data assimilation through implementation of a pre-processing rescaling step whereby observations are scaled (or non-linearly transformed) to somehow "match" comparable predictions made by an assimilation model. While the rationale for removing systematic differences in means (i.e., bias) between models and observations is well-established, relatively little theoretical guidance is currently available to determine the appropriate treatment of higher-order moments during rescaling. This talk presents a simple analytical argument to define an optimal linear-rescaling strategy for observations prior to their assimilation into a land surface model. While a technique based on triple collocation theory is shown to replicate this optimal strategy, commonly-applied rescaling techniques (e.g., so called "least-squares regression" and "variance matching" approaches) are shown to represent only sub-optimal approximations to it. Since the triple collocation approach is likely infeasible in many real-world circumstances, general advice for deciding between various feasible (yet sub-optimal) rescaling approaches will be presented with an emphasis of the implications of this work for the case of directly assimilating satellite radiances. While the bulk of the analysis will deal with linear rescaling techniques, its extension to nonlinear cases will also be discussed.
Leading Undergraduate Research Projects in Mathematical Modeling
ERIC Educational Resources Information Center
Seshaiyer, Padmanabhan
2017-01-01
In this article, we provide some useful perspectives and experiences in mentoring students in undergraduate research (UR) in mathematical modeling using differential equations. To engage students in this topic, we present a systematic approach to the creation of rich problems from real-world phenomena; present mathematical models that are derived…
Quantum Chemical Modeling of Enzymatic Reactions: The Case of Decarboxylation.
Liao, Rong-Zhen; Yu, Jian-Guo; Himo, Fahmi
2011-05-10
We present a systematic study of the decarboxylation step of the enzyme aspartate decarboxylase with the purpose of assessing the quantum chemical cluster approach for modeling this important class of decarboxylase enzymes. Active site models ranging in size from 27 to 220 atoms are designed, and the barrier and reaction energy of this step are evaluated. To model the enzyme surrounding, homogeneous polarizable medium techniques are used with several dielectric constants. The main conclusion is that when the active site model reaches a certain size, the solvation effects from the surroundings saturate. Similar results have previously been obtained from systematic studies of other classes of enzymes, suggesting that they are of a quite general nature.
Chiang, Austin W T; Liu, Wei-Chung; Charusanti, Pep; Hwang, Ming-Jing
2014-01-15
A major challenge in mathematical modeling of biological systems is to determine how model parameters contribute to systems dynamics. As biological processes are often complex in nature, it is desirable to address this issue using a systematic approach. Here, we propose a simple methodology that first performs an enrichment test to find patterns in the values of globally profiled kinetic parameters with which a model can produce the required system dynamics; this is then followed by a statistical test to elucidate the association between individual parameters and different parts of the system's dynamics. We demonstrate our methodology on a prototype biological system of perfect adaptation dynamics, namely the chemotaxis model for Escherichia coli. Our results agreed well with those derived from experimental data and theoretical studies in the literature. Using this model system, we showed that there are motifs in kinetic parameters and that these motifs are governed by constraints of the specified system dynamics. A systematic approach based on enrichment statistical tests has been developed to elucidate the relationships between model parameters and the roles they play in affecting system dynamics of a prototype biological network. The proposed approach is generally applicable and therefore can find wide use in systems biology modeling research.
Vilela, Paulina; Liu, Hongbin; Lee, SeungChul; Hwangbo, Soonho; Nam, KiJeon; Yoo, ChangKyoo
2018-08-15
The release of silver nanoparticles (AgNPs) to wastewater caused by over-generation and poor treatment of the remaining nanomaterial has raised the interest of researchers. AgNPs can have a negative impact on watersheds and generate degradation of the effluent quality of wastewater treatment plants (WWTPs). The aim of this research is to design and analyze an integrated model system for the removal of AgNPs with high effluent quality in WWTPs using a systematic approach of removal mechanisms modeling, optimization, and control of the removal of silver nanoparticles. The activated sludge model 1 was modified with the inclusion of AgNPs removal mechanisms, such as adsorption/desorption, dissolution, and inhibition of microbial organisms. Response surface methodology was performed to minimize the AgNPs and total nitrogen concentrations in the effluent by optimizing operating conditions of the system. Then, the optimal operating conditions were utilized for the implementation of control strategies into the system for further analysis of enhancement of AgNPs removal efficiency. Thus, the overall AgNP removal efficiency was found to be slightly higher than 80%, which was an improvement of almost 7% compared to the BSM1 reference value. This study provides a systematic approach to find an optimal solution for enhancing AgNP removal efficiency in WWTPs and thereby to prevent pollution in the environment. Copyright © 2018 Elsevier B.V. All rights reserved.
A systematic study of multiple minerals precipitation modelling in wastewater treatment.
Kazadi Mbamba, Christian; Tait, Stephan; Flores-Alsina, Xavier; Batstone, Damien J
2015-11-15
Mineral solids precipitation is important in wastewater treatment. However approaches to minerals precipitation modelling are varied, often empirical, and mostly focused on single precipitate classes. A common approach, applicable to multi-species precipitates, is needed to integrate into existing wastewater treatment models. The present study systematically tested a semi-mechanistic modelling approach, using various experimental platforms with multiple minerals precipitation. Experiments included dynamic titration with addition of sodium hydroxide to synthetic wastewater, and aeration to progressively increase pH and induce precipitation in real piggery digestate and sewage sludge digestate. The model approach consisted of an equilibrium part for aqueous phase reactions and a kinetic part for minerals precipitation. The model was fitted to dissolved calcium, magnesium, total inorganic carbon and phosphate. Results indicated that precipitation was dominated by the mineral struvite, forming together with varied and minor amounts of calcium phosphate and calcium carbonate. The model approach was noted to have the advantage of requiring a minimal number of fitted parameters, so the model was readily identifiable. Kinetic rate coefficients, which were statistically fitted, were generally in the range 0.35-11.6 h(-1) with confidence intervals of 10-80% relative. Confidence regions for the kinetic rate coefficients were often asymmetric with model-data residuals increasing more gradually with larger coefficient values. This suggests that a large kinetic coefficient could be used when actual measured data is lacking for a particular precipitate-matrix combination. Correlation between the kinetic rate coefficients of different minerals was low, indicating that parameter values for individual minerals could be independently fitted (keeping all other model parameters constant). Implementation was therefore relatively flexible, and would be readily expandable to include other minerals. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
Development of Prototype Driver Models for Highway Design: Research Update
DOT National Transportation Integrated Search
1999-06-01
One of the high-priority research areas of the Federal Highway Administration (FHWA) is the development of the Interactive Highway Safety Design Model (IHSDM). The goal of the IHSDM research program is to develop a systematic approach that will allow...
Nikdel, Ali; Braatz, Richard D; Budman, Hector M
2018-05-01
Dynamic flux balance analysis (DFBA) has become an instrumental modeling tool for describing the dynamic behavior of bioprocesses. DFBA involves the maximization of a biologically meaningful objective subject to kinetic constraints on the rate of consumption/production of metabolites. In this paper, we propose a systematic data-based approach for finding both the biological objective function and a minimum set of active constraints necessary for matching the model predictions to the experimental data. The proposed algorithm accounts for the errors in the experiments and eliminates the need for ad hoc choices of objective function and constraints as done in previous studies. The method is illustrated for two cases: (1) for in silico (simulated) data generated by a mathematical model for Escherichia coli and (2) for actual experimental data collected from the batch fermentation of Bordetella Pertussis (whooping cough).
MUSiC - A general search for deviations from monte carlo predictions in CMS
NASA Astrophysics Data System (ADS)
Biallass, Philipp A.; CMS Collaboration
2009-06-01
A model independent analysis approach in CMS is presented, systematically scanning the data for deviations from the Monte Carlo expectation. Such an analysis can contribute to the understanding of the detector and the tuning of the event generators. Furthermore, due to the minimal theoretical bias this approach is sensitive to a variety of models of new physics, including those not yet thought of. Events are classified into event classes according to their particle content (muons, electrons, photons, jets and missing transverse energy). A broad scan of various distributions is performed, identifying significant deviations from the Monte Carlo simulation. The importance of systematic uncertainties is outlined, which are taken into account rigorously within the algorithm. Possible detector effects and generator issues, as well as models involving Supersymmetry and new heavy gauge bosons are used as an input to the search algorithm.
Multiconfiguration calculations of electronic isotope shift factors in Al i
NASA Astrophysics Data System (ADS)
Filippin, Livio; Beerwerth, Randolf; Ekman, Jörgen; Fritzsche, Stephan; Godefroid, Michel; Jönsson, Per
2016-12-01
The present work reports results from systematic multiconfiguration Dirac-Hartree-Fock calculations of electronic isotope shift factors for a set of transitions between low-lying levels of neutral aluminium. These electronic quantities together with observed isotope shifts between different pairs of isotopes provide the changes in mean-square charge radii of the atomic nuclei. Two computational approaches are adopted for the estimation of the mass- and field-shift factors. Within these approaches, different models for electron correlation are explored in a systematic way to determine a reliable computational strategy and to estimate theoretical error bars of the isotope shift factors.
Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David
2015-01-01
Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908
Small-Area Estimation of Spatial Access to Care and Its Implications for Policy.
Gentili, Monica; Isett, Kim; Serban, Nicoleta; Swann, Julie
2015-10-01
Local or small-area estimates to capture emerging trends across large geographic regions are critical in identifying and addressing community-level health interventions. However, they are often unavailable due to lack of analytic capabilities in compiling and integrating extensive datasets and complementing them with the knowledge about variations in state-level health policies. This study introduces a modeling approach for small-area estimation of spatial access to pediatric primary care that is data "rich" and mathematically rigorous, integrating data and health policy in a systematic way. We illustrate the sensitivity of the model to policy decision making across large geographic regions by performing a systematic comparison of the estimates at the census tract and county levels for Georgia and California. Our results show the proposed approach is able to overcome limitations of other existing models by capturing patient and provider preferences and by incorporating possible changes in health policies. The primary finding is systematic underestimation of spatial access, and inaccurate estimates of disparities across population and across geography at the county level with respect to those at the census tract level with implications on where to focus and which type of interventions to consider.
Systematic approaches to toxicology in the zebrafish.
Peterson, Randall T; Macrae, Calum A
2012-01-01
As the current paradigms of drug discovery evolve, it has become clear that a more comprehensive understanding of the interactions between small molecules and organismal biology will be vital. The zebrafish is emerging as a complement to existing in vitro technologies and established preclinical in vivo models that can be scaled for high-throughput. In this review, we highlight the current status of zebrafish toxicology studies, identify potential future niches for the model in the drug development pipeline, and define the hurdles that must be overcome as zebrafish technologies are refined for systematic toxicology.
A continued fraction resummation form of bath relaxation effect in the spin-boson model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Zhihao; Tang, Zhoufei; Wu, Jianlan, E-mail: jianlanwu@zju.edu.cn
2015-02-28
In the spin-boson model, a continued fraction form is proposed to systematically resum high-order quantum kinetic expansion (QKE) rate kernels, accounting for the bath relaxation effect beyond the second-order perturbation. In particular, the analytical expression of the sixth-order QKE rate kernel is derived for resummation. With higher-order correction terms systematically extracted from higher-order rate kernels, the resummed quantum kinetic expansion approach in the continued fraction form extends the Pade approximation and can fully recover the exact quantum dynamics as the expansion order increases.
Guastaferro, Kate; Miller, Katy; Shanley Chatham, Jenelle R.; Whitaker, Daniel J.; McGilly, Kate; Lutzker, John R.
2017-01-01
An effective approach in early intervention for children and families, including child maltreatment prevention, is home-based services. Though several evidence-based programs exist, they are often grouped together, despite having different foci. This paper describes an ongoing cluster randomized trial systematically braiding two evidence-based home-based models, SafeCare® and Parents as Teachers (PAT)®, to better meet the needs of families at-risk. We describe the methodology for braiding model implementation and curriculum, specifically focusing on how structured qualitative feedback from pilot families and providers was used to create the braided curriculum and implementation. Systematic braiding of two models at the implementation and curriculum levels is a mechanism that has the potential to meet the more comprehensive needs of families at-risk for maltreatment. PMID:27870760
Error analysis and system optimization of non-null aspheric testing system
NASA Astrophysics Data System (ADS)
Luo, Yongjie; Yang, Yongying; Liu, Dong; Tian, Chao; Zhuo, Yongmo
2010-10-01
A non-null aspheric testing system, which employs partial null lens (PNL for short) and reverse iterative optimization reconstruction (ROR for short) technique, is proposed in this paper. Based on system modeling in ray tracing software, the parameter of each optical element is optimized and this makes system modeling more precise. Systematic error of non-null aspheric testing system is analyzed and can be categorized into two types, the error due to surface parameters of PNL in the system modeling and the rest from non-null interferometer by the approach of error storage subtraction. Experimental results show that, after systematic error is removed from testing result of non-null aspheric testing system, the aspheric surface is precisely reconstructed by ROR technique and the consideration of systematic error greatly increase the test accuracy of non-null aspheric testing system.
NASA Astrophysics Data System (ADS)
Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.
2016-05-01
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.
Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers
Sun, Ting; Xing, Fei; You, Zheng
2013-01-01
The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers. PMID:23567527
ERIC Educational Resources Information Center
Stern, Carolyn; Keislar, Evan
In an attempt to explore a systematic approach to language expansion and improved sentence structure, echoic and modeling procedures for language instruction were compared. Four hypotheses were formulated: (1) children who use modeling procedures will produce better structured sentences than children who use echoic prompting, (2) both echoic and…
An investigation of modelling and design for software service applications.
Anjum, Maria; Budgen, David
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.
Leonidou, Chrysanthi; Panayiotou, Georgia
2018-08-01
According to the cognitive-behavioral model, illness anxiety is developed and maintained through biased processing of health-threatening information and maladaptive responses to such information. This study is a systematic review of research that attempted to validate central tenets of the cognitive-behavioral model regarding etiological and maintenance mechanisms in illness anxiety. Sixty-two studies, including correlational and experimental designs, were identified through a systematic search of databases and were evaluated for their quality. Outcomes were synthesized following a qualitative thematic approach under categories of theoretically driven mechanisms derived from the cognitive-behavioral model: attention, memory and interpretation biases, perceived awareness and inaccuracy in perception of somatic sensations, negativity bias, emotion dysregulation, and behavioral avoidance. Findings partly support the cognitive-behavioral model, but several of its hypothetical mechanisms only receive weak support due to the scarcity of relevant studies. Directions for future research are suggested based on identified gaps in the existing literature. Copyright © 2018 Elsevier Inc. All rights reserved.
Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.
Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir
2013-10-31
Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.
2014-01-01
Background There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. Methods This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. Results The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. Conclusions The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. Trial registration number PROSPERO registration number: CRD42013004037. PMID:24885751
Baxter, Susan K; Blank, Lindsay; Woods, Helen Buckley; Payne, Nick; Rimmer, Melanie; Goyder, Elizabeth
2014-05-10
There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. PROSPERO registration number: CRD42013004037.
Diffusion of Super-Gaussian Profiles
ERIC Educational Resources Information Center
Rosenberg, C.-J.; Anderson, D.; Desaix, M.; Johannisson, P.; Lisak, M.
2007-01-01
The present analysis describes an analytically simple and systematic approximation procedure for modelling the free diffusive spreading of initially super-Gaussian profiles. The approach is based on a self-similar ansatz for the evolution of the diffusion profile, and the parameter functions involved in the modelling are determined by suitable…
The ESA21 Project: A Model for Civic Engagement
ERIC Educational Resources Information Center
Pratte, John; Laposata, Matt
2005-01-01
There have been many systematic approaches to solving the problem of how to make science courses interesting to students. One that is currently receiving attention in the sciences is the use of civic engagement within the classroom. This approach works well in small enrollment courses, but it is logistically difficult to implement in large…
2006-03-01
1989) present an innovative approach to quantifying risk . Their approach is to utilize linguistic terms or words and to systematically assign a...Together, these 15 factors were a first step in the problem of quantifying risk . These factors, and the four categories within which they fall, are
Improvements in safety testing of lithium cells
NASA Astrophysics Data System (ADS)
Stinebring, R. C.; Krehl, P.
1985-07-01
A systematic approach was developed for evaluating the basic safety parameters of high power lithium soluble cathode cells. This approach consists of performing a series of tests on each cell model during the design, prototype and production phases. Abusive testing is performed in a facility where maximum protection is given to test personnel.
Improvements in safety testing of lithium cells
NASA Technical Reports Server (NTRS)
Stinebring, R. C.; Krehl, P.
1985-01-01
A systematic approach was developed for evaluating the basic safety parameters of high power lithium soluble cathode cells. This approach consists of performing a series of tests on each cell model during the design, prototype and production phases. Abusive testing is performed in a facility where maximum protection is given to test personnel.
A Systematic Approach for Real-Time Operator Functional State Assessment
NASA Technical Reports Server (NTRS)
Zhang, Guangfan; Wang, Wei; Pepe, Aaron; Xu, Roger; Schnell, Thomas; Anderson, Nick; Heitkamp, Dean; Li, Jiang; Li, Feng; McKenzie, Frederick
2012-01-01
A task overload condition often leads to high stress for an operator, causing performance degradation and possibly disastrous consequences. Just as dangerous, with automated flight systems, an operator may experience a task underload condition (during the en-route flight phase, for example), becoming easily bored and finding it difficult to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, the disengaged operator may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss an approach for Operator Functional State (OFS) monitoring in a typical aviation environment. A systematic ground truth finding procedure has been designed based on subjective evaluations, performance measures, and strong physiological indicators. The derived OFS ground truth is continuous in time compared to a very sparse estimation of OFS based on an expert review or subjective evaluations. It can capture the variations of OFS during a mission to better guide through the training process of the OFS assessment model. Furthermore, an OFS assessment model framework based on advanced machine learning techniques was designed and the systematic approach was then verified and validated with experimental data collected in a high fidelity Boeing 737 simulator. Preliminary results show highly accurate engagement/disengagement detection making it suitable for real-time applications to assess pilot engagement.
New Methodology for Estimating Fuel Economy by Vehicle Class
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling
2011-01-01
Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less
Research utilization in the building industry: decision model and preliminary assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, R.L.; Johnson, D.R.; Smith, S.A.
1985-10-01
The Research Utilization Program was conceived as a far-reaching means for managing the interactions of the private sector and the federal research sector as they deal with energy conservation in buildings. The program emphasizes a private-public partnership in planning a research agenda and in applying the results of ongoing and completed research. The results of this task support the hypothesis that the transfer of R and D results to the buildings industry can be accomplished more efficiently and quickly by a systematic approach to technology transfer. This systematic approach involves targeting decision makers, assessing research and information needs, properly formatingmore » information, and then transmitting the information through trusted channels. The purpose of this report is to introduce elements of a market-oriented knowledge base, which would be useful to the Building Systems Division, the Office of Buildings and Community Systems and their associated laboratories in managing a private-public research partnership on a rational systematic basis. This report presents conceptual models and data bases that can be used in formulating a technology transfer strategy and in planning technology transfer programs.« less
Koch, Marcus A.; German, Dmitry A.
2013-01-01
Taxonomy and systematics provide the names and evolutionary framework for any biological study. Without these names there is no access to a biological context of the evolutionary processes which gave rise to a given taxon: close relatives and sister species (hybridization), more distantly related taxa (ancestral states), for example. This is not only true for the single species a research project is focusing on, but also for its relatives, which might be selected for comparative approaches and future research. Nevertheless, taxonomical and systematic knowledge is rarely fully explored and considered across biological disciplines. One would expect the situation to be more developed with model organisms such as Noccaea, Arabidopsis, Schrenkiella and Eutrema (Thellungiella). However, we show the reverse. Using Arabidopsis halleri and Noccaea caerulescens, two model species among metal accumulating taxa, we summarize and reflect past taxonomy and systematics of Arabidopsis and Noccaea and provide a modern synthesis of taxonomic, systematic and evolutionary perspectives. The same is presented for several species of Eutrema s. l. and Schrenkiella recently appeared as models for studying stress tolerance in plants and widely known under the name Thellungiella. PMID:23914192
Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O
2016-06-01
Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.
An investigation of modelling and design for software service applications
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905
Some aspects of modeling hydrocarbon oxidation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gal, D.; Botar, L.; Danoczy, E.
1981-01-01
A modeling procedure for the study of hydrocarbon oxidation is suggested, and its effectiveness for the oxidation of ethylbenzene is demonstrated. As a first step in modeling, systematization involves compilation of possible mechanisms. Then, by introduction of the concept of kinetic communication, the chaotic set of possible mechanisms is systematized into a network. Experimentation serves both as feedback to the systematic arrangement of information and source of new information. Kinetic treatment of the possible mechanism has been accomplished by two different approaches: by classical inductive calculations starting with a small mechanism and using kinetic approximations, and by computer simulation. Themore » authors have compiled a so-called Main Contributory Mechanism, involving processes - within the possible mechanism - which contribute basically to the formation and consumption of the intermediates, to the consumption of the starting compounds and to the formation of the end products. 24 refs.« less
ERIC Educational Resources Information Center
Bhola, H. S.
The definitional and conceptual structure of the Esman model of institution building is described in great detail, emphasizing its philosophic and process assumptions and its latent dynamics. The author systematically critiques the Esman model in terms of its (1) specificity to the universe of institution building, (2) generalizability across…
ERIC Educational Resources Information Center
Cheung, Mike W. L.; Chan, Wai
2009-01-01
Structural equation modeling (SEM) is widely used as a statistical framework to test complex models in behavioral and social sciences. When the number of publications increases, there is a need to systematically synthesize them. Methodology of synthesizing findings in the context of SEM is known as meta-analytic SEM (MASEM). Although correlation…
ERIC Educational Resources Information Center
Chieu, Vu Minh; Luengo, Vanda; Vadcard, Lucile; Tonetti, Jerome
2010-01-01
Cognitive approaches have been used for student modeling in intelligent tutoring systems (ITSs). Many of those systems have tackled fundamental subjects such as mathematics, physics, and computer programming. The change of the student's cognitive behavior over time, however, has not been considered and modeled systematically. Furthermore, the…
ERIC Educational Resources Information Center
Wiio, Osmo A.
A more unified approach to communication theory can evolve through systems modeling of information theory, communication modes, and mass media operations. Such systematic analysis proposes, as is the case care here, that information models be based upon combinations of energy changes and exchanges and changes in receiver systems. The mass media is…
Leaf area index uncertainty estimates for model-data fusion applications
Andrew D. Richardson; D. Bryan Dail; D.Y. Hollinger
2011-01-01
Estimates of data uncertainties are required to integrate different observational data streams as model constraints using model-data fusion. We describe an approach with which random and systematic uncertainties in optical measurements of leaf area index [LAI] can be quantified. We use data from a measurement campaign at the spruce-dominated Howland Forest AmeriFlux...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.
Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less
Toward Systematic Study of the History and Foundations of Literacy
ERIC Educational Resources Information Center
King, James R.; Stahl, Norman A.
2012-01-01
This study of a literacy course begins with methodological approaches useful in the historical study of the literacy profession, its practices, beliefs, and participants. A model course is presented via "moments" in the history of literacy. Results from implementations of the model course are also presented.
Classroom Crisis Intervention through Contracting: A Moral Development Model.
ERIC Educational Resources Information Center
Smaby, Marlowe H.; Tamminen, Armas W.
1981-01-01
A counselor can arbitrate problem situations using a systematic approach to classroom intervention which includes meetings with the teacher and students. This crisis intervention model based on moral development can be more effective than reliance on guidance activities disconnected from the actual classroom settings where the problems arise.…
Practical Applications of Response-to-Intervention Research
ERIC Educational Resources Information Center
Griffiths, Amy-Jane; VanDerHeyden, Amanda M.; Parson, Lorien B.; Burns, Matthew K.
2006-01-01
Several approaches to response to intervention (RTI) described in the literature could be blended into an RTI model that would be effective in the schools. An effective RTI model should employ three fundamental variables: (a) systematic data collection to identify students in need, (b) effective implementation of interventions for adequate…
This case study examines how systematic planning, an evolving conceptual site model (CSM), dynamic work strategies, and real time measurement technologies can be used to unravel complex contaminant distribution patterns...
A Qualitative Approach to Portfolios: The Early Assessment for Exceptional Potential Model.
ERIC Educational Resources Information Center
Shaklee, Beverly D.; Viechnicki, Karen J.
1995-01-01
The Early Assessment for Exceptional Potential portfolio assessment model assesses children as exceptional learners, users, generators, and pursuers of knowledge. It is based on use of authentic learning opportunities; interaction of assessment, curriculum, and instruction; multiple criteria derived from multiple sources; and systematic teacher…
A Systematic Approach to Error Free Telemetry
2017-06-28
A SYSTEMATIC APPROACH TO ERROR FREE TELEMETRY 412TW-TIM-17-03 DISTRIBUTION A: Approved for public release. Distribution is...Systematic Approach to Error-Free Telemetry) was submitted by the Commander, 412th Test Wing, Edwards AFB, California 93524. Prepared by...Technical Information Memorandum 3. DATES COVERED (From - Through) February 2016 4. TITLE AND SUBTITLE A Systematic Approach to Error-Free
Ogurtsova, Katherine; Heise, Thomas L; Linnenkamp, Ute; Dintsios, Charalabos-Markos; Lhachimi, Stefan K; Icks, Andrea
2017-12-29
Type 2 diabetes mellitus (T2DM), a highly prevalent chronic disease, puts a large burden on individual health and health care systems. Computer simulation models, used to evaluate the clinical and economic effectiveness of various interventions to handle T2DM, have become a well-established tool in diabetes research. Despite the broad consensus about the general importance of validation, especially external validation, as a crucial instrument of assessing and controlling for the quality of these models, there are no systematic reviews comparing such validation of diabetes models. As a result, the main objectives of this systematic review are to identify and appraise the different approaches used for the external validation of existing models covering the development and progression of T2DM. We will perform adapted searches by applying respective search strategies to identify suitable studies from 14 electronic databases. Retrieved study records will be included or excluded based on predefined eligibility criteria as defined in this protocol. Among others, a publication filter will exclude studies published before 1995. We will run abstract and full text screenings and then extract data from all selected studies by filling in a predefined data extraction spreadsheet. We will undertake a descriptive, narrative synthesis of findings to address the study objectives. We will pay special attention to aspects of quality of these models in regard to the external validation based upon ISPOR and ADA recommendations as well as Mount Hood Challenge reports. All critical stages within the screening, data extraction and synthesis processes will be conducted by at least two authors. This protocol adheres to PRISMA and PRISMA-P standards. The proposed systematic review will provide a broad overview of the current practice in the external validation of models with respect to T2DM incidence and progression in humans built on simulation techniques. PROSPERO CRD42017069983 .
Active Learning to Understand Infectious Disease Models and Improve Policy Making
Vladislavleva, Ekaterina; Broeckhove, Jan; Beutels, Philippe; Hens, Niel
2014-01-01
Modeling plays a major role in policy making, especially for infectious disease interventions but such models can be complex and computationally intensive. A more systematic exploration is needed to gain a thorough systems understanding. We present an active learning approach based on machine learning techniques as iterative surrogate modeling and model-guided experimentation to systematically analyze both common and edge manifestations of complex model runs. Symbolic regression is used for nonlinear response surface modeling with automatic feature selection. First, we illustrate our approach using an individual-based model for influenza vaccination. After optimizing the parameter space, we observe an inverse relationship between vaccination coverage and cumulative attack rate reinforced by herd immunity. Second, we demonstrate the use of surrogate modeling techniques on input-response data from a deterministic dynamic model, which was designed to explore the cost-effectiveness of varicella-zoster virus vaccination. We use symbolic regression to handle high dimensionality and correlated inputs and to identify the most influential variables. Provided insight is used to focus research, reduce dimensionality and decrease decision uncertainty. We conclude that active learning is needed to fully understand complex systems behavior. Surrogate models can be readily explored at no computational expense, and can also be used as emulator to improve rapid policy making in various settings. PMID:24743387
Active learning to understand infectious disease models and improve policy making.
Willem, Lander; Stijven, Sean; Vladislavleva, Ekaterina; Broeckhove, Jan; Beutels, Philippe; Hens, Niel
2014-04-01
Modeling plays a major role in policy making, especially for infectious disease interventions but such models can be complex and computationally intensive. A more systematic exploration is needed to gain a thorough systems understanding. We present an active learning approach based on machine learning techniques as iterative surrogate modeling and model-guided experimentation to systematically analyze both common and edge manifestations of complex model runs. Symbolic regression is used for nonlinear response surface modeling with automatic feature selection. First, we illustrate our approach using an individual-based model for influenza vaccination. After optimizing the parameter space, we observe an inverse relationship between vaccination coverage and cumulative attack rate reinforced by herd immunity. Second, we demonstrate the use of surrogate modeling techniques on input-response data from a deterministic dynamic model, which was designed to explore the cost-effectiveness of varicella-zoster virus vaccination. We use symbolic regression to handle high dimensionality and correlated inputs and to identify the most influential variables. Provided insight is used to focus research, reduce dimensionality and decrease decision uncertainty. We conclude that active learning is needed to fully understand complex systems behavior. Surrogate models can be readily explored at no computational expense, and can also be used as emulator to improve rapid policy making in various settings.
Effect of inventory method on niche models: random versus systematic error
Heather E. Lintz; Andrew N. Gray; Bruce McCune
2013-01-01
Data from large-scale biological inventories are essential for understanding and managing Earth's ecosystems. The Forest Inventory and Analysis Program (FIA) of the U.S. Forest Service is the largest biological inventory in North America; however, the FIA inventory recently changed from an amalgam of different approaches to a nationally-standardized approach in...
ERIC Educational Resources Information Center
Lee, Young S.
2014-01-01
The article focuses on a systematic approach to the instructional framework to incorporate three aspects of sustainable design. It also aims to provide an instruction model for sustainable design stressing a collective effort to advance knowledge creation as a community. It develops a framework conjoining the concept of integrated process in…
ERIC Educational Resources Information Center
Kamei-Hannan, Cheryl; Holbrook, M. Cay; Ricci, Leila A.
2012-01-01
Introduction: Response to intervention (RTI) has become widely recognized and used in education. Propelling its significance is its systematic and schoolwide approach and emphasis on using a problem-solving approach to providing appropriate instruction for each child. Children with visual impairments (that is, blindness and low vision) are…
Synthesis of Single-Case Experimental Data: A Comparison of Alternative Multilevel Approaches
ERIC Educational Resources Information Center
Ferron, John; Van den Noortgate, Wim; Beretvas, Tasha; Moeyaert, Mariola; Ugille, Maaike; Petit-Bois, Merlande; Baek, Eun Kyeng
2013-01-01
Single-case or single-subject experimental designs (SSED) are used to evaluate the effect of one or more treatments on a single case. Although SSED studies are growing in popularity, the results are in theory case-specific. One systematic and statistical approach for combining single-case data within and across studies is multilevel modeling. The…
Long-distance effects in B→ K^*ℓ ℓ from analyticity
NASA Astrophysics Data System (ADS)
Bobeth, Christoph; Chrzaszcz, Marcin; van Dyk, Danny; Virto, Javier
2018-06-01
We discuss a novel approach to systematically determine the dominant long-distance contribution to B→ K^*ℓ ℓ decays in the kinematic region where the dilepton invariant mass is below the open charm threshold. This approach provides the most consistent and reliable determination to date and can be used to compute Standard Model predictions for all observables of interest, including the kinematic region where the dilepton invariant mass lies between the J/ψ and the ψ (2S) resonances. We illustrate the power of our results by performing a New Physics fit to the Wilson coefficient C_9. This approach is systematically improvable from theoretical and experimental sides, and applies to other decay modes of the type B→ Vℓ ℓ , B→ Pℓ ℓ and B→ Vγ.
Incorporating time-delays in S-System model for reverse engineering genetic networks.
Chowdhury, Ahsan Raja; Chetty, Madhu; Vinh, Nguyen Xuan
2013-06-18
In any gene regulatory network (GRN), the complex interactions occurring amongst transcription factors and target genes can be either instantaneous or time-delayed. However, many existing modeling approaches currently applied for inferring GRNs are unable to represent both these interactions simultaneously. As a result, all these approaches cannot detect important interactions of the other type. S-System model, a differential equation based approach which has been increasingly applied for modeling GRNs, also suffers from this limitation. In fact, all S-System based existing modeling approaches have been designed to capture only instantaneous interactions, and are unable to infer time-delayed interactions. In this paper, we propose a novel Time-Delayed S-System (TDSS) model which uses a set of delay differential equations to represent the system dynamics. The ability to incorporate time-delay parameters in the proposed S-System model enables simultaneous modeling of both instantaneous and time-delayed interactions. Furthermore, the delay parameters are not limited to just positive integer values (corresponding to time stamps in the data), but can also take fractional values. Moreover, we also propose a new criterion for model evaluation exploiting the sparse and scale-free nature of GRNs to effectively narrow down the search space, which not only reduces the computation time significantly but also improves model accuracy. The evaluation criterion systematically adapts the max-min in-degrees and also systematically balances the effect of network accuracy and complexity during optimization. The four well-known performance measures applied to the experimental studies on synthetic networks with various time-delayed regulations clearly demonstrate that the proposed method can capture both instantaneous and delayed interactions correctly with high precision. The experiments carried out on two well-known real-life networks, namely IRMA and SOS DNA repair network in Escherichia coli show a significant improvement compared with other state-of-the-art approaches for GRN modeling.
A systematic petri net approach for multiple-scale modeling and simulation of biochemical processes.
Chen, Ming; Hu, Minjie; Hofestädt, Ralf
2011-06-01
A method to exploit hybrid Petri nets for modeling and simulating biochemical processes in a systematic way was introduced. Both molecular biology and biochemical engineering aspects are manipulated. With discrete and continuous elements, the hybrid Petri nets can easily handle biochemical factors such as metabolites concentration and kinetic behaviors. It is possible to translate both molecular biological behavior and biochemical processes workflow into hybrid Petri nets in a natural manner. As an example, penicillin production bioprocess is modeled to illustrate the concepts of the methodology. Results of the dynamic of production parameters in the bioprocess were simulated and observed diagrammatically. Current problems and post-genomic perspectives were also discussed.
Systematics of first 2+ state g factors around mass 80
NASA Astrophysics Data System (ADS)
Mertzimekis, T. J.; Stuchbery, A. E.; Benczer-Koller, N.; Taylor, M. J.
2003-11-01
The systematics of the first 2+ state g factors in the mass 80 region are investigated in terms of an IBM-II analysis, a pairing-corrected geometrical model, and a shell-model approach. Subshell closure effects at N=38 and overall trends were examined using IBM-II. A large-space shell-model calculation was successful in describing the behavior for N=48 and N=50 nuclei, where single-particle features are prominent. A schematic truncated-space calculation was applied to the lighter isotopes. The variations of the effective boson g factors are discussed in connection with the role of F -spin breaking, and comparisons are made between the mass 80 and mass 180 regions.
Genetic Network Inference: From Co-Expression Clustering to Reverse Engineering
NASA Technical Reports Server (NTRS)
Dhaeseleer, Patrik; Liang, Shoudan; Somogyi, Roland
2000-01-01
Advances in molecular biological, analytical, and computational technologies are enabling us to systematically investigate the complex molecular processes underlying biological systems. In particular, using high-throughput gene expression assays, we are able to measure the output of the gene regulatory network. We aim here to review datamining and modeling approaches for conceptualizing and unraveling the functional relationships implicit in these datasets. Clustering of co-expression profiles allows us to infer shared regulatory inputs and functional pathways. We discuss various aspects of clustering, ranging from distance measures to clustering algorithms and multiple-duster memberships. More advanced analysis aims to infer causal connections between genes directly, i.e., who is regulating whom and how. We discuss several approaches to the problem of reverse engineering of genetic networks, from discrete Boolean networks, to continuous linear and non-linear models. We conclude that the combination of predictive modeling with systematic experimental verification will be required to gain a deeper insight into living organisms, therapeutic targeting, and bioengineering.
Raftery, James; Hanney, Steve; Greenhalgh, Trish; Glover, Matthew; Blatch-Jones, Amanda
2016-10-01
This report reviews approaches and tools for measuring the impact of research programmes, building on, and extending, a 2007 review. (1) To identify the range of theoretical models and empirical approaches for measuring the impact of health research programmes; (2) to develop a taxonomy of models and approaches; (3) to summarise the evidence on the application and use of these models; and (4) to evaluate the different options for the Health Technology Assessment (HTA) programme. We searched databases including Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature and The Cochrane Library from January 2005 to August 2014. This narrative systematic literature review comprised an update, extension and analysis/discussion. We systematically searched eight databases, supplemented by personal knowledge, in August 2014 through to March 2015. The literature on impact assessment has much expanded. The Payback Framework, with adaptations, remains the most widely used approach. It draws on different philosophical traditions, enhancing an underlying logic model with an interpretative case study element and attention to context. Besides the logic model, other ideal type approaches included constructionist, realist, critical and performative. Most models in practice drew pragmatically on elements of several ideal types. Monetisation of impact, an increasingly popular approach, shows a high return from research but relies heavily on assumptions about the extent to which health gains depend on research. Despite usually requiring systematic reviews before funding trials, the HTA programme does not routinely examine the impact of those trials on subsequent systematic reviews. The York/Patient-Centered Outcomes Research Institute and the Grading of Recommendations Assessment, Development and Evaluation toolkits provide ways of assessing such impact, but need to be evaluated. The literature, as reviewed here, provides very few instances of a randomised trial playing a major role in stopping the use of a new technology. The few trials funded by the HTA programme that may have played such a role were outliers. The findings of this review support the continued use of the Payback Framework by the HTA programme. Changes in the structure of the NHS, the development of NHS England and changes in the National Institute for Health and Care Excellence's remit pose new challenges for identifying and meeting current and future research needs. Future assessments of the impact of the HTA programme will have to take account of wider changes, especially as the Research Excellence Framework (REF), which assesses the quality of universities' research, seems likely to continue to rely on case studies to measure impact. The HTA programme should consider how the format and selection of case studies might be improved to aid more systematic assessment. The selection of case studies, such as in the REF, but also more generally, tends to be biased towards high-impact rather than low-impact stories. Experience for other industries indicate that much can be learnt from the latter. The adoption of researchfish ® (researchfish Ltd, Cambridge, UK) by most major UK research funders has implications for future assessments of impact. Although the routine capture of indexed research publications has merit, the degree to which researchfish will succeed in collecting other, non-indexed outputs and activities remains to be established. There were limitations in how far we could address challenges that faced us as we extended the focus beyond that of the 2007 review, and well beyond a narrow focus just on the HTA programme. Research funders can benefit from continuing to monitor and evaluate the impacts of the studies they fund. They should also review the contribution of case studies and expand work on linking trials to meta-analyses and to guidelines. The National Institute for Health Research HTA programme.
Begon, Mickaël; Andersen, Michael Skipper; Dumas, Raphaël
2018-03-01
Multibody kinematics optimization (MKO) aims to reduce soft tissue artefact (STA) and is a key step in musculoskeletal modeling. The objective of this review was to identify the numerical methods, their validation and performance for the estimation of the human joint kinematics using MKO. Seventy-four papers were extracted from a systematized search in five databases and cross-referencing. Model-derived kinematics were obtained using either constrained optimization or Kalman filtering to minimize the difference between measured (i.e., by skin markers, electromagnetic or inertial sensors) and model-derived positions and/or orientations. While hinge, universal, and spherical joints prevail, advanced models (e.g., parallel and four-bar mechanisms, elastic joint) have been introduced, mainly for the knee and shoulder joints. Models and methods were evaluated using: (i) simulated data based, however, on oversimplified STA and joint models; (ii) reconstruction residual errors, ranging from 4 mm to 40 mm; (iii) sensitivity analyses which highlighted the effect (up to 36 deg and 12 mm) of model geometrical parameters, joint models, and computational methods; (iv) comparison with other approaches (i.e., single body kinematics optimization and nonoptimized kinematics); (v) repeatability studies that showed low intra- and inter-observer variability; and (vi) validation against ground-truth bone kinematics (with errors between 1 deg and 22 deg for tibiofemoral rotations and between 3 deg and 10 deg for glenohumeral rotations). Moreover, MKO was applied to various movements (e.g., walking, running, arm elevation). Additional validations, especially for the upper limb, should be undertaken and we recommend a more systematic approach for the evaluation of MKO. In addition, further model development, scaling, and personalization methods are required to better estimate the secondary degrees-of-freedom (DoF).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2010-05-23
The increasing asymmetric nature of threats to the security, health and sustainable growth of our society requires that anticipatory reasoning become an everyday activity. Currently, the use of anticipatory reasoning is hindered by the lack of systematic methods for combining knowledge- and evidence-based models, integrating modeling algorithms, and assessing model validity, accuracy and utility. The workshop addresses these gaps with the intent of fostering the creation of a community of interest on model integration and evaluation that may serve as an aggregation point for existing efforts and a launch pad for new approaches.
Hasse, Katelyn; Neylon, John; Sheng, Ke; Santhanam, Anand P
2016-03-01
Breast elastography is a critical tool for improving the targeted radiotherapy treatment of breast tumors. Current breast radiotherapy imaging protocols only involve prone and supine CT scans. There is a lack of knowledge on the quantitative accuracy with which breast elasticity can be systematically measured using only prone and supine CT datasets. The purpose of this paper is to describe a quantitative elasticity estimation technique for breast anatomy using only these supine/prone patient postures. Using biomechanical, high-resolution breast geometry obtained from CT scans, a systematic assessment was performed in order to determine the feasibility of this methodology for clinically relevant elasticity distributions. A model-guided inverse analysis approach is presented in this paper. A graphics processing unit (GPU)-based linear elastic biomechanical model was employed as a forward model for the inverse analysis with the breast geometry in a prone position. The elasticity estimation was performed using a gradient-based iterative optimization scheme and a fast-simulated annealing (FSA) algorithm. Numerical studies were conducted to systematically analyze the feasibility of elasticity estimation. For simulating gravity-induced breast deformation, the breast geometry was anchored at its base, resembling the chest-wall/breast tissue interface. Ground-truth elasticity distributions were assigned to the model, representing tumor presence within breast tissue. Model geometry resolution was varied to estimate its influence on convergence of the system. A priori information was approximated and utilized to record the effect on time and accuracy of convergence. The role of the FSA process was also recorded. A novel error metric that combined elasticity and displacement error was used to quantify the systematic feasibility study. For the authors' purposes, convergence was set to be obtained when each voxel of tissue was within 1 mm of ground-truth deformation. The authors' analyses showed that a ∼97% model convergence was systematically observed with no-a priori information. Varying the model geometry resolution showed no significant accuracy improvements. The GPU-based forward model enabled the inverse analysis to be completed within 10-70 min. Using a priori information about the underlying anatomy, the computation time decreased by as much as 50%, while accuracy improved from 96.81% to 98.26%. The use of FSA was observed to allow the iterative estimation methodology to converge more precisely. By utilizing a forward iterative approach to solve the inverse elasticity problem, this work indicates the feasibility and potential of the fast reconstruction of breast tissue elasticity using supine/prone patient postures.
Wei, Zhengxian; Song, Min; Yin, Guisheng; Wang, Hongbin; Ma, Xuefei; Song, Houbing
2017-07-12
Underwater wireless sensor networks (UWSNs) have become a new hot research area. However, due to the work dynamics and harsh ocean environment, how to obtain an UWSN with the best systematic performance while deploying as few sensor nodes as possible and setting up self-adaptive networking is an urgent problem that needs to be solved. Consequently, sensor deployment, networking, and performance calculation of UWSNs are challenging issues, hence the study in this paper centers on this topic and three relevant methods and models are put forward. Firstly, the normal body-centered cubic lattice to cross body-centered cubic lattice (CBCL) has been improved, and a deployment process and topology generation method are built. Then most importantly, a cross deployment networking method (CDNM) for UWSNs suitable for the underwater environment is proposed. Furthermore, a systematic quar-performance calculation model (SQPCM) is proposed from an integrated perspective, in which the systematic performance of a UWSN includes coverage, connectivity, durability and rapid-reactivity. Besides, measurement models are established based on the relationship between systematic performance and influencing parameters. Finally, the influencing parameters are divided into three types, namely, constraint parameters, device performance and networking parameters. Based on these, a networking parameters adjustment method (NPAM) for optimized systematic performance of UWSNs has been presented. The simulation results demonstrate that the approach proposed in this paper is feasible and efficient in networking and performance calculation of UWSNs.
Wei, Zhengxian; Song, Min; Yin, Guisheng; Wang, Hongbin; Ma, Xuefei
2017-01-01
Underwater wireless sensor networks (UWSNs) have become a new hot research area. However, due to the work dynamics and harsh ocean environment, how to obtain an UWSN with the best systematic performance while deploying as few sensor nodes as possible and setting up self-adaptive networking is an urgent problem that needs to be solved. Consequently, sensor deployment, networking, and performance calculation of UWSNs are challenging issues, hence the study in this paper centers on this topic and three relevant methods and models are put forward. Firstly, the normal body-centered cubic lattice to cross body-centered cubic lattice (CBCL) has been improved, and a deployment process and topology generation method are built. Then most importantly, a cross deployment networking method (CDNM) for UWSNs suitable for the underwater environment is proposed. Furthermore, a systematic quar-performance calculation model (SQPCM) is proposed from an integrated perspective, in which the systematic performance of a UWSN includes coverage, connectivity, durability and rapid-reactivity. Besides, measurement models are established based on the relationship between systematic performance and influencing parameters. Finally, the influencing parameters are divided into three types, namely, constraint parameters, device performance and networking parameters. Based on these, a networking parameters adjustment method (NPAM) for optimized systematic performance of UWSNs has been presented. The simulation results demonstrate that the approach proposed in this paper is feasible and efficient in networking and performance calculation of UWSNs. PMID:28704959
NASA Astrophysics Data System (ADS)
Vu, Tuan V.; Papavassiliou, Dimitrios V.
2018-05-01
In order to investigate the interfacial region between oil and water with the presence of surfactants using coarse-grained computations, both the interaction between different components of the system and the number of surfactant molecules present at the interface play an important role. However, in many prior studies, the amount of surfactants used was chosen rather arbitrarily. In this work, a systematic approach to develop coarse-grained models for anionic surfactants (such as sodium dodecyl sulfate) and nonionic surfactants (such as octaethylene glycol monododecyl ether) in oil-water interfaces is presented. The key is to place the theoretically calculated number of surfactant molecules on the interface at the critical micelle concentration. Based on this approach, the molecular description of surfactants and the effects of various interaction parameters on the interfacial tension are investigated. The results indicate that the interfacial tension is affected mostly by the head-water and tail-oil interaction. Even though the procedure presented herein is used with dissipative particle dynamics models, it can be applied for other coarse-grained methods to obtain the appropriate set of parameters (or force fields) to describe the surfactant behavior on the oil-water interface.
An Approach to Remove the Systematic Bias from the Storm Surge forecasts in the Venice Lagoon
NASA Astrophysics Data System (ADS)
Canestrelli, A.
2017-12-01
In this work a novel approach is proposed for removing the systematic bias from the storm surge forecast computed by a two-dimensional shallow-water model. The model covers both the Adriatic and Mediterranean seas and provides the forecast at the entrance of the Venice Lagoon. The wind drag coefficient at the water-air interface is treated as a calibration parameter, with a different value for each range of wind velocities and wind directions. This sums up to a total of 16-64 parameters to be calibrated, depending on the chosen resolution. The best set of parameters is determined by means of an optimization procedure, which minimizes the RMS error between measured and modeled water level in Venice for the period 2011-2015. It is shown that a bias is present, for which the peaks of wind velocities provided by the weather forecast are largely underestimated, and that the calibration procedure removes this bias. When the calibrated model is used to reproduce events not included in the calibration dataset, the forecast error is strongly reduced, thus confirming the quality of our procedure. The proposed approach it is not site-specific and could be applied to different situations, such as storm surges caused by intense hurricanes.
Yong, Alan K.; Hough, Susan E.; Iwahashi, Junko; Braverman, Amy
2012-01-01
We present an approach based on geomorphometry to predict material properties and characterize site conditions using the VS30 parameter (time‐averaged shear‐wave velocity to a depth of 30 m). Our framework consists of an automated terrain classification scheme based on taxonomic criteria (slope gradient, local convexity, and surface texture) that systematically identifies 16 terrain types from 1‐km spatial resolution (30 arcsec) Shuttle Radar Topography Mission digital elevation models (SRTM DEMs). Using 853 VS30 values from California, we apply a simulation‐based statistical method to determine the mean VS30 for each terrain type in California. We then compare the VS30 values with models based on individual proxies, such as mapped surface geology and topographic slope, and show that our systematic terrain‐based approach consistently performs better than semiempirical estimates based on individual proxies. To further evaluate our model, we apply our California‐based estimates to terrains of the contiguous United States. Comparisons of our estimates with 325 VS30 measurements outside of California, as well as estimates based on the topographic slope model, indicate our method to be statistically robust and more accurate. Our approach thus provides an objective and robust method for extending estimates of VS30 for regions where in situ measurements are sparse or not readily available.
A Multilevel Bifactor Approach to Construct Validation of Mixed-Format Scales
ERIC Educational Resources Information Center
Wang, Yan; Kim, Eun Sook; Dedrick, Robert F.; Ferron, John M.; Tan, Tony
2018-01-01
Wording effects associated with positively and negatively worded items have been found in many scales. Such effects may threaten construct validity and introduce systematic bias in the interpretation of results. A variety of models have been applied to address wording effects, such as the correlated uniqueness model and the correlated traits and…
Classroom Strategies Coaching Model: Integration of Formative Assessment and Instructional Coaching
ERIC Educational Resources Information Center
Reddy, Linda A.; Dudek, Christopher M.; Lekwa, Adam
2017-01-01
This article describes the theory, key components, and empirical support for the Classroom Strategies Coaching (CSC) Model, a data-driven coaching approach that systematically integrates data from multiple observations to identify teacher practice needs and goals, design practice plans, and evaluate progress towards goals. The primary aim of the…
A Category Adjustment Approach to Memory for Spatial Location in Natural Scenes
ERIC Educational Resources Information Center
Holden, Mark P.; Curby, Kim M.; Newcombe, Nora S.; Shipley, Thomas F.
2010-01-01
Memories for spatial locations often show systematic errors toward the central value of the surrounding region. This bias has been explained using a Bayesian model in which fine-grained and categorical information are combined (Huttenlocher, Hedges, & Duncan, 1991). However, experiments testing this model have largely used locations contained in…
An effective model for ergonomic optimization applied to a new automotive assembly line
NASA Astrophysics Data System (ADS)
Duraccio, Vincenzo; Elia, Valerio; Forcina, Antonio
2016-06-01
An efficient ergonomic optimization can lead to a significant improvement in production performance and a considerable reduction of costs. In the present paper new model for ergonomic optimization is proposed. The new approach is based on the criteria defined by National Institute of Occupational Safety and Health and, adapted to Italian legislation. The proposed model provides an ergonomic optimization, by analyzing ergonomic relations between manual work in correct conditions. The model includes a schematic and systematic analysis method of the operations, and identifies all possible ergonomic aspects to be evaluated. The proposed approach has been applied to an automotive assembly line, where the operation repeatability makes the optimization fundamental. The proposed application clearly demonstrates the effectiveness of the new approach.
Agent-Based Modeling in Public Health: Current Applications and Future Directions.
Tracy, Melissa; Cerdá, Magdalena; Keyes, Katherine M
2018-04-01
Agent-based modeling is a computational approach in which agents with a specified set of characteristics interact with each other and with their environment according to predefined rules. We review key areas in public health where agent-based modeling has been adopted, including both communicable and noncommunicable disease, health behaviors, and social epidemiology. We also describe the main strengths and limitations of this approach for questions with public health relevance. Finally, we describe both methodologic and substantive future directions that we believe will enhance the value of agent-based modeling for public health. In particular, advances in model validation, comparisons with other causal modeling procedures, and the expansion of the models to consider comorbidity and joint influences more systematically will improve the utility of this approach to inform public health research, practice, and policy.
Business model framework applications in health care: A systematic review.
Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl
2017-11-01
It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.
Variability extraction and modeling for product variants.
Linsbauer, Lukas; Lopez-Herrejon, Roberto Erick; Egyed, Alexander
2017-01-01
Fast-changing hardware and software technologies in addition to larger and more specialized customer bases demand software tailored to meet very diverse requirements. Software development approaches that aim at capturing this diversity on a single consolidated platform often require large upfront investments, e.g., time or budget. Alternatively, companies resort to developing one variant of a software product at a time by reusing as much as possible from already-existing product variants. However, identifying and extracting the parts to reuse is an error-prone and inefficient task compounded by the typically large number of product variants. Hence, more disciplined and systematic approaches are needed to cope with the complexity of developing and maintaining sets of product variants. Such approaches require detailed information about the product variants, the features they provide and their relations. In this paper, we present an approach to extract such variability information from product variants. It identifies traces from features and feature interactions to their implementation artifacts, and computes their dependencies. This work can be useful in many scenarios ranging from ad hoc development approaches such as clone-and-own to systematic reuse approaches such as software product lines. We applied our variability extraction approach to six case studies and provide a detailed evaluation. The results show that the extracted variability information is consistent with the variability in our six case study systems given by their variability models and available product variants.
An effectiveness analysis of healthcare systems using a systems theoretic approach.
Chuang, Sheuwen; Inder, Kerry
2009-10-24
The use of accreditation and quality measurement and reporting to improve healthcare quality and patient safety has been widespread across many countries. A review of the literature reveals no association between the accreditation system and the quality measurement and reporting systems, even when hospital compliance with these systems is satisfactory. Improvement of health care outcomes needs to be based on an appreciation of the whole system that contributes to those outcomes. The research literature currently lacks an appropriate analysis and is fragmented among activities. This paper aims to propose an integrated research model of these two systems and to demonstrate the usefulness of the resulting model for strategic research planning. To achieve these aims, a systematic integration of the healthcare accreditation and quality measurement/reporting systems is structured hierarchically. A holistic systems relationship model of the administration segment is developed to act as an investigation framework. A literature-based empirical study is used to validate the proposed relationships derived from the model. Australian experiences are used as evidence for the system effectiveness analysis and design base for an adaptive-control study proposal to show the usefulness of the system model for guiding strategic research. Three basic relationships were revealed and validated from the research literature. The systemic weaknesses of the accreditation system and quality measurement/reporting system from a system flow perspective were examined. The approach provides a system thinking structure to assist the design of quality improvement strategies. The proposed model discovers a fourth implicit relationship, a feedback between quality performance reporting components and choice of accreditation components that is likely to play an important role in health care outcomes. An example involving accreditation surveyors is developed that provides a systematic search for improving the impact of accreditation on quality of care and hence on the accreditation/performance correlation. There is clear value in developing a theoretical systems approach to achieving quality in health care. The introduction of the systematic surveyor-based search for improvements creates an adaptive-control system to optimize health care quality. It is hoped that these outcomes will stimulate further research in the development of strategic planning using systems theoretic approach for the improvement of quality in health care.
Validation of a common data model for active safety surveillance research
Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E
2011-01-01
Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893
Schofield, Paul N; Sundberg, John P; Hoehndorf, Robert; Gkoutos, Georgios V
2011-09-01
The systematic investigation of the phenotypes associated with genotypes in model organisms holds the promise of revealing genotype-phenotype relations directly and without additional, intermediate inferences. Large-scale projects are now underway to catalog the complete phenome of a species, notably the mouse. With the increasing amount of phenotype information becoming available, a major challenge that biology faces today is the systematic analysis of this information and the translation of research results across species and into an improved understanding of human disease. The challenge is to integrate and combine phenotype descriptions within a species and to systematically relate them to phenotype descriptions in other species, in order to form a comprehensive understanding of the relations between those phenotypes and the genotypes involved in human disease. We distinguish between two major approaches for comparative phenotype analyses: the first relies on evolutionary relations to bridge the species gap, while the other approach compares phenotypes directly. In particular, the direct comparison of phenotypes relies heavily on the quality and coherence of phenotype and disease databases. We discuss major achievements and future challenges for these databases in light of their potential to contribute to the understanding of the molecular mechanisms underlying human disease. In particular, we discuss how the use of ontologies and automated reasoning can significantly contribute to the analysis of phenotypes and demonstrate their potential for enabling translational research.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
A partial Hamiltonian approach for current value Hamiltonian systems
NASA Astrophysics Data System (ADS)
Naz, R.; Mahomed, F. M.; Chaudhry, Azam
2014-10-01
We develop a partial Hamiltonian framework to obtain reductions and closed-form solutions via first integrals of current value Hamiltonian systems of ordinary differential equations (ODEs). The approach is algorithmic and applies to many state and costate variables of the current value Hamiltonian. However, we apply the method to models with one control, one state and one costate variable to illustrate its effectiveness. The current value Hamiltonian systems arise in economic growth theory and other economic models. We explain our approach with the help of a simple illustrative example and then apply it to two widely used economic growth models: the Ramsey model with a constant relative risk aversion (CRRA) utility function and Cobb Douglas technology and a one-sector AK model of endogenous growth are considered. We show that our newly developed systematic approach can be used to deduce results given in the literature and also to find new solutions.
Biesta-Peters, Elisabeth G.; Reij, Martine W.; Zwietering, Marcel H.; Gorris, Leon G. M.
2011-01-01
This research aims to test the absence (gamma hypothesis) or occurrence of synergy between two growth-limiting factors, i.e., pH and water activity (aw), using a systematic approach for model selection. In this approach, preset criteria were used to evaluate the performance of models. Such a systematic approach is required to be confident in the correctness of the individual components of the combined (synergy) models. With Bacillus cereus F4810/72 as the test organism, estimated growth boundaries for the aw-lowering solutes NaCl, KCl, and glucose were 1.13 M, 1.13 M, and 1.68 M, respectively. The accompanying aw values were 0.954, 0.956, and 0.961, respectively, indicating that equal aw values result in similar effects on growth. Out of the 12 models evaluated using the preset criteria, the model of J. H. T. Luong (Biotechnol. Bioeng. 27:280–285, 1985) was the best model to describe the effect of aw on growth. This aw model and the previously selected pH model were combined into a gamma model and into two synergy models. None of the three models was able to describe the combined pH and aw conditions sufficiently well to satisfy the preset criteria. The best matches between predicted and experimental data were obtained with the gamma model, followed by the synergy model of Y. Le Marc et al. (Int. J. Food Microbiol. 73:219–237, 2002). No combination of models that was able to predict the impact of both individual and combined hurdles correctly could be found. Consequently, in this case we could not prove the existence of synergy nor falsify the gamma hypothesis. PMID:21705525
Latent class analysis of diagnostic science assessment data using Bayesian networks
NASA Astrophysics Data System (ADS)
Steedle, Jeffrey Thomas
2008-10-01
Diagnostic science assessments seek to draw inferences about student understanding by eliciting evidence about the mental models that underlie students' reasoning about physical systems. Measurement techniques for analyzing data from such assessments embody one of two contrasting assessment programs: learning progressions and facet-based assessments. Learning progressions assume that students have coherent theories that they apply systematically across different problem contexts. In contrast, the facet approach makes no such assumption, so students should not be expected to reason systematically across different problem contexts. A systematic comparison of these two approaches is of great practical value to assessment programs such as the National Assessment of Educational Progress as they seek to incorporate small clusters of related items in their tests for the purpose of measuring depth of understanding. This dissertation describes an investigation comparing learning progression and facet models. Data comprised student responses to small clusters of multiple-choice diagnostic science items focusing on narrow aspects of understanding of Newtonian mechanics. Latent class analysis was employed using Bayesian networks in order to model the relationship between students' science understanding and item responses. Separate models reflecting the assumptions of the learning progression and facet approaches were fit to the data. The technical qualities of inferences about student understanding resulting from the two models were compared in order to determine if either modeling approach was more appropriate. Specifically, models were compared on model-data fit, diagnostic reliability, diagnostic certainty, and predictive accuracy. In addition, the effects of test length were evaluated for both models in order to inform the number of items required to obtain adequately reliable latent class diagnoses. Lastly, changes in student understanding over time were studied with a longitudinal model in order to provide educators and curriculum developers with a sense of how students advance in understanding over the course of instruction. Results indicated that expected student response patterns rarely reflected the assumptions of the learning progression approach. That is, students tended not to systematically apply a coherent set of ideas across different problem contexts. Even those students expected to express scientifically-accurate understanding had substantial probabilities of reporting certain problematic ideas. The learning progression models failed to make as many substantively-meaningful distinctions among students as the facet models. In statistical comparisons, model-data fit was better for the facet model, but the models were quite comparable on all other statistical criteria. Studying the effects of test length revealed that approximately 8 items are needed to obtain adequate diagnostic certainty, but more items are needed to obtain adequate diagnostic reliability. The longitudinal analysis demonstrated that students either advance in their understanding (i.e., switch to the more advanced latent class) over a short period of instruction or stay at the same level. There was no significant relationship between the probability of changing latent classes and time between testing occasions. In all, this study is valuable because it provides evidence informing decisions about modeling and reporting on student understanding, it assesses the quality of measurement available from short clusters of diagnostic multiple-choice items, and it provides educators with knowledge of the paths that student may take as they advance from novice to expert understanding over the course of instruction.
Cummins, Carla A; McInerney, James O
2011-12-01
Current phylogenetic methods attempt to account for evolutionary rate variation across characters in a matrix. This is generally achieved by the use of sophisticated evolutionary models, combined with dense sampling of large numbers of characters. However, systematic biases and superimposed substitutions make this task very difficult. Model adequacy can sometimes be achieved at the cost of adding large numbers of free parameters, with each parameter being optimized according to some criterion, resulting in increased computation times and large variances in the model estimates. In this study, we develop a simple approach that estimates the relative evolutionary rate of each homologous character. The method that we describe uses the similarity between characters as a proxy for evolutionary rate. In this article, we work on the premise that if the character-state distribution of a homologous character is similar to many other characters, then this character is likely to be relatively slowly evolving. If the character-state distribution of a homologous character is not similar to many or any of the rest of the characters in a data set, then it is likely to be the result of rapid evolution. We show that in some test cases, at least, the premise can hold and the inferences are robust. Importantly, the method does not use a "starting tree" to make the inference and therefore is tree independent. We demonstrate that this approach can work as well as a maximum likelihood (ML) approach, though the ML method needs to have a known phylogeny, or at least a very good estimate of that phylogeny. We then demonstrate some uses for this method of analysis, including the improvement in phylogeny reconstruction for both deep-level and recent relationships and overcoming systematic biases such as base composition bias. Furthermore, we compare this approach to two well-established methods for reweighting or removing characters. These other methods are tree-based and we show that they can be systematically biased. We feel this method can be useful for phylogeny reconstruction, understanding evolutionary rate variation, and for understanding selection variation on different characters.
ERIC Educational Resources Information Center
Wolf, Peter
2007-01-01
In the fall of 2003, Teaching Support Services (TSS), a department at the University of Guelph, was approached by a faculty member in the department of food sciences. Professor Art Hill was interested in seeking support in systematically assessing the department's undergraduate curriculum and using that assessment to trigger further improvement of…
A systematic linear space approach to solving partially described inverse eigenvalue problems
NASA Astrophysics Data System (ADS)
Hu, Sau-Lon James; Li, Haujun
2008-06-01
Most applications of the inverse eigenvalue problem (IEP), which concerns the reconstruction of a matrix from prescribed spectral data, are associated with special classes of structured matrices. Solving the IEP requires one to satisfy both the spectral constraint and the structural constraint. If the spectral constraint consists of only one or few prescribed eigenpairs, this kind of inverse problem has been referred to as the partially described inverse eigenvalue problem (PDIEP). This paper develops an efficient, general and systematic approach to solve the PDIEP. Basically, the approach, applicable to various structured matrices, converts the PDIEP into an ordinary inverse problem that is formulated as a set of simultaneous linear equations. While solving simultaneous linear equations for model parameters, the singular value decomposition method is applied. Because of the conversion to an ordinary inverse problem, other constraints associated with the model parameters can be easily incorporated into the solution procedure. The detailed derivation and numerical examples to implement the newly developed approach to symmetric Toeplitz and quadratic pencil (including mass, damping and stiffness matrices of a linear dynamic system) PDIEPs are presented. Excellent numerical results for both kinds of problem are achieved under the situations that have either unique or infinitely many solutions.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Continuing education for general practice. 2. Systematic learning from experience.
al-Shehri, A; Stanley, I; Thomas, P
1993-01-01
Prompted by evidence that the recently-adopted arrangements for ongoing education among established general practitioners are unsatisfactory, the first of a pair of papers examined the theoretical basis of continuing education for general practice and proposed a model of self-directed learning in which the experience of established practitioners is connected, through the media of reading, reflection and audit, with competence for the role. In this paper a practical, systematic approach to self-directed learning by general practitioners is described based on the model. The contribution which appropriate participation in continuing medical education can make to enhancing learning from experience is outlined. PMID:8373649
NASA Astrophysics Data System (ADS)
Zammit-Mangion, Andrew; Stavert, Ann; Rigby, Matthew; Ganesan, Anita; Rayner, Peter; Cressie, Noel
2017-04-01
The Orbiting Carbon Observatory-2 (OCO-2) satellite was launched on 2 July 2014, and it has been a source of atmospheric CO2 data since September 2014. The OCO-2 dataset contains a number of variables, but the one of most interest for flux inversion has been the column-averaged dry-air mole fraction (in units of ppm). These global level-2 data offer the possibility of inferring CO2 fluxes at Earth's surface and tracking those fluxes over time. However, as well as having a component of random error, the OCO-2 data have a component of systematic error that is dependent on the instrument's mode, namely land nadir, land glint, and ocean glint. Our statistical approach to CO2-flux inversion starts with constructing a statistical model for the random and systematic errors with parameters that can be estimated from the OCO-2 data and possibly in situ sources from flasks, towers, and the Total Column Carbon Observing Network (TCCON). Dimension reduction of the flux field is achieved through the use of physical basis functions, while temporal evolution of the flux is captured by modelling the basis-function coefficients as a vector autoregressive process. For computational efficiency, flux inversion uses only three months of sensitivities of mole fraction to changes in flux, computed using MOZART; any residual variation is captured through the modelling of a stochastic process that varies smoothly as a function of latitude. The second stage of our statistical approach is to simulate from the posterior distribution of the basis-function coefficients and all unknown parameters given the data using a fully Bayesian Markov chain Monte Carlo (MCMC) algorithm. Estimates and posterior variances of the flux field can then be obtained straightforwardly from this distribution. Our statistical approach is different than others, as it simultaneously makes inference (and quantifies uncertainty) on both the error components' parameters and the CO2 fluxes. We compare it to more classical approaches through an Observing System Simulation Experiment (OSSE) on a global scale. By changing the size of the random and systematic errors in the OSSE, we can determine the corresponding spatial and temporal resolutions at which useful flux signals could be detected from the OCO-2 data.
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
Nonholonomic Hamiltonian Method for Meso-macroscale Simulations of Reacting Shocks
NASA Astrophysics Data System (ADS)
Fahrenthold, Eric; Lee, Sangyup
2015-06-01
The seamless integration of macroscale, mesoscale, and molecular scale models of reacting shock physics has been hindered by dramatic differences in the model formulation techniques normally used at different scales. In recent research the authors have developed the first unified discrete Hamiltonian approach to multiscale simulation of reacting shock physics. Unlike previous work, the formulation employs reacting themomechanical Hamiltonian formulations at all scales, including the continuum. Unlike previous work, the formulation employs a nonholonomic modeling approach to systematically couple the models developed at all scales. Example applications of the method show meso-macroscale shock to detonation simulations in nitromethane and RDX. Research supported by the Defense Threat Reduction Agency.
PARAGON: A Systematic, Integrated Approach to Aerosol Observation and Modeling
NASA Technical Reports Server (NTRS)
Diner, David J.; Kahn, Ralph A.; Braverman, Amy J.; Davies, Roger; Martonchik, John V.; Menzies, Robert T.; Ackerman, Thomas P.; Seinfeld, John H.; Anderson, Theodore L.; Charlson, Robert J.;
2004-01-01
Aerosols are generated and transformed by myriad processes operating across many spatial and temporal scales. Evaluation of climate models and their sensitivity to changes, such as in greenhouse gas abundances, requires quantifying natural and anthropogenic aerosol forcings and accounting for other critical factors, such as cloud feedbacks. High accuracy is required to provide sufficient sensitivity to perturbations, separate anthropogenic from natural influences, and develop confidence in inputs used to support policy decisions. Although many relevant data sources exist, the aerosol research community does not currently have the means to combine these diverse inputs into an integrated data set for maximum scientific benefit. Bridging observational gaps, adapting to evolving measurements, and establishing rigorous protocols for evaluating models are necessary, while simultaneously maintaining consistent, well understood accuracies. The Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON) concept represents a systematic, integrated approach to global aerosol Characterization, bringing together modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies to provide the machinery necessary for achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the Earth system. We outline a framework for integrating and interpreting observations and models and establishing an accurate, consistent and cohesive long-term data record.
Using Laser Scanners to Augment the Systematic Error Pointing Model
NASA Astrophysics Data System (ADS)
Wernicke, D. R.
2016-08-01
The antennas of the Deep Space Network (DSN) rely on precise pointing algorithms to communicate with spacecraft that are billions of miles away. Although the existing systematic error pointing model is effective at reducing blind pointing errors due to static misalignments, several of its terms have a strong dependence on seasonal and even daily thermal variation and are thus not easily modeled. Changes in the thermal state of the structure create a separation from the model and introduce a varying pointing offset. Compensating for this varying offset is possible by augmenting the pointing model with laser scanners. In this approach, laser scanners mounted to the alidade measure structural displacements while a series of transformations generate correction angles. Two sets of experiments were conducted in August 2015 using commercially available laser scanners. When compared with historical monopulse corrections under similar conditions, the computed corrections are within 3 mdeg of the mean. However, although the results show promise, several key challenges relating to the sensitivity of the optical equipment to sunlight render an implementation of this approach impractical. Other measurement devices such as inclinometers may be implementable at a significantly lower cost.
Design of experiments applications in bioprocessing: concepts and approach.
Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S
2014-01-01
Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.
NASA Astrophysics Data System (ADS)
Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.
2015-12-01
Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively evaluate progress in magnetospheric modeling.
Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios
2012-01-01
The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.
Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N.; Mantalaris, Athanasios
2013-01-01
The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals. PMID:24688682
NASA Astrophysics Data System (ADS)
Gorbunov, Michael E.; Kirchengast, Gottfried
2018-01-01
A new reference occultation processing system (rOPS) will include a Global Navigation Satellite System (GNSS) radio occultation (RO) retrieval chain with integrated uncertainty propagation. In this paper, we focus on wave-optics bending angle (BA) retrieval in the lower troposphere and introduce (1) an empirically estimated boundary layer bias (BLB) model then employed to reduce the systematic uncertainty of excess phases and bending angles in about the lowest 2 km of the troposphere and (2) the estimation of (residual) systematic uncertainties and their propagation together with random uncertainties from excess phase to bending angle profiles. Our BLB model describes the estimated bias of the excess phase transferred from the estimated bias of the bending angle, for which the model is built, informed by analyzing refractivity fluctuation statistics shown to induce such biases. The model is derived from regression analysis using a large ensemble of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) RO observations and concurrent European Centre for Medium-Range Weather Forecasts (ECMWF) analysis fields. It is formulated in terms of predictors and adaptive functions (powers and cross products of predictors), where we use six main predictors derived from observations: impact altitude, latitude, bending angle and its standard deviation, canonical transform (CT) amplitude, and its fluctuation index. Based on an ensemble of test days, independent of the days of data used for the regression analysis to establish the BLB model, we find the model very effective for bias reduction and capable of reducing bending angle and corresponding refractivity biases by about a factor of 5. The estimated residual systematic uncertainty, after the BLB profile subtraction, is lower bounded by the uncertainty from the (indirect) use of ECMWF analysis fields but is significantly lower than the systematic uncertainty without BLB correction. The systematic and random uncertainties are propagated from excess phase to bending angle profiles, using a perturbation approach and the wave-optical method recently introduced by Gorbunov and Kirchengast (2015), starting with estimated excess phase uncertainties. The results are encouraging and this uncertainty propagation approach combined with BLB correction enables a robust reduction and quantification of the uncertainties of excess phases and bending angles in the lower troposphere.
Exploring Measurement Error with Cookies: A Real and Virtual Approach via Interactive Excel
ERIC Educational Resources Information Center
Sinex, Scott A; Gage, Barbara A.; Beck, Peggy J.
2007-01-01
A simple, guided-inquiry investigation using stacked sandwich cookies is employed to develop a simple linear mathematical model and to explore measurement error by incorporating errors as part of the investigation. Both random and systematic errors are presented. The model and errors are then investigated further by engaging with an interactive…
ERIC Educational Resources Information Center
Buttram, Joan L.; Covert, Robert W.
The Discrepancy Evaluation Model (DEM), developed in 1966 by Malcolm Provus, provides information for program assessment and program improvement. Under the DEM, evaluation is defined as the comparison of an actual performance to a desired standard. The DEM embodies five stages of evaluation based upon a program's natural development: program…
ERIC Educational Resources Information Center
Cavendish, Wendy; Harry, Beth; Menda, Anne Maria; Espinosa, Anabel; Mahotiere, Margarette
2016-01-01
Background: The Response to Intervention (RTI) approach involves the use of a dynamic model built around the systematic documentation of students' response to research-based instructional interventions. Although there has been widespread implementation of RTI models for early intervention and in some cases, as a means to identify students with…
Enhancing Services to the Rural Elderly through Primary Care Centers.
ERIC Educational Resources Information Center
Leighton, Jeannette; Sprague, Patricia
This paper describes a systematic, coordinated approach to the delivery of health and social services to the rural elderly of Maine provided by the Kennebec Valley Regional Health Agency. Four points of the model are described which distinguish it from other models of coordination: (1) a strong medical orientation in the assessment process; (2)…
ERIC Educational Resources Information Center
Jordan, Jakarla
2016-01-01
This research examines the systematic process of developing an integrative play therapy group model for middle school male students, ages 11-15 who participate in bullying behaviors. Play therapy approaches and evidence-based practices are documented as effective measures for addressing bullying behaviors with children and adolescents. This group…
A Systematic Approach for Identifying Level-1 Error Covariance Structures in Latent Growth Modeling
ERIC Educational Resources Information Center
Ding, Cherng G.; Jane, Ten-Der; Wu, Chiu-Hui; Lin, Hang-Rung; Shen, Chih-Kang
2017-01-01
It has been pointed out in the literature that misspecification of the level-1 error covariance structure in latent growth modeling (LGM) has detrimental impacts on the inferences about growth parameters. Since correct covariance structure is difficult to specify by theory, the identification needs to rely on a specification search, which,…
[Thinking on the Training of Uniportal Video-assisted Thoracic Surgery].
Zhu, Yuming; Jiang, Gening
2018-04-20
Recently, uniportal video-assisted thoracic surgery (VATS) has developed rapidly and has become the main theme of global surgical development. The specific, standardized and systematic training of this technology has become an important topic. Specific training in the uniportal VATS approach is crucial to ensure safety and radical treatment. Such training approach, including a direct interaction with experienced surgeons in high-volume centers, is crucial and represents an indispensable step. Another form of training that usually occurs after preceptorship is proctorship: an experienced mentor can be invited to a trainee's own center to provide specific on-site tutelage. Videos published online are commonly used as training material. Technology has allowed the use of different models of simulators for training. The most common model is the use of animal wet laboratory training. Other models, however, have been used mostrecently, such as the use of 3D and VR Technology, virtual reality simulators, and completely artificial models of the human thorax with synthetic lung, vessel, airway, and nodal tissues. A short-duration, high-volume, clinical immersion training, and a long term systematic training in high-volume centers are getting more and more attention. According to the evaluation of students' grading, a diversified training mode is adopted and the targeted training in accordance with different students helps to improve the training effect. We have done some work in systematic and standardized training of uniportal VATS in single center. We believe such training is feasible and absolutely necessary.
Owens, Douglas K; Whitlock, Evelyn P; Henderson, Jillian; Pignone, Michael P; Krist, Alex H; Bibbins-Domingo, Kirsten; Curry, Susan J; Davidson, Karina W; Ebell, Mark; Gillman, Matthew W; Grossman, David C; Kemper, Alex R; Kurth, Ann E; Maciosek, Michael; Siu, Albert L; LeFevre, Michael L
2016-10-04
The U.S. Preventive Services Task Force (USPSTF) develops evidence-based recommendations about preventive care based on comprehensive systematic reviews of the best available evidence. Decision models provide a complementary, quantitative approach to support the USPSTF as it deliberates about the evidence and develops recommendations for clinical and policy use. This article describes the rationale for using modeling, an approach to selecting topics for modeling, and how modeling may inform recommendations about clinical preventive services. Decision modeling is useful when clinical questions remain about how to target an empirically established clinical preventive service at the individual or program level or when complex determinations of magnitude of net benefit, overall or among important subpopulations, are required. Before deciding whether to use decision modeling, the USPSTF assesses whether the benefits and harms of the preventive service have been established empirically, assesses whether there are key issues about applicability or implementation that modeling could address, and then defines the decision problem and key questions to address through modeling. Decision analyses conducted for the USPSTF are expected to follow best practices for modeling. For chosen topics, the USPSTF assesses the strengths and limitations of the systematically reviewed evidence and the modeling analyses and integrates the results of each to make preventive service recommendations.
Making the Most of What We Already Know: A Three-Stage Approach to Systematic Reviewing.
Rebelo Da Silva, Natalie; Zaranyika, Hazel; Langer, Laurenz; Randall, Nicola; Muchiri, Evans; Stewart, Ruth
2016-09-06
Conducting a systematic review in social policy is a resource-intensive process in terms of time and funds. It is thus important to understand the scope of the evidence base of a topic area prior to conducting a synthesis of primary research in order to maximize these resources. One approach to conserving resources is to map out the available evidence prior to undertaking a traditional synthesis. A few examples of this approach exist in the form of gap maps, overviews of reviews, and systematic maps supported by social policy and systematic review agencies alike. Despite this growing call for alternative approaches to systematic reviews, it is still common for systematic review teams to embark on a traditional in-depth review only. This article describes a three-stage approach to systematic reviewing that was applied to a systematic review focusing in interventions for smallholder farmers in Africa. We argue that this approach proved useful in helping us to understand the evidence base. By applying preliminary steps as part of a three-stage approach, we were able to maximize the resources needed to conduct a traditional systematic review on a more focused research question. This enabled us to identify and fill real knowledge gaps, build on work that had already been done, and avoid wasting resources on areas of work that would have no useful outcome. It also facilitated meaningful engagement between the review team and our key policy stakeholders. © The Author(s) 2016.
Qumquad: a UML-based approach for remodeling of legacy systems in health care.
Garde, Sebastian; Knaup, Petra; Herold, Ralf
2003-07-01
Health care information systems still comprise legacy systems to a certain extent. For reengineering legacy systems a thorough remodeling is inalienable. Current modeling techniques like the Unified Modeling Language (UML) do not offer a systematic and comprehensive process-oriented method for remodeling activities. We developed a systematic method for remodeling legacy systems in health care called Qumquad. Qumquad consists of three major steps: (i) modeling the actual state of the application system, (ii) systematic identification of weak points in this model and (iii) development of a target concept for the reimplementation considering the identified weak points. We applied Qumquad for remodeling a documentation and therapy planning system for pediatric oncology (DOSPO). As a result of our remodeling activities we regained an abstract model of the system, an analysis of the current weak points of DOSPO and possible (partly alternative) solutions to overcome the weak points. Qumquad proved to be very helpful in the reengineering process of DOSPO since we now have at our disposal a comprehensive model for the reimplementation of DOSPO that current users of the system agree on. Qumquad can easily be applied to other reengineering projects in health care.
Quantifying Overdiagnosis in Cancer Screening: A Systematic Review to Evaluate the Methodology.
Ripping, Theodora M; Ten Haaf, Kevin; Verbeek, André L M; van Ravesteyn, Nicolien T; Broeders, Mireille J M
2017-10-01
Overdiagnosis is the main harm of cancer screening programs but is difficult to quantify. This review aims to evaluate existing approaches to estimate the magnitude of overdiagnosis in cancer screening in order to gain insight into the strengths and limitations of these approaches and to provide researchers with guidance to obtain reliable estimates of overdiagnosis in cancer screening. A systematic review was done of primary research studies in PubMed that were published before January 1, 2016, and quantified overdiagnosis in breast cancer screening. The studies meeting inclusion criteria were then categorized by their methods to adjust for lead time and to obtain an unscreened reference population. For each approach, we provide an overview of the data required, assumptions made, limitations, and strengths. A total of 442 studies were identified in the initial search. Forty studies met the inclusion criteria for the qualitative review. We grouped the approaches to adjust for lead time in two main categories: the lead time approach and the excess incidence approach. The lead time approach was further subdivided into the mean lead time approach, lead time distribution approach, and natural history modeling. The excess incidence approach was subdivided into the cumulative incidence approach and early vs late-stage cancer approach. The approaches used to obtain an unscreened reference population were grouped into the following categories: control group of a randomized controlled trial, nonattenders, control region, extrapolation of a prescreening trend, uninvited groups, adjustment for the effect of screening, and natural history modeling. Each approach to adjust for lead time and obtain an unscreened reference population has its own strengths and limitations, which should be taken into consideration when estimating overdiagnosis. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
An effective model for ergonomic optimization applied to a new automotive assembly line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duraccio, Vincenzo; Elia, Valerio; Forcina, Antonio
2016-06-08
An efficient ergonomic optimization can lead to a significant improvement in production performance and a considerable reduction of costs. In the present paper new model for ergonomic optimization is proposed. The new approach is based on the criteria defined by National Institute of Occupational Safety and Health and, adapted to Italian legislation. The proposed model provides an ergonomic optimization, by analyzing ergonomic relations between manual work in correct conditions. The model includes a schematic and systematic analysis method of the operations, and identifies all possible ergonomic aspects to be evaluated. The proposed approach has been applied to an automotive assemblymore » line, where the operation repeatability makes the optimization fundamental. The proposed application clearly demonstrates the effectiveness of the new approach.« less
Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick
2013-01-01
Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412
NASA Astrophysics Data System (ADS)
Moeck, Christian; Affolter, Annette; Radny, Dirk; Dressmann, Horst; Auckenthaler, Adrian; Huggenberger, Peter; Schirmer, Mario
2018-02-01
A three-dimensional groundwater model was used to improve water resource management for a study area in north-west Switzerland, where drinking-water production is close to former landfills and industrial areas. To avoid drinking-water contamination, artificial groundwater recharge with surface water is used to create a hydraulic barrier between the contaminated sites and drinking-water extraction wells. The model was used for simulating existing and proposed water management strategies as a tool to ensure the utmost security for drinking water. A systematic evaluation of the flow direction between existing observation points using a developed three-point estimation method for a large number of scenarios was carried out. It is demonstrated that systematically applying the developed methodology helps to identify vulnerable locations which are sensitive to changing boundary conditions such as those arising from changes to artificial groundwater recharge rates. At these locations, additional investigations and protection are required. The presented integrated approach, using the groundwater flow direction between observation points, can be easily transferred to a variety of hydrological settings to systematically evaluate groundwater modelling scenarios.
Galkin, A A
2012-01-01
On the basis of graphic models of the human response to environmental factors, two main types of complex quantitative influence as well as interrelation between determined effects at the level of an individual, and stochastic effects on population were revealed. Two main kinds of factors have been suggested to be distinguished. They are essential factors and accidental factors. The essential factors are common for environment. The accidental factors are foreign for environment. The above two kinds are different in approaches of hygienic standardization Accidental factors need a dot-like approach, whereas a two-level range approach is suitable for the essential factors.
Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios
2018-05-02
Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.
Sandia fracture challenge 2: Sandia California's modeling approach
Karlson, Kyle N.; James W. Foulk, III; Brown, Arthur A.; ...
2016-03-09
The second Sandia Fracture Challenge illustrates that predicting the ductile fracture of Ti-6Al-4V subjected to moderate and elevated rates of loading requires thermomechanical coupling, elasto-thermo-poro-viscoplastic constitutive models with the physics of anisotropy and regularized numerical methods for crack initiation and propagation. We detail our initial approach with an emphasis on iterative calibration and systematically increasing complexity to accommodate anisotropy in the context of an isotropic material model. Blind predictions illustrate strengths and weaknesses of our initial approach. We then revisit our findings to illustrate the importance of including anisotropy in the failure process. Furthermore, mesh-independent solutions of continuum damage modelsmore » having both isotropic and anisotropic yields surfaces are obtained through nonlocality and localization elements.« less
Montoya-Castillo, Andrés; Reichman, David R
2017-01-14
We derive a semi-analytical form for the Wigner transform for the canonical density operator of a discrete system coupled to a harmonic bath based on the path integral expansion of the Boltzmann factor. The introduction of this simple and controllable approach allows for the exact rendering of the canonical distribution and permits systematic convergence of static properties with respect to the number of path integral steps. In addition, the expressions derived here provide an exact and facile interface with quasi- and semi-classical dynamical methods, which enables the direct calculation of equilibrium time correlation functions within a wide array of approaches. We demonstrate that the present method represents a practical path for the calculation of thermodynamic data for the spin-boson and related systems. We illustrate the power of the present approach by detailing the improvement of the quality of Ehrenfest theory for the correlation function C zz (t)=Re⟨σ z (0)σ z (t)⟩ for the spin-boson model with systematic convergence to the exact sampling function. Importantly, the numerically exact nature of the scheme presented here and its compatibility with semiclassical methods allows for the systematic testing of commonly used approximations for the Wigner-transformed canonical density.
Reavley, Nicola; Livingston, Jenni; Buchbinder, Rachelle; Bennell, Kim; Stecki, Chris; Osborne, Richard Harry
2010-02-01
Despite demands for evidence-based research and practice, little attention has been given to systematic approaches to the development of complex interventions to tackle workplace health problems. This paper outlines an approach to the initial stages of a workplace program development which integrates health promotion and disease management. The approach commences with systematic and genuine processes of obtaining information from key stakeholders with broad experience of these interventions. This information is constructed into a program framework in which practice-based and research-informed elements are both valued. We used this approach to develop a workplace education program to reduce the onset and impact of a common chronic disease - osteoarthritis. To gain information systematically at a national level, a structured concept mapping workshop with 47 participants from across Australia was undertaken. Participants were selected to maximise the whole-of-workplace perspective and included health education providers, academics, clinicians and policymakers. Participants generated statements in response to a seeding statement: Thinking as broadly as possible, what changes in education and support should occur in the workplace to help in the prevention and management of arthritis? Participants grouped the resulting statements into conceptually coherent groups and a computer program was used to generate a 'cluster map' along with a list of statements sorted according to cluster membership. In combination with research-based evidence, the concept map informed the development of a program logic model incorporating the program's guiding principles, possible service providers, services, training modes, program elements and the causal processes by which participants might benefit. The program logic model components were further validated through research findings from diverse fields, including health education, coaching, organisational learning, workplace interventions, workforce development and osteoarthritis disability prevention. In summary, wide and genuine consultation, concept mapping, and evidence-based program logic development were integrated to develop a whole-of-system complex intervention in which potential effectiveness and assimilation into the workplace for which optimised. Copyright 2009 Elsevier Ltd. All rights reserved.
Miraldi Utz, Virginia
2017-01-01
Myopia is the most common eye disorder and major cause of visual impairment worldwide. As the incidence of myopia continues to rise, the need to further understand the complex roles of molecular and environmental factors controlling variation in refractive error is of increasing importance. Tkatchenko and colleagues applied a systematic approach using a combination of gene set enrichment analysis, genome-wide association studies, and functional analysis of a murine model to identify a myopia susceptibility gene, APLP2. Differential expression of refractive error was associated with time spent reading for those with low frequency variants in this gene. This provides support for the longstanding hypothesis of gene-environment interactions in refractive error development.
Performing Systematic Literature Reviews with Novices: An Iterative Approach
ERIC Educational Resources Information Center
Lavallée, Mathieu; Robillard, Pierre-N.; Mirsalari, Reza
2014-01-01
Reviewers performing systematic literature reviews require understanding of the review process and of the knowledge domain. This paper presents an iterative approach for conducting systematic literature reviews that addresses the problems faced by reviewers who are novices in one or both levels of understanding. This approach is derived from…
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Training programs at DOE facilities should prepare personnel to safely and efficiently operate and maintain the facilities in accordance with DOE requirements. This guide presents good practices for a systematic approach to on-the-job training (OJT) and OJT programs and should be used in conjunction with DOE Training Program Handbook: A Systematic Approach to Training, and with the DOE Handbook entitled Alternative Systematic Approaches to Training to develop performance-based OJT programs. DOE contractors may also use this guide to modify existing OJT programs that do not meet the systematic approach to training (SAT) objectives.
Ataman, Meric
2017-01-01
Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these “consistently-reduced” models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models. PMID:28727725
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jennings, Elise; Wolf, Rachel; Sako, Masao
2016-11-09
Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set ofmore » $$\\sim$$1000 SNe corresponding to the first season of the Dark Energy Survey Supernova Program. Varying $$\\Omega_m, w_0, \\alpha$$ and $$\\beta$$ and a magnitude offset parameter, with no systematics we obtain $$\\Delta(w_0) = w_0^{\\rm true} - w_0^{\\rm best \\, fit} = -0.036\\pm0.109$$ (a $$\\sim11$$% 1$$\\sigma$$ uncertainty) using the Tripp metric and $$\\Delta(w_0) = -0.055\\pm0.068$$ (a $$\\sim7$$% 1$$\\sigma$$ uncertainty) using the Light Curve metric. Including 1% calibration uncertainties in four passbands, adding 4 more parameters, we obtain $$\\Delta(w_0) = -0.062\\pm0.132$$ (a $$\\sim14$$% 1$$\\sigma$$ uncertainty) using the Tripp metric. Overall we find a $17$% increase in the uncertainty on $$w_0$$ with systematics compared to without. We contrast this with a MCMC approach where systematic effects are approximately included. We find that the MCMC method slightly underestimates the impact of calibration uncertainties for this simulated data set.« less
NASA Astrophysics Data System (ADS)
Baker, D. F.; Oda, T.; O'Dell, C.; Wunch, D.; Jacobson, A. R.; Yoshida, Y.; Partners, T.
2012-12-01
Measurements of column CO2 concentration from space are now being taken at a spatial and temporal density that permits regional CO2 sources and sinks to be estimated. Systematic errors in the satellite retrievals must be minimized for these estimates to be useful, however. CO2 retrievals from the TANSO instrument aboard the GOSAT satellite are compared to similar column retrievals from the Total Carbon Column Observing Network (TCCON) as the primary method of validation; while this is a powerful approach, it can only be done for overflights of 10-20 locations and has not, for example, permitted validation of GOSAT data over the oceans or deserts. Here we present a complementary approach that uses a global atmospheric transport model and flux inversion method to compare different types of CO2 measurements (GOSAT, TCCON, surface in situ, and aircraft) at different locations, at the cost of added transport error. The measurements from any single type of data are used in a variational carbon data assimilation method to optimize surface CO2 fluxes (with a CarbonTracker prior), then the corresponding optimized CO2 concentration fields are compared to those data types not inverted, using the appropriate vertical weighting. With this approach, we find that GOSAT column CO2 retrievals from the ACOS project (version 2.9 and 2.10) contain systematic errors that make the modeled fit to the independent data worse. However, we find that the differences between the GOSAT data and our prior model are correlated with certain physical variables (aerosol amount, surface albedo, correction to total column mass) that are likely driving errors in the retrievals, independent of CO2 concentration. If we correct the GOSAT data using a fit to these variables, then we find the GOSAT data to improve the fit to independent CO2 data, which suggests that the useful information in the measurements outweighs the negative impact of the remaining systematic errors. With this assurance, we compare the flux estimates given by assimilating the ACOS GOSAT retrievals to similar ones given by NIES GOSAT column retrievals, bias-corrected in a similar manner. Finally, we have found systematic differences on the order of a half ppm between column CO2 integrals from 18 TCCON sites and those given by assimilating NOAA in situ data (both surface and aircraft profile) in this approach. We assess how these differences change in switching to a newer version of the TCCON retrieval software.
DOT National Transportation Integrated Search
1996-06-01
To approach the reflection cracking problem in AC overlays systematically the properties of the materials intended to be used in an ISAC system were first identified. Various thermal/structural models and laboratory equipment were used for this purpo...
Behavioral Treatment of Children's Fears and Phobias: A Review.
ERIC Educational Resources Information Center
Morris, Richard J.; Kratochwill, Thomas R.
1985-01-01
An overview of the behaviorally-oriented fear reduction methods for children is presented. Systematic desensitization and related procedures, flooding-related therapies, contingency management approaches, modeling procedures, and self-control methods are discussed after reviewing normative and prevalence data regarding children's fears. Research…
Assessing theoretical uncertainties in fission barriers of superheavy nuclei
Agbemava, S. E.; Afanasjev, A. V.; Ray, D.; ...
2017-05-26
Here, theoretical uncertainties in the predictions of inner fission barrier heights in superheavy elements have been investigated in a systematic way for a set of state-of-the-art covariant energy density functionals which represent major classes of the functionals used in covariant density functional theory. They differ in basic model assumptions and fitting protocols. Both systematic and statistical uncertainties have been quantified where the former turn out to be larger. Systematic uncertainties are substantial in superheavy elements and their behavior as a function of proton and neutron numbers contains a large random component. The benchmarking of the functionals to the experimental datamore » on fission barriers in the actinides allows to reduce the systematic theoretical uncertainties for the inner fission barriers of unknown superheavy elements. However, even then they on average increase on moving away from the region where benchmarking has been performed. In addition, a comparison with the results of non-relativistic approaches is performed in order to define full systematic theoretical uncertainties over the state-of-the-art models. Even for the models benchmarked in the actinides, the difference in the inner fission barrier height of some superheavy elements reaches $5-6$ MeV. This uncertainty in the fission barrier heights will translate into huge (many tens of the orders of magnitude) uncertainties in the spontaneous fission half-lives.« less
Salvi, Daniele; Macali, Armando; Mariottini, Paolo
2014-01-01
The bivalve family Ostreidae has a worldwide distribution and includes species of high economic importance. Phylogenetics and systematic of oysters based on morphology have proved difficult because of their high phenotypic plasticity. In this study we explore the phylogenetic information of the DNA sequence and secondary structure of the nuclear, fast-evolving, ITS2 rRNA and the mitochondrial 16S rRNA genes from the Ostreidae and we implemented a multi-locus framework based on four loci for oyster phylogenetics and systematics. Sequence-structure rRNA models aid sequence alignment and improved accuracy and nodal support of phylogenetic trees. In agreement with previous molecular studies, our phylogenetic results indicate that none of the currently recognized subfamilies, Crassostreinae, Ostreinae, and Lophinae, is monophyletic. Single gene trees based on Maximum likelihood (ML) and Bayesian (BA) methods and on sequence-structure ML were congruent with multilocus trees based on a concatenated (ML and BA) and coalescent based (BA) approaches and consistently supported three main clades: (i) Crassostrea, (ii) Saccostrea, and (iii) an Ostreinae-Lophinae lineage. Therefore, the subfamily Crassotreinae (including Crassostrea), Saccostreinae subfam. nov. (including Saccostrea and tentatively Striostrea) and Ostreinae (including Ostreinae and Lophinae taxa) are recognized. Based on phylogenetic and biogeographical evidence the Asian species of Crassostrea from the Pacific Ocean are assigned to Magallana gen. nov., whereas an integrative taxonomic revision is required for the genera Ostrea and Dendostrea. This study pointed out the suitability of the ITS2 marker for DNA barcoding of oyster and the relevance of using sequence-structure rRNA models and features of the ITS2 folding in molecular phylogenetics and taxonomy. The multilocus approach allowed inferring a robust phylogeny of Ostreidae providing a broad molecular perspective on their systematics. PMID:25250663
Salvi, Daniele; Macali, Armando; Mariottini, Paolo
2014-01-01
The bivalve family Ostreidae has a worldwide distribution and includes species of high economic importance. Phylogenetics and systematic of oysters based on morphology have proved difficult because of their high phenotypic plasticity. In this study we explore the phylogenetic information of the DNA sequence and secondary structure of the nuclear, fast-evolving, ITS2 rRNA and the mitochondrial 16S rRNA genes from the Ostreidae and we implemented a multi-locus framework based on four loci for oyster phylogenetics and systematics. Sequence-structure rRNA models aid sequence alignment and improved accuracy and nodal support of phylogenetic trees. In agreement with previous molecular studies, our phylogenetic results indicate that none of the currently recognized subfamilies, Crassostreinae, Ostreinae, and Lophinae, is monophyletic. Single gene trees based on Maximum likelihood (ML) and Bayesian (BA) methods and on sequence-structure ML were congruent with multilocus trees based on a concatenated (ML and BA) and coalescent based (BA) approaches and consistently supported three main clades: (i) Crassostrea, (ii) Saccostrea, and (iii) an Ostreinae-Lophinae lineage. Therefore, the subfamily Crassostreinae (including Crassostrea), Saccostreinae subfam. nov. (including Saccostrea and tentatively Striostrea) and Ostreinae (including Ostreinae and Lophinae taxa) are recognized [corrected]. Based on phylogenetic and biogeographical evidence the Asian species of Crassostrea from the Pacific Ocean are assigned to Magallana gen. nov., whereas an integrative taxonomic revision is required for the genera Ostrea and Dendostrea. This study pointed out the suitability of the ITS2 marker for DNA barcoding of oyster and the relevance of using sequence-structure rRNA models and features of the ITS2 folding in molecular phylogenetics and taxonomy. The multilocus approach allowed inferring a robust phylogeny of Ostreidae providing a broad molecular perspective on their systematics.
Chen, Yong; Liu, Yulun; Ning, Jing; Cormier, Janice; Chu, Haitao
2014-01-01
Systematic reviews of diagnostic tests often involve a mixture of case-control and cohort studies. The standard methods for evaluating diagnostic accuracy only focus on sensitivity and specificity and ignore the information on disease prevalence contained in cohort studies. Consequently, such methods cannot provide estimates of measures related to disease prevalence, such as population averaged or overall positive and negative predictive values, which reflect the clinical utility of a diagnostic test. In this paper, we propose a hybrid approach that jointly models the disease prevalence along with the diagnostic test sensitivity and specificity in cohort studies, and the sensitivity and specificity in case-control studies. In order to overcome the potential computational difficulties in the standard full likelihood inference of the proposed hybrid model, we propose an alternative inference procedure based on the composite likelihood. Such composite likelihood based inference does not suffer computational problems and maintains high relative efficiency. In addition, it is more robust to model mis-specifications compared to the standard full likelihood inference. We apply our approach to a review of the performance of contemporary diagnostic imaging modalities for detecting metastases in patients with melanoma. PMID:25897179
Systematic Assessment Through Mathematical Model For Sustainability Reporting In Malaysia Context
NASA Astrophysics Data System (ADS)
Lanang, Wan Nurul Syahirah Wan; Turan, Faiz Mohd; Johan, Kartina
2017-08-01
Sustainability assessment have been studied and increasingly recognized as a powerful and valuable tool to measure the performance of sustainability in a company or industry. Nowadays, there are many existing tools that the users can use for sustainable development. There are various initiatives exists on tools for sustainable development, though most of the tools focused on environmental, economy and social aspects. Using the Green Project Management (GPM) P5 concept that suggests the firms not only needs to engage in mainly 3Ps principle: planet, profit, people responsible behaviours, but also, product and process need to be included in the practices, this study will introduce a new mathematical model for assessing the level of sustainability practice in the company. Based on multiple case studies, involving in-depth interviews with senior directors, feedback from experts, and previous engineering report, a systematic approach is done with the aims to obtain the respective data from the feedbacks and to be developed into a new mathematical model. By reviewing on the methodology of this research it comprises of several phases where it starts with the analyzation of the parameters and criteria selection according to the Malaysian context of industry. Moving on to the next step is data analysis involving regression and finally the normalisation process will be done to determine the result of this research either succeeded or not. Lastly, this study is expected to provide a clear guideline to any company or organization to assimilate the sustainability assessment in their development stage. In future, the better understanding towards the sustainability assessment is attained to be aligned unitedly in order to integrated the process approach into the systematic approach for the sustainability assessment.
On the engineering design for systematic integration of agent-orientation in industrial automation.
Yu, Liyong; Schüller, Andreas; Epple, Ulrich
2014-09-01
In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
The ILRS Contribution to ITRF2013
NASA Astrophysics Data System (ADS)
Pavlis, Erricos C.; Luceri, Cinzia; Sciarretta, Cecilia; Evans, Keith
2014-05-01
Satellite Laser Ranging (SLR) data have contributed to the definition of the International Terrestrial Reference Frame (ITRF) over the past three decades. The development of ITRF2005 ushered a new era with the use of weekly or session contributions, allowing greater flexibility in the editing, relative weighting and the combination of information from the four contributing techniques. The new approach allows each Service to generate a solution based on the rigorous combination of the individual Analysis Centers' contributions that provides an opportunity to verify the intra-technique consistency and a comparison of internal procedures and adopted models. The intra- and inter-technique comparisons that the time series approach facilitates are an extremely powerful diagnostic that highlights differences and inconsistencies at the single station level. Over the past year the ILRS Analysis Working Group (AWG) worked on designing an improved ILRS contribution for the development of ITRF2013. The ILRS approach is based on the current IERS Conventions 2010 and our internal ILRS standards, with a few deviations that are documented. Since the Global Geodetic Observing System - GGOS identified the ITRF as its key project, the ILRS has taken a two-pronged approach in order to meet its stringent goals: modernizing the engineering components (ground and space segments), and revising the modeling standards taking advantage of recent improvements in system Earth modeling. The main concern in the case of SLR is monitoring systematic errors at individual stations, accounting for undocumented discontinuities, and improving the target signature models. The latter has been addressed with the adoption of mm-level models for all of our targets. As far as the station systematics, the AWG had already embarked on a major effort to improve the handling of such errors prior to the development of ITRF2008. The results of that effort formed the foundation for the re-examination of the systematic errors at all sites. The new process benefited extensively from the results of the quality control process that ILRS provides on a daily basis as a feedback to the stations, and the recovery of systematic error corrections from the data themselves through targeted investigations. The present re-analysis extends from 1983 to the end of 2013. The data quality for the early period 1983-1993 is significantly poorer than for the recent years. However, it contributes to the overall stability of the datum definition, especially in terms of its origin and scale and, as the more recent and higher quality data accumulate, the significance of the early data will progressively diminish. As in the case of ITRF2008, station engineers and analysts have worked together to determine the magnitude and cause of systematic errors that were noticed during the analysis, rationalize them based on events at the stations, and develop appropriate corrections whenever possible. This presentation will give an overview of the process and examples from the various steps.
What is a "good enough" termination?
Gabbard, Glen O
2009-06-01
In Freud's technique papers, he failed to develop a systematic approach to termination. Much of the existing literature is based on psychoanalytic mythologies about the way patients are expected to end analysis. The models described in the literature are often starkly at odds with what one sees in clinical practice. A wish for idealized versions of termination underlies much of what has been written, and we need to shift to a conceptual model involving "good enough" termination. A number of different endings to psychoanalysis may, in the long run, lead to productive outcomes; these models are examined, as are various approaches to the dilemmas presented at the time of termination.
A heuristic approach to determine an appropriate number of topics in topic modeling
2015-01-01
Background Topic modelling is an active research field in machine learning. While mainly used to build models from unstructured textual data, it offers an effective means of data mining where samples represent documents, and different biological endpoints or omics data represent words. Latent Dirichlet Allocation (LDA) is the most commonly used topic modelling method across a wide number of technical fields. However, model development can be arduous and tedious, and requires burdensome and systematic sensitivity studies in order to find the best set of model parameters. Often, time-consuming subjective evaluations are needed to compare models. Currently, research has yielded no easy way to choose the proper number of topics in a model beyond a major iterative approach. Methods and results Based on analysis of variation of statistical perplexity during topic modelling, a heuristic approach is proposed in this study to estimate the most appropriate number of topics. Specifically, the rate of perplexity change (RPC) as a function of numbers of topics is proposed as a suitable selector. We test the stability and effectiveness of the proposed method for three markedly different types of grounded-truth datasets: Salmonella next generation sequencing, pharmacological side effects, and textual abstracts on computational biology and bioinformatics (TCBB) from PubMed. Conclusion The proposed RPC-based method is demonstrated to choose the best number of topics in three numerical experiments of widely different data types, and for databases of very different sizes. The work required was markedly less arduous than if full systematic sensitivity studies had been carried out with number of topics as a parameter. We understand that additional investigation is needed to substantiate the method's theoretical basis, and to establish its generalizability in terms of dataset characteristics. PMID:26424364
Assessing Institutional Fitness: A Population Ecology Perspective on College and University Health.
ERIC Educational Resources Information Center
Emmert, Mark A.
1985-01-01
A population ecology model of institutional fitness broadens the scope of perspectives on organizational success. The approach allows systematic thinking about internal and external factors identifies the critical dependency relationships between a college and other organizations that supply resources. (MLW)
Simulation-Based Evaluation of Learning Sequences for Instructional Technologies
ERIC Educational Resources Information Center
McEneaney, John E.
2016-01-01
Instructional technologies critically depend on systematic design, and learning hierarchies are a commonly advocated tool for designing instructional sequences. But hierarchies routinely allow numerous sequences and choosing an optimal sequence remains an unsolved problem. This study explores a simulation-based approach to modeling learning…
75 FR 12753 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-17
... effective at improving health care quality. While evidence-based approaches for decisionmaking have become standard in healthcare, this has been limited in laboratory medicine. No single- evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...
IMPLICIT DUAL CONTROL BASED ON PARTICLE FILTERING AND FORWARD DYNAMIC PROGRAMMING.
Bayard, David S; Schumitzky, Alan
2010-03-01
This paper develops a sampling-based approach to implicit dual control. Implicit dual control methods synthesize stochastic control policies by systematically approximating the stochastic dynamic programming equations of Bellman, in contrast to explicit dual control methods that artificially induce probing into the control law by modifying the cost function to include a term that rewards learning. The proposed implicit dual control approach is novel in that it combines a particle filter with a policy-iteration method for forward dynamic programming. The integration of the two methods provides a complete sampling-based approach to the problem. Implementation of the approach is simplified by making use of a specific architecture denoted as an H-block. Practical suggestions are given for reducing computational loads within the H-block for real-time applications. As an example, the method is applied to the control of a stochastic pendulum model having unknown mass, length, initial position and velocity, and unknown sign of its dc gain. Simulation results indicate that active controllers based on the described method can systematically improve closed-loop performance with respect to other more common stochastic control approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat
Several transportation network vulnerability models have been proposed. However, most only consider disruptions as a static snapshot in time and the impact on total travel time. These approaches cannot consider the time-varying nature of travel demand nor other undesirable outcomes that follow from transportation network disruptions. This paper proposes an algorithmic approach to assess the vulnerability of a transportation network that considers the time-varying demand with an open source dynamic transportation simulation tool. The open source nature of the tool allows us to systematically consider many disruption scenarios and quantitatively compare their relative criticality. This is far more efficient thanmore » traditional approaches which would require days or weeks of a transportation engineers time to manually set up, run, and assess these simulations. In addition to travel time, we also collect statistics on additional fuel consumed and the corresponding carbon dioxide emissions. Our approach, thus provides a more systematic approach that is both time-varying and can consider additional negative consequences of disruptions for decision makers to evaluate.« less
System Dynamics Modeling for Public Health: Background and Opportunities
Homer, Jack B.; Hirsch, Gary B.
2006-01-01
The systems modeling methodology of system dynamics is well suited to address the dynamic complexity that characterizes many public health issues. The system dynamics approach involves the development of computer simulation models that portray processes of accumulation and feedback and that may be tested systematically to find effective policies for overcoming policy resistance. System dynamics modeling of chronic disease prevention should seek to incorporate all the basic elements of a modern ecological approach, including disease outcomes, health and risk behaviors, environmental factors, and health-related resources and delivery systems. System dynamics shows promise as a means of modeling multiple interacting diseases and risks, the interaction of delivery systems and diseased populations, and matters of national and state policy. PMID:16449591
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems.
White, Andrew; Tolman, Malachi; Thames, Howard D; Withers, Hubert Rodney; Mason, Kathy A; Transtrum, Mark K
2016-12-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model's discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system-a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
Inherent noise can facilitate coherence in collective swarm motion
Yates, Christian A.; Erban, Radek; Escudero, Carlos; Couzin, Iain D.; Buhl, Jerome; Kevrekidis, Ioannis G.; Maini, Philip K.; Sumpter, David J. T.
2009-01-01
Among the most striking aspects of the movement of many animal groups are their sudden coherent changes in direction. Recent observations of locusts and starlings have shown that this directional switching is an intrinsic property of their motion. Similar direction switches are seen in self-propelled particle and other models of group motion. Comprehending the factors that determine such switches is key to understanding the movement of these groups. Here, we adopt a coarse-grained approach to the study of directional switching in a self-propelled particle model assuming an underlying one-dimensional Fokker–Planck equation for the mean velocity of the particles. We continue with this assumption in analyzing experimental data on locusts and use a similar systematic Fokker–Planck equation coefficient estimation approach to extract the relevant information for the assumed Fokker–Planck equation underlying that experimental data. In the experiment itself the motion of groups of 5 to 100 locust nymphs was investigated in a homogeneous laboratory environment, helping us to establish the intrinsic dynamics of locust marching bands. We determine the mean time between direction switches as a function of group density for the experimental data and the self-propelled particle model. This systematic approach allows us to identify key differences between the experimental data and the model, revealing that individual locusts appear to increase the randomness of their movements in response to a loss of alignment by the group. We give a quantitative description of how locusts use noise to maintain swarm alignment. We discuss further how properties of individual animal behavior, inferred by using the Fokker–Planck equation coefficient estimation approach, can be implemented in the self-propelled particle model to replicate qualitatively the group level dynamics seen in the experimental data. PMID:19336580
2014-01-01
Background mRNA translation involves simultaneous movement of multiple ribosomes on the mRNA and is also subject to regulatory mechanisms at different stages. Translation can be described by various codon-based models, including ODE, TASEP, and Petri net models. Although such models have been extensively used, the overlap and differences between these models and the implications of the assumptions of each model has not been systematically elucidated. The selection of the most appropriate modelling framework, and the most appropriate way to develop coarse-grained/fine-grained models in different contexts is not clear. Results We systematically analyze and compare how different modelling methodologies can be used to describe translation. We define various statistically equivalent codon-based simulation algorithms and analyze the importance of the update rule in determining the steady state, an aspect often neglected. Then a novel probabilistic Boolean network (PBN) model is proposed for modelling translation, which enjoys an exact numerical solution. This solution matches those of numerical simulation from other methods and acts as a complementary tool to analytical approximations and simulations. The advantages and limitations of various codon-based models are compared, and illustrated by examples with real biological complexities such as slow codons, premature termination and feedback regulation. Our studies reveal that while different models gives broadly similiar trends in many cases, important differences also arise and can be clearly seen, in the dependence of the translation rate on different parameters. Furthermore, the update rule affects the steady state solution. Conclusions The codon-based models are based on different levels of abstraction. Our analysis suggests that a multiple model approach to understanding translation allows one to ascertain which aspects of the conclusions are robust with respect to the choice of modelling methodology, and when (and why) important differences may arise. This approach also allows for an optimal use of analysis tools, which is especially important when additional complexities or regulatory mechanisms are included. This approach can provide a robust platform for dissecting translation, and results in an improved predictive framework for applications in systems and synthetic biology. PMID:24576337
NASA Technical Reports Server (NTRS)
Duong, N.; Winn, C. B.; Johnson, G. R.
1975-01-01
Two approaches to an identification problem in hydrology are presented, based upon concepts from modern control and estimation theory. The first approach treats the identification of unknown parameters in a hydrologic system subject to noisy inputs as an adaptive linear stochastic control problem; the second approach alters the model equation to account for the random part in the inputs, and then uses a nonlinear estimation scheme to estimate the unknown parameters. Both approaches use state-space concepts. The identification schemes are sequential and adaptive and can handle either time-invariant or time-dependent parameters. They are used to identify parameters in the Prasad model of rainfall-runoff. The results obtained are encouraging and confirm the results from two previous studies; the first using numerical integration of the model equation along with a trial-and-error procedure, and the second using a quasi-linearization technique. The proposed approaches offer a systematic way of analyzing the rainfall-runoff process when the input data are imbedded in noise.
Systematic Review of Model-Based Economic Evaluations of Treatments for Alzheimer's Disease.
Hernandez, Luis; Ozen, Asli; DosSantos, Rodrigo; Getsios, Denis
2016-07-01
Numerous economic evaluations using decision-analytic models have assessed the cost effectiveness of treatments for Alzheimer's disease (AD) in the last two decades. It is important to understand the methods used in the existing models of AD and how they could impact results, as they could inform new model-based economic evaluations of treatments for AD. The aim of this systematic review was to provide a detailed description on the relevant aspects and components of existing decision-analytic models of AD, identifying areas for improvement and future development, and to conduct a quality assessment of the included studies. We performed a systematic and comprehensive review of cost-effectiveness studies of pharmacological treatments for AD published in the last decade (January 2005 to February 2015) that used decision-analytic models, also including studies considering patients with mild cognitive impairment (MCI). The background information of the included studies and specific information on the decision-analytic models, including their approach and components, assumptions, data sources, analyses, and results, were obtained from each study. A description of how the modeling approaches and assumptions differ across studies, identifying areas for improvement and future development, is provided. At the end, we present our own view of the potential future directions of decision-analytic models of AD and the challenges they might face. The included studies present a variety of different approaches, assumptions, and scope of decision-analytic models used in the economic evaluation of pharmacological treatments of AD. The major areas for improvement in future models of AD are to include domains of cognition, function, and behavior, rather than cognition alone; include a detailed description of how data used to model the natural course of disease progression were derived; state and justify the economic model selected and structural assumptions and limitations; provide a detailed (rather than high-level) description of the cost components included in the model; and report on the face-, internal-, and cross-validity of the model to strengthen the credibility and confidence in model results. The quality scores of most studies were rated as fair to good (average 87.5, range 69.5-100, in a scale of 0-100). Despite the advancements in decision-analytic models of AD, there remain several areas of improvement that are necessary to more appropriately and realistically capture the broad nature of AD and the potential benefits of treatments in future models of AD.
The Limitations of Model-Based Experimental Design and Parameter Estimation in Sloppy Systems
Tolman, Malachi; Thames, Howard D.; Mason, Kathy A.
2016-01-01
We explore the relationship among experimental design, parameter estimation, and systematic error in sloppy models. We show that the approximate nature of mathematical models poses challenges for experimental design in sloppy models. In many models of complex biological processes it is unknown what are the relevant physical mechanisms that must be included to explain system behaviors. As a consequence, models are often overly complex, with many practically unidentifiable parameters. Furthermore, which mechanisms are relevant/irrelevant vary among experiments. By selecting complementary experiments, experimental design may inadvertently make details that were ommitted from the model become relevant. When this occurs, the model will have a large systematic error and fail to give a good fit to the data. We use a simple hyper-model of model error to quantify a model’s discrepancy and apply it to two models of complex biological processes (EGFR signaling and DNA repair) with optimally selected experiments. We find that although parameters may be accurately estimated, the discrepancy in the model renders it less predictive than it was in the sloppy regime where systematic error is small. We introduce the concept of a sloppy system–a sequence of models of increasing complexity that become sloppy in the limit of microscopic accuracy. We explore the limits of accurate parameter estimation in sloppy systems and argue that identifying underlying mechanisms controlling system behavior is better approached by considering a hierarchy of models of varying detail rather than focusing on parameter estimation in a single model. PMID:27923060
Application of zonal model on indoor air sensor network design
NASA Astrophysics Data System (ADS)
Chen, Y. Lisa; Wen, Jin
2007-04-01
Growing concerns over the safety of the indoor environment have made the use of sensors ubiquitous. Sensors that detect chemical and biological warfare agents can offer early warning of dangerous contaminants. However, current sensor system design is more informed by intuition and experience rather by systematic design. To develop a sensor system design methodology, a proper indoor airflow modeling approach is needed. Various indoor airflow modeling techniques, from complicated computational fluid dynamics approaches to simplified multi-zone approaches, exist in the literature. In this study, the effects of two airflow modeling techniques, multi-zone modeling technique and zonal modeling technique, on indoor air protection sensor system design are discussed. Common building attack scenarios, using a typical CBW agent, are simulated. Both multi-zone and zonal models are used to predict airflows and contaminant dispersion. Genetic Algorithm is then applied to optimize the sensor location and quantity. Differences in the sensor system design resulting from the two airflow models are discussed for a typical office environment and a large hall environment.
Hunt, Pete; Barrios, Lisa; Telljohann, Susan K; Mazyck, Donna
2015-11-01
The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. The existing literature, including scientific articles, programmatic guidance, and publications by national agencies and organizations, was reviewed and synthesized to describe an overview of interrelatedness of learning and health and the 10 components of the WSCC model. The literature suggests potential benefits of applying the WSCC model at the district and school level. But, the model lacks specific guidance as to how this might be made actionable. A collaborative approach to health and learning is suggested, including a 10-step systematic process to help schools and districts develop an action plan for improving health and education outcomes. Essential preliminary actions are suggested to minimize the impact of the challenges that commonly derail systematic planning processes and program implementation, such as lack of readiness, personnel shortages, insufficient resources, and competing priorities. All new models require testing and evidence to confirm their value. District and schools will need to test this model and put plans into action to show that significant, substantial, and sustainable health and academic outcomes can be achieved. © 2015 The Authors. Journal of School Health published by Wiley Periodicals, Inc. on behalf of American School Health Association.
Interventions and approaches to integrating HIV and mental health services: a systematic review
Chuah, Fiona Leh Hoon; Haldane, Victoria Elizabeth; Cervero-Liceras, Francisco; Ong, Suan Ee; Sigfrid, Louise A; Murphy, Georgina; Watt, Nicola; Balabanova, Dina; Hogarth, Sue; Maimaris, Will; Otero, Laura; Buse, Kent; McKee, Martin; Piot, Peter; Perel, Pablo; Legido-Quigley, Helena
2017-01-01
Abstract Background The frequency in which HIV and AIDS and mental health problems co-exist, and the complex bi-directional relationship between them, highlights the need for effective care models combining services for HIV and mental health. Here, we present a systematic review that synthesizes the literature on interventions and approaches integrating these services. Methods This review was part of a larger systematic review on integration of services for HIV and non-communicable diseases. Eligible studies included those that described or evaluated an intervention or approach aimed at integrating HIV and mental health care. We searched multiple databases from inception until October 2015, independently screened articles identified for inclusion, conducted data extraction, and assessed evaluative papers for risk of bias. Results Forty-five articles were eligible for this review. We identified three models of integration at the meso and micro levels: single-facility integration, multi-facility integration, and integrated care coordinated by a non-physician case manager. Single-site integration enhances multidisciplinary coordination and reduces access barriers for patients. However, the practicality and cost-effectiveness of providing a full continuum of specialized care on-site for patients with complex needs is arguable. Integration based on a collaborative network of specialized agencies may serve those with multiple co-morbidities but fragmented and poorly coordinated care can pose barriers. Integrated care coordinated by a single case manager can enable continuity of care for patients but requires appropriate training and support for case managers. Involving patients as key actors in facilitating integration within their own treatment plan is a promising approach. Conclusion This review identified much diversity in integration models combining HIV and mental health services, which are shown to have potential in yielding positive patient and service delivery outcomes when implemented within appropriate contexts. Our review revealed a lack of research in low- and middle- income countries, and was limited to most studies being descriptive. Overall, studies that seek to evaluate and compare integration models in terms of long-term outcomes and cost-effectiveness are needed, particularly at the health system level and in regions with high HIV and AIDS burden. PMID:29106512
Multiscale modeling of lithium ion batteries: thermal aspects
Zausch, Jochen
2015-01-01
Summary The thermal behavior of lithium ion batteries has a huge impact on their lifetime and the initiation of degradation processes. The development of hot spots or large local overpotentials leading, e.g., to lithium metal deposition depends on material properties as well as on the nano- und microstructure of the electrodes. In recent years a theoretical structure emerges, which opens the possibility to establish a systematic modeling strategy from atomistic to continuum scale to capture and couple the relevant phenomena on each scale. We outline the building blocks for such a systematic approach and discuss in detail a rigorous approach for the continuum scale based on rational thermodynamics and homogenization theories. Our focus is on the development of a systematic thermodynamically consistent theory for thermal phenomena in batteries at the microstructure scale and at the cell scale. We discuss the importance of carefully defining the continuum fields for being able to compare seemingly different phenomenological theories and for obtaining rules to determine unknown parameters of the theory by experiments or lower-scale theories. The resulting continuum models for the microscopic and the cell scale are numerically solved in full 3D resolution. The complex very localized distributions of heat sources in a microstructure of a battery and the problems of mapping these localized sources on an averaged porous electrode model are discussed by comparing the detailed 3D microstructure-resolved simulations of the heat distribution with the result of the upscaled porous electrode model. It is shown, that not all heat sources that exist on the microstructure scale are represented in the averaged theory due to subtle cancellation effects of interface and bulk heat sources. Nevertheless, we find that in special cases the averaged thermal behavior can be captured very well by porous electrode theory. PMID:25977870
A systematic approach for the location of hand sanitizer dispensers in hospitals.
Cure, Laila; Van Enk, Richard; Tiong, Ewing
2014-09-01
Compliance with hand hygiene practices is directly affected by the accessibility and availability of cleaning agents. Nevertheless, the decision of where to locate these dispensers is often not explicitly or fully addressed in the literature. In this paper, we study the problem of selecting the locations to install alcohol-based hand sanitizer dispensers throughout a hospital unit as an indirect approach to maximize compliance with hand hygiene practices. We investigate the relevant criteria in selecting dispenser locations that promote hand hygiene compliance, propose metrics for the evaluation of various location configurations, and formulate a dispenser location optimization model that systematically incorporates such criteria. A complete methodology to collect data and obtain the model parameters is described. We illustrate the proposed approach using data from a general care unit at a collaborating hospital. A cost analysis was performed to study the trade-offs between usability and cost. The proposed methodology can help in evaluating the current location configuration, determining the need for change, and establishing the best possible configuration. It can be adapted to incorporate alternative metrics, tailored to different institutions and updated as needed with new internal policies or safety regulation.
78 FR 9698 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-11
... effective at improving health care quality. While evidence-based approaches for decision-making have become standard in healthcare, this has been limited in laboratory medicine. No single-evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...
CAMCE: An Environment to Support Multimedia Courseware Projects.
ERIC Educational Resources Information Center
Barrese, R. M.; And Others
1992-01-01
Presents results of CAMCE (Computer-Aided Multimedia Courseware Engineering) project research concerned with definition of a methodology to describe a systematic approach for multimedia courseware development. Discussion covers the CAMCE methodology, requirements of an advanced authoring environment, use of an object-based model in the CAMCE…
Error assessment of biogeochemical models by lower bound methods (NOMMA-1.0)
NASA Astrophysics Data System (ADS)
Sauerland, Volkmar; Löptien, Ulrike; Leonhard, Claudine; Oschlies, Andreas; Srivastav, Anand
2018-03-01
Biogeochemical models, capturing the major feedbacks of the pelagic ecosystem of the world ocean, are today often embedded into Earth system models which are increasingly used for decision making regarding climate policies. These models contain poorly constrained parameters (e.g., maximum phytoplankton growth rate), which are typically adjusted until the model shows reasonable behavior. Systematic approaches determine these parameters by minimizing the misfit between the model and observational data. In most common model approaches, however, the underlying functions mimicking the biogeochemical processes are nonlinear and non-convex. Thus, systematic optimization algorithms are likely to get trapped in local minima and might lead to non-optimal results. To judge the quality of an obtained parameter estimate, we propose determining a preferably large lower bound for the global optimum that is relatively easy to obtain and that will help to assess the quality of an optimum, generated by an optimization algorithm. Due to the unavoidable noise component in all observations, such a lower bound is typically larger than zero. We suggest deriving such lower bounds based on typical properties of biogeochemical models (e.g., a limited number of extremes and a bounded time derivative). We illustrate the applicability of the method with two real-world examples. The first example uses real-world observations of the Baltic Sea in a box model setup. The second example considers a three-dimensional coupled ocean circulation model in combination with satellite chlorophyll a.
Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K
2015-07-05
We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.
Hiner, Jacqueline; Pyka, Jeanine; Burks, Colleen; Pisegna, Lily; Gador, Rachel Ann
2012-01-01
Ensuring the safety of infants born in a hospital is a top priority and, therefore, requires a solid infant security plan. Using an interdisciplinary approach and a systematic change process, nursing leadership in collaboration with clinical nurses and security personnel analyzed the infant security program at this community hospital to identify vulnerabilities. By establishing an interdisciplinary approach to infant security, participants were able to unravel a complicated concept, systematically analyze the gaps, and agree to a plan of action. This resulted in improved communication and clarification of roles between the nursing and security divisions. Supply costs decreased by 17.4% after the first year of implementation. Most importantly, this project enhanced and strengthened the existing infant abduction prevention measures, hard wired the importance of infant security, and minimized vulnerabilities.
A Comparison of Two Balance Calibration Model Building Methods
NASA Technical Reports Server (NTRS)
DeLoach, Richard; Ulbrich, Norbert
2007-01-01
Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.
Asymptotically inspired moment-closure approximation for adaptive networks
NASA Astrophysics Data System (ADS)
Shkarayev, Maxim
2013-03-01
Dynamics of adaptive social networks, in which nodes and network structure co-evolve, are often described using a mean-field system of equations for the density of node and link types. These equations constitute an open system due to dependence on higher order topological structures. We propose a systematic approach to moment closure approximation based on the analytical description of the system in an asymptotic regime. We apply the proposed approach to two examples of adaptive networks: recruitment to a cause model and adaptive epidemic model. We show a good agreement between the mean-field prediction and simulations of the full network system.
Sobel, Michael E; Lindquist, Martin A
2014-07-01
Functional magnetic resonance imaging (fMRI) has facilitated major advances in understanding human brain function. Neuroscientists are interested in using fMRI to study the effects of external stimuli on brain activity and causal relationships among brain regions, but have not stated what is meant by causation or defined the effects they purport to estimate. Building on Rubin's causal model, we construct a framework for causal inference using blood oxygenation level dependent (BOLD) fMRI time series data. In the usual statistical literature on causal inference, potential outcomes, assumed to be measured without systematic error, are used to define unit and average causal effects. However, in general the potential BOLD responses are measured with stimulus dependent systematic error. Thus we define unit and average causal effects that are free of systematic error. In contrast to the usual case of a randomized experiment where adjustment for intermediate outcomes leads to biased estimates of treatment effects (Rosenbaum, 1984), here the failure to adjust for task dependent systematic error leads to biased estimates. We therefore adjust for systematic error using measured "noise covariates" , using a linear mixed model to estimate the effects and the systematic error. Our results are important for neuroscientists, who typically do not adjust for systematic error. They should also prove useful to researchers in other areas where responses are measured with error and in fields where large amounts of data are collected on relatively few subjects. To illustrate our approach, we re-analyze data from a social evaluative threat task, comparing the findings with results that ignore systematic error.
Up on the Roof: A Systematic Approach to Roof Maintenance.
ERIC Educational Resources Information Center
Burd, William
1979-01-01
A systematic roof maintenance program is characterized by carefully prepared long- and short-range plans. An essential feature of a systematic approach to roof maintenance is the stress on preventive measures rather than the patching of leaks. (Author)
Unrean, Pornkamol; Khajeeram, Sutamat; Laoteng, Kobkul
2016-03-01
An integrative simultaneous saccharification and fermentation (SSF) modeling is a useful guiding tool for rapid process optimization to meet the techno-economic requirement of industrial-scale lignocellulosic ethanol production. In this work, we have developed the SSF model composing of a metabolic network of a Saccharomyces cerevisiae cell associated with fermentation kinetics and enzyme hydrolysis model to quantitatively capture dynamic responses of yeast cell growth and fermentation during SSF. By using model-based design of feeding profiles for substrate and yeast cell in the fed-batch SSF process, an efficient ethanol production with high titer of up to 65 g/L and high yield of 85 % of theoretical yield was accomplished. The ethanol titer and productivity was increased by 47 and 41 %, correspondingly, in optimized fed-batch SSF as compared to batch process. The developed integrative SSF model is, therefore, considered as a promising approach for systematic design of economical and sustainable SSF bioprocessing of lignocellulose.
Manios, Y; Grammatikaki, E; Androutsos, O; Chinapaw, M J M; Gibson, E L; Buijs, G; Iotova, V; Socha, P; Annemans, L; Wildgruber, A; Mouratidou, T; Yngve, A; Duvinage, K; de Bourdeaudhuij, I
2012-03-01
The increasing childhood obesity epidemic calls for appropriate measures and effective policies to be applied early in life. Large-scale socioecological frameworks providing a holistic multifactorial and cost-effective approach necessary to support obesity prevention initiatives in this age are however currently missing. To address this missing link, ToyBox-study aims to build and evaluate a cost-effective kindergarten-based, family-involved intervention scheme to prevent obesity in early childhood, which could potentially be expanded on a pan-European scale. A multidisciplinary team of researchers from 10 countries have joined forces and will work to realize this according to a systematic stepwise approach that combines the use of the PRECEDE-PROCEED model and intervention mapping protocol. ToyBox-study will conduct systematic and narrative reviews, secondary data analyses, focus group research and societal assessment to design, implement and evaluate outcome, impact, process and cost effectiveness of the intervention. This is the first time that such a holistic approach has been used on a pan-European scale to promote healthy weight and healthy energy balance-related behaviours for the prevention of early childhood obesity. The results of ToyBox-study will be disseminated among key stakeholders including researchers, policy makers, practitioners and the general population. © 2012 The Authors. obesity reviews © 2012 International Association for the Study of Obesity.
Kaltenthaler, Eva; Tappenden, Paul; Paisley, Suzy
2013-01-01
Health technology assessments (HTAs) typically require the development of a cost-effectiveness model, which necessitates the identification, selection, and use of other types of information beyond clinical effectiveness evidence to populate the model parameters. The reviewing activity associated with model development should be transparent and reproducible but can result in a tension between being both timely and systematic. Little procedural guidance exists in this area. The purpose of this article was to provide guidance, informed by focus groups, on what might constitute a systematic and transparent approach to reviewing information to populate model parameters. A focus group series was held with HTA experts in the United Kingdom including systematic reviewers, information specialists, and health economic modelers to explore these issues. Framework analysis was used to analyze the qualitative data elicited during focus groups. Suggestions included the use of rapid reviewing methods and the need to consider the trade-off between relevance and quality. The need for transparency in the reporting of review methods was emphasized. It was suggested that additional attention should be given to the reporting of parameters deemed to be more important to the model or where the preferred decision regarding the choice of evidence is equivocal. These recommendations form part of a Technical Support Document produced for the National Institute for Health and Clinical Excellence Decision Support Unit in the United Kingdom. It is intended that these recommendations will help to ensure a more systematic, transparent, and reproducible process for the review of model parameters within HTA. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Model-Based Anomaly Detection for a Transparent Optical Transmission System
NASA Astrophysics Data System (ADS)
Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.
In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.
Statistical bias correction modelling for seasonal rainfall forecast for the case of Bali island
NASA Astrophysics Data System (ADS)
Lealdi, D.; Nurdiati, S.; Sopaheluwakan, A.
2018-04-01
Rainfall is an element of climate which is highly influential to the agricultural sector. Rain pattern and distribution highly determines the sustainability of agricultural activities. Therefore, information on rainfall is very useful for agriculture sector and farmers in anticipating the possibility of extreme events which often cause failures of agricultural production. This research aims to identify the biases from seasonal forecast products from ECMWF (European Centre for Medium-Range Weather Forecasts) rainfall forecast and to build a transfer function in order to correct the distribution biases as a new prediction model using quantile mapping approach. We apply this approach to the case of Bali Island, and as a result, the use of bias correction methods in correcting systematic biases from the model gives better results. The new prediction model obtained with this approach is better than ever. We found generally that during rainy season, the bias correction approach performs better than in dry season.
Illuminating the Black Box of Entrepreneurship Education Programmes: Part 2
ERIC Educational Resources Information Center
Maritz, Alex
2017-01-01
Purpose: The purpose of this paper is to provide a justified, legitimate and validated model on entrepreneurship education programmes (EEPs), by combining recent research and scholarship in leading edge entrepreneurship education (EE). Design/methodology/approach: A systematic literature review of recent EE research and scholarship is followed by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jeff
This report discusses the UHSP monitoring program, a radioactive material accounting process and its purpose. The systematic approach to implementing Lean principles, determining key requirements, root causes of variation and disruption that interfere with program efficiency and effectiveness. Preexisting issues within the UHSP are modeled to illustrate the impact that they have on the large and extensive systems.
This paper presents a new system for automated 2D-3D migration of chemicals in large databases with conformer multiplication. The main advantages of this system are its straightforward performance, reasonable execution time, simplicity, and applicability to building large 3D che...
Constituent Aspects of Workplace Guidance in Secondary VET
ERIC Educational Resources Information Center
Swager, Robert; Klarus, Ruud; van Merriënboer, Jeroen J. G.; Nieuwenhuis, Loek F. M.
2015-01-01
Purpose: This paper aims to present an integrated model of workplace guidance to enhance awareness of what constitutes good guidance, to improve workplace guidance practices in vocational education and training. Design/methodology/approach: To identify constituent aspects of workplace guidance, a systematic search of Web of Science was conducted,…
Standard Operating Procedures for Collecting Data from Local Education Agencies.
ERIC Educational Resources Information Center
McElreath, Nancy R., Ed.
A systematic approach to planning and presenting the data collection activities of a State Department of Education is described. The Information Communication System, a model communication system used by the state of New Jersey, conveys narrative and statistical information relating to a school district's students, teachers, finances, facilities…
Enhancing Capacity to Improve Student Learning
ERIC Educational Resources Information Center
Mayotte, Gail; Wei, Dan; Lamphier, Sarah; Doyle, Thomas
2013-01-01
Professional development provides a means to build capacity among school personnel when it is delivered as part of a systematic, long-term approach to school and teacher improvement. This research examines a sustained, diocesan-wide professional development model, called the ACE Collaborative for Academic Excellence, that aims to build capacity…
ERIC Educational Resources Information Center
Lillis, Deirdre
2012-01-01
Higher education institutions worldwide invest significant resources in their quality assurance systems. Little empirical evidence exists that demonstrates the effectiveness (or otherwise) of these systems. Methodological approaches for determining effectiveness are also underdeveloped. Self-study-with-peer-review is a widely used model for…
Ethics: A Bridge for Studying the Social Contexts of Professional Communication.
ERIC Educational Resources Information Center
Speck, Bruce W.
1989-01-01
Describes a method for helping students evaluate ethical issues in a systematic way, based on Lawrence Kohlberg's stages of moral development. Recommends the case-study approach for creating social constructs in which students face ethical dilemmas, and outlines a case-study ethics unit using Kohlberg's model. (MM)
ERIC Educational Resources Information Center
Forbes, Raymond L., Jr.; Nickols, Frederick W.
The basic similarities between educational technology and organizational development provide a powerful rationale for collaboration. The two disciplines are essentially in the same business, that of systematically changing human behavior. System theory and the system model appear to supply the language and the technology through which such efforts…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillingham, Kenneth; Bollinger, Bryan
This is the final report for a systematic, evidence-based project using an unprecedented series of large-scale field experiments to examine the effectiveness and cost-effectiveness of novel approaches to reduce the soft costs of solar residential photovoltaics. The approaches were based around grassroots marketing campaigns called ‘Solarize’ campaigns, that were designed to lower costs and increase adoption of solar technology. This study quantified the effectiveness and cost-effectiveness of the Solarize programs and tested new approaches to further improve the model.
Behavioral facilitation: a cognitive model of individual differences in approach motivation.
Robinson, Michael D; Meier, Brian P; Tamir, Maya; Wilkowski, Benjamin M; Ode, Scott
2009-02-01
Approach motivation consists of the active, engaged pursuit of one's goals. The purpose of the present three studies (N = 258) was to examine whether approach motivation could be cognitively modeled, thereby providing process-based insights into personality functioning. Behavioral facilitation was assessed in terms of faster (or facilitated) reaction time with practice. As hypothesized, such tendencies predicted higher levels of approach motivation, higher levels of positive affect, and lower levels of depressive symptoms and did so across cognitive, behavioral, self-reported, and peer-reported outcomes. Tendencies toward behavioral facilitation, on the other hand, did not correlate with self-reported traits (Study 1) and did not predict avoidance motivation or negative affect (all studies). The results indicate a systematic relationship between behavioral facilitation in cognitive tasks and approach motivation in daily life. Results are discussed in terms of the benefits of modeling the cognitive processes hypothesized to underlie individual differences motivation, affect, and depression. (c) 2009 APA, all rights reserved
Modern control concepts in hydrology
NASA Technical Reports Server (NTRS)
Duong, N.; Johnson, G. R.; Winn, C. B.
1974-01-01
Two approaches to an identification problem in hydrology are presented based upon concepts from modern control and estimation theory. The first approach treats the identification of unknown parameters in a hydrologic system subject to noisy inputs as an adaptive linear stochastic control problem; the second approach alters the model equation to account for the random part in the inputs, and then uses a nonlinear estimation scheme to estimate the unknown parameters. Both approaches use state-space concepts. The identification schemes are sequential and adaptive and can handle either time invariant or time dependent parameters. They are used to identify parameters in the Prasad model of rainfall-runoff. The results obtained are encouraging and conform with results from two previous studies; the first using numerical integration of the model equation along with a trial-and-error procedure, and the second, by using a quasi-linearization technique. The proposed approaches offer a systematic way of analyzing the rainfall-runoff process when the input data are imbedded in noise.
NASA Technical Reports Server (NTRS)
Phatak, A. V.
1980-01-01
A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.
A Bayesian Approach to Systematic Error Correction in Kepler Photometric Time Series
NASA Astrophysics Data System (ADS)
Jenkins, Jon Michael; VanCleve, J.; Twicken, J. D.; Smith, J. C.; Kepler Science Team
2011-01-01
In order for the Kepler mission to achieve its required 20 ppm photometric precision for 6.5 hr observations of 12th magnitude stars, the Presearch Data Conditioning (PDC) software component of the Kepler Science Processing Pipeline must reduce systematic errors in flux time series to the limit of stochastic noise for errors with time-scales less than three days, without smoothing or over-fitting away the transits that Kepler seeks. The current version of PDC co-trends against ancillary engineering data and Pipeline generated data using essentially a least squares (LS) approach. This approach is successful for quiet stars when all sources of systematic error have been identified. If the stars are intrinsically variable or some sources of systematic error are unknown, LS will nonetheless attempt to explain all of a given time series, not just the part the model can explain well. Negative consequences can include loss of astrophysically interesting signal, and injection of high-frequency noise into the result. As a remedy, we present a Bayesian Maximum A Posteriori (MAP) approach, in which a subset of intrinsically quiet and highly-correlated stars is used to establish the probability density function (PDF) of robust fit parameters in a diagonalized basis. The PDFs then determine a "reasonable” range for the fit parameters for all stars, and brake the runaway fitting that can distort signals and inject noise. We present a closed-form solution for Gaussian PDFs, and show examples using publically available Quarter 1 Kepler data. A companion poster (Van Cleve et al.) shows applications and discusses current work in more detail. Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.
NASA Astrophysics Data System (ADS)
Ingram, G. Walter; Alvarez-Berastegui, Diego; Reglero, Patricia; Balbín, Rosa; García, Alberto; Alemany, Francisco
2017-06-01
Fishery independent indices of bluefin tuna larvae in the Western Mediterranean Sea are presented utilizing ichthyoplankton survey data collected from 2001 through 2005 and 2012 through 2013. Indices were developed using larval catch rates collected using two different types of bongo sampling, by first standardizing catch rates by gear/fishing-style and then employing a delta-lognormal modeling approach. The delta-lognormal models were developed three ways: 1) a basic larval index including the following covariates: time of day, a systematic geographic area variable, month and year; 2) a standard environmental larval index including the following covariates: mean water temperature over the mixed layer depth, mean salinity over the mixed layer depth, geostrophic velocity, time of day, a systematic geographic area variable, month and year; and 3) a habitat-adjusted larval index including the following covariates: a potential habitat variable, time of day, a systematic geographic area variable, month and year. Results indicated that all three model-types had similar precision in index values. However, the habitat-adjusted larval index demonstrated a high correlation with estimates of spawning stock biomass from the previous stock assessment model, and, therefore, is recommended as a tuning index in future stock assessment models.
BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements.
Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang
2017-10-27
This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm.
BeiDou Geostationary Satellite Code Bias Modeling Using Fengyun-3C Onboard Measurements
Jiang, Kecai; Li, Min; Zhao, Qile; Li, Wenwen; Guo, Xiang
2017-01-01
This study validated and investigated elevation- and frequency-dependent systematic biases observed in ground-based code measurements of the Chinese BeiDou navigation satellite system, using the onboard BeiDou code measurement data from the Chinese meteorological satellite Fengyun-3C. Particularly for geostationary earth orbit satellites, sky-view coverage can be achieved over the entire elevation and azimuth angle ranges with the available onboard tracking data, which is more favorable to modeling code biases. Apart from the BeiDou-satellite-induced biases, the onboard BeiDou code multipath effects also indicate pronounced near-field systematic biases that depend only on signal frequency and the line-of-sight directions. To correct these biases, we developed a proposed code correction model by estimating the BeiDou-satellite-induced biases as linear piece-wise functions in different satellite groups and the near-field systematic biases in a grid approach. To validate the code bias model, we carried out orbit determination using single-frequency BeiDou data with and without code bias corrections applied. Orbit precision statistics indicate that those code biases can seriously degrade single-frequency orbit determination. After the correction model was applied, the orbit position errors, 3D root mean square, were reduced from 150.6 to 56.3 cm. PMID:29076998
Automatic determination of fault effects on aircraft functionality
NASA Technical Reports Server (NTRS)
Feyock, Stefan
1989-01-01
The problem of determining the behavior of physical systems subsequent to the occurrence of malfunctions is discussed. It is established that while it was reasonable to assume that the most important fault behavior modes of primitive components and simple subsystems could be known and predicted, interactions within composite systems reached levels of complexity that precluded the use of traditional rule-based expert system techniques. Reasoning from first principles, i.e., on the basis of causal models of the physical system, was required. The first question that arises is, of course, how the causal information required for such reasoning should be represented. The bond graphs presented here occupy a position intermediate between qualitative and quantitative models, allowing the automatic derivation of Kuipers-like qualitative constraint models as well as state equations. Their most salient feature, however, is that entities corresponding to components and interactions in the physical system are explicitly represented in the bond graph model, thus permitting systematic model updates to reflect malfunctions. Researchers show how this is done, as well as presenting a number of techniques for obtaining qualitative information from the state equations derivable from bond graph models. One insight is the fact that one of the most important advantages of the bond graph ontology is the highly systematic approach to model construction it imposes on the modeler, who is forced to classify the relevant physical entities into a small number of categories, and to look for two highly specific types of interactions among them. The systematic nature of bond graph model construction facilitates the process to the point where the guidelines are sufficiently specific to be followed by modelers who are not domain experts. As a result, models of a given system constructed by different modelers will have extensive similarities. Researchers conclude by pointing out that the ease of updating bond graph models to reflect malfunctions is a manifestation of the systematic nature of bond graph construction, and the regularity of the relationship between bond graph models and physical reality.
Towards a Generalizable Time Expression Model for Temporal Reasoning in Clinical Notes
Velupillai, Sumithra; Mowery, Danielle L.; Abdelrahman, Samir; Christensen, Lee; Chapman, Wendy W
2015-01-01
Accurate temporal identification and normalization is imperative for many biomedical and clinical tasks such as generating timelines and identifying phenotypes. A major natural language processing challenge is developing and evaluating a generalizable temporal modeling approach that performs well across corpora and institutions. Our long-term goal is to create such a model. We initiate our work on reaching this goal by focusing on temporal expression (TIMEX3) identification. We present a systematic approach to 1) generalize existing solutions for automated TIMEX3 span detection, and 2) assess similarities and differences by various instantiations of TIMEX3 models applied on separate clinical corpora. When evaluated on the 2012 i2b2 and the 2015 Clinical TempEval challenge corpora, our conclusion is that our approach is successful – we achieve competitive results for automated classification, and we identify similarities and differences in TIMEX3 modeling that will be informative in the development of a simplified, general temporal model. PMID:26958265
A two-stage DEA approach for environmental efficiency measurement.
Song, Malin; Wang, Shuhong; Liu, Wei
2014-05-01
The slacks-based measure (SBM) model based on the constant returns to scale has achieved some good results in addressing the undesirable outputs, such as waste water and water gas, in measuring environmental efficiency. However, the traditional SBM model cannot deal with the scenario in which desirable outputs are constant. Based on the axiomatic theory of productivity, this paper carries out a systematic research on the SBM model considering undesirable outputs, and further expands the SBM model from the perspective of network analysis. The new model can not only perform efficiency evaluation considering undesirable outputs, but also calculate desirable and undesirable outputs separately. The latter advantage successfully solves the "dependence" problem of outputs, that is, we can not increase the desirable outputs without producing any undesirable outputs. The following illustration shows that the efficiency values obtained by two-stage approach are smaller than those obtained by the traditional SBM model. Our approach provides a more profound analysis on how to improve environmental efficiency of the decision making units.
Tissue engineering of the bladder--reality or myth? A systematic review.
Sloff, Marije; Simaioforidis, Vasileios; de Vries, Rob; Oosterwijk, Egbert; Feitz, Wout
2014-10-01
We systematically reviewed preclinical studies in the literature to evaluate the potential of tissue engineering of the bladder. Study outcomes were compared to the available clinical evidence to assess the feasibility of tissue engineering for future clinical use. Preclinical studies of tissue engineering for bladder augmentation were identified through a systematic search of PubMed and Embase™ from January 1, 1980 to January 1, 2014. Primary studies in English were included if bladder reconstruction after partial cystectomy was performed using a tissue engineered biomaterial in any animal species, with cystometric bladder capacity as an outcome measure. Outcomes were compared to clinical studies available at http://www.clinicaltrials.gov and published clinical studies. A total of 28 preclinical studies are included, demonstrating remarkable heterogeneity in study characteristics and design. Studies in which preoperative bladder volumes were compared to postoperative volumes were considered the most clinically relevant (18 studies). Bladder augmentation through tissue engineering resulted in a normal bladder volume in healthy animals, with the influence of a cellular component being negligible. Furthermore, experiments in large animal models (pigs and dogs) approximated the desired bladder volume more accurately than in smaller species. The initial clinical experience was based on seemingly predictive healthy animal models with a promising outcome. Unfortunately these results were not substantiated in all clinical trials, revealing dissimilar outcomes in different clinical/disease backgrounds. Thus, the translational predictability of a model using healthy animals might be questioned. Through this systematic approach we present an unbiased overview of all published preclinical studies investigating the effect of bladder tissue engineering on cystometric bladder capacity. Preclinical research in healthy animals appears to show the feasibility of bladder augmentation by tissue engineering. However, in view of the disappointing clinical results based on healthy animal models new approaches should also be evaluated in preclinical models using dysfunctional/diseased bladders. This endeavor may aid in the development of clinically applicable tissue engineered bladder augmentation with satisfactory long-term outcome. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Modeling and Reduction With Applications to Semiconductor Processing
1999-01-01
smoothies ,” as they kept my energy level high without resorting to coffee (the beverage of choice, it seems, for graduate students). My advisor gave me all...with POC data, and balancing approach. . . . . . . . . . . . . . . . 312 xii LIST OF FIGURES 1.1 General state-space model reduction methodology ...reduction problem, then, is one of finding a systematic methodology within a given mathematical framework to produce an efficient or optimal trade-off of
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
New Approach for Investigating Reaction Dynamics and Rates with Ab Initio Calculations.
Fleming, Kelly L; Tiwary, Pratyush; Pfaendtner, Jim
2016-01-21
Herein, we demonstrate a convenient approach to systematically investigate chemical reaction dynamics using the metadynamics (MetaD) family of enhanced sampling methods. Using a symmetric SN2 reaction as a model system, we applied infrequent metadynamics, a theoretical framework based on acceleration factors, to quantitatively estimate the rate of reaction from biased and unbiased simulations. A systematic study of the algorithm and its application to chemical reactions was performed by sampling over 5000 independent reaction events. Additionally, we quantitatively reweighed exhaustive free-energy calculations to obtain the reaction potential-energy surface and showed that infrequent metadynamics works to effectively determine Arrhenius-like activation energies. Exact agreement with unbiased high-temperature kinetics is also shown. The feasibility of using the approach on actual ab initio molecular dynamics calculations is then presented by using Car-Parrinello MD+MetaD to sample the same reaction using only 10-20 calculations of the rare event. Owing to the ease of use and comparatively low-cost of computation, the approach has extensive potential applications for catalysis, combustion, pyrolysis, and enzymology.
A Systematic Process for Developing High Quality SaaS Cloud Services
NASA Astrophysics Data System (ADS)
La, Hyun Jung; Kim, Soo Dong
Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.
NASA Astrophysics Data System (ADS)
Pacifici, Camilla; da Cunha, Elisabete; Charlot, Stéphane; Rix, Hans-Walter; Fumagalli, Mattia; Wel, Arjen van der; Franx, Marijn; Maseda, Michael V.; van Dokkum, Pieter G.; Brammer, Gabriel B.; Momcheva, Ivelina; Skelton, Rosalind E.; Whitaker, Katherine; Leja, Joel; Lundgren, Britt; Kassin, Susan A.; Yi, Sukyoung K.
2015-02-01
Interpreting observations of distant galaxies in terms of constraints on physical parameters - such as stellar mass (M★), star formation rate (SFR) and dust optical depth ({hat{τ}V}) - requires spectral synthesis modelling. We analyse the reliability of these physical parameters as determined under commonly adopted `classical' assumptions: star formation histories assumed to be exponentially declining functions of time, a simple dust law and no emission-line contribution. Improved modelling techniques and data quality now allow us to use a more sophisticated approach, including realistic star formation histories, combined with modern prescriptions for dust attenuation and nebular emission. We present a Bayesian analysis of the spectra and multiwavelength photometry of 1048 galaxies from the 3D-HST survey in the redshift range 0.7 < z < 2.8 and in the stellar mass range 9 ≲ log (M★/M⊙) ≲ 12. We find that, using the classical spectral library, stellar masses are systematically overestimated (˜0.1 dex) and SFRs are systematically underestimated (˜0.6 dex) relative to our more sophisticated approach. We also find that the simultaneous fit of photometric fluxes and emission-line equivalent widths helps break a degeneracy between SFR and {hat{τ}V}, reducing the uncertainties on these parameters. Finally, we show how the biases of classical approaches can affect the correlation between M★ and SFR for star-forming galaxies (the `star-formation main sequence'). We conclude that the normalization, slope and scatter of this relation strongly depend on the adopted approach and demonstrate that the classical, oversimplified approach cannot recover the true distribution of M★ and SFR.
Research Applications of Magnetic Resonance Spectroscopy (MRS) to Investigate Psychiatric Disorders
Dager, SR; Oskin, NM; Richards, TL; Posse, S
2009-01-01
Advances in magnetic resonance spectroscopy (MRS) methodology and related analytic strategies allow sophisticated testing of neurobiological models of disease pathology in psychiatric disorders. An overview of principles underlying MRS, methodological considerations and investigative approaches is presented. A review of recent research is presented that highlights innovative approaches applying MRS, in particular 1H MRS, to systematically investigate specific psychiatric disorders, including autism spectrum disorders, schizophrenia, panic disorder, major depression and bipolar disorder. PMID:19363431
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
Holmes, Tyson H.; Lewis, David B.
2014-01-01
Bayesian estimation techniques offer a systematic and quantitative approach for synthesizing data drawn from the literature to model immunological systems. As detailed here, the practitioner begins with a theoretical model and then sequentially draws information from source data sets and/or published findings to inform estimation of model parameters. Options are available to weigh these various sources of information differentially per objective measures of their corresponding scientific strengths. This approach is illustrated in depth through a carefully worked example for a model of decline in T-cell receptor excision circle content of peripheral T cells during development and aging. Estimates from this model indicate that 21 years of age is plausible for the developmental timing of mean age of onset of decline in T-cell receptor excision circle content of peripheral T cells. PMID:25179832
NASA Technical Reports Server (NTRS)
Dennehy, Cornelius J.
2010-01-01
This final report summarizes the results of a comparative assessment of the fault tolerance and reliability of different Guidance, Navigation and Control (GN&C) architectural approaches. This study was proactively performed by a combined Massachusetts Institute of Technology (MIT) and Draper Laboratory team as a GN&C "Discipline-Advancing" activity sponsored by the NASA Engineering and Safety Center (NESC). This systematic comparative assessment of GN&C system architectural approaches was undertaken as a fundamental step towards understanding the opportunities for, and limitations of, architecting highly reliable and fault tolerant GN&C systems composed of common avionic components. The primary goal of this study was to obtain architectural 'rules of thumb' that could positively influence future designs in the direction of an optimized (i.e., most reliable and cost-efficient) GN&C system. A secondary goal was to demonstrate the application and the utility of a systematic modeling approach that maps the entire possible architecture solution space.
Dynamic metabolic modeling for a MAB bioprocess.
Gao, Jianying; Gorenflo, Volker M; Scharer, Jeno M; Budman, Hector M
2007-01-01
Production of monoclonal antibodies (MAb) for diagnostic or therapeutic applications has become an important task in the pharmaceutical industry. The efficiency of high-density reactor systems can be potentially increased by model-based design and control strategies. Therefore, a reliable kinetic model for cell metabolism is required. A systematic procedure based on metabolic modeling is used to model nutrient uptake and key product formation in a MAb bioprocess during both the growth and post-growth phases. The approach combines the key advantages of stoichiometric and kinetic models into a complete metabolic network while integrating the regulation and control of cellular activity. This modeling procedure can be easily applied to any cell line during both the cell growth and post-growth phases. Quadratic programming (QP) has been identified as a suitable method to solve the underdetermined constrained problem related to model parameter identification. The approach is illustrated for the case of murine hybridoma cells cultivated in stirred spinners.
NASA Astrophysics Data System (ADS)
Mehrotra, Rajeshwar; Sharma, Ashish
2012-12-01
The quality of the absolute estimates of general circulation models (GCMs) calls into question the direct use of GCM outputs for climate change impact assessment studies, particularly at regional scales. Statistical correction of GCM output is often necessary when significant systematic biasesoccur between the modeled output and observations. A common procedure is to correct the GCM output by removing the systematic biases in low-order moments relative to observations or to reanalysis data at daily, monthly, or seasonal timescales. In this paper, we present an extension of a recently published nested bias correction (NBC) technique to correct for the low- as well as higher-order moments biases in the GCM-derived variables across selected multiple time-scales. The proposed recursive nested bias correction (RNBC) approach offers an improved basis for applying bias correction at multiple timescales over the original NBC procedure. The method ensures that the bias-corrected series exhibits improvements that are consistently spread over all of the timescales considered. Different variations of the approach starting from the standard NBC to the more complex recursive alternatives are tested to assess their impacts on a range of GCM-simulated atmospheric variables of interest in downscaling applications related to hydrology and water resources. Results of the study suggest that three to five iteration RNBCs are the most effective in removing distributional and persistence related biases across the timescales considered.
Kane, Jennifer; Landes, Megan; Carroll, Christopher; Nolen, Amy; Sodhi, Sumeet
2017-03-23
Chronic diseases, primarily cardiovascular disease, respiratory disease, diabetes and cancer, are the leading cause of death and disability worldwide. In sub-Saharan Africa (SSA), where communicable disease prevalence still outweighs that of non-communicable disease (NCDs), rates of NCDs are rapidly rising and evidence for primary healthcare approaches for these emerging NCDs is needed. A systematic review and evidence synthesis of primary care approaches for chronic disease in SSA. Quantitative and qualitative primary research studies were included that focused on priority NCDs interventions. The method used was best-fit framework synthesis. Three conceptual models of care for NCDs in low- and middle-income countries were identified and used to develop an a priori framework for the synthesis. The literature search for relevant primary research studies generated 3759 unique citations of which 12 satisfied the inclusion criteria. Eleven studies were quantitative and one used mixed methods. Three higher-level themes of screening, prevention and management of disease were derived. This synthesis permitted the development of a new evidence-based conceptual model of care for priority NCDs in SSA. For this review there was a near-consensus that passive rather than active case-finding approaches are suitable in resource-poor settings. Modifying risk factors among existing patients through advice on diet and lifestyle was a common element of healthcare approaches. The priorities for disease management in primary care were identified as: availability of essential diagnostic tools and medications at local primary healthcare clinics and the use of standardized protocols for diagnosis, treatment, monitoring and referral to specialist care.
Shea, Beverley; Nahwegahbow, Amy; Andersson, Neil
2010-01-01
Many efforts to reduce family violence are documented in the published literature. We conducted a systematic review of interventions intended to prevent family violence in Aboriginal communities. We retrieved studies published up to October 2009; 506 papers included one systematic review, two randomized controlled trials, and fourteen nonrandomized studies or reviews. Two reviews discussed interventions relevant to primary prevention (reducing the risk factors for family violence), including parenting, role modelling, and active participation. More studies addressed secondary prevention (where risk factors exist, reducing outbreaks of violence) such as restriction on the trading hours for take away alcohol and home visiting programs for high risk families. Examples of tertiary prevention (preventing recurrence) include traditional healing circles and group counselling. Most studies contributed a low level of evidence. PMID:21052554
An Event-Based Approach to Distributed Diagnosis of Continuous Systems
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon
2010-01-01
Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.
A Data Based Gymnasium: A Systematic Approach to Physical Education for the Handicapped.
ERIC Educational Resources Information Center
Dunn, John M.; And Others
The authors describe a data based physical education curriculum designed for low incidence severely handicapped students by Oregon State University in conjunction with Teaching Research. Chapter 1 provides a brief introduction to the physical education curriculum and the Teaching Research model with emphasis placed on the importance of…
Disability Policy Evaluation: Combining Logic Models and Systems Thinking
ERIC Educational Resources Information Center
Claes, Claudia; Ferket, Neelke; Vandevelde, Stijn; Verlet, Dries; De Maeyer, Jessica
2017-01-01
Policy evaluation focuses on the assessment of policy-related personal, family, and societal changes or benefits that follow as a result of the interventions, services, and supports provided to those persons to whom the policy is directed. This article describes a systematic approach to policy evaluation based on an evaluation framework and an…
Systematizing Scaffolding for Problem-Based Learning: A View from Case-Based Reasoning
ERIC Educational Resources Information Center
Tawfik, Andrew A.; Kolodner, Janet L.
2016-01-01
Current theories and models of education often argue that instruction is best administered when knowledge is situated within a context. Problem-based learning (PBL) provides an approach to education that has particularly powerful affordances for learning disciplinary content and practices by solving authentic problems within a discipline. However,…
Since the publication of the Adverse Outcome Pathway (AOP) for skin sensitization, there have been many efforts to develop systematic approaches to integrate the information generated from different key events for decision making. The types of information characterizing key event...
2011-02-02
who graduated during this period and will receive scholarships or fellowships for further studies in science, mathematics, engineering or technology...nature or are collected at discrete points or localized areas in the system. The qualitative data includes, geology , large-scale stratigraphy and
Diagnosing EAP Writing Ability Using the Reduced Reparameterized Unified Model
ERIC Educational Resources Information Center
Kim, Youn-Hee
2011-01-01
Despite the increasing interest in and need for test information for use in instructional practice and student learning, there have been few attempts to systematically link a diagnostic approach to English for academic purposes (EAP) writing instruction and assessment. In response to this need for research, this study examined the extent to which…
Implementing a Project-Based Learning Model in a Pre-Service Leadership Program
ERIC Educational Resources Information Center
Albritton, Shelly; Stacks, Jamie
2016-01-01
This paper describes two instructors' efforts to more authentically engage students in a preservice leadership program's course called Program Planning and Evaluation by using a project-based learning approach. Markham, Larmer, and Ravitz (2003) describe project-based learning (PjBL) as "a systematic teaching method that engages students in…
Emotion and Emotionality as a Hidden Dimension of Lexicon and Discourse
ERIC Educational Resources Information Center
Viberg, Ake
2008-01-01
In her thought-provoking article, Aneta Pavlenko approaches emotion and emotion-laden words in the bilingual lexicon from an impressive number of different perspectives. This is particularly welcome, since most models of linguistic structure do not account for emotional meanings in a systematic way. One exception worth mentioning, however, is…
Modeling human target acquisition in ground-to-air weapon systems
NASA Technical Reports Server (NTRS)
Phatak, A. V.; Mohr, R. L.; Vikmanis, M.; Wei, K. C.
1982-01-01
The problems associated with formulating and validating mathematical models for describing and predicting human target acquisition response are considered. In particular, the extension of the human observer model to include the acquisition phase as well as the tracking segment is presented. Relationship of the Observer model structure to the more complex Standard Optimal Control model formulation and to the simpler Transfer Function/Noise representation is discussed. Problems pertinent to structural identifiability and the form of the parameterization are elucidated. A systematic approach toward the identification of the observer acquisition model parameters from ensemble tracking error data is presented.
A probabilistic approach to remote compositional analysis of planetary surfaces
Lapotre, Mathieu G.A.; Ehlmann, Bethany L.; Minson, Sarah E.
2017-01-01
Reflected light from planetary surfaces provides information, including mineral/ice compositions and grain sizes, by study of albedo and absorption features as a function of wavelength. However, deconvolving the compositional signal in spectra is complicated by the nonuniqueness of the inverse problem. Trade-offs between mineral abundances and grain sizes in setting reflectance, instrument noise, and systematic errors in the forward model are potential sources of uncertainty, which are often unquantified. Here we adopt a Bayesian implementation of the Hapke model to determine sets of acceptable-fit mineral assemblages, as opposed to single best fit solutions. We quantify errors and uncertainties in mineral abundances and grain sizes that arise from instrument noise, compositional end members, optical constants, and systematic forward model errors for two suites of ternary mixtures (olivine-enstatite-anorthite and olivine-nontronite-basaltic glass) in a series of six experiments in the visible-shortwave infrared (VSWIR) wavelength range. We show that grain sizes are generally poorly constrained from VSWIR spectroscopy. Abundance and grain size trade-offs lead to typical abundance errors of ≤1 wt % (occasionally up to ~5 wt %), while ~3% noise in the data increases errors by up to ~2 wt %. Systematic errors further increase inaccuracies by a factor of 4. Finally, phases with low spectral contrast or inaccurate optical constants can further increase errors. Overall, typical errors in abundance are <10%, but sometimes significantly increase for specific mixtures, prone to abundance/grain-size trade-offs that lead to high unmixing uncertainties. These results highlight the need for probabilistic approaches to remote determination of planetary surface composition.
Application of linear regression analysis in accuracy assessment of rolling force calculations
NASA Astrophysics Data System (ADS)
Poliak, E. I.; Shim, M. K.; Kim, G. S.; Choo, W. Y.
1998-10-01
Efficient operation of the computational models employed in process control systems require periodical assessment of the accuracy of their predictions. Linear regression is proposed as a tool which allows separate systematic and random prediction errors from those related to measurements. A quantitative characteristic of the model predictive ability is introduced in addition to standard statistical tests for model adequacy. Rolling force calculations are considered as an example for the application. However, the outlined approach can be used to assess the performance of any computational model.
NASA Astrophysics Data System (ADS)
Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg
2014-06-01
A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.
A systematic review of innovative diabetes care models in low-and middle-income countries (LMICs).
Esterson, Yonah B; Carey, Michelle; Piette, John D; Thomas, Nihal; Hawkins, Meredith
2014-02-01
Over 70% of the world's patients with diabetes reside in low-and middle-income countries (LMICs), where adequate infrastructure and resources for diabetes care are often lacking. Therefore, academic institutions, health care organizations, and governments from Western nations and LMICs have worked together to develop a variety of effective diabetes care models for resource-poor settings. A focused search of PubMed was conducted with the goal of identifying reports that addressed the implementation of diabetes care models or initiatives to improve clinical and/or biochemical outcomes in patients with diabetes mellitus. A total of 15 published manuscripts comprising nine diabetes care models in 16 locations in sub-Saharan Africa, Latin America, and Asia identified by the above approach were systematically reviewed. The reviewed models shared a number of principles including collaboration, education, standardization, resource optimization, and technological innovation. The most comprehensive models used a number of these principles, which contributed to their success. Reviewing the principles shared by these successful programs may help guide the development of effective future models for diabetes care in low-income settings.
Systems science and systems thinking for public health: a systematic review of the field
Carey, Gemma; Malbon, Eleanor; Carey, Nicole; Joyce, Andrew; Crammond, Brad; Carey, Alan
2015-01-01
Objectives This paper reports on findings from a systematic review designed to investigate the state of systems science research in public health. The objectives were to: (1) explore how systems methodologies are being applied within public health and (2) identify fruitful areas of activity. Design A systematic review was conducted from existing literature that draws on or uses systems science (in its various forms) and relates to key public health areas of action and concern, including tobacco, alcohol, obesity and the social determinants of health. Data analysis 117 articles were included in the review. An inductive qualitative content analysis was used for data extraction. The following were systematically extracted from the articles: approach, methodology, transparency, strengths and weaknesses. These were then organised according to theme (ie, commonalities between studies within each category), in order to provide an overview of the state of the field as a whole. The assessment of data quality was intrinsic to the goals of the review itself, and therefore, was carried out as part of the analysis. Results 4 categories of research were identified from the review, ranging from editorial and commentary pieces to complex system dynamic modelling. Our analysis of each of these categories of research highlighted areas of potential for systems science to strengthen public health efforts, while also revealing a number of limitations in the dynamic systems modelling being carried out in public health. Conclusions There is a great deal of interest in how the application of systems concepts and approach might aid public health. Our analysis suggests that soft systems modelling techniques are likely to be the most useful addition to public health, and align well with current debate around knowledge transfer and policy. However, the full range of systems methodologies is yet to be engaged with by public health researchers. PMID:26719314
VALUE - Validating and Integrating Downscaling Methods for Climate Change Research
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose
2013-04-01
Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of this exercise will directly provide end users with important information about the uncertainty of regional climate scenarios, and will furthermore provide the basis for further developing downscaling methods. This presentation will provide background information on VALUE and discuss the identified characteristics and the validation framework.
Schüle, Steffen Andreas; Bolte, Gabriele
2015-01-01
The research question how contextual factors of neighbourhood environments influence individual health has gained increasing attention in public health research. Both socioeconomic neighbourhood characteristics and factors of the built environment play an important role for health and health-related behaviours. However, their reciprocal relationships have not been systematically reviewed so far. This systematic review aims to identify studies applying a multilevel modelling approach which consider both neighbourhood socioeconomic position (SEP) and factors of the objective built environment simultaneously in order to disentangle their independent and interactive effects on individual health. The three databases PubMed, PsycINFO, and Web of Science were systematically searched with terms for title and abstract screening. Grey literature was not included. Observational studies from USA, Canada, Australia, New Zealand, and Western European countries were considered which analysed simultaneously factors of neighbourhood SEP and the objective built environment with a multilevel modelling approach. Adjustment for individual SEP was a further inclusion criterion. Thirty-three studies were included in qualitative synthesis. Twenty-two studies showed an independent association between characteristics of neighbourhood SEP or the built environment and individual health outcomes or health-related behaviours. Twenty-one studies found cross-level or within-level interactions either between neighbourhood SEP and the built environment, or between neighbourhood SEP or the built environment and individual characteristics, such as sex, individual SEP or ethnicity. Due to the large variation of study design and heterogeneous reporting of results the identification of consistent findings was problematic and made quantitative analysis not possible. There is a need for studies considering multiple neighbourhood dimensions and applying multilevel modelling in order to clarify their causal relationship towards individual health. Especially, more studies using comparable characteristics of neighbourhood SEP and the objective built environment and analysing interactive effects are necessary to disentangle health impacts and identify vulnerable neighbourhoods and population groups.
Schüle, Steffen Andreas; Bolte, Gabriele
2015-01-01
Background The research question how contextual factors of neighbourhood environments influence individual health has gained increasing attention in public health research. Both socioeconomic neighbourhood characteristics and factors of the built environment play an important role for health and health-related behaviours. However, their reciprocal relationships have not been systematically reviewed so far. This systematic review aims to identify studies applying a multilevel modelling approach which consider both neighbourhood socioeconomic position (SEP) and factors of the objective built environment simultaneously in order to disentangle their independent and interactive effects on individual health. Methods The three databases PubMed, PsycINFO, and Web of Science were systematically searched with terms for title and abstract screening. Grey literature was not included. Observational studies from USA, Canada, Australia, New Zealand, and Western European countries were considered which analysed simultaneously factors of neighbourhood SEP and the objective built environment with a multilevel modelling approach. Adjustment for individual SEP was a further inclusion criterion. Results Thirty-three studies were included in qualitative synthesis. Twenty-two studies showed an independent association between characteristics of neighbourhood SEP or the built environment and individual health outcomes or health-related behaviours. Twenty-one studies found cross-level or within-level interactions either between neighbourhood SEP and the built environment, or between neighbourhood SEP or the built environment and individual characteristics, such as sex, individual SEP or ethnicity. Due to the large variation of study design and heterogeneous reporting of results the identification of consistent findings was problematic and made quantitative analysis not possible. Conclusions There is a need for studies considering multiple neighbourhood dimensions and applying multilevel modelling in order to clarify their causal relationship towards individual health. Especially, more studies using comparable characteristics of neighbourhood SEP and the objective built environment and analysing interactive effects are necessary to disentangle health impacts and identify vulnerable neighbourhoods and population groups. PMID:25849569
Systems science and systems thinking for public health: a systematic review of the field.
Carey, Gemma; Malbon, Eleanor; Carey, Nicole; Joyce, Andrew; Crammond, Brad; Carey, Alan
2015-12-30
This paper reports on findings from a systematic review designed to investigate the state of systems science research in public health. The objectives were to: (1) explore how systems methodologies are being applied within public health and (2) identify fruitful areas of activity. A systematic review was conducted from existing literature that draws on or uses systems science (in its various forms) and relates to key public health areas of action and concern, including tobacco, alcohol, obesity and the social determinants of health. 117 articles were included in the review. An inductive qualitative content analysis was used for data extraction. The following were systematically extracted from the articles: approach, methodology, transparency, strengths and weaknesses. These were then organised according to theme (ie, commonalities between studies within each category), in order to provide an overview of the state of the field as a whole. The assessment of data quality was intrinsic to the goals of the review itself, and therefore, was carried out as part of the analysis. 4 categories of research were identified from the review, ranging from editorial and commentary pieces to complex system dynamic modelling. Our analysis of each of these categories of research highlighted areas of potential for systems science to strengthen public health efforts, while also revealing a number of limitations in the dynamic systems modelling being carried out in public health. There is a great deal of interest in how the application of systems concepts and approach might aid public health. Our analysis suggests that soft systems modelling techniques are likely to be the most useful addition to public health, and align well with current debate around knowledge transfer and policy. However, the full range of systems methodologies is yet to be engaged with by public health researchers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Exploring "patient-centered" hospitals: a systematic review to understand change.
Gabutti, Irene; Mascia, Daniele; Cicchetti, Americo
2017-05-22
The healthcare scenario in developed countries is changing deeply: patients, who are frequently affected by multi-pathological chronic conditions, have risen their expectations. Simultaneously, there exist dramatic financial pressures which require healthcare organizations to provide more and better services with equal (or decreasing) resources. In response to these challenges, hospitals are facing radical transformations by bridging, redesigning and engaging their organization and staff. This study has the ambitious aim to shed light and clearly label the trends of change hospitals are enhancing in developed economies, in order to fully understand the presence of common trends and which organizational models and features are inspiring the most innovative organizations. The purpose is to make stock of what is known in the field of hospital organization about how hospitals are changing, as well as of how such change may be implemented effectively through managerial tools. To do so the methodology adopted integrates a systematic literature review to a wider engaged research approach. Evidence suggests that the three main pillars of change of the system are given by the progressive patient care model, the patient-centered approach and the lean approach. However, there emerge a number of gaps in what is known about how to exploit drivers of change and their effects. This study confirms that efforts in literature are concentrated in analyzing circumscribed experiences in the implementation of new models and approaches, failing therefore to extend the analysis at the organizational and inter-organizational level in order to legitimately draw consequences to be generalized. There seem to be a number of "gaps" in what is known about how to exploit drivers of change and their effects, suggesting that the research approach privileged till now fails in providing a clear guidance to policy makers and to organizations' management on how to concretely and effectively implement new organizational models.
Three approaches to investigating the multidimensional nature of a science assessment
NASA Astrophysics Data System (ADS)
Gokiert, Rebecca Jayne
The purpose of this study was to investigate a multi-method approach for collecting validity evidence about the underlying knowledge and skills measured by a large-scale science assessment. The three approaches included analysis of dimensionality, differential item functioning (DIF), and think-aloud interviews. The specific research questions addressed were: (1) Does the 4-factor model previously found by Hamilton et al. (1995) for the grade 8 sample explain the data? (2) Do the performances of male and female students systematically differ? Are these performance differences captured in the dimensions? (3) Can think-aloud reports aid in the generation of hypotheses about the underlying knowledge and skills that are measured by this test? A confirmatory factor analysis of the 4-factor model revealed good model data fit for both the AB and AC tests. Twenty-four of the 83 AB test items and 16 of the 77 AC test items displayed significant DIF, however, items were found, on average, to favour both males and females equally. There were some systematic differences found across the 4-factors; items favouring males tended to be related to earth and space sciences, stereotypical male related activities, and numerical operations. Conversely, females were found to outperform males on items that required careful reading and attention to detail. Concurrent and retrospective verbal reports (Ericsson & Simon, 1993) were collected from 16 grade 8 students (9 male and 7 female) while they solved 12 DIF items. Four general cognitive processing themes were identified from the student protocols that could be used to explain male and female problem solving. The themes included comprehension (verbal and visual), visualization, background knowledge/experience (school or life), and strategy use. There were systematic differences in cognitive processing between the students that answered the items correctly and the students who answered the items incorrectly; however, this did not always correspond with the statistical gender DIF results. Although the multifaceted approach produced interpretable and meaningful validity evidence about the knowledge and skills, these forms of validity evidence only begin to provide a basic understanding of the underlying construct(s) that are being measured.
Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice
2017-01-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170
Endoscopic versus surgical treatment of ampullary adenomas: a systematic review and meta-analysis
Mendonça, Ernesto Quaresma; Bernardo, Wanderley Marques; de Moura, Eduardo Guimarães Hourneaux; Chaves, Dalton Marques; Kondo, André; Pu, Leonardo Zorrón Cheng Tao; Baracat, Felipe Iankelevich
2016-01-01
The aim of this study is to address the outcomes of endoscopic resection compared with surgery in the treatment of ampullary adenomas. A systematic review and meta-analysis were performed according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) recommendations. For this purpose, the Medline, Embase, Cochrane, Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS), Scopus and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases were scanned. Studies included patients with ampullary adenomas and data considering endoscopic treatment compared with surgery. The entire analysis was based on a fixed-effects model. Five retrospective cohort studies were selected (466 patients). All five studies (466 patients) had complete primary resection data available and showed a difference that favored surgical treatment (risk difference [RD] = -0.24, 95% confidence interval [CI] = -0.44 to -0.04). Primary success data were identified in all five studies as well. Analysis showed that the surgical approach outperformed endoscopic treatment for this outcome (RD = -0.37, 95% CI = -0.50 to -0.24). Recurrence data were found in all studies (466 patients), with a benefit indicated for surgical treatment (RD = 0.10, 95% CI = -0.01 to 0.19). Three studies (252 patients) presented complication data, but analysis showed no difference between the approaches for this parameter (RD = -0.15, 95% CI = -0.53 to 0.23). Considering complete primary resection, primary success and recurrence outcomes, the surgical approach achieves significantly better results. Regarding complication data, this systematic review concludes that rates are not significantly different. PMID:26872081
Dretzke, Janine; Ensor, Joie; Bayliss, Sue; Hodgkinson, James; Lordkipanidzé, Marie; Riley, Richard D; Fitzmaurice, David; Moore, David
2014-12-03
Prognostic factors are associated with the risk of future health outcomes in individuals with a particular health condition. The prognostic ability of such factors is increasingly being assessed in both primary research and systematic reviews. Systematic review methodology in this area is continuing to evolve, reflected in variable approaches to key methodological aspects. The aim of this article was to (i) explore and compare the methodology of systematic reviews of prognostic factors undertaken for the same clinical question, (ii) to discuss implications for review findings, and (iii) to present recommendations on what might be considered to be 'good practice' approaches. The sample was comprised of eight systematic reviews addressing the same clinical question, namely whether 'aspirin resistance' (a potential prognostic factor) has prognostic utility relative to future vascular events in patients on aspirin therapy for secondary prevention. A detailed comparison of methods around study identification, study selection, quality assessment, approaches to analysis, and reporting of findings was undertaken and the implications discussed. These were summarised into key considerations that may be transferable to future systematic reviews of prognostic factors. Across systematic reviews addressing the same clinical question, there were considerable differences in the numbers of studies identified and overlap between included studies, which could only partially be explained by different study eligibility criteria. Incomplete reporting and differences in terminology within primary studies hampered study identification and selection process across reviews. Quality assessment was highly variable and only one systematic review considered a checklist for studies of prognostic questions. There was inconsistency between reviews in approaches towards analysis, synthesis, addressing heterogeneity and reporting of results. Different methodological approaches may ultimately affect the findings and interpretation of systematic reviews of prognostic research, with implications for clinical decision-making.
Field-theoretic approach to fluctuation effects in neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buice, Michael A.; Cowan, Jack D.; Mathematics Department, University of Chicago, Chicago, Illinois 60637
A well-defined stochastic theory for neural activity, which permits the calculation of arbitrary statistical moments and equations governing them, is a potentially valuable tool for theoretical neuroscience. We produce such a theory by analyzing the dynamics of neural activity using field theoretic methods for nonequilibrium statistical processes. Assuming that neural network activity is Markovian, we construct the effective spike model, which describes both neural fluctuations and response. This analysis leads to a systematic expansion of corrections to mean field theory, which for the effective spike model is a simple version of the Wilson-Cowan equation. We argue that neural activity governedmore » by this model exhibits a dynamical phase transition which is in the universality class of directed percolation. More general models (which may incorporate refractoriness) can exhibit other universality classes, such as dynamic isotropic percolation. Because of the extremely high connectivity in typical networks, it is expected that higher-order terms in the systematic expansion are small for experimentally accessible measurements, and thus, consistent with measurements in neocortical slice preparations, we expect mean field exponents for the transition. We provide a quantitative criterion for the relative magnitude of each term in the systematic expansion, analogous to the Ginsburg criterion. Experimental identification of dynamic universality classes in vivo is an outstanding and important question for neuroscience.« less
Lyon, Aaron R.; Maras, Melissa A.; Pate, Christina M.; Igusa, Takeru; Stoep, Ann Vander
2016-01-01
Although it is widely known that the occurrence of depression increases over the course of adolescence, symptoms of mood disorders frequently go undetected. While schools are viable settings for conducting universal screening to systematically identify students in need of services for common health conditions, particularly those that adversely affect school performance, few school districts routinely screen their students for depression. Among the most commonly referenced barriers are concerns that the number of students identified may exceed schools’ service delivery capacities, but few studies have evaluated this concern systematically. System dynamics (SD) modeling may prove a useful approach for answering questions of this sort. The goal of the current paper is therefore to demonstrate how SD modeling can be applied to inform implementation decisions in communities. In our demonstration, we used SD modeling to estimate the additional service demand generated by universal depression screening in a typical high school. We then simulated the effects of implementing “compensatory approaches” designed to address anticipated increases in service need through (1) the allocation of additional staff time and (2) improvements in the effectiveness of mental health interventions. Results support the ability of screening to facilitate more rapid entry into services and suggest that improving the effectiveness of mental health services for students with depression via the implementation of an evidence-based treatment protocol may have a limited impact on overall recovery rates and service availability. In our example, the SD approach proved useful in informing systems’ decision-making about the adoption of a new school mental health service. PMID:25601192
Ambulatory Antibiotic Stewardship through a Human Factors Engineering Approach: A Systematic Review.
Keller, Sara C; Tamma, Pranita D; Cosgrove, Sara E; Miller, Melissa A; Sateia, Heather; Szymczak, Julie; Gurses, Ayse P; Linder, Jeffrey A
2018-01-01
In the United States, most antibiotics are prescribed in ambulatory settings. Human factors engineering, which explores interactions between people and the place where they work, has successfully improved quality of care. However, human factors engineering models have not been explored to frame what is known about ambulatory antibiotic stewardship (AS) interventions and barriers and facilitators to their implementation. We conducted a systematic review and searched OVID MEDLINE, Embase, Scopus, Web of Science, and CINAHL to identify controlled interventions and qualitative studies of ambulatory AS and determine whether and how they incorporated principles from a human factors engineering model, the Systems Engineering Initiative for Patient Safety 2.0 model. This model describes how a work system (ambulatory clinic) contributes to a process (antibiotic prescribing) that leads to outcomes. The work system consists of 5 components, tools and technology, organization, person, tasks, and environment, within an external environment. Of 1,288 abstracts initially identified, 42 quantitative studies and 17 qualitative studies met inclusion criteria. Effective interventions focused on tools and technology (eg, clinical decision support and point-of-care testing), the person (eg, clinician education), organization (eg, audit and feedback and academic detailing), tasks (eg, delayed antibiotic prescribing), the environment (eg, commitment posters), and the external environment (media campaigns). Studies have not focused on clinic-wide approaches to AS. A human factors engineering approach suggests that investigating the role of the clinic's processes or physical layout or external pressures' role in antibiotic prescribing may be a promising way to improve ambulatory AS. © Copyright 2018 by the American Board of Family Medicine.
Ottmann, Goetz; Allen, Jacqui; Feldman, Peter
2013-11-01
Consumer-directed care is increasingly becoming a mainstream option in community-based aged care. However, a systematic review describing how the current evaluation research translates into practise has not been published to date. This review aimed to systematically establish an evidence base of user preferences for and satisfaction with services associated with consumer-directed care programmes for older people. Twelve databases were searched, including MedLine, BioMed Central, Cinahl, Expanded Academic ASAP, PsychInfo, ProQuest, Age Line, Science Direct, Social Citation Index, Sociological Abstracts, Web of Science and the Cochrane Library. Google Scholar and Google were also searched. Eligible studies were those reporting on choice, user preferences and service satisfaction outcomes regarding a programme or model of home-based care in the United States or United Kingdom. This systematic narrative review retrieved literature published from January 1992 to August 2011. A total of 277 references were identified. Of these 17 met the selection criteria and were reviewed. Findings indicate that older people report varying preferences for consumer-directed care with some demonstrating limited interest. Clients and carers reported good service satisfaction. However, research comparing user preferences across countries or investigating how ecological factors shape user preferences has received limited attention. Policy-makers and practitioners need to carefully consider the diverse contexts, needs and preferences of older adults in adopting consumer-directed care approaches in community aged care. The review calls for the development of consumer-directed care programmes offering a broad range of options that allow for personalisation and greater control over services without necessarily transferring the responsibility for administrative responsibilities to service users. Review findings suggest that consumer-directed care approaches have the potential to empower older people. © 2013 Blackwell Publishing Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balderson, Michael, E-mail: michael.balderson@rmp.uhn.ca; Brown, Derek; Johnson, Patricia
The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic–based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for themore » different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15 mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT.« less
Arepeva, Maria; Kolbin, Alexey; Kurylev, Alexey; Balykina, Julia; Sidorenko, Sergey
2015-01-01
Acquired bacterial resistance is one of the causes of mortality and morbidity from infectious diseases. Mathematical modeling allows us to predict the spread of resistance and to some extent to control its dynamics. The purpose of this review was to examine existing mathematical models in order to understand the pros and cons of currently used approaches and to build our own model. During the analysis, seven articles on mathematical approaches to studying resistance that satisfied the inclusion/exclusion criteria were selected. All models were classified according to the approach used to study resistance in the presence of an antibiotic and were analyzed in terms of our research. Some models require modifications due to the specifics of the research. The plan for further work on model building is as follows: modify some models, according to our research, check all obtained models against our data, and select the optimal model or models with the best quality of prediction. After that we would be able to build a model for the development of resistance using the obtained results. PMID:25972847
Integrating Identity Management With Federated Healthcare Data Models
NASA Astrophysics Data System (ADS)
Hu, Jun; Peyton, Liam
In order to manage performance and provide integrated services, health care data needs to be linked and aggregated across data sources from different organizations. The Internet and secure B2B networks offer the possibility of providing near real-time integration. However, there are three major stumbling blocks. One is to standardize and agree upon a common data model across organizations. The second is to match identities between different locations in order to link and aggregate records. The third is to protect identity and ensure compliance with privacy laws. In this paper, we analyze three main approaches to the problem and use a healthcare scenario to illustrate how each one addresses different aspects of the problem while failing to address others. We then present a systematic framework in which the different approaches can be flexibly combined for a more comprehensive approach to integrate identity management with federated healthcare data models.
A time domain frequency-selective multivariate Granger causality approach.
Leistritz, Lutz; Witte, Herbert
2016-08-01
The investigation of effective connectivity is one of the major topics in computational neuroscience to understand the interaction between spatially distributed neuronal units of the brain. Thus, a wide variety of methods has been developed during the last decades to investigate functional and effective connectivity in multivariate systems. Their spectrum ranges from model-based to model-free approaches with a clear separation into time and frequency range methods. We present in this simulation study a novel time domain approach based on Granger's principle of predictability, which allows frequency-selective considerations of directed interactions. It is based on a comparison of prediction errors of multivariate autoregressive models fitted to systematically modified time series. These modifications are based on signal decompositions, which enable a targeted cancellation of specific signal components with specific spectral properties. Depending on the embedded signal decomposition method, a frequency-selective or data-driven signal-adaptive Granger Causality Index may be derived.
Developing parenting programs to prevent child health risk behaviors: a practice model
Jackson, Christine; Dickinson, Denise M.
2009-01-01
Research indicates that developing public health programs to modify parenting behaviors could lead to multiple beneficial health outcomes for children. Developing feasible effective parenting programs requires an approach that applies a theory-based model of parenting to a specific domain of child health and engages participant representatives in intervention development. This article describes this approach to intervention development in detail. Our presentation emphasizes three points that provide key insights into the goals and procedures of parenting program development. These are a generalized theoretical model of parenting derived from the child development literature, an established eight-step parenting intervention development process and an approach to integrating experiential learning methods into interventions for parents and children. By disseminating this framework for a systematic theory-based approach to developing parenting programs, we aim to support the program development efforts of public health researchers and practitioners who recognize the potential of parenting programs to achieve primary prevention of health risk behaviors in children. PMID:19661165
NASA Astrophysics Data System (ADS)
Kopytova, Taisiya
2016-01-01
When studying isolated brown dwarfs and directly imaged exoplanets with insignificant orbital motion,we have to rely on theoretical models to determine basic parameters such as mass, age, effective temperature, and surface gravity.While stellar and atmospheric models are rapidly evolving, we need a powerful tool to test and calibrate them.In my thesis, I focussed on comparing interior and atmospheric models with observational data, in the effort of taking into account various systematic effects that can significantly influence the data analysis.As a first step, about 460 candidate member os the Hyades were screened for companions using diffraction limited imaging observation (both our own data and archival data). As a result I could establish the single star sequence for the Hyades comprising about 250 stars (Kopytova et al. 2015, accepted to A&A). Open clusters contain many coeval objects of the same chemical composition and age, and spanning a range of masses. We compare the obtained sequence with a set of theoretical isochrones identifying systematic offsets and revealing probable issues in the models.However, there are many cases when it is impossible to test models before comparing them with observations.As a second step, we apply atmospheric models for constraining parameters of WISE 0855-07, the coolest known Y dwarf(Kopytova et al. 2014, ApJ 797, 3). We demonstrate the limits of constraining effective temperature and the presence/absence of water clouds.As a third step, we introduce a novel method to take into account the above-mentioned systematics. We construct a "systematics vector" that allows us to reveal problematic wavelength ranges when fitting atmospheric models to observed near-infrared spectraof brown dwarfs and exoplanets (Kopytova et al., in prep.). This approach plays a crucial role when retrieving abundances for these objects, in particularly, a C/O ratio. The latter parameter is an important key to formation scenarios of brown dwarf and exoplanets. We show the way to constrain a C/O ratio while eliminating systematics effects, which significantly improves the reliability of a final result and our conclusions about formation history of certain exoplanets and brown dwarfs.
Theory of wavelet-based coarse-graining hierarchies for molecular dynamics.
Rinderspacher, Berend Christopher; Bardhan, Jaydeep P; Ismail, Ahmed E
2017-07-01
We present a multiresolution approach to compressing the degrees of freedom and potentials associated with molecular dynamics, such as the bond potentials. The approach suggests a systematic way to accelerate large-scale molecular simulations with more than two levels of coarse graining, particularly applications of polymeric materials. In particular, we derive explicit models for (arbitrarily large) linear (homo)polymers and iterative methods to compute large-scale wavelet decompositions from fragment solutions. This approach does not require explicit preparation of atomistic-to-coarse-grained mappings, but instead uses the theory of diffusion wavelets for graph Laplacians to develop system-specific mappings. Our methodology leads to a hierarchy of system-specific coarse-grained degrees of freedom that provides a conceptually clear and mathematically rigorous framework for modeling chemical systems at relevant model scales. The approach is capable of automatically generating as many coarse-grained model scales as necessary, that is, to go beyond the two scales in conventional coarse-grained strategies; furthermore, the wavelet-based coarse-grained models explicitly link time and length scales. Furthermore, a straightforward method for the reintroduction of omitted degrees of freedom is presented, which plays a major role in maintaining model fidelity in long-time simulations and in capturing emergent behaviors.
Modeling and Measurement Constraints in Fault Diagnostics for HVAC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Najafi, Massieh; Auslander, David M.; Bartlett, Peter L.
2010-05-30
Many studies have shown that energy savings of five to fifteen percent are achievable in commercial buildings by detecting and correcting building faults, and optimizing building control systems. However, in spite of good progress in developing tools for determining HVAC diagnostics, methods to detect faults in HVAC systems are still generally undeveloped. Most approaches use numerical filtering or parameter estimation methods to compare data from energy meters and building sensors to predictions from mathematical or statistical models. They are effective when models are relatively accurate and data contain few errors. In this paper, we address the case where models aremore » imperfect and data are variable, uncertain, and can contain error. We apply a Bayesian updating approach that is systematic in managing and accounting for most forms of model and data errors. The proposed method uses both knowledge of first principle modeling and empirical results to analyze the system performance within the boundaries defined by practical constraints. We demonstrate the approach by detecting faults in commercial building air handling units. We find that the limitations that exist in air handling unit diagnostics due to practical constraints can generally be effectively addressed through the proposed approach.« less
Interventions and approaches to integrating HIV and mental health services: a systematic review.
Chuah, Fiona Leh Hoon; Haldane, Victoria Elizabeth; Cervero-Liceras, Francisco; Ong, Suan Ee; Sigfrid, Louise A; Murphy, Georgina; Watt, Nicola; Balabanova, Dina; Hogarth, Sue; Maimaris, Will; Otero, Laura; Buse, Kent; McKee, Martin; Piot, Peter; Perel, Pablo; Legido-Quigley, Helena
2017-11-01
The frequency in which HIV and AIDS and mental health problems co-exist, and the complex bi-directional relationship between them, highlights the need for effective care models combining services for HIV and mental health. Here, we present a systematic review that synthesizes the literature on interventions and approaches integrating these services. This review was part of a larger systematic review on integration of services for HIV and non-communicable diseases. Eligible studies included those that described or evaluated an intervention or approach aimed at integrating HIV and mental health care. We searched multiple databases from inception until October 2015, independently screened articles identified for inclusion, conducted data extraction, and assessed evaluative papers for risk of bias. Forty-five articles were eligible for this review. We identified three models of integration at the meso and micro levels: single-facility integration, multi-facility integration, and integrated care coordinated by a non-physician case manager. Single-site integration enhances multidisciplinary coordination and reduces access barriers for patients. However, the practicality and cost-effectiveness of providing a full continuum of specialized care on-site for patients with complex needs is arguable. Integration based on a collaborative network of specialized agencies may serve those with multiple co-morbidities but fragmented and poorly coordinated care can pose barriers. Integrated care coordinated by a single case manager can enable continuity of care for patients but requires appropriate training and support for case managers. Involving patients as key actors in facilitating integration within their own treatment plan is a promising approach. This review identified much diversity in integration models combining HIV and mental health services, which are shown to have potential in yielding positive patient and service delivery outcomes when implemented within appropriate contexts. Our review revealed a lack of research in low- and middle- income countries, and was limited to most studies being descriptive. Overall, studies that seek to evaluate and compare integration models in terms of long-term outcomes and cost-effectiveness are needed, particularly at the health system level and in regions with high HIV and AIDS burden. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
A systematic review of health economic models and utility estimation methods in schizophrenia.
Németh, Bertalan; Fasseeh, Ahmad; Molnár, Anett; Bitter, István; Horváth, Margit; Kóczián, Kristóf; Götze, Árpád; Nagy, Balázs
2018-06-01
There is a growing need for economic evaluations describing the disease course, as well as the costs and clinical outcomes related to the treatment of schizophrenia. Areas covered: A systematic review on studies describing health economic models in schizophrenia and a targeted literature review on utility mapping algorithms in schizophrenia were carried out. Models found in the review were collated and assessed in detail according to their type and various other attributes. Fifty-nine studies were included in the review. Modeling techniques varied from simple decision trees to complex simulation models. The models used various clinical endpoints as value drivers, 47% of the models used quality-adjusted life years, and eight percent used disability-adjusted life years to measure benefits, while others applied various clinical outcomes. Most models considered patients switching between therapies, and therapeutic adherence, compliance or persistence. The targeted literature review identified four main approaches to map PANSS scores to utility values. Expert commentary: Health economic models developed for schizophrenia showed great variability, with simulation models becoming more frequently used in the last decade. Using PANSS scores as the basis of utility estimations is justifiable.
Calcium dynamics and signaling in vascular regulation: computational models
Tsoukias, Nikolaos Michael
2013-01-01
Calcium is a universal signaling molecule with a central role in a number of vascular functions including in the regulation of tone and blood flow. Experimentation has provided insights into signaling pathways that lead to or affected by Ca2+ mobilization in the vasculature. Mathematical modeling offers a systematic approach to the analysis of these mechanisms and can serve as a tool for data interpretation and for guiding new experimental studies. Comprehensive models of calcium dynamics are well advanced for some systems such as the heart. This review summarizes the progress that has been made in modeling Ca2+ dynamics and signaling in vascular cells. Model simulations show how Ca2+ signaling emerges as a result of complex, nonlinear interactions that cannot be properly analyzed using only a reductionist's approach. A strategy of integrative modeling in the vasculature is outlined that will allow linking macroscale pathophysiological responses to the underlying cellular mechanisms. PMID:21061306
The Cirrus Parcel Model Comparison Project. Phase 1
NASA Technical Reports Server (NTRS)
Lin, Ruei-Fong; Starr, D.; DeMott, P.; Cotten, R.; Jensen, E.; Sassen, K.
2000-01-01
The cirrus Parcel Model Comparison Project involves the systematic comparison of current models of ice crystal nucleation and growth for specified, typical, cirrus cloud environments. In Phase 1 of the project reported here, simulated cirrus cloud microphysical properties are compared for situations of "warm" (-40 C) and "cold" (-60 C) cirrus subject to updrafts of 4, 20 and 100 centimeters per second, respectively. Five models are participating in the project. These models employ explicit microphysical schemes wherein the size distribution of each class of particles (aerosols and ice crystals) is resolved into bins. Simulations are made including both homogeneous and heterogeneous ice nucleation mechanisms. A single initial aerosol population of sulfuric acid particles is prescribed for all simulations. To isolate the treatment of the homogeneous freezing (of haze drops) nucleation process, the heterogeneous nucleation mechanism is disabled for a second parallel set of simulations. Qualitative agreement is found amongst the models for the homogeneous-nucleation-only simulations, e.g., the number density of nucleated ice crystals increases with the strength of the prescribed updraft. However, non-negligible quantitative differences are found. Systematic bias exists between results of a model based on a modified classical theory approach and models using an effective freezing temperature approach to the treatment of nucleation. Each approach is constrained by critical freezing data from laboratory studies. This information is necessary, but not sufficient, to construct consistent formulae for the two approaches. Large haze particles may deviate considerably from equilibrium size in moderate to strong updrafts (20-100 centimeters per second) at -60 C when the commonly invoked equilibrium assumption is lifted. The resulting difference in particle-size-dependent solution concentration of haze particles may significantly affect the ice nucleation rate during the initial nucleation interval. The uptake rate for water vapor excess by ice crystals is another key component regulating the total number of nucleated ice crystals. This rate, the product of ice number concentration and ice crystal diffusional growth rate, partially controls the peak nucleation rate achieved in an air parcel and the duration of the active nucleation time period.
Low-energy nuclear spectroscopy in a microscopic multiphonon approach
NASA Astrophysics Data System (ADS)
Lo Iudice, N.; Ponomarev, V. Yu; Stoyanov, Ch; Sushkov, A. V.; Voronov, V. V.
2012-04-01
The low-lying spectra of heavy nuclei are investigated within the quasiparticle-phonon model. This microscopic approach goes beyond the quasiparticle random-phase approximation by treating a Hamiltonian of separable form in a microscopic multiphonon basis. It is therefore able to describe the anharmonic features of collective modes as well as the multiphonon states, whose experimental evidence is continuously growing. The method can be put in close correspondence with the proton-neutron interacting boson model. By associating the microscopic isoscalar and isovector quadrupole phonons with proton-neutron symmetric and mixed-symmetry quadrupole bosons, respectively, the microscopic states can be classified, just as in the algebraic model, according to their phonon content and their symmetry. In addition, these states disclose the nuclear properties which are to be ascribed to genuine shell effects, not included in the algebraic approach. Due to its flexibility, the method can be implemented numerically for systematic studies of spectroscopic properties throughout entire regions of vibrational nuclei. The spectra and multipole transition strengths so computed are in overall good agreement with the experimental data. By exploiting the correspondence of the method with the interacting boson model, it is possible to embed the microscopic states into this algebraic frame and, therefore, face the study of nuclei far from shell closures, not directly accessible to merely microscopic approaches. Here, it is shown how this task is accomplished through systematic investigations of magnetic dipole and, especially, electric dipole modes along paths moving from the vibrational to the transitional regions. The method is very well suited to the study of well-deformed nuclei. It provides reliable descriptions of low-lying magnetic as well as electric multipole modes of nuclei throughout the rare-earth and actinide regions. Attention is focused here on the low-lying 0+ states produced in large abundance in recent experiments. The analysis shows that the quasiparticle-phonon model accounts for the occurrence of so many 0+ levels and discloses their nature.
Design of a framework for modeling, integration and simulation of physiological models.
Erson, E Zeynep; Cavuşoğlu, M Cenk
2012-09-01
Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Solid waste forecasting using modified ANFIS modeling.
Younes, Mohammad K; Nopiah, Z M; Basri, N E Ahmad; Basri, H; Abushammala, Mohammed F M; K N A, Maulud
2015-10-01
Solid waste prediction is crucial for sustainable solid waste management. Usually, accurate waste generation record is challenge in developing countries which complicates the modelling process. Solid waste generation is related to demographic, economic, and social factors. However, these factors are highly varied due to population and economy growths. The objective of this research is to determine the most influencing demographic and economic factors that affect solid waste generation using systematic approach, and then develop a model to forecast solid waste generation using a modified Adaptive Neural Inference System (MANFIS). The model evaluation was performed using Root Mean Square Error (RMSE), Mean Absolute Error (MAE) and the coefficient of determination (R²). The results show that the best input variables are people age groups 0-14, 15-64, and people above 65 years, and the best model structure is 3 triangular fuzzy membership functions and 27 fuzzy rules. The model has been validated using testing data and the resulted training RMSE, MAE and R² were 0.2678, 0.045 and 0.99, respectively, while for testing phase RMSE =3.986, MAE = 0.673 and R² = 0.98. To date, a few attempts have been made to predict the annual solid waste generation in developing countries. This paper presents modeling of annual solid waste generation using Modified ANFIS, it is a systematic approach to search for the most influencing factors and then modify the ANFIS structure to simplify the model. The proposed method can be used to forecast the waste generation in such developing countries where accurate reliable data is not always available. Moreover, annual solid waste prediction is essential for sustainable planning.
NASA Astrophysics Data System (ADS)
Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.
2017-12-01
Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.
Habchi, Johnny; Chia, Sean; Limbocker, Ryan; Mannini, Benedetta; Ahn, Minkoo; Perni, Michele; Hansson, Oskar; Arosio, Paolo; Kumita, Janet R; Challa, Pavan Kumar; Cohen, Samuel I A; Linse, Sara; Dobson, Christopher M; Knowles, Tuomas P J; Vendruscolo, Michele
2017-01-10
The aggregation of the 42-residue form of the amyloid-β peptide (Aβ42) is a pivotal event in Alzheimer's disease (AD). The use of chemical kinetics has recently enabled highly accurate quantifications of the effects of small molecules on specific microscopic steps in Aβ42 aggregation. Here, we exploit this approach to develop a rational drug discovery strategy against Aβ42 aggregation that uses as a read-out the changes in the nucleation and elongation rate constants caused by candidate small molecules. We thus identify a pool of compounds that target specific microscopic steps in Aβ42 aggregation. We then test further these small molecules in human cerebrospinal fluid and in a Caenorhabditis elegans model of AD. Our results show that this strategy represents a powerful approach to identify systematically small molecule lead compounds, thus offering an appealing opportunity to reduce the attrition problem in drug discovery.
Systematic design for trait introgression projects.
Cameron, John N; Han, Ye; Wang, Lizhi; Beavis, William D
2017-10-01
Using an Operations Research approach, we demonstrate design of optimal trait introgression projects with respect to competing objectives. We demonstrate an innovative approach for designing Trait Introgression (TI) projects based on optimization principles from Operations Research. If the designs of TI projects are based on clear and measurable objectives, they can be translated into mathematical models with decision variables and constraints that can be translated into Pareto optimality plots associated with any arbitrary selection strategy. The Pareto plots can be used to make rational decisions concerning the trade-offs between maximizing the probability of success while minimizing costs and time. The systematic rigor associated with a cost, time and probability of success (CTP) framework is well suited to designing TI projects that require dynamic decision making. The CTP framework also revealed that previously identified 'best' strategies can be improved to be at least twice as effective without increasing time or expenses.
Chattoraj, Sayantan; Bhugra, Chandan; Li, Zheng Jane; Sun, Changquan Calvin
2014-12-01
The nonisothermal crystallization kinetics of amorphous materials is routinely analyzed by statistically fitting the crystallization data to kinetic models. In this work, we systematically evaluate how the model-dependent crystallization kinetics is impacted by variations in the heating rate and the selection of the kinetic model, two key factors that can lead to significant differences in the crystallization activation energy (Ea ) of an amorphous material. Using amorphous felodipine, we show that the Ea decreases with increase in the heating rate, irrespective of the kinetic model evaluated in this work. The model that best describes the crystallization phenomenon cannot be identified readily through the statistical fitting approach because several kinetic models yield comparable R(2) . Here, we propose an alternate paired model-fitting model-free (PMFMF) approach for identifying the most suitable kinetic model, where Ea obtained from model-dependent kinetics is compared with those obtained from model-free kinetics. The most suitable kinetic model is identified as the one that yields Ea values comparable with the model-free kinetics. Through this PMFMF approach, nucleation and growth is identified as the main mechanism that controls the crystallization kinetics of felodipine. Using this PMFMF approach, we further demonstrate that crystallization mechanism from amorphous phase varies with heating rate. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Bodner-Adler, Barbara; Hanzal, Engelbert; Pablik, Eleonore; Koelbl, Heinz; Bodner, Klaus
2017-01-01
Background Vesicovaginal fistulas (VVF) are the most commonly acquired fistulas of the urinary tract, but we lack a standardized algorithm for their management. Surgery is the most commonly preferred approach to treat women with primary VVF following benign gynaecologic surgery. Objective To carry out a systematic review and meta-analysis on the effectiveness of operative techniques or conservative treatment for patients with postsurgical VVF. Our secondary objective was to define the surgical time and determine the types of study designs. Methods PubMed, Old Medline, Embase and Cochrane Central Register of Controlled Trials were used as data sources. This systematic review was modelled on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement, including a registration number (CRD42012002097). Results We reviewed 282 full text articles to identify 124 studies for inclusion. In all, 1379/1430 (96.4%) patients were treated surgically. Overall, the transvaginal approach was performed in the majority of patients (39%), followed by a transabdominal/transvesical route (36%), a laparoscopic/robotic approach (15%) and a combined transabdominal-transvaginal approach in 3% of cases. Success rate of conservative treatment was 92.86% (95%CI: 79.54–99.89), 97.98% in surgical cases (95% CI: 96.13–99.29) and 91.63% (95% CI: 87.68–97.03) in patients with prolonged catheter drainage followed by surgery. 79/124 studies (63.7%) provided information for the length of follow-up, but showed a poor reporting standard regarding prognosis. Complications were studied only selectively. Due to the inconsistency of these data it was impossible to analyse them collectively. Conclusions Although the literature is imprecise and inconsistent, existing studies indicate that operation, mainly through a transvaginal approach, is the most commonly preferred treatment strategy in females with postsurgical VVF. Our data showed no clear odds-on favorite regarding disease management as well as surgical approach and current evidence on the surgical management of VVF does not allow any accurate estimation of success and complication rates. Standardisation of the terminology is required so that VVF can be managed with a proper surgical treatment algorithm based on characteristics of the fistula. PMID:28225769
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horstemeyer, Mark R.; Chaudhuri, Santanu
2015-09-30
A multiscale modeling Internal State Variable (ISV) constitutive model was developed that captures the fundamental structure-property relationships. The macroscale ISV model used lower length scale simulations (Butler-Volmer and Electronics Structures results) in order to inform the ISVs at the macroscale. The chemomechanical ISV model was calibrated and validated from experiments with magnesium (Mg) alloys that were investigated under corrosive environments coupled with experimental electrochemical studies. Because the ISV chemomechanical model is physically based, it can be used for other material systems to predict corrosion behavior. As such, others can use the chemomechanical model for analyzing corrosion effects on their designs.
Teaching Reading for Students with Intellectual Disabilities: A Systematic Review
ERIC Educational Resources Information Center
Alnahdi, Ghaleb Hamad
2015-01-01
A systematic review of the literature related to instructional strategies to improve reading skills for students with intellectual disabilities was conducted. Studies reviewed were within three categories; early reading approaches, comprehensive approaches, and one method approach. It was concluded that students with intellectual disabilities are…
Planck 2015 results. III. LFI systematic uncertainties
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Burigana, C.; Butler, R. C.; Calabrese, E.; Catalano, A.; Christensen, P. R.; Colombo, L. P. L.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Doré, O.; Ducout, A.; Dupac, X.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Frailis, M.; Franceschet, C.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Harrison, D. L.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Knoche, J.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Nati, F.; Natoli, P.; Noviello, F.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Pearson, T. J.; Perdereau, O.; Pettorino, V.; Piacentini, F.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Stolyarov, V.; Stompor, R.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.
2016-09-01
We present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (I) simulations based on measured data and physical models of the known systematic effects; and (II) analysis of difference maps containing the same sky signal ("null-maps"). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrum by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10-20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.
Planck 2015 results: III. LFI systematic uncertainties
Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; ...
2016-09-20
In this paper, we present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal (“null-maps”). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrummore » by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10–20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Finally, some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.« less
Planck 2015 results: III. LFI systematic uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ade, P. A. R.; Aumont, J.; Baccigalupi, C.
In this paper, we present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal (“null-maps”). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrummore » by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10–20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Finally, some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.« less
Model for a patient-centered comparative effectiveness research center.
Costlow, Monica R; Landsittel, Douglas P; James, A Everette; Kahn, Jeremy M; Morton, Sally C
2015-04-01
This special report describes the systematic approach the University of Pittsburgh and the University of Pittsburgh Medical Center (UPMC) undertook in creating an infrastructure for comparative effectiveness and patient-centered outcomes research resources. We specifically highlight the administrative structure, communication and training opportunities, stakeholder engagement resources, and support services offered. © 2015 Wiley Periodicals, Inc.
Summary of Research on the Effectiveness of Math Professional Development Approaches. REL 2014-010
ERIC Educational Resources Information Center
Gersten, Russell; Taylor, Mary Jo; Keys, Tran D.; Rolfhus, Eric; Newman-Gonchar, Rebecca
2014-01-01
This study used a systematic process modeled after the What Works Clearinghouse (WWC) study review process to answer the question: What does the causal research say are effective math professional development interventions for K-12 teachers aimed at improving student achievement? The study identified and screened 910 research studies in a…
ERIC Educational Resources Information Center
WHITMAN, LAURIS B.; AND OTHERS
AS PART OF AN OVERALL EVALUATION OF ITS EDUCATIONAL CURRICULUM, THE UNITED PRESBYTERIAN CHURCH, IN 1964, COMMISSIONED THE DEPARTMENT OF RESEARCH OF THE NATIONAL COUNCIL OF CHURCHES TO PROVIDE SYSTEMATIC AND COHERENT PROFILES OF COMMUNICANTS, YOUTH, CHURCH SCHOOL TEACHERS, AND MINISTERS. THIS RESEARCH WAS BASED ON THE INTERDISCIPLINARY APPROACH TO…
The Applied Behavior Analytic Heritage of PBS: A Dynamic Model of Action-Oriented Research
ERIC Educational Resources Information Center
Dunlap, Glen; Horner, Robert H., Ed.
2006-01-01
In the past two decades, positive behavior support (PBS) has emerged from applied behavior analysis (ABA) as a newly fashioned approach to problems of behavioral adaptation. ABA was established in the 1960s as a science in which learning principles are systematically applied to produce socially important changes in behavior, whereas PBS was…
ERIC Educational Resources Information Center
Klebansky, Anna; Fraser, Sharon P.
2013-01-01
This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…
The Value of Experiments in Education
ERIC Educational Resources Information Center
Whitehurst, Grover J.
2012-01-01
One of the major story lines of the growth of civilization is the advance of the experiment. From the food we eat to the diseases we conquer to our understanding of how we think and behave, we have profited enormously from an approach that marries our models of the world with tests of their validity through systematic variation to determine cause…
Once upon a Time. . . at the Tenth SOBRAMFA International and Academic Meeting--S. Paulo--Brazil
ERIC Educational Resources Information Center
De Benedetto, Maria Auxiliadora C.; Blasco, Pablo G.; de Castro, Ariane G.; de Carvalho, Elsi
2006-01-01
In Brazil, medical practice and the predominant medical education model are based on specialization. Methodologies such as patient-centered medicine and narrative medicine are either unknown or not applied in a systematic way. In order to draw students' and doctors' attention to these approaches during the TENTH SOBRAMFA INTERNATIONAL AND ACADEMIC…
An Analysis of the Learning Center in Community Colleges.
ERIC Educational Resources Information Center
Peterson, Gary T.
A study was made to relate: (1) the concepts of a library of materials and (2) newer concepts such as instructional development activities which initiate a more scientific, systematic approach to the improvement and individualization of learning experiences. The major output of the study was to be a definitive model so that the fields of library…
Qualitative reasoning for biological network inference from systematic perturbation experiments.
Badaloni, Silvana; Di Camillo, Barbara; Sambo, Francesco
2012-01-01
The systematic perturbation of the components of a biological system has been proven among the most informative experimental setups for the identification of causal relations between the components. In this paper, we present Systematic Perturbation-Qualitative Reasoning (SPQR), a novel Qualitative Reasoning approach to automate the interpretation of the results of systematic perturbation experiments. Our method is based on a qualitative abstraction of the experimental data: for each perturbation experiment, measured values of the observed variables are modeled as lower, equal or higher than the measurements in the wild type condition, when no perturbation is applied. The algorithm exploits a set of IF-THEN rules to infer causal relations between the variables, analyzing the patterns of propagation of the perturbation signals through the biological network, and is specifically designed to minimize the rate of false positives among the inferred relations. Tested on both simulated and real perturbation data, SPQR indeed exhibits a significantly higher precision than the state of the art.
NASA Astrophysics Data System (ADS)
Butler, Christopher J.; Tseng, Yi; Hsing, Cheng-Rong; Wu, Yu-Mi; Sankar, Raman; Wang, Mei-Fang; Wei, Ching-Ming; Chou, Fang-Cheng; Lin, Minn-Tsong
2017-02-01
The Dirac semimetal phase found in Cd3As2 is protected by a C4 rotational symmetry derived from a corkscrew arrangement of systematic Cd vacancies in its complicated crystal structure. It is therefore surprising that no microscopic observation, direct or indirect, of these systematic vacancies has so far been described. To this end, we revisit the cleaved (112) surface of Cd3As2 using a combined approach of scanning tunneling microscopy and ab initio calculations. We determine the exact position of the (112) plane at which Cd3As2 naturally cleaves, and describe in detail a structural periodicity found at the reconstructed surface, consistent with that expected to arise from the systematic Cd vacancies. This reconciles the current state of microscopic surface observations with those of crystallographic and theoretical models, and demonstrates that this vacancy superstructure, central to the preservation of the Dirac semimetal phase, survives the cleavage process and retains order at the surface.
What do we know about managing Dupuytren's disease cost-effectively?
Dritsaki, Melina; Rivero-Arias, Oliver; Gray, Alastair; Ball, Catherine; Nanchahal, Jagdeep
2018-01-25
Dupuytren's disease (DD) is a common and progressive, fibroproliferative disorder of the palmar and digital fascia of the hand. Various treatments have been recommended for advanced disease or to retard progression of early disease and to prevent deterioration of the finger contracture and quality of life. Recent studies have tried to evaluate the clinical and cost-effectiveness of therapies for DD, but there is currently no systematic assessment and appraisal of the economic evaluations. A systematic literature review was conducted, following PRISMA guidelines, to identify studies reporting economic evaluations of interventions for managing DD. Databases searched included the Ovid MEDLINE/Embase (without time restriction), National Health Service (NHS) Economic Evaluation Database (all years) and the National Institute for Health Research (NIHR) Journals Library) Health Technology Assessment (HTA). Cost-effectiveness analyses of treating DD were identified and their quality was assessed using the CHEERS assessment tool for quality of reporting and Phillips checklist for model evaluation. A total of 103 studies were screened, of which 4 met the study inclusion criteria. Two studies were from the US, one from the UK and one from Canada. They all assessed the same interventions for advanced DD, namely collagenase Clostridium histolyticum injection, percutaneous needle fasciotomy and partial fasciectomy. All studies conducting a cost-utility analysis, two implemented a decision analytic model and two a Markov model approach. None of them were based on a single randomised controlled trial, but rather synthesised evidence from various sources. Studies varied in their time horizon, sources of utility estimates and perspective of analysis. The overall quality of study reporting was good based on the CHEERS checklist. The quality of the model reporting in terms of model structure, data synthesis and model consistency varied across the included studies. Cost-effectiveness analyses for patients with advanced DD are limited and have applied different approaches with respect to modelling. Future studies should improve the way they are conducted and report their findings according to established guidance for conducting economic modelling of health care technologies. The protocol was registered ( CRD42016032989 ; date 08/01/2016) with the PROSPERO international prospective register of systematic reviews.
2014-09-30
existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this...through downscaling future projection simulations. APPROACH To address the scientific objectives, we plan to develop, implement, and validate a...ITD and FSD at the same time. The development of MIZMAS will be based on systematic model parameterization, calibration, and validation, and data
TRILEX and G W +EDMFT approach to d -wave superconductivity in the Hubbard model
NASA Astrophysics Data System (ADS)
Vučičević, J.; Ayral, T.; Parcollet, O.
2017-09-01
We generalize the recently introduced TRILEX approach (TRiply irreducible local EXpansion) to superconducting phases. The method treats simultaneously Mott and spin-fluctuation physics using an Eliashberg theory supplemented by local vertex corrections determined by a self-consistent quantum impurity model. We show that, in the two-dimensional Hubbard model, at strong coupling, TRILEX yields a d -wave superconducting dome as a function of doping. Contrary to the standard cluster dynamical mean field theory (DMFT) approaches, TRILEX can capture d -wave pairing using only a single-site effective impurity model. We also systematically explore the dependence of the superconducting temperature on the bare dispersion at weak coupling, which shows a clear link between strong antiferromagnetic (AF) correlations and the onset of superconductivity. We identify a combination of hopping amplitudes particularly favorable to superconductivity at intermediate doping. Finally, we study within G W +EDMFT the low-temperature d -wave superconducting phase at strong coupling in a region of parameter space with reduced AF fluctuations.
Unraveling the Mechanisms of Manual Therapy: Modeling an Approach.
Bialosky, Joel E; Beneciuk, Jason M; Bishop, Mark D; Coronado, Rogelio A; Penza, Charles W; Simon, Corey B; George, Steven Z
2018-01-01
Synopsis Manual therapy interventions are popular among individual health care providers and their patients; however, systematic reviews do not strongly support their effectiveness. Small treatment effect sizes of manual therapy interventions may result from a "one-size-fits-all" approach to treatment. Mechanistic-based treatment approaches to manual therapy offer an intriguing alternative for identifying patients likely to respond to manual therapy. However, the current lack of knowledge of the mechanisms through which manual therapy interventions inhibit pain limits such an approach. The nature of manual therapy interventions further confounds such an approach, as the related mechanisms are likely a complex interaction of factors related to the patient, the provider, and the environment in which the intervention occurs. Therefore, a model to guide both study design and the interpretation of findings is necessary. We have previously proposed a model suggesting that the mechanical force from a manual therapy intervention results in systemic neurophysiological responses leading to pain inhibition. In this clinical commentary, we provide a narrative appraisal of the model and recommendations to advance the study of manual therapy mechanisms. J Orthop Sports Phys Ther 2018;48(1):8-18. doi:10.2519/jospt.2018.7476.
The role of local stress perturbation on the simultaneous opening of orthogonal fractures
NASA Astrophysics Data System (ADS)
Boersma, Quinten; Hardebol, Nico; Barnhoorn, Auke; Bertotti, Giovanni; Drury, Martyn
2016-04-01
Orthogonal fracture networks (ladder-like networks) are arrangements that are commonly observed in outcrop studies. They form a particularly dense and well connected network which can play an important role in the effective permeability of tight hydrocarbon or geothermal reservoirs. One issue is the extent to which both the long systematic and smaller cross fractures can be simultaneously critically stressed under a given stress condition. Fractures in an orthogonal network form by opening mode-I displacements in which the main component is separation of the two fracture walls. This opening is driven by effective tensile stresses as the smallest principle stress acting perpendicular to the fracture wall, which accords with linear elastic fracture mechanics. What has been well recognized in previous field and modelling studies is how both the systematic fractures and perpendicular cross fractures require the minimum principle stress to act perpendicular to the fracture wall. Thus, these networks either require a rotation of the regional stress field or local perturbations in stress field. Using a mechanical finite element modelling software, a geological case of layer perpendicular systematic mode I opening fractures is generated. New in our study is that we not only address tensile stresses at the boundary, but also address models using pore fluid pressure. The local stress in between systematic fractures is then assessed in order to derive the probability and orientation of micro crack propagation using the theory of sub critical crack growth and Griffith's theory. Under effective tensile conditions, the results indicate that in between critically spaced systematic fractures, local effective tensile stresses flip. Therefore the orientation of the least principle stress will rotate 90°, hence an orthogonal fracture is more likely to form. Our new findings for models with pore fluid pressures instead of boundary tension show that the magnitude of effective tension in between systematic fractures is reduced but does not remove the occurring stress flip. However, putting effective tension on the boundaries will give overestimates in the reduction of the local effective tensile stress perpendicular to the larger systematic fractures and therefore the magnitude of the stress flip. In conclusion, both model approaches indicate that orthogonal fractures can form while experiencing one regional stress regime. This also means that under these specific loading and locally perturbed stress conditions both sets of orthogonal fractures stay open and can provide a pathway for fluid circulation.
AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku
2014-05-27
The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.
NASA Astrophysics Data System (ADS)
Nahar, Jannatun; Johnson, Fiona; Sharma, Ashish
2017-07-01
Use of General Circulation Model (GCM) precipitation and evapotranspiration sequences for hydrologic modelling can result in unrealistic simulations due to the coarse scales at which GCMs operate and the systematic biases they contain. The Bias Correction Spatial Disaggregation (BCSD) method is a popular statistical downscaling and bias correction method developed to address this issue. The advantage of BCSD is its ability to reduce biases in the distribution of precipitation totals at the GCM scale and then introduce more realistic variability at finer scales than simpler spatial interpolation schemes. Although BCSD corrects biases at the GCM scale before disaggregation; at finer spatial scales biases are re-introduced by the assumptions made in the spatial disaggregation process. Our study focuses on this limitation of BCSD and proposes a rank-based approach that aims to reduce the spatial disaggregation bias especially for both low and high precipitation extremes. BCSD requires the specification of a multiplicative bias correction anomaly field that represents the ratio of the fine scale precipitation to the disaggregated precipitation. It is shown that there is significant temporal variation in the anomalies, which is masked when a mean anomaly field is used. This can be improved by modelling the anomalies in rank-space. Results from the application of the rank-BCSD procedure improve the match between the distributions of observed and downscaled precipitation at the fine scale compared to the original BCSD approach. Further improvements in the distribution are identified when a scaling correction to preserve mass in the disaggregation process is implemented. An assessment of the approach using a single GCM over Australia shows clear advantages especially in the simulation of particularly low and high downscaled precipitation amounts.
ERIC Educational Resources Information Center
Baeza-Baeza, Juan J.; Garcia-Alvarez-Coque, M. Celia
2012-01-01
A general systematic approach including ionic strength effects is proposed for the numerical calculation of concentrations of chemical species in multiequilibrium problems. This approach extends the versatility of the approach presented in a previous article and is applied using the Solver option of the Excel spreadsheet to solve real problems…
Identification of metabolic pathways using pathfinding approaches: a systematic review.
Abd Algfoor, Zeyad; Shahrizal Sunar, Mohd; Abdullah, Afnizanfaizal; Kolivand, Hoshang
2017-03-01
Metabolic pathways have become increasingly available for various microorganisms. Such pathways have spurred the development of a wide array of computational tools, in particular, mathematical pathfinding approaches. This article can facilitate the understanding of computational analysis of metabolic pathways in genomics. Moreover, stoichiometric and pathfinding approaches in metabolic pathway analysis are discussed. Three major types of studies are elaborated: stoichiometric identification models, pathway-based graph analysis and pathfinding approaches in cellular metabolism. Furthermore, evaluation of the outcomes of the pathways with mathematical benchmarking metrics is provided. This review would lead to better comprehension of metabolism behaviors in living cells, in terms of computed pathfinding approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
From research to evidence-informed decision making: a systematic approach
Poot, Charlotte C; van der Kleij, Rianne M; Brakema, Evelyn A; Vermond, Debbie; Williams, Siân; Cragg, Liza; van den Broek, Jos M; Chavannes, Niels H
2018-01-01
Abstract Background Knowledge creation forms an integral part of the knowledge-to-action framework aimed at bridging the gap between research and evidence-informed decision making. Although principles of science communication, data visualisation and user-centred design largely impact the effectiveness of communication, their role in knowledge creation is still limited. Hence, this article aims to provide researchers a systematic approach on how knowledge creation can be put into practice. Methods A systematic two-phased approach towards knowledge creation was formulated and executed. First, during a preparation phase the purpose and audience of the knowledge were defined. Subsequently, a developmental phase facilitated how the content is ‘said’ (language) and communicated (channel). This developmental phase proceeded via two pathways: a translational cycle and design cycle, during which core translational and design components were incorporated. The entire approach was demonstrated by a case study. Results The case study demonstrated how the phases in this systematic approach can be operationalised. It furthermore illustrated how created knowledge can be delivered. Conclusion The proposed approach offers researchers a systematic, practical and easy-to-implement tool to facilitate effective knowledge creation towards decision-makers in healthcare. Through the integration of core components of knowledge creation evidence-informed decision making will ultimately be optimized. PMID:29538728
Advantages and Disadvantages of Health Care Accreditation Mod-els.
Tabrizi, Jafar S; Gharibi, Farid; Wilson, Andrew J
2011-01-01
This systematic review seeks to define the general advantages and disadvan-tages of accreditation programs to assist in choosing the most appropriate approach. Systematic search of SID, Ovid Medline & PubMed databases was conducted by the keywords of accreditation, hospital, medical practice, clinic, accreditation models, health care and Persian meanings. From 2379 initial articles, 83 articles met the full inclusion criteria. From initial analysis, 23 attributes were identified which appeared to define advantages and disadvantages of different accreditation approaches and the available systems were compared on these. Six systems were identified in the international literature including the JCAHO from USA, the Canadian program of CCHSA, and the accreditation programs of UK, Australia, New Zealand and France. The main distinguishing attributes among them were: quality improve-ment, patient and staff safety, improving health services integration, public's confi-dence, effectiveness and efficiency of health services, innovation, influence global standards, information management, breadth of activity, history, effective relationship with stakeholders, agreement with AGIL attributes and independence from government. Based on 23 attributes of comprehensive accreditation systems we have defined from a systematic review, the JCAHO accreditation program of USA and then CCHSA of Can-ada offered the most comprehensive systems with the least disadvantages. Other programs such as the ACHS of Australia, ANAES of France, QHNZ of New Zealand and UK accredita-tion programs were fairly comparable according to these criteria. However the decision for any country or health system should be based on an assessment weighing up their specific objec-tives and needs.
Gravitational decoupled anisotropies in compact stars
NASA Astrophysics Data System (ADS)
Gabbanelli, Luciano; Rincón, Ángel; Rubio, Carlos
2018-05-01
Simple generic extensions of isotropic Durgapal-Fuloria stars to the anisotropic domain are presented. These anisotropic solutions are obtained by guided minimal deformations over the isotropic system. When the anisotropic sector interacts in a purely gravitational manner, the conditions to decouple both sectors by means of the minimal geometric deformation approach are satisfied. Hence the anisotropic field equations are isolated resulting a more treatable set. The simplicity of the equations allows one to manipulate the anisotropies that can be implemented in a systematic way to obtain different realistic models for anisotropic configurations. Later on, observational effects of such anisotropies when measuring the surface redshift are discussed. To conclude, the consistency of the application of the method over the obtained anisotropic configurations is shown. In this manner, different anisotropic sectors can be isolated of each other and modeled in a simple and systematic way.
Lewin, Simon; Hendry, Maggie; Chandler, Jackie; Oxman, Andrew D; Michie, Susan; Shepperd, Sasha; Reeves, Barnaby C; Tugwell, Peter; Hannes, Karin; Rehfuess, Eva A; Welch, Vivien; Mckenzie, Joanne E; Burford, Belinda; Petkovic, Jennifer; Anderson, Laurie M; Harris, Janet; Noyes, Jane
2017-04-26
Health interventions fall along a spectrum from simple to more complex. There is wide interest in methods for reviewing 'complex interventions', but few transparent approaches for assessing intervention complexity in systematic reviews. Such assessments may assist review authors in, for example, systematically describing interventions and developing logic models. This paper describes the development and application of the intervention Complexity Assessment Tool for Systematic Reviews (iCAT_SR), a new tool to assess and categorise levels of intervention complexity in systematic reviews. We developed the iCAT_SR by adapting and extending an existing complexity assessment tool for randomized trials. We undertook this adaptation using a consensus approach in which possible complexity dimensions were circulated for feedback to a panel of methodologists with expertise in complex interventions and systematic reviews. Based on these inputs, we developed a draft version of the tool. We then invited a second round of feedback from the panel and a wider group of systematic reviewers. This informed further refinement of the tool. The tool comprises ten dimensions: (1) the number of active components in the intervention; (2) the number of behaviours of recipients to which the intervention is directed; (3) the range and number of organizational levels targeted by the intervention; (4) the degree of tailoring intended or flexibility permitted across sites or individuals in applying or implementing the intervention; (5) the level of skill required by those delivering the intervention; (6) the level of skill required by those receiving the intervention; (7) the degree of interaction between intervention components; (8) the degree to which the effects of the intervention are context dependent; (9) the degree to which the effects of the interventions are changed by recipient or provider factors; (10) and the nature of the causal pathway between intervention and outcome. Dimensions 1-6 are considered 'core' dimensions. Dimensions 7-10 are optional and may not be useful for all interventions. The iCAT_SR tool facilitates more in-depth, systematic assessment of the complexity of interventions in systematic reviews and can assist in undertaking reviews and interpreting review findings. Further testing of the tool is now needed.
Whitlock, Evelyn P; Eder, Michelle; Thompson, Jamie H; Jonas, Daniel E; Evans, Corinne V; Guirguis-Blake, Janelle M; Lin, Jennifer S
2017-03-02
Guideline developers and other users of systematic reviews need information about whether a medical or preventive intervention is likely to benefit or harm some patients more (or less) than the average in order to make clinical practice recommendations tailored to these populations. However, guidance is lacking on how to include patient subpopulation considerations into the systematic reviews upon which guidelines are often based. In this article, we describe methods developed to consistently consider the evidence for relevant subpopulations in systematic reviews conducted to support primary care clinical preventive service recommendations made by the U.S. Preventive Services Task Force (USPSTF). Our approach is grounded in our experience conducting systematic reviews for the USPSTF and informed by a review of existing guidance on subgroup analysis and subpopulation issues. We developed and refined our approach based on feedback from the Subpopulation Workgroup of the USPSTF and pilot testing on reviews being conducted for the USPSTF. This paper provides processes and tools for incorporating evidence-based identification of important sources of potential heterogeneity of intervention effects into all phases of systematic reviews. Key components of our proposed approach include targeted literature searches and key informant interviews to identify the most important subpopulations a priori during topic scoping, a framework for assessing the credibility of subgroup analyses reported in studies, and structured investigation of sources of heterogeneity of intervention effects. Further testing and evaluation are necessary to refine this proposed approach and demonstrate its utility to the producers and users of systematic reviews beyond the context of the USPSTF. Gaps in the evidence on important subpopulations identified by routinely applying this process in systematic reviews will also inform future research needs.
Impact of compressibility on heat transport characteristics of large terrestrial planets
NASA Astrophysics Data System (ADS)
Čížková, Hana; van den Berg, Arie; Jacobs, Michel
2017-07-01
We present heat transport characteristics for mantle convection in large terrestrial exoplanets (M ⩽ 8M⊕) . Our thermal convection model is based on a truncated anelastic liquid approximation (TALA) for compressible fluids and takes into account a selfconsistent thermodynamic description of material properties derived from mineral physics based on a multi-Einstein vibrational approach. We compare heat transport characteristics in compressible models with those obtained with incompressible models based on the classical- and extended Boussinesq approximation (BA and EBA respectively). Our scaling analysis shows that heat flux scales with effective dissipation number as Nu ∼Dieff-0.71 and with Rayleigh number as Nu ∼Raeff0.27. The surface heat flux of the BA models strongly overestimates the values from the corresponding compressible models, whereas the EBA models systematically underestimate the heat flux by ∼10%-15% with respect to a corresponding compressible case. Compressible models are also systematically warmer than the EBA models. Compressibility effects are therefore important for mantle dynamic processes, especially for large rocky exoplanets and consequently also for formation of planetary atmospheres, through outgassing, and the existence of a magnetic field, through thermal coupling of mantle and core dynamic systems.
Primary care access improvement: an empowerment-interaction model.
Ledlow, G R; Bradshaw, D M; Shockley, C
2000-05-01
Improving community primary care access is a difficult and dynamic undertaking. Realizing a need to improve appointment availability, a systematic approach based on measurement, empowerment, and interaction was developed. The model fostered exchange of information and problem solving between interdependent staff sections within a managed care system. Measuring appointments demanded but not available proved to be a credible customer-focused approach to benchmark against set goals. Changing the organizational culture to become more sensitive to changing beneficiary needs was a paramount consideration. Dependent-group t tests were performed to compare the pretreatment and posttreatment effect. The empowerment-interaction model significantly improved the availability of routine and wellness-type appointments. The availability of urgent appointments improved but not significantly; a better prospective model needs to be developed. In aggregate, appointments demanded but not available (empowerment-interaction model) were more than 10% before the treatment and less than 3% with the treatment.
Leerlooijer, Joanne N; Ruiter, Robert A C; Reinders, Jo; Darwisyah, Wati; Kok, Gerjo; Bartholomew, L Kay
2011-06-01
Evidence-based health promotion programmes, including HIV/AIDS prevention and sexuality education programmes, are often transferred to other cultures, priority groups and implementation settings. Challenges in this process include the identification of retaining core elements that relate to the programme's effectiveness while making changes that enhances acceptance in the new context and for the new priority group. This paper describes the use of a systematic approach to programme adaptation using a case study as an example. Intervention Mapping, a protocol for the development of evidence-based behaviour change interventions, was used to adapt the comprehensive school-based sexuality education programme 'The World Starts With Me'. The programme was developed for a priority population in Uganda and adapted to a programme for Indonesian secondary school students. The approach helped to systematically address the complexity and challenges of programme adaptation and to find a balance between preservation of essential programme elements (i.e. logic models) that may be crucial to the programme's effectiveness, including key objectives and theoretical behaviour change methods, and the adaptation of the programme to be acceptable to the new priority group and the programme implementers.
Reuse at the Software Productivity Consortium
NASA Technical Reports Server (NTRS)
Weiss, David M.
1989-01-01
The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.
Hinton-Bayre, Anton D
2011-02-01
There is an ongoing debate over the preferred method(s) for determining the reliable change (RC) in individual scores over time. In the present paper, specificity comparisons of several classic and contemporary RC models were made using a real data set. This included a more detailed review of a new RC model recently proposed in this journal, that used the within-subjects standard deviation (WSD) as the error term. It was suggested that the RC(WSD) was more sensitive to change and theoretically superior. The current paper demonstrated that even in the presence of mean practice effects, false-positive rates were comparable across models when reliability was good and initial and retest variances were equivalent. However, when variances differed, discrepancies in classification across models became evident. Notably, the RC using the WSD provided unacceptably high false-positive rates in this setting. It was considered that the WSD was never intended for measuring change in this manner. The WSD actually combines systematic and error variance. The systematic variance comes from measurable between-treatment differences, commonly referred to as practice effect. It was further demonstrated that removal of the systematic variance and appropriate modification of the residual error term for the purpose of testing individual change yielded an error term already published and criticized in the literature. A consensus on the RC approach is needed. To that end, further comparison of models under varied conditions is encouraged.
Beauchamp, Alison; Batterham, Roy W; Dodson, Sarity; Astbury, Brad; Elsworth, Gerald R; McPhee, Crystal; Jacobson, Jeanine; Buchbinder, Rachelle; Osborne, Richard H
2017-03-03
The need for healthcare strengthening to enhance equity is critical, requiring systematic approaches that focus on those experiencing lesser access and outcomes. This project developed and tested the Ophelia (OPtimising HEalth LIteracy and Access) approach for co-design of interventions to improve health literacy and equity of access. Eight principles guided this development: Outcomes focused; Equity driven, Needs diagnosis, Co-design, Driven by local wisdom, Sustainable, Responsive and Systematically applied. We report the application of the Ophelia process where proof-of-concept was defined as successful application of the principles. Nine sites were briefed on the aims of the project around health literacy, co-design and quality improvement. The sites were rural/metropolitan, small/large hospitals, community health centres or municipalities. Each site identified their own priorities for improvement; collected health literacy data using the Health Literacy Questionnaire (HLQ) within the identified priority groups; engaged staff in co-design workshops to generate ideas for improvement; developed program-logic models; and implemented their projects using Plan-Do-Study-Act (PDSA) cycles. Evaluation included assessment of impacts on organisations, practitioners and service users, and whether the principles were applied. Sites undertook co-design workshops involving discussion of service user needs informed by HLQ (n = 813) and interview data. Sites generated between 21 and 78 intervention ideas and then planned their selected interventions through program-logic models. Sites successfully implemented interventions and refined them progressively with PDSA cycles. Interventions generally involved one of four pathways: development of clinician skills and resources for health literacy, engagement of community volunteers to disseminate health promotion messages, direct impact on consumers' health literacy, and redesign of existing services. Evidence of application of the principles was found in all sites. The Ophelia approach guided identification of health literacy issues at each participating site and the development and implementation of locally appropriate solutions. The eight principles provided a framework that allowed flexible application of the Ophelia approach and generation of a diverse set of interventions. Changes were observed at organisational, staff, and community member levels. The Ophelia approach can be used to generate health service improvements that enhance health outcomes and address inequity of access to healthcare.
De Nunzio, Cosimo; Presicce, Fabrizio; Lombardo, Riccardo; Trucchi, Alberto; Bellangino, Mariangela; Tubaro, Andrea; Moja, Egidio
2018-06-26
Even though evidence based medicine, guidelines and algorithms still represent the pillars of the management of chronic diseases (i.e: hypertension, diabetes mellitus), a patient centred approach has been recently proposed as a successful strategy, in particular to improve drug adherence. Aim of the present review is to evaluate the unmet needs in LUTS/BPH management and the possible impact of a patient centered approach in this setting. A National Center for Biotechnology Information (NCBI) PubMed search for relevant articles published from January 2000 until December 2016 was performed by combining the following MESH terms: patients centred medicine, patient centered care, person centered care, patient centered outcomes, value based care, shared decision making, male, Lower Urinary Tract Symptoms, Benign Prostatic Hyperplasia, treatment. We followed the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA). All studies reporting on patient centred approach, shared decision making and evidence-based medicine were included in the review. All original article, reviews, letters, congress abstracts, and editorials comments were included in the review. Studies reporting single case reports, experimental studies on animal models and studies not in English were not included in the review. Overall 751 abstracts were reviewed, out of them 87 full texts were analysed resulting in 36 papers included. The evidence summarised in this systematic review confirmed how a patient centred visit may improve patient's adherence to medication. Although a patient centred model has been rarely used in urology, management of Low Urinary Tract Symptoms (LUTS) and Benign Prostatic Obstruction (BPO) may represent the perfect ground to experiment and improve this approach. Notwithstanding all the innovations in LUTS/BPO medical treatment, the real life picture is far from ideal. Recent evidence shows a dramatical low drug adherence and satisfaction to medical treatment in LUTS/BPH patients. A patient centred approach may improve drug adherence and some unmet needs in this area, potentially reducing complications and costs. However further well designed studies are needed to confirm this data.
Interplay between proton-neutron pairing and deformation in self-conjugated medium mass nuclei
NASA Astrophysics Data System (ADS)
Gambacurta, Danilo; Lacroix, Denis
2016-05-01
We employ a model combining self-consistent mean-field and shell model techniques to study the competition between particle-like and proton-neutron pairing correlations in fp-shell even-even self-conjugate nuclei. Deformation effects are realistically and microscopically described. The resulting approach can give a precise description of pairing correlations and eventually treat the coexistence of different condensate formed of pairs with different total spin/ isospin. The standard BCS calculations are systematically compared with approaches including correlation effects beyond the independent quasi-particle picture. The competition between proton-neutron correlations in the isoscalar and isovector channels is also analyzed, as well as their dependence on the deformation properties.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
Cyril, Sheila; Smith, Ben J.; Possamai-Inesedy, Alphia; Renzaho, Andre M. N.
2015-01-01
Background Although community engagement (CE) is widely used in health promotion, components of CE models associated with improved health are poorly understood. This study aimed to examine the magnitude of the impact of CE on health and health inequalities among disadvantaged populations, which methodological approaches maximise the effectiveness of CE, and components of CE that are acceptable, feasible, and effective when used among disadvantaged populations. Design The systematic review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. We carried out methodological assessments of the included studies using rating scales. The analysis focussed on model synthesis to identify the key CE components linked to positive study outcomes and comparative analysis between positive study outcomes, processes, and quality indicators of CE. Results Out of 24 studies that met our inclusion criteria, 21 (87.5%) had positively impacted health behaviours, public health planning, health service access, health literacy, and a range of health outcomes. More than half of the studies (58%) were of good quality, whereas 71% and 42% of studies showed good community involvement in research and achieved high levels of CE, respectively. Key CE components that affected health outcomes included real power-sharing, collaborative partnerships, bidirectional learning, incorporating the voice and agency of beneficiary communities in research protocol, and using bicultural health workers for intervention delivery. Conclusions The findings suggest that CE models can lead to improved health and health behaviours among disadvantaged populations if designed properly and implemented through effective community consultation and participation. We also found several gaps in the current measurement of CE in health intervention studies, which suggests the importance of developing innovative approaches to measure CE impact on health outcomes in a more rigorous way. PMID:26689460
The Emergence of Systematic Review in Toxicology
Stephens, Martin L.; Betts, Kellyn; Beck, Nancy B.; Cogliano, Vincent; Dickersin, Kay; Fitzpatrick, Suzanne; Freeman, James; Gray, George; Hartung, Thomas; McPartland, Jennifer; Rooney, Andrew A.; Scherer, Roberta W.; Verloo, Didier; Hoffmann, Sebastian
2016-01-01
The Evidence-based Toxicology Collaboration hosted a workshop on “The Emergence of Systematic Review and Related Evidence-based Approaches in Toxicology,” on November 21, 2014 in Baltimore, Maryland. The workshop featured speakers from agencies and organizations applying systematic review approaches to questions in toxicology, speakers with experience in conducting systematic reviews in medicine and healthcare, and stakeholders in industry, government, academia, and non-governmental organizations. Based on the workshop presentations and discussion, here we address the state of systematic review methods in toxicology, historical antecedents in both medicine and toxicology, challenges to the translation of systematic review from medicine to toxicology, and thoughts on the way forward. We conclude with a recommendation that as various agencies and organizations adapt systematic review methods, they continue to work together to ensure that there is a harmonized process for how the basic elements of systematic review methods are applied in toxicology. PMID:27208075
It's time to rework the blueprints: building a science for clinical psychology.
Millon, Theodore
2003-11-01
The aims in this article are to connect the conceptual structure of clinical psychological science to what the author believes to be the omnipresent principles of evolution, use the evolutionary model to create a deductively derived clinical theory and taxonomy, link the theory and taxonomy to comprehensive and integrated approaches to assessment, and outline a framework for an integrative synergistic model of psychotherapy. These foundations also provide a framework for a systematic approach to the subject realms of personology and psychopathology. Exploring nature's deep principles, the model revives the personologic concept christened by Henry Murray some 65 years ago; it also parallels the interface between human social functioning and evolutionary biology proposed by Edward Wilson in his concept of sociobiology. (c) 2003 APA, all rights reserved.
A systematic reactor design approach for the synthesis of active pharmaceutical ingredients.
Emenike, Victor N; Schenkendorf, René; Krewer, Ulrike
2018-05-01
Today's highly competitive pharmaceutical industry is in dire need of an accelerated transition from the drug development phase to the drug production phase. At the heart of this transition are chemical reactors that facilitate the synthesis of active pharmaceutical ingredients (APIs) and whose design can affect subsequent processing steps. Inspired by this challenge, we present a model-based approach for systematic reactor design. The proposed concept is based on the elementary process functions (EPF) methodology to select an optimal reactor configuration from existing state-of-the-art reactor types or can possibly lead to the design of novel reactors. As a conceptual study, this work summarizes the essential steps in adapting the EPF approach to optimal reactor design problems in the field of API syntheses. Practically, the nucleophilic aromatic substitution of 2,4-difluoronitrobenzene was analyzed as a case study of pharmaceutical relevance. Here, a small-scale tubular coil reactor with controlled heating was identified as the optimal set-up reducing the residence time by 33% in comparison to literature values. Copyright © 2017 Elsevier B.V. All rights reserved.
Wilson, Michael G; Lavis, John N; Gauvin, Francois-Pierre
2016-11-01
Living with multiple chronic conditions (multimorbidity) - and facing complex, uncoordinated and fragmented care - is part of the daily life of a growing number of Canadians. We undertook: a knowledge synthesis; a "gap analysis" of existing systematic reviews; an issue brief that synthesized the available evidence about the problem, three options for addressing it and implementation considerations; a stakeholder dialogue involving key health-system leaders; and a citizen panel. We identified several recommendations for actions that can be taken, including: developing evidence-based guidance that providers can use to help achieve goals set by patients; embracing approaches to supporting self-management; supporting greater communication and collaboration across healthcare providers as well as between healthcare providers and patients; and investing more efforts in health promotion and disease prevention. Our results point to the need for health system decision-makers to support bottom-up, person-centred approaches to developing models of care that are tailored for people with multimorbidity and support a research agenda to address the identified priorities. Copyright © 2016 Longwoods Publishing.
Conducting systematic reviews of association (etiology): The Joanna Briggs Institute's approach.
Moola, Sandeep; Munn, Zachary; Sears, Kim; Sfetcu, Raluca; Currie, Marian; Lisy, Karolina; Tufanaru, Catalin; Qureshi, Rubab; Mattis, Patrick; Mu, Peifan
2015-09-01
The systematic review of evidence is the research method which underpins the traditional approach to evidence-based healthcare. There is currently no uniform methodology for conducting a systematic review of association (etiology). This study outlines and describes the Joanna Briggs Institute's approach and guidance for synthesizing evidence related to association with a predominant focus on etiology and contributes to the emerging field of systematic review methodologies. It should be noted that questions of association typically address etiological or prognostic issues.The systematic review of studies to answer questions of etiology follows the same basic principles of systematic review of other types of data. An a priori protocol must inform the conduct of the systematic review, comprehensive searching must be performed and critical appraisal of retrieved studies must be carried out.The overarching objective of systematic reviews of etiology is to identify and synthesize the best available evidence on the factors of interest that are associated with a particular disease or outcome. The traditional PICO (population, interventions, comparators and outcomes) format for systematic reviews of effects does not align with questions relating to etiology. A systematic review of etiology should include the following aspects: population, exposure of interest (independent variable) and outcome (dependent variable).Studies of etiology are predominantly explanatory or predictive. The objective of reviews of explanatory or predictive studies is to contribute to, and improve our understanding of, the relationship of health-related events or outcomes by examining the association between variables. When interpreting possible associations between variables based on observational study data, caution must be exercised due to the likely presence of confounding variables or moderators that may impact on the results.As with all systematic reviews, there are various approaches to present the results, including a narrative, graphical or tabular summary, or meta-analysis. When meta-analysis is not possible, a set of alternative methods for synthesizing research is available. On the basis of the research question and objectives, narrative, tabular and/or visual approaches can be used for data synthesis. There are some special considerations when conducting meta-analysis for questions related to risk and correlation. These include, but are not limited to, causal inference.Systematic review and meta-analysis of studies related to etiology is an emerging methodology in the field of evidence synthesis. These reviews can provide useful information for healthcare professionals and policymakers on the burden of disease. The standardized Joanna Briggs Institute approach offers a rigorous and transparent method to conduct reviews of etiology.
Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P
2003-06-01
Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to theory, none considered two or more levels. In other words, although the researchers employ behavioral theory, they omit two fundamental propositions: (1) IT usage is influenced by multiple factors and (2) interventions must be multi-dimensional. Our literature synthesis may provide additional insight into the reason for high failure rates associated with underutilized systems, and underscores the need to move beyond the current dominant approach that employs a single model to guide IT implementation plans that aim to address factors associated with IT acceptance and subsequent positive use behavior.
NASA Astrophysics Data System (ADS)
Khaki, M.; Hoteit, I.; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A. I. J. M.; Schumacher, M.; Pattiaratchi, C.
2017-09-01
The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively, improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice
2017-02-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
A systematic comparison of recurrent event models for application to composite endpoints.
Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine
2018-01-04
Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.
ERIC Educational Resources Information Center
Asikainen, Henna; Gijbels, David
2017-01-01
The focus of the present paper is on the contribution of the research in the student approaches to learning tradition. Several studies in this field have started from the assumption that students' approaches to learning develop towards more deep approaches to learning in higher education. This paper reports on a systematic review of longitudinal…
Systematic Approach to Calculate the Concentration of Chemical Species in Multi-Equilibrium Problems
ERIC Educational Resources Information Center
Baeza-Baeza, Juan Jose; Garcia-Alvarez-Coque, Maria Celia
2011-01-01
A general systematic approach is proposed for the numerical calculation of multi-equilibrium problems. The approach involves several steps: (i) the establishment of balances involving the chemical species in solution (e.g., mass balances, charge balance, and stoichiometric balance for the reaction products), (ii) the selection of the unknowns (the…
Elements of integrated care approaches for older people: a review of reviews.
Briggs, Andrew M; Valentijn, Pim P; Thiyagarajan, Jotheeswaran A; Araujo de Carvalho, Islene
2018-04-07
The World Health Organization (WHO) recently proposed an Integrated Care for Older People approach to guide health systems and services in better supporting functional ability of older people. A knowledge gap remains in the key elements of integrated care approaches used in health and social care delivery systems for older populations. The objective of this review was to identify and describe the key elements of integrated care models for elderly people reported in the literature. Review of reviews using a systematic search method. A systematic search was performed in MEDLINE and the Cochrane database in June 2017. Reviews of interventions aimed at care integration at the clinical (micro), organisational/service (meso) or health system (macro) levels for people aged ≥60 years were included. Non-Cochrane reviews published before 2015 were excluded. Reviews were assessed for quality using the Assessment of Multiple Systematic Reviews (AMSTAR) 1 tool. Fifteen reviews (11 systematic reviews, of which six were Cochrane reviews) were included, representing 219 primary studies. Three reviews (20%) included only randomised controlled trials (RCT), while 10 reviews (65%) included both RCTs and non-RCTs. The region where the largest number of primary studies originated was North America (n=89, 47.6%), followed by Europe (n=60, 32.1%) and Oceania (n=31, 16.6%). Eleven (73%) reviews focused on clinical 'micro' and organisational 'meso' care integration strategies. The most commonly reported elements of integrated care models were multidisciplinary teams, comprehensive assessment and case management. Nurses, physiotherapists, general practitioners and social workers were the most commonly reported service providers. Methodological quality was variable (AMSTAR scores: 1-11). Seven (47%) reviews were scored as high quality (AMSTAR score ≥8). Evidence of elements of integrated care for older people focuses particularly on micro clinical care integration processes, while there is a relative lack of information regarding the meso organisational and macro system-level care integration strategies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Lavrentiev, N. A.; Rodimova, O. B.; Fazliev, A. Z.; Vigasin, A. A.
2017-11-01
An approach is suggested to the formation of applied ontologies in subject domains where results are represented in graphical form. An approach to systematization of research graphics is also given which contains information on weakly bound carbon dioxide complexes. The results of systematization of research plots and images that characterize the spectral properties of the CO2 complexes are presented.
Hoang, Van Phuong; Shanahan, Marian; Shukla, Nagesh; Perez, Pascal; Farrell, Michael; Ritter, Alison
2016-04-13
The overarching goal of health policies is to maximize health and societal benefits. Economic evaluations can play a vital role in assessing whether or not such benefits occur. This paper reviews the application of modelling techniques in economic evaluations of drug and alcohol interventions with regard to (i) modelling paradigms themselves; (ii) perspectives of costs and benefits and (iii) time frame. Papers that use modelling approaches for economic evaluations of drug and alcohol interventions were identified by carrying out searches of major databases. Thirty eight papers met the inclusion criteria. Overall, the cohort Markov models remain the most popular approach, followed by decision trees, Individual based model and System dynamics model (SD). Most of the papers adopted a long term time frame to reflect the long term costs and benefits of health interventions. However, it was fairly common among the reviewed papers to adopt a narrow perspective that only takes into account costs and benefits borne by the health care sector. This review paper informs policy makers about the availability of modelling techniques that can be used to enhance the quality of economic evaluations for drug and alcohol treatment interventions.
Li, Xianfeng; Murthy, N. Sanjeeva; Becker, Matthew L.; Latour, Robert A.
2016-01-01
A multiscale modeling approach is presented for the efficient construction of an equilibrated all-atom model of a cross-linked poly(ethylene glycol) (PEG)-based hydrogel using the all-atom polymer consistent force field (PCFF). The final equilibrated all-atom model was built with a systematic simulation toolset consisting of three consecutive parts: (1) building a global cross-linked PEG-chain network at experimentally determined cross-link density using an on-lattice Monte Carlo method based on the bond fluctuation model, (2) recovering the local molecular structure of the network by transitioning from the lattice model to an off-lattice coarse-grained (CG) model parameterized from PCFF, followed by equilibration using high performance molecular dynamics methods, and (3) recovering the atomistic structure of the network by reverse mapping from the equilibrated CG structure, hydrating the structure with explicitly represented water, followed by final equilibration using PCFF parameterization. The developed three-stage modeling approach has application to a wide range of other complex macromolecular hydrogel systems, including the integration of peptide, protein, and/or drug molecules as side-chains within the hydrogel network for the incorporation of bioactivity for tissue engineering, regenerative medicine, and drug delivery applications. PMID:27013229
A functional-dynamic reflection on participatory processes in modeling projects.
Seidl, Roman
2015-12-01
The participation of nonscientists in modeling projects/studies is increasingly employed to fulfill different functions. However, it is not well investigated if and how explicitly these functions and the dynamics of a participatory process are reflected by modeling projects in particular. In this review study, I explore participatory modeling projects from a functional-dynamic process perspective. The main differences among projects relate to the functions of participation-most often, more than one per project can be identified, along with the degree of explicit reflection (i.e., awareness and anticipation) on the dynamic process perspective. Moreover, two main approaches are revealed: participatory modeling covering diverse approaches and companion modeling. It becomes apparent that the degree of reflection on the participatory process itself is not always explicit and perfectly visible in the descriptions of the modeling projects. Thus, the use of common protocols or templates is discussed to facilitate project planning, as well as the publication of project results. A generic template may help, not in providing details of a project or model development, but in explicitly reflecting on the participatory process. It can serve to systematize the particular project's approach to stakeholder collaboration, and thus quality management.
Lennox, L; Maher, L; Reed, J
2018-02-09
Improvement initiatives offer a valuable mechanism for delivering and testing innovations in healthcare settings. Many of these initiatives deliver meaningful and necessary changes to patient care and outcomes. However, many improvement initiatives fail to sustain to a point where their full benefits can be realised. This has led many researchers and healthcare practitioners to develop frameworks, models and tools to support and monitor sustainability. This work aimed to identify what approaches are available to assess and influence sustainability in healthcare and to describe the different perspectives, applications and constructs within these approaches to guide their future use. A systematic review was carried out following PRISMA guidelines to identify publications that reported approaches to support or influence sustainability in healthcare. Eligibility criteria were defined through an iterative process in which two reviewers independently assessed 20% of articles to test the objectivity of the selection criteria. Data were extracted from the identified articles, and a template analysis was undertaken to identify and assess the sustainability constructs within each reported approach. The search strategy identified 1748 publications with 227 articles retrieved in full text for full documentary analysis. In total, 62 publications identifying a sustainability approach were included in this review (32 frameworks, 16 models, 8 tools, 4 strategies, 1 checklist and 1 process). Constructs across approaches were compared and 40 individual constructs for sustainability were found. Comparison across approaches demonstrated consistent constructs were seen regardless of proposed interventions, setting or level of application with 6 constructs included in 75% of the approaches. Although similarities were found, no approaches contained the same combination of the constructs nor did any single approach capture all identified constructs. From these results, a consolidated framework for sustainability constructs in healthcare was developed. Choosing a sustainability method can pose a challenge because of the diverse approaches reported in the literature. This review provides a valuable resource to researchers, healthcare professionals and improvement practitioners by providing a summary of available sustainability approaches and their characteristics. This review was registered on the PROSPERO database: CRD42016040081 in June 2016.
Zhang, Yiming; Jin, Quan; Wang, Shuting; Ren, Ren
2011-05-01
The mobile behavior of 1481 peptides in ion mobility spectrometry (IMS), which are generated by protease digestion of the Drosophila melanogaster proteome, is modeled and predicted based on two different types of characterization methods, i.e. sequence-based approach and structure-based approach. In this procedure, the sequence-based approach considers both the amino acid composition of a peptide and the local environment profile of each amino acid in the peptide; the structure-based approach is performed with the CODESSA protocol, which regards a peptide as a common organic compound and generates more than 200 statistically significant variables to characterize the whole structure profile of a peptide molecule. Subsequently, the nonlinear support vector machine (SVM) and Gaussian process (GP) as well as linear partial least squares (PLS) regression is employed to correlate the structural parameters of the characterizations with the IMS drift times of these peptides. The obtained quantitative structure-spectrum relationship (QSSR) models are evaluated rigorously and investigated systematically via both one-deep and two-deep cross-validations as well as the rigorous Monte Carlo cross-validation (MCCV). We also give a comprehensive comparison on the resulting statistics arising from the different combinations of variable types with modeling methods and find that the sequence-based approach can give the QSSR models with better fitting ability and predictive power but worse interpretability than the structure-based approach. In addition, though the QSSR modeling using sequence-based approach is not needed for the preparation of the minimization structures of peptides before the modeling, it would be considerably efficient as compared to that using structure-based approach. Copyright © 2011 Elsevier Ltd. All rights reserved.
Semi-supervised anomaly detection - towards model-independent searches of new physics
NASA Astrophysics Data System (ADS)
Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu
2012-06-01
Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.
NASA Astrophysics Data System (ADS)
Liang, Dong; Song, Yimin; Sun, Tao; Jin, Xueying
2017-09-01
A systematic dynamic modeling methodology is presented to develop the rigid-flexible coupling dynamic model (RFDM) of an emerging flexible parallel manipulator with multiple actuation modes. By virtue of assumed mode method, the general dynamic model of an arbitrary flexible body with any number of lumped parameters is derived in an explicit closed form, which possesses the modular characteristic. Then the completely dynamic model of system is formulated based on the flexible multi-body dynamics (FMD) theory and the augmented Lagrangian multipliers method. An approach of combining the Udwadia-Kalaba formulation with the hybrid TR-BDF2 numerical algorithm is proposed to address the nonlinear RFDM. Two simulation cases are performed to investigate the dynamic performance of the manipulator with different actuation modes. The results indicate that the redundant actuation modes can effectively attenuate vibration and guarantee higher dynamic performance compared to the traditional non-redundant actuation modes. Finally, a virtual prototype model is developed to demonstrate the validity of the presented RFDM. The systematic methodology proposed in this study can be conveniently extended for the dynamic modeling and controller design of other planar flexible parallel manipulators, especially the emerging ones with multiple actuation modes.
Multiscale Modeling of Hematologic Disorders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fedosov, Dmitry A.; Pivkin, Igor; Pan, Wenxiao
Parasitic infectious diseases and other hereditary hematologic disorders are often associated with major changes in the shape and viscoelastic properties of red blood cells (RBCs). Such changes can disrupt blood flow and even brain perfusion, as in the case of cerebral malaria. Modeling of these hematologic disorders requires a seamless multiscale approach, where blood cells and blood flow in the entire arterial tree are represented accurately using physiologically consistent parameters. In this chapter, we present a computational methodology based on dissipative particle dynamics (DPD) which models RBCs as well as whole blood in health and disease. DPD is a Lagrangianmore » method that can be derived from systematic coarse-graining of molecular dynamics but can scale efficiently up to small arteries and can also be used to model RBCs down to spectrin level. To this end, we present two complementary mathematical models for RBCs and describe a systematic procedure on extracting the relevant input parameters from optical tweezers and microfluidic experiments for single RBCs. We then use these validated RBC models to predict the behavior of whole healthy blood and compare with experimental results. The same procedure is applied to modeling malaria, and results for infected single RBCs and whole blood are presented.« less
Cross-validation analysis for genetic evaluation models for ranking in endurance horses.
García-Ballesteros, S; Varona, L; Valera, M; Gutiérrez, J P; Cervantes, I
2018-01-01
Ranking trait was used as a selection criterion for competition horses to estimate racing performance. In the literature the most common approaches to estimate breeding values are the linear or threshold statistical models. However, recent studies have shown that a Thurstonian approach was able to fix the race effect (competitive level of the horses that participate in the same race), thus suggesting a better prediction accuracy of breeding values for ranking trait. The aim of this study was to compare the predictability of linear, threshold and Thurstonian approaches for genetic evaluation of ranking in endurance horses. For this purpose, eight genetic models were used for each approach with different combinations of random effects: rider, rider-horse interaction and environmental permanent effect. All genetic models included gender, age and race as systematic effects. The database that was used contained 4065 ranking records from 966 horses and that for the pedigree contained 8733 animals (47% Arabian horses), with an estimated heritability around 0.10 for the ranking trait. The prediction ability of the models for racing performance was evaluated using a cross-validation approach. The average correlation between real and predicted performances across genetic models was around 0.25 for threshold, 0.58 for linear and 0.60 for Thurstonian approaches. Although no significant differences were found between models within approaches, the best genetic model included: the rider and rider-horse random effects for threshold, only rider and environmental permanent effects for linear approach and all random effects for Thurstonian approach. The absolute correlations of predicted breeding values among models were higher between threshold and Thurstonian: 0.90, 0.91 and 0.88 for all animals, top 20% and top 5% best animals. For rank correlations these figures were 0.85, 0.84 and 0.86. The lower values were those between linear and threshold approaches (0.65, 0.62 and 0.51). In conclusion, the Thurstonian approach is recommended for the routine genetic evaluations for ranking in endurance horses.
Modelling of phase transformations occurring in low activation martensitic steels
NASA Astrophysics Data System (ADS)
Brachet, J.-C.; Gavard, L.; Boussidan, C.; Lepoittevin, C.; Denis, S.; Servant, C.
1998-10-01
The main objective of this paper is to summarize modelling of on-heating and on-cooling phase transformations occurring in Low Activation Martensitic (LAM) steels. Calculations of thermodynamic equilibrium phase fractions and kinetic aspects of phase transformations have been performed by using different approaches from experimental data (CCT and TTT diagrams obtained by dilatometry). All the calculated data have been compared to an important and systematic set of experimental data obtained on different LAM steels of the 7.5-11% CrWVT a type.
Simplified stock markets described by number operators
NASA Astrophysics Data System (ADS)
Bagarello, F.
2009-06-01
In this paper we continue our systematic analysis of the operatorial approach previously proposed in an economical context and we discuss a mixed toy model of a simplified stock market, i.e. a model in which the price of the shares is given as an input. We deduce the time evolution of the portfolio of the various traders of the market, as well as of other observable quantities. As in a previous paper, we solve the equations of motion by means of a fixed point like approximation.
Comparing different stimulus configurations for population receptive field mapping in human fMRI
Alvarez, Ivan; de Haas, Benjamin; Clark, Chris A.; Rees, Geraint; Schwarzkopf, D. Samuel
2015-01-01
Population receptive field (pRF) mapping is a widely used approach to measuring aggregate human visual receptive field properties by recording non-invasive signals using functional MRI. Despite growing interest, no study to date has systematically investigated the effects of different stimulus configurations on pRF estimates from human visual cortex. Here we compared the effects of three different stimulus configurations on a model-based approach to pRF estimation: size-invariant bars and eccentricity-scaled bars defined in Cartesian coordinates and traveling along the cardinal axes, and a novel simultaneous “wedge and ring” stimulus defined in polar coordinates, systematically covering polar and eccentricity axes. We found that the presence or absence of eccentricity scaling had a significant effect on goodness of fit and pRF size estimates. Further, variability in pRF size estimates was directly influenced by stimulus configuration, particularly for higher visual areas including V5/MT+. Finally, we compared eccentricity estimation between phase-encoded and model-based pRF approaches. We observed a tendency for more peripheral eccentricity estimates using phase-encoded methods, independent of stimulus size. We conclude that both eccentricity scaling and polar rather than Cartesian stimulus configuration are important considerations for optimal experimental design in pRF mapping. While all stimulus configurations produce adequate estimates, simultaneous wedge and ring stimulation produced higher fit reliability, with a significant advantage in reduced acquisition time. PMID:25750620
Elbogen, Eric B.; Fuller, Sara; Johnson, Sally C.; Brooks, Stephanie; Kinneer, Patricia; Calhoun, Patrick; Beckham, Jean C.
2010-01-01
Despite increased media attention on violent acts against others committed by military Veterans, few models have been developed to systematically guide violence risk assessment among Veterans. Ideally, a model would identify which Veterans are most at risk for violence and increased attention could then be turned to determining what could be done to prevent violent behavior. This article suggests how empirical approaches to risk assessment used successfully in civilian populations can be applied to Veterans. A review was conducted of the scientific literature on Veteran populations regarding factors related to interpersonal violence generally and to domestic violence specifically. A list was then generated of empirically-supported risk factors for clinicians to consider in practice. To conceptualize how these known risk factors relate to a Veteran’s violence potential, risk assessment scholarship was utilized to develop an evidence-based method to guide mental health professionals. The goals of this approach are to integrate science into practice, overcome logistical barriers, and permit more effective assessment, monitoring, and management of violence risk for clinicians working with Veterans, both in Veteran Administration settings and in the broader community. It is likely that the use of a systematic, empirical framework could lead to improved clinical decision-making in the area of risk assessment, and help reduce violence among Veterans. PMID:20627387
Hypnotherapy for traumatic grief: janetian and modern approaches integrated.
van der Hart, O; Brown, P; Turco, R N
1990-04-01
Traumatic grief occurs when psychological trauma obstructs mourning. Nosologically, it is related to pathological grief and posttraumatic stress disorder (PTSD). Therapeutic advances from both fields make it clear that the trauma per se must be accessed before mourning can proceed. The gamut of psychotherapies has been employed, but hypnosis appears to be the most specific. Pierre Janet provided a remarkably modern conceptual basis for diagnosis and treatment based on a dissociation model. His approach is combined with contemporary innovations to present a systematic and integrated account of hypnotherapy for traumatic grief.
Adaptive Prior Variance Calibration in the Bayesian Continual Reassessment Method
Zhang, Jin; Braun, Thomas M.; Taylor, Jeremy M.G.
2012-01-01
Use of the Continual Reassessment Method (CRM) and other model-based approaches to design in Phase I clinical trials has increased due to the ability of the CRM to identify the maximum tolerated dose (MTD) better than the 3+3 method. However, the CRM can be sensitive to the variance selected for the prior distribution of the model parameter, especially when a small number of patients are enrolled. While methods have emerged to adaptively select skeletons and to calibrate the prior variance only at the beginning of a trial, there has not been any approach developed to adaptively calibrate the prior variance throughout a trial. We propose three systematic approaches to adaptively calibrate the prior variance during a trial and compare them via simulation to methods proposed to calibrate the variance at the beginning of a trial. PMID:22987660
ERIC Educational Resources Information Center
Packard, Richard D.; Dereshiwsky, Mary I.
Despite current interest with the concept of the "New American School" model discussed in "America 2000," school systems continue to approach educational reform and restructuring by tinkering with key organizational components in isolation. The total school organization requires assessment and profiling to determine which key components are drags…
ERIC Educational Resources Information Center
Bush, Michael D.
2010-01-01
The development of online learning materials is a complex and expensive process that can benefit from the application of consistent and organized principles of instructional design. This article discusses the development at Brigham Young University of the online portion of a one-semester course in Swahili using the ADDIE Model (Analysis, Design,…
Commentary: exploring hormonal influences on problem sexual behavior.
Sullivan, Danny H; Mullen, Paul E
2012-01-01
The conceptualization of sexual offending remains problematic and prey to fashion and enthusiasm. Progress can come only on the basis of sound research on the biological, social, and psychological associations to such offending. This study, though in some ways modest in its contribution, offers a model of the systematic approaches which offer the best chances of eventually understanding and managing sexual offending.
A Cross-Case Analysis of Growth Model Programs in Three States
ERIC Educational Resources Information Center
Gardella, Jennifer L.
2013-01-01
Signed into Law on January 8, 2002, the 1,180 page No Child Left Behind Act (NCLB) shifted the course of public education in America. For the first time accountability was firmly placed at the center of school operations by requiring a systematic approach to achieving reform and improving all areas of school life (Wanker & Christie, 2005). As…
ERIC Educational Resources Information Center
Simpson, Steve; Clemens, Rebecca; Killingsworth, Drea Rae; Ford, Julie Dyke
2015-01-01
A flurry of recent research in writing studies has addressed the need for more systematic approaches to graduate-level writing support, though more research is needed into more organic models that account for graduate students' specific needs and that build infrastructure for writing support within university departments. This article reports on a…
ERIC Educational Resources Information Center
Slauson, Gayla Jo; Carpenter, Donald; Snyder, Johnny
2011-01-01
Systems in the Foundations of Information Systems course can be used to connect with students in computer information systems programs; a systematic approach to beginning student relationship management in this course is helpful. The authors suggest that four systems be created in the Foundations Course. These theoretical systems include an…
ERIC Educational Resources Information Center
Diamond, Robert M.
This book is intended to help college faculty effectively design and evaluate courses and curricula. The 16 chapters address the following topics: a learning-centered approach to course and curriculum design; a systematic design model (showing benefits); the decision to begin a curriculum project; getting started; linking goals, courses, and…
ERIC Educational Resources Information Center
Schrader, Marvin A.; And Others
The project was designed to determine the feasibility of having a vocational technical adult education (VTAE) district provide continuing education inservice training for health care facilities using videotape equipment so that employees could gain knowledge and skills without leaving the facility or having to involve time outside the normal…
Evans, Natalie; Meñaca, Arantza; Koffman, Jonathan; Harding, Richard; Higginson, Irene J; Pool, Robert; Gysels, Marjolein
2012-07-01
Cultural competency is increasingly recommended in policy and practice to improve end-of-life (EoL) care for minority ethnic groups in multicultural societies. It is imperative to critically analyze this approach to understand its underlying concepts. Our aim was to appraise cultural competency approaches described in the British literature on EoL care and minority ethnic groups. This is a critical review. Articles on cultural competency were identified from a systematic review of the literature on minority ethnic groups and EoL care in the United Kingdom. Terms, definitions, and conceptual models of cultural competency approaches were identified and situated according to purpose, components, and origin. Content analysis of definitions and models was carried out to identify key components. One-hundred thirteen articles on minority ethnic groups and EoL care in the United Kingdom were identified. Over half (n=60) contained a term, definition, or model for cultural competency. In all, 17 terms, 17 definitions, and 8 models were identified. The most frequently used term was "culturally sensitive," though "cultural competence" was defined more often. Definitions contained one or more of the components: "cognitive," "implementation," or "outcome." Models were categorized for teaching or use in patient assessment. Approaches were predominantly of American origin. The variety of terms, definitions, and models underpinning cultural competency approaches demonstrates a lack of conceptual clarity, and potentially complicates implementation. Further research is needed to compare the use of cultural competency approaches in diverse cultures and settings, and to assess the impact of such approaches on patient outcomes.
Propagation of stage measurement uncertainties to streamflow time series
NASA Astrophysics Data System (ADS)
Horner, Ivan; Le Coz, Jérôme; Renard, Benjamin; Branger, Flora; McMillan, Hilary
2016-04-01
Streamflow uncertainties due to stage measurements errors are generally overlooked in the promising probabilistic approaches that have emerged in the last decade. We introduce an original error model for propagating stage uncertainties through a stage-discharge rating curve within a Bayesian probabilistic framework. The method takes into account both rating curve (parametric errors and structural errors) and stage uncertainty (systematic and non-systematic errors). Practical ways to estimate the different types of stage errors are also presented: (1) non-systematic errors due to instrument resolution and precision and non-stationary waves and (2) systematic errors due to gauge calibration against the staff gauge. The method is illustrated at a site where the rating-curve-derived streamflow can be compared with an accurate streamflow reference. The agreement between the two time series is overall satisfying. Moreover, the quantification of uncertainty is also satisfying since the streamflow reference is compatible with the streamflow uncertainty intervals derived from the rating curve and the stage uncertainties. Illustrations from other sites are also presented. Results are much contrasted depending on the site features. In some cases, streamflow uncertainty is mainly due to stage measurement errors. The results also show the importance of discriminating systematic and non-systematic stage errors, especially for long term flow averages. Perspectives for improving and validating the streamflow uncertainty estimates are eventually discussed.
Cross-cultural perspectives on physician and lay models of the common cold.
Baer, Roberta D; Weller, Susan C; de Alba García, Javier García; Rocha, Ana L Salcedo
2008-06-01
We compare physicians and laypeople within and across cultures, focusing on similarities and differences across samples, to determine whether cultural differences or lay-professional differences have a greater effect on explanatory models of the common cold. Data on explanatory models for the common cold were collected from physicians and laypeople in South Texas and Guadalajara, Mexico. Structured interview materials were developed on the basis of open-ended interviews with samples of lay informants at each locale. A structured questionnaire was used to collect information from each sample on causes, symptoms, and treatments for the common cold. Consensus analysis was used to estimate the cultural beliefs for each sample. Instead of systematic differences between samples based on nationality or level of professional training, all four samples largely shared a single-explanatory model of the common cold, with some differences on subthemes, such as the role of hot and cold forces in the etiology of the common cold. An evaluation of our findings indicates that, although there has been conjecture about whether cultural or lay-professional differences are of greater importance in understanding variation in explanatory models of disease and illness, systematic data collected on community and professional beliefs indicate that such differences may be a function of the specific illness. Further generalizations about lay-professional differences need to be based on detailed data for a variety of illnesses, to discern patterns that may be present. Finally, a systematic approach indicates that agreement across individual explanatory models is sufficient to allow for a community-level explanatory model of the common cold.
Solving work-related ethical problems.
Laukkanen, Laura; Suhonen, Riitta; Leino-Kilpi, Helena
2016-12-01
Nurse managers are responsible for solving work-related ethical problems to promote a positive ethical culture in healthcare organizations. The aim of this study was to describe the activities that nurse managers use to solve work-related ethical problems. The ultimate aim was to enhance the ethical awareness of all nurse managers. The data for this descriptive cross-sectional survey were analyzed through inductive content analysis and quantification. Participants and research context: The data were collected in 2011 using a questionnaire that included an open-ended question and background factors. Participants were nurse managers working in Finnish healthcare organizations (n = 122). Ethical considerations: Permission for the study was given by the Finnish Association of Academic Managers and Experts of Health Sciences. Nurse managers identified a variety of activities they use to solve work-related ethical problems: discussion (30%), cooperation (25%), work organization (17%), intervention (10%), personal values (9%), operational models (4%), statistics and feedback (4%), and personal examples (1%). However, these activities did not follow any common or systematic model. In the future, nurse managers need a more systematic approach to solve ethical problems. It is important to establish new kinds of ethics structures in organizations, such as a common, systematic ethical decision-making model and an ethics club for nurse manager problems, to support nurse managers in solving work-related ethical problems.
Lofton, Saria; Julion, Wrenetha A; McNaughton, Diane B; Bergren, Martha Dewey; Keim, Kathryn S
2016-02-01
Obesity and overweight prevalence in African American (AA) youth continues to be one of the highest of all major ethnic groups, which has led researchers to pursue culturally based approaches as a means to improve obesity prevention interventions. The purpose of this systematic review was to evaluate culturally adapted obesity prevention interventions targeting AA youth. A search of electronic databases, limited to multicomponent culturally adapted obesity prevention controlled trials from 2003 to 2013, was conducted for key terms. Eleven studies met inclusion criteria. We used the PEN-3 model to evaluate the strengths and weaknesses of interventions as well as to identify cultural adaptation strategies. The PEN-3 model highlighted the value of designing joint parent-youth interventions, building a relationship between AA mentors and youth, and emphasizing healthful activities that the youth preferred. The PEN-3 model shows promise as an overarching framework to develop culturally adapted obesity interventions. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
Papike, J. J.; Simon, S. B.; White, C.; Laul, J. C.
1982-01-01
The first part of a study of the 'less than 10 micrometer' soil fraction and agglutinates is concerned with the chemical systematics of the considered fraction of lunar soils, taking into account a model for agglutinate formation based on the fusion of the finest fraction (FFF). Attention is given to some evidence which supports the FFF model. The evidence is based on some indirect approaches to an estimation of the composition of the fused soil component. It is found that the 'less than 10 micrometer' soil fraction from all Apollo sites except Apollo 16 (which can be explained) is more feldspathic and enriched in incompatible elements (e.g., K and Th) than the bulk soil. It is concluded that these systematics result from simple comminution in which feldspar breaks down to finer sizes than pyroxene and olivine and the fine-grained incompatible-element-enriched mesostasis concentrates in the 'less than 10 micrometer' soil fraction.
Lean leadership attributes: a systematic review of the literature.
Aij, Kjeld Harald; Teunissen, Maurits
2017-10-09
Purpose Emphasis on quality and reducing costs has led many health-care organizations to reconfigure their management, process, and quality control infrastructures. Many are lean, a management philosophy with roots in manufacturing industries that emphasizes elimination of waste. Successful lean implementation requires systemic change and strong leadership. Despite the importance of leadership to successful lean implementation, few researchers have probed the question of ideal leadership attributes to achieve lean thinking in health care. The purpose of this paper is to provide insight into applicable attributes for lean leaders in health care. Design/methodology/approach The authors systematically reviewed the literature on principles of leadership and, using Dombrowski and Mielke's (2013) conceptual model of lean leadership, developed a parallel theoretical model for lean leadership in health care. Findings This work contributes to the development of a new framework for describing leadership attributes within lean management of health care. Originality/value The summary of attributes can provide a model for health-care leaders to apply lean in their organizations.
Uribe-Convers, Simon; Duke, Justin R.; Moore, Michael J.; Tank, David C.
2014-01-01
• Premise of the study: We present an alternative approach for molecular systematic studies that combines long PCR and next-generation sequencing. Our approach can be used to generate templates from any DNA source for next-generation sequencing. Here we test our approach by amplifying complete chloroplast genomes, and we present a set of 58 potentially universal primers for angiosperms to do so. Additionally, this approach is likely to be particularly useful for nuclear and mitochondrial regions. • Methods and Results: Chloroplast genomes of 30 species across angiosperms were amplified to test our approach. Amplification success varied depending on whether PCR conditions were optimized for a given taxon. To further test our approach, some amplicons were sequenced on an Illumina HiSeq 2000. • Conclusions: Although here we tested this approach by sequencing plastomes, long PCR amplicons could be generated using DNA from any genome, expanding the possibilities of this approach for molecular systematic studies. PMID:25202592
An Alternative Approach to the Teaching of Systematic Transition Metal Chemistry.
ERIC Educational Resources Information Center
Hathaway, Brian
1979-01-01
Presents an alternative approach to teaching Systematic Transition Metal Chemistry with the transition metal chemistry skeleton features of interest. The "skeleton" is intended as a guide to predicting the chemistry of a selected compound. (Author/SA)
Model-data integration to improve the LPJmL dynamic global vegetation model
NASA Astrophysics Data System (ADS)
Forkel, Matthias; Thonicke, Kirsten; Schaphoff, Sibyll; Thurner, Martin; von Bloh, Werner; Dorigo, Wouter; Carvalhais, Nuno
2017-04-01
Dynamic global vegetation models show large uncertainties regarding the development of the land carbon balance under future climate change conditions. This uncertainty is partly caused by differences in how vegetation carbon turnover is represented in global vegetation models. Model-data integration approaches might help to systematically assess and improve model performances and thus to potentially reduce the uncertainty in terrestrial vegetation responses under future climate change. Here we present several applications of model-data integration with the LPJmL (Lund-Potsdam-Jena managed Lands) dynamic global vegetation model to systematically improve the representation of processes or to estimate model parameters. In a first application, we used global satellite-derived datasets of FAPAR (fraction of absorbed photosynthetic activity), albedo and gross primary production to estimate phenology- and productivity-related model parameters using a genetic optimization algorithm. Thereby we identified major limitations of the phenology module and implemented an alternative empirical phenology model. The new phenology module and optimized model parameters resulted in a better performance of LPJmL in representing global spatial patterns of biomass, tree cover, and the temporal dynamic of atmospheric CO2. Therefore, we used in a second application additionally global datasets of biomass and land cover to estimate model parameters that control vegetation establishment and mortality. The results demonstrate the ability to improve simulations of vegetation dynamics but also highlight the need to improve the representation of mortality processes in dynamic global vegetation models. In a third application, we used multiple site-level observations of ecosystem carbon and water exchange, biomass and soil organic carbon to jointly estimate various model parameters that control ecosystem dynamics. This exercise demonstrates the strong role of individual data streams on the simulated ecosystem dynamics which consequently changed the development of ecosystem carbon stocks and fluxes under future climate and CO2 change. In summary, our results demonstrate challenges and the potential of using model-data integration approaches to improve a dynamic global vegetation model.
Krzyzanowska, M K; Walker-Dilks, C; Morris, A M; Gupta, R; Halligan, R; Kouroukis, C T; McCann, K; Atzema, C L
2016-12-01
To define the optimal model of care for patients receiving outpatient chemotherapy who experience a fever. Fever is a common symptom in patients receiving chemotherapy, but the approach to evaluation of fever is not standardized. We conducted a search for existing guidelines and a systematic review of the primary literature from database inception to November 2015. Full-text reports and conference abstracts were considered for inclusion. The search focused on the following topics: the relationship between temperature and poor outcome; predictors for the development of febrile neutropenia (FN); the timing, location, and personnel involved in fever assessment; and the provision of information to patients receiving chemotherapy. Eight guidelines and 38 studies were included. None of the guidelines were directly relevant to the target population because they dealt primarily with the management of FN after diagnosis. The primary studies tended to include fever as one of many symptoms assessed in the setting of chemotherapy. Temperature level was a weak predictor of poor outcomes. We did not find validated prediction models for identifying patients at risk of FN among patients receiving chemotherapy. Several studies presented approaches to symptom management that included fever among the symptoms, but results were not mature enough to merit widespread adoption. Despite the frequency and risks of fever in the setting of chemotherapy, there is limited evidence to define who needs urgent assessment, where the assessment should be performed, and how quickly. Future research in this area is greatly needed to inform new models of care. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Follin, B.; Knox, L.
2018-03-01
Recent determination of the Hubble constant via Cepheid-calibrated supernovae by Riess et al. (2016) (R16) find ˜3σ tension with inferences based on cosmic microwave background temperature and polarization measurements from Planck. This tension could be an indication of inadequacies in the concordance ΛCDM model. Here we investigate the possibility that the discrepancy could instead be due to systematic bias or uncertainty in the Cepheid calibration step of the distance ladder measurement by R16. We consider variations in total-to-selective extinction of Cepheid flux as a function of line-of-sight, hidden structure in the period-luminosity relationship, and potentially different intrinsic colour distributions of Cepheids as a function of host galaxy. Considering all potential sources of error, our final determination of H0 = 73.3 ± 1.7 km/s/Mpc (not including systematic errors from the treatment of geometric distances or Type Ia Supernovae) shows remarkable robustness and agreement with R16. We conclude systematics from the modelling of Cepheid photometry, including Cepheid selection criteria, cannot explain the observed tension between Cepheid-variable and CMB-based inferences of the Hubble constant. Considering a `model-independent' approach to relating Cepheids in galaxies with known distances to Cepheids in galaxies hosting a Type Ia supernova and finding agreement with the R16 result, we conclude no generalization of the model relating anchor and host Cepheid magnitude measurements can introduce significant bias in the H0 inference.
NASA Astrophysics Data System (ADS)
Follin, B.; Knox, L.
2018-07-01
Recent determination of the Hubble constant via Cepheid-calibrated supernovae by Riess et al.find ˜3σ tension with inferences based on cosmic microwave background (CMB) temperature and polarization measurements from Planck. This tension could be an indication of inadequacies in the concordance Λcold dark matter model. Here, we investigate the possibility that the discrepancy could instead be due to systematic bias or uncertainty in the Cepheid calibration step of the distance ladder measurement by Riess et al. We consider variations in total-to-selective extinction of Cepheid flux as a function of line of sight, hidden structure in the period-luminosity relationship, and potentially different intrinsic colour distributions of Cepheids as a function of host galaxy. Considering all potential sources of error, our final determination of H0 = 73.3 ± 1.7 km s-1Mpc-1 (not including systematic errors from the treatment of geometric distances or Type Ia supernovae) shows remarkable robustness and agreement with Riess et al. We conclude systematics from the modelling of Cepheid photometry, including Cepheid selection criteria, cannot explain the observed tension between Cepheid-variable and CMB-based inferences of the Hubble constant. Considering a `model-independent' approach to relating Cepheids in galaxies with known distances to Cepheids in galaxies hosting a Type Ia supernova and finding agreement with the Riess et al. result, we conclude no generalization of the model relating anchor and host Cepheid magnitude measurements can introduce significant bias in the H0 inference.
Living systematic review: 1. Introduction-the why, what, when, and how.
Elliott, Julian H; Synnot, Anneliese; Turner, Tari; Simmonds, Mark; Akl, Elie A; McDonald, Steve; Salanti, Georgia; Meerpohl, Joerg; MacLehose, Harriet; Hilton, John; Tovey, David; Shemilt, Ian; Thomas, James
2017-11-01
Systematic reviews are difficult to keep up to date, but failure to do so leads to a decay in review currency, accuracy, and utility. We are developing a novel approach to systematic review updating termed "Living systematic review" (LSR): systematic reviews that are continually updated, incorporating relevant new evidence as it becomes available. LSRs may be particularly important in fields where research evidence is emerging rapidly, current evidence is uncertain, and new research may change policy or practice decisions. We hypothesize that a continual approach to updating will achieve greater currency and validity, and increase the benefits to end users, with feasible resource requirements over time. Copyright © 2017 Elsevier Inc. All rights reserved.
Bayesian component separation: The Planck experience
NASA Astrophysics Data System (ADS)
Wehus, Ingunn Kathrine; Eriksen, Hans Kristian
2018-05-01
Bayesian component separation techniques have played a central role in the data reduction process of Planck. The most important strength of this approach is its global nature, in which a parametric and physical model is fitted to the data. Such physical modeling allows the user to constrain very general data models, and jointly probe cosmological, astrophysical and instrumental parameters. This approach also supports statistically robust goodness-of-fit tests in terms of data-minus-model residual maps, which are essential for identifying residual systematic effects in the data. The main challenges are high code complexity and computational cost. Whether or not these costs are justified for a given experiment depends on its final uncertainty budget. We therefore predict that the importance of Bayesian component separation techniques is likely to increase with time for intensity mapping experiments, similar to what has happened in the CMB field, as observational techniques mature, and their overall sensitivity improves.
Designing perturbative metamaterials from discrete models.
Matlack, Kathryn H; Serra-Garcia, Marc; Palermo, Antonio; Huber, Sebastian D; Daraio, Chiara
2018-04-01
Identifying material geometries that lead to metamaterials with desired functionalities presents a challenge for the field. Discrete, or reduced-order, models provide a concise description of complex phenomena, such as negative refraction, or topological surface states; therefore, the combination of geometric building blocks to replicate discrete models presenting the desired features represents a promising approach. However, there is no reliable way to solve such an inverse problem. Here, we introduce 'perturbative metamaterials', a class of metamaterials consisting of weakly interacting unit cells. The weak interaction allows us to associate each element of the discrete model with individual geometric features of the metamaterial, thereby enabling a systematic design process. We demonstrate our approach by designing two-dimensional elastic metamaterials that realize Veselago lenses, zero-dispersion bands and topological surface phonons. While our selected examples are within the mechanical domain, the same design principle can be applied to acoustic, thermal and photonic metamaterials composed of weakly interacting unit cells.
Improving effectiveness of systematic conservation planning with density data.
Veloz, Samuel; Salas, Leonardo; Altman, Bob; Alexander, John; Jongsomjit, Dennis; Elliott, Nathan; Ballard, Grant
2015-08-01
Systematic conservation planning aims to design networks of protected areas that meet conservation goals across large landscapes. The optimal design of these conservation networks is most frequently based on the modeled habitat suitability or probability of occurrence of species, despite evidence that model predictions may not be highly correlated with species density. We hypothesized that conservation networks designed using species density distributions more efficiently conserve populations of all species considered than networks designed using probability of occurrence models. To test this hypothesis, we used the Zonation conservation prioritization algorithm to evaluate conservation network designs based on probability of occurrence versus density models for 26 land bird species in the U.S. Pacific Northwest. We assessed the efficacy of each conservation network based on predicted species densities and predicted species diversity. High-density model Zonation rankings protected more individuals per species when networks protected the highest priority 10-40% of the landscape. Compared with density-based models, the occurrence-based models protected more individuals in the lowest 50% priority areas of the landscape. The 2 approaches conserved species diversity in similar ways: predicted diversity was higher in higher priority locations in both conservation networks. We conclude that both density and probability of occurrence models can be useful for setting conservation priorities but that density-based models are best suited for identifying the highest priority areas. Developing methods to aggregate species count data from unrelated monitoring efforts and making these data widely available through ecoinformatics portals such as the Avian Knowledge Network will enable species count data to be more widely incorporated into systematic conservation planning efforts. © 2015, Society for Conservation Biology.
Reliable inference of light curve parameters in the presence of systematics
NASA Astrophysics Data System (ADS)
Gibson, Neale P.
2016-10-01
Time-series photometry and spectroscopy of transiting exoplanets allow us to study their atmospheres. Unfortunately, the required precision to extract atmospheric information surpasses the design specifications of most general purpose instrumentation. This results in instrumental systematics in the light curves that are typically larger than the target precision. Systematics must therefore be modelled, leaving the inference of light-curve parameters conditioned on the subjective choice of systematics models and model-selection criteria. Here, I briefly review the use of systematics models commonly used for transmission and emission spectroscopy, including model selection, marginalisation over models, and stochastic processes. These form a hierarchy of models with increasing degree of objectivity. I argue that marginalisation over many systematics models is a minimal requirement for robust inference. Stochastic models provide even more flexibility and objectivity, and therefore produce the most reliable results. However, no systematics models are perfect, and the best strategy is to compare multiple methods and repeat observations where possible.
Airflow and Particle Transport Through Human Airways: A Systematic Review
NASA Astrophysics Data System (ADS)
Kharat, S. B.; Deoghare, A. B.; Pandey, K. M.
2017-08-01
This paper describes review of the relevant literature about two phase analysis of air and particle flow through human airways. An emphasis of the review is placed on elaborating the steps involved in two phase analysis, which are Geometric modelling methods and Mathematical models. The first two parts describes various approaches that are followed for constructing an Airway model upon which analysis are conducted. Broad two categories of geometric modelling viz. Simplified modelling and Accurate modelling using medical scans are discussed briefly. Ease and limitations of simplified models, then examples of CT based models are discussed. In later part of the review different mathematical models implemented by researchers for analysis are briefed. Mathematical models used for Air and Particle phases are elaborated separately.
Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.
2012-01-01
Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365
NASA Astrophysics Data System (ADS)
Berendsen, Herman J. C.
2004-06-01
The simulation of physical systems requires a simplified, hierarchical approach which models each level from the atomistic to the macroscopic scale. From quantum mechanics to fluid dynamics, this book systematically treats the broad scope of computer modeling and simulations, describing the fundamental theory behind each level of approximation. Berendsen evaluates each stage in relation to its applications giving the reader insight into the possibilities and limitations of the models. Practical guidance for applications and sample programs in Python are provided. With a strong emphasis on molecular models in chemistry and biochemistry, this book will be suitable for advanced undergraduate and graduate courses on molecular modeling and simulation within physics, biophysics, physical chemistry and materials science. It will also be a useful reference to all those working in the field. Additional resources for this title including solutions for instructors and programs are available online at www.cambridge.org/9780521835275. The first book to cover the wide range of modeling and simulations, from atomistic to the macroscopic scale, in a systematic fashion Providing a wealth of background material, it does not assume advanced knowledge and is eminently suitable for course use Contains practical examples and sample programs in Python
Hmielowski, Jay D; Wang, Meredith Y; Donaway, Rebecca R
2018-04-25
This article attempts to connect literatures from the Risk Information Seeking and Processing (RISP) model and cultural cognition theory. We do this by assessing the relationship between the two prominent cultural cognition variables (i.e., group and grid) and risk perceptions. We then examine whether these risk perceptions are associated with three outcomes important to the RISP model: information seeking, systematic processing, and heuristic processing, through a serial mediation model. We used 2015 data collected from 10 communities across the United States to test our hypotheses. Our results show that people high on group and low on grid (egalitarian communitarians) show greater risk perceptions regarding water quality issues. Moreover, these higher levels of perceived risk translate into increased information seeking, systematic processing of information, and lower heuristic processing through intervening variables from the RISP model (e.g., negative emotions and information insufficiency). These results extend the extant literature by expanding on the treatment of political ideology within the RISP model literature and taking a more nuanced approach to political beliefs in accordance with the cultural cognitions literature. Our article also expands on the RISP literature by looking at information-processing variables. © 2018 Society for Risk Analysis.
Bayesian model selection applied to artificial neural networks used for water resources modeling
NASA Astrophysics Data System (ADS)
Kingston, Greer B.; Maier, Holger R.; Lambert, Martin F.
2008-04-01
Artificial neural networks (ANNs) have proven to be extremely valuable tools in the field of water resources engineering. However, one of the most difficult tasks in developing an ANN is determining the optimum level of complexity required to model a given problem, as there is no formal systematic model selection method. This paper presents a Bayesian model selection (BMS) method for ANNs that provides an objective approach for comparing models of varying complexity in order to select the most appropriate ANN structure. The approach uses Markov Chain Monte Carlo posterior simulations to estimate the evidence in favor of competing models and, in this study, three known methods for doing this are compared in terms of their suitability for being incorporated into the proposed BMS framework for ANNs. However, it is acknowledged that it can be particularly difficult to accurately estimate the evidence of ANN models. Therefore, the proposed BMS approach for ANNs incorporates a further check of the evidence results by inspecting the marginal posterior distributions of the hidden-to-output layer weights, which unambiguously indicate any redundancies in the hidden layer nodes. The fact that this check is available is one of the greatest advantages of the proposed approach over conventional model selection methods, which do not provide such a test and instead rely on the modeler's subjective choice of selection criterion. The advantages of a total Bayesian approach to ANN development, including training and model selection, are demonstrated on two synthetic and one real world water resources case study.
Attitude Accessibility as a Function of Emotionality.
Rocklage, Matthew D; Fazio, Russell H
2018-04-01
Despite the centrality of both attitude accessibility and attitude basis to the last 30 years of theoretical and empirical work concerning attitudes, little work has systematically investigated their relation. The research that does exist provides conflicting results and is not at all conclusive given the methodology that has been used. The current research uses recent advances in statistical modeling and attitude measurement to provide the most systematic examination of the relation between attitude accessibility and basis to date. Specifically, we use mixed-effects modeling which accounts for variation across individuals and attitude objects in conjunction with the Evaluative Lexicon (EL)-a linguistic approach that allows for the simultaneous measurement of an attitude's valence, extremity, and emotionality. We demonstrate across four studies, over 10,000 attitudes, and nearly 50 attitude objects that attitudes based on emotion tend to be more accessible in memory, particularly if the attitude is positive.
Fluorescence decay data analysis correcting for detector pulse pile-up at very high count rates
NASA Astrophysics Data System (ADS)
Patting, Matthias; Reisch, Paja; Sackrow, Marcus; Dowler, Rhys; Koenig, Marcelle; Wahl, Michael
2018-03-01
Using time-correlated single photon counting for the purpose of fluorescence lifetime measurements is usually limited in speed due to pile-up. With modern instrumentation, this limitation can be lifted significantly, but some artifacts due to frequent merging of closely spaced detector pulses (detector pulse pile-up) remain an issue to be addressed. We propose a data analysis method correcting for this type of artifact and the resulting systematic errors. It physically models the photon losses due to detector pulse pile-up and incorporates the loss in the decay fit model employed to obtain fluorescence lifetimes and relative amplitudes of the decay components. Comparison of results with and without this correction shows a significant reduction of systematic errors at count rates approaching the excitation rate. This allows quantitatively accurate fluorescence lifetime imaging at very high frame rates.
Direct-to-consumer-advertising of prescription medicines: a theoretical approach to understanding.
Harker, Michael; Harker, Debra
2007-01-01
The pharmaceutical industry is a leader in research and development investment. New treatments need to be communicated to the market, and consumers are increasingly interested in learning about new drugs. Direct to consumer advertising of prescription medicines (DTCA) is a controversial practice where many of the arguments for and against are not supported by strong evidence. This paper aims to contribute to a research agenda that is forming in this area. The paper reports on a systematic review that was conducted and applies accepted theoretical models to the DTCA context. The systematic review methodology is widely accepted in the medical sector and is successfully applied here in the marketing field. The hierarchy of effects model is specifically applied to DTCA with a clear emphasis on consumer rights, empowerment, protection and knowledge. This paper provides healthcare practitioners with insight into how consumers process DTCA messages and provides guidance into how to assist in this message processing.
Systematic study of source mask optimization and verification flows
NASA Astrophysics Data System (ADS)
Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi
2012-06-01
Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.
Douglas, Joy W; Lawrence, Jeannine C; Turner, Lori W
2017-01-01
Dementia is a progressive, debilitating disease that often results in weight loss, malnutrition, and dehydration. Feeding tubes are often prescribed; however, this practice can lead to complications. The purpose of this systematic review was to examine the use of feeding tubes in elderly demented patients from a social ecological perspective. Results indicated that family members often receive inadequate decision-making education. Many health care professionals lack knowledge of evidence-based guidelines pertaining to feeding tube use. Organizational and financial reimbursement structures influence feeding tube use. Feeding practices for patients with advanced dementia is a complex issue, warranting approaches that target each level of the Social Ecological Model.
Using ICT techniques for improving mechatronic systems' dependability
NASA Astrophysics Data System (ADS)
Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe
2013-10-01
The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.
Modeling startle eyeblink electromyogram to assess fear learning.
Khemka, Saurabh; Tzovara, Athina; Gerster, Samuel; Quednow, Boris B; Bach, Dominik R
2017-02-01
Pavlovian fear conditioning is widely used as a laboratory model of associative learning in human and nonhuman species. In this model, an organism is trained to predict an aversive unconditioned stimulus from initially neutral events (conditioned stimuli, CS). In humans, fear memory is typically measured via conditioned autonomic responses or fear-potentiated startle. For the latter, various analysis approaches have been developed, but a systematic comparison of competing methodologies is lacking. Here, we investigate the suitability of a model-based approach to startle eyeblink analysis for assessment of fear memory, and compare this to extant analysis strategies. First, we build a psychophysiological model (PsPM) on a generic startle response. Then, we optimize and validate this PsPM on three independent fear-conditioning data sets. We demonstrate that our model can robustly distinguish aversive (CS+) from nonaversive stimuli (CS-, i.e., has high predictive validity). Importantly, our model-based approach captures fear-potentiated startle during fear retention as well as fear acquisition. Our results establish a PsPM-based approach to assessment of fear-potentiated startle, and qualify previous peak-scoring methods. Our proposed model represents a generic startle response and can potentially be used beyond fear conditioning, for example, to quantify affective startle modulation or prepulse inhibition of the acoustic startle response. © 2016 The Authors. Psychophysiology published by Wiley Periodicals, Inc. on behalf of Society for Psychophysiological Research.
Jiang, Rui ; Yang, Hua ; Zhou, Linqi ; Kuo, C.-C. Jay ; Sun, Fengzhu ; Chen, Ting
2007-01-01
The increasing demand for the identification of genetic variation responsible for common diseases has translated into a need for sophisticated methods for effectively prioritizing mutations occurring in disease-associated genetic regions. In this article, we prioritize candidate nonsynonymous single-nucleotide polymorphisms (nsSNPs) through a bioinformatics approach that takes advantages of a set of improved numeric features derived from protein-sequence information and a new statistical learning model called “multiple selection rule voting” (MSRV). The sequence-based features can maximize the scope of applications of our approach, and the MSRV model can capture subtle characteristics of individual mutations. Systematic validation of the approach demonstrates that this approach is capable of prioritizing causal mutations for both simple monogenic diseases and complex polygenic diseases. Further studies of familial Alzheimer diseases and diabetes show that the approach can enrich mutations underlying these polygenic diseases among the top of candidate mutations. Application of this approach to unclassified mutations suggests that there are 10 suspicious mutations likely to cause diseases, and there is strong support for this in the literature. PMID:17668383
Modeling a terminology-based electronic nursing record system: an object-oriented approach.
Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo
2007-10-01
The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.
Systematic Dissemination of Research and Development Program Improvement Efforts.
ERIC Educational Resources Information Center
Sanders, Carol S.
A systematic approach to disseminaton of vocational education research and development program improvement efforts is comprehensive, effective, and efficient. Systematic dissemination is a prerequisite link to assessing impact of research and development--for program improvement to occur, successful dissemination is crucial. A systematic approach…
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Distributed parameter modeling of repeated truss structures
NASA Technical Reports Server (NTRS)
Wang, Han-Ching
1994-01-01
A new approach to find homogeneous models for beam-like repeated flexible structures is proposed which conceptually involves two steps. The first step involves the approximation of 3-D non-homogeneous model by a 1-D periodic beam model. The structure is modeled as a 3-D non-homogeneous continuum. The displacement field is approximated by Taylor series expansion. Then, the cross sectional mass and stiffness matrices are obtained by energy equivalence using their additive properties. Due to the repeated nature of the flexible bodies, the mass, and stiffness matrices are also periodic. This procedure is systematic and requires less dynamics detail. The first step involves the homogenization from a 1-D periodic beam model to a 1-D homogeneous beam model. The periodic beam model is homogenized into an equivalent homogeneous beam model using the additive property of compliance along the generic axis. The major departure from previous approaches in literature is using compliance instead of stiffness in homogenization. An obvious justification is that the stiffness is additive at each cross section but not along the generic axis. The homogenized model preserves many properties of the original periodic model.
The Emergence of Systematic Review in Toxicology.
Stephens, Martin L; Betts, Kellyn; Beck, Nancy B; Cogliano, Vincent; Dickersin, Kay; Fitzpatrick, Suzanne; Freeman, James; Gray, George; Hartung, Thomas; McPartland, Jennifer; Rooney, Andrew A; Scherer, Roberta W; Verloo, Didier; Hoffmann, Sebastian
2016-07-01
The Evidence-based Toxicology Collaboration hosted a workshop on "The Emergence of Systematic Review and Related Evidence-based Approaches in Toxicology," on November 21, 2014 in Baltimore, Maryland. The workshop featured speakers from agencies and organizations applying systematic review approaches to questions in toxicology, speakers with experience in conducting systematic reviews in medicine and healthcare, and stakeholders in industry, government, academia, and non-governmental organizations. Based on the workshop presentations and discussion, here we address the state of systematic review methods in toxicology, historical antecedents in both medicine and toxicology, challenges to the translation of systematic review from medicine to toxicology, and thoughts on the way forward. We conclude with a recommendation that as various agencies and organizations adapt systematic review methods, they continue to work together to ensure that there is a harmonized process for how the basic elements of systematic review methods are applied in toxicology. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
Modeling Zone-3 Protection with Generic Relay Models for Dynamic Contingency Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; Vyakaranam, Bharat GNVSR; Diao, Ruisheng
This paper presents a cohesive approach for calculating and coordinating the settings of multiple zone-3 protections for dynamic contingency analysis. The zone-3 protections are represented by generic distance relay models. A two-step approach for determining zone-3 relay settings is proposed. The first step is to calculate settings, particularly, the reach, of each zone-3 relay individually by iteratively running line open-end fault short circuit analysis; the blinder is also employed and properly set to meet the industry standard under extreme loading conditions. The second step is to systematically coordinate the protection settings of the zone-3 relays. The main objective of thismore » coordination step is to address the over-reaching issues. We have developed a tool to automate the proposed approach and generate the settings of all distance relays in a PSS/E dyr format file. The calculated zone-3 settings have been tested on a modified IEEE 300 system using a dynamic contingency analysis tool (DCAT).« less
A Goal Oriented Approach for Modeling and Analyzing Security Trade-Offs
NASA Astrophysics Data System (ADS)
Elahi, Golnaz; Yu, Eric
In designing software systems, security is typically only one design objective among many. It may compete with other objectives such as functionality, usability, and performance. Too often, security mechanisms such as firewalls, access control, or encryption are adopted without explicit recognition of competing design objectives and their origins in stakeholder interests. Recently, there is increasing acknowledgement that security is ultimately about trade-offs. One can only aim for "good enough" security, given the competing demands from many parties. In this paper, we examine how conceptual modeling can provide explicit and systematic support for analyzing security trade-offs. After considering the desirable criteria for conceptual modeling methods, we examine several existing approaches for dealing with security trade-offs. From analyzing the limitations of existing methods, we propose an extension to the i* framework for security trade-off analysis, taking advantage of its multi-agent and goal orientation. The method was applied to several case studies used to exemplify existing approaches.
Hermenau, Katharin; Goessmann, Katharina; Rygaard, Niels Peter; Landolt, Markus A; Hecker, Tobias
2017-12-01
Quality of child care has been shown to have a crucial impact on children's development and psychological adjustment, particularly for orphans with a history of maltreatment and trauma. However, adequate care for orphans is often impacted by unfavorable caregiver-child ratios and poorly trained, overburdened personnel, especially in institutional care in countries with limited resources and large numbers of orphans. This systematic review investigated the effects of structural interventions and caregiver trainings on child development in institutional environments. The 24 intervention studies included in this systematic review reported beneficial effects on the children's emotional, social, and cognitive development. Yet, few studies focused on effects of interventions on the child-caregiver relationship or the general institutional environment. Moreover, our review revealed that interventions aimed at improving institutional care settings have largely neglected violence and abuse prevention. Unfortunately, our findings are partially limited by constraints of study design and methodology. In sum, this systematic review sheds light on obstacles and possibilities for the improvement in institutional care. There must be greater efforts at preventing violence, abuse, and neglect of children living in institutional care. Therefore, we advocate for combining attachment theory-based models with maltreatment prevention approaches and then testing them using rigorous scientific standards. By using approaches grounded in the evidence, it could be possible to enable more children to grow up in supportive and nonviolent environments.
Neuropsychology in Finland - over 30 years of systematically trained clinical practice.
Hokkanen, Laura; Nybo, Taina; Poutiainen, Erja
2016-11-01
The aim of this invited paper for a special issue of international practice in The Clinical Neuropsychologist is to provide information on training models, clinical practice, and professional issues within neuropsychology in Finland. Relevant information was gathered via literature searches, a survey by the Neuropsychology Working Group of the Finnish Psychological Association, archives of the Finnish Neuropsychological Society, and personal communication with professionals in Finland. The roots of Finnish neuropsychology are linked to the early German tradition of experimental psychology. Since the 1970s, it has been strongly influenced by both the psychometric approach in the U.S. and the qualitative approach by Luria. Systematic specialization training program began in Finland in 1983. It was first organized by the Finnish Neuropsychological Society and since 1997 by Finnish universities. At present, around 260 neuropsychologists have completed this training. According to the survey by the Finnish Psychological Association in 2014, 67% of Finnish neuropsychologists work in the public sector, 36% in the private sector, and 28% reported that they had private practice. Work includes assessments for 90% of the respondents, rehabilitation for 74%, and many are involved in teaching and research. Of the respondents, 20% worked both with adults and children, 44% with adults only and 36% with children only. Within test development, pediatric neuropsychology is an especially prominent field. A unique blend of approaches and a solid systematic training tradition has led to a strong position of neuropsychologists as distinguished experts in the Finnish health care system.
Yoshimura, Humberto N; Chimanski, Afonso; Cesar, Paulo F
2015-10-01
Ceramic composites are promising materials for dental restorations. However, it is difficult to prepare highly translucent composites due to the light scattering that occurs in multiphase ceramics. The objective of this work was to verify the effectiveness of a systematic approach in designing specific glass compositions with target properties in order to prepare glass infiltrated ceramic composites with high translucency. First it was necessary to calculate from literature data the viscosity of glass at the infiltration temperature using the SciGlass software. Then, a glass composition was designed for targeted viscosity and refractive index. The glass of the system SiO2-B2O3-Al2O3-La2O3-TiO2 prepared by melting the oxide raw materials was spontaneously infiltrated into porous alumina preforms at 1200°C. The optical properties were evaluated using a refractometer and a spectrophotometer. The absorption and scattering coefficients were calculated using the Kubelka-Munk model. The light transmittance of prepared composite was significantly higher than a commercial ceramic-glass composite, due to the matching of glass and preform refractive indexes which decreased the scattering, and also to the decrease in absorption coefficient. The proposed systematic approach was efficient for development of glass infiltrated ceramic composites with high translucency, which benefits include the better aesthetic performance of the final prosthesis. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Commonality and Variability Analysis for Xenon Family of Separation Virtual Machine Monitors (CVAX)
2017-07-18
technical approach is a systematic application of Software Product Line Engineering (SPLE). A systematic application requires describing the family and... engineering Software family September 2016 – October 2016 OSD/OUSD/ATL/ASD(R&E)/RDOffice of Information Systems & Cyber Security RD / ASD(R&E) / AT&L...by the evolving open-source Xen hypervisor. The technical approach is a systematic application of Software Product Line Engineering (SPLE). A
Spatial dynamics of ecosystem service flows: a comprehensive approach to quantifying actual services
Bagstad, Kenneth J.; Johnson, Gary W.; Voigt, Brian; Villa, Ferdinando
2013-01-01
Recent ecosystem services research has highlighted the importance of spatial connectivity between ecosystems and their beneficiaries. Despite this need, a systematic approach to ecosystem service flow quantification has not yet emerged. In this article, we present such an approach, which we formalize as a class of agent-based models termed “Service Path Attribution Networks” (SPANs). These models, developed as part of the Artificial Intelligence for Ecosystem Services (ARIES) project, expand on ecosystem services classification terminology introduced by other authors. Conceptual elements needed to support flow modeling include a service's rivalness, its flow routing type (e.g., through hydrologic or transportation networks, lines of sight, or other approaches), and whether the benefit is supplied by an ecosystem's provision of a beneficial flow to people or by absorption of a detrimental flow before it reaches them. We describe our implementation of the SPAN framework for five ecosystem services and discuss how to generalize the approach to additional services. SPAN model outputs include maps of ecosystem service provision, use, depletion, and flows under theoretical, possible, actual, inaccessible, and blocked conditions. We highlight how these different ecosystem service flow maps could be used to support various types of decision making for conservation and resource management planning.
Ensembles vs. information theory: supporting science under uncertainty
NASA Astrophysics Data System (ADS)
Nearing, Grey S.; Gupta, Hoshin V.
2018-05-01
Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.
Anticipatory nausea and vomiting due to chemotherapy.
Kamen, Charles; Tejani, Mohamedtaki A; Chandwani, Kavita; Janelsins, Michelle; Peoples, Anita R; Roscoe, Joseph A; Morrow, Gary R
2014-01-05
As a specific variation of chemotherapy-induced nausea and vomiting, anticipatory nausea and vomiting (ANV) appears particularly linked to psychological processes. The three predominant factors related to ANV are classical conditioning; demographic and treatment-related factors; and anxiety or negative expectancies. Laboratory models have provided some support for these underlying mechanisms for ANV. ANV may be treated with medical or pharmacological interventions, including benzodiazepines and other psychotropic medications. However, behavioral treatments, including systematic desensitization, remain first line options for addressing ANV. Some complementary treatment approaches have shown promise in reducing ANV symptoms. Additional research into these approaches is needed. This review will address the underlying models of ANV and provide a discussion of these various treatment options. © 2013 Published by Elsevier B.V.
Nuclear physics: Macroscopic aspects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiatecki, W.J.
1993-12-01
A systematic macroscopic, leptodermous approach to nuclear statics and dynamics is described, based formally on the assumptions {h_bar} {yields} 0 and b/R << 1, where b is the surface diffuseness and R the nuclear radius. The resulting static model of shell-corrected nuclear binding energies and deformabilities is accurate to better than 1 part in a thousand and yields a firm determination of the principal properties of the nuclear fluid. As regards dynamics, the above approach suggests that nuclear shape evolutions will often be dominated by dissipation, but quantitative comparisons with experimental data are more difficult than in the case ofmore » statics. In its simplest liquid drop version the model exhibits interesting formal connections to the classic astronomical problem of rotating gravitating masses.« less
System approach to modeling of industrial technologies
NASA Astrophysics Data System (ADS)
Toropov, V. S.; Toropov, E. S.
2018-03-01
The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.
Comparing multiple imputation methods for systematically missing subject-level data.
Kline, David; Andridge, Rebecca; Kaizar, Eloise
2017-06-01
When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Optimal Elastomeric Scaffold Leaflet Shape for Pulmonary Heart Valve Leaflet Replacement
Fan, Rong; Bayoumi, Ahmed S.; Chen, Peter; Hobson, Christopher M.; Wagner, William R.; Mayer, John E.; Sacks, Michael S.
2012-01-01
Surgical replacement of the pulmonary valve (PV) is a common treatment option for congenital pulmonary valve defects. Engineered tissue approaches to develop novel PV replacements are intrinsically complex, and will require methodical approaches for their development. Single leaflet replacement utilizing an ovine model is an attractive approach in that candidate materials can be evaluated under valve level stresses in blood contact without the confounding effects of a particular valve design. In the present study an approach for optimal leaflet shape design based on finite element (FE) simulation of a mechanically anisotropic, elastomeric scaffold for PV replacement is presented. The scaffold was modeled as an orthotropic hyperelastic material using a generalized Fung-type constitutive model. The optimal shape of the fully loaded PV replacement leaflet was systematically determined by minimizing the difference between the deformed shape obtained from FE simulation and an ex-vivo microCT scan of a native ovine PV leaflet. Effects of material anisotropy, dimensional changes of PV root, and fiber orientation on the resulting leaflet deformation were investigated. In-situ validation demonstrated that the approach could guide the design of the leaflet shape for PV replacement surgery. PMID:23294966
Lichtenberg, Peter A.; Stoltman, Jonathan; Ficker, Lisa J.; Iris, Madelyn; Mast, Benjamin
2014-01-01
Financial exploitation and financial capacity issues often overlap when a gerontologist assesses whether an older adult’s financial decision is an autonomous, capable choice. Our goal is to describe a new conceptual model for assessing financial decisions using principles of person-centered approaches and to introduce a new instrument, the Lichtenberg Financial Decision Rating Scale (LFDRS). We created a conceptual model, convened meetings of experts from various disciplines to critique the model and provide input on content and structure, and select final items. We then videotaped administration of the LFDRS to five older adults and had 10 experts provide independent ratings. The LFDRS demonstrated good to excellent inter-rater agreement. The LFDRS is a new tool that allows gerontologists to systematically gather information about a specific financial decision and the decisional abilities in question. PMID:25866438
Lichtenberg, Peter A; Stoltman, Jonathan; Ficker, Lisa J; Iris, Madelyn; Mast, Benjamin
2015-01-01
Financial exploitation and financial capacity issues often overlap when a gerontologist assesses whether an older adult's financial decision is an autonomous, capable choice. Our goal is to describe a new conceptual model for assessing financial decisions using principles of person-centered approaches and to introduce a new instrument, the Lichtenberg Financial Decision Rating Scale (LFDRS). We created a conceptual model, convened meetings of experts from various disciplines to critique the model and provide input on content and structure, and select final items. We then videotaped administration of the LFDRS to five older adults and had 10 experts provide independent ratings. The LFDRS demonstrated good to excellent inter-rater agreement. The LFDRS is a new tool that allows gerontologists to systematically gather information about a specific financial decision and the decisional abilities in question.
Mathematical modeling and computational prediction of cancer drug resistance.
Sun, Xiaoqiang; Hu, Bin
2017-06-23
Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of computational methods for studying drug resistance, including inferring drug-induced signaling networks, multiscale modeling, drug combinations and precision medicine. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Planning in Education--A Systematic Approach
ERIC Educational Resources Information Center
Barker, L. J.
1977-01-01
Presents a case for and poses a procedure including techniques for a systematic approach to planning in education as a means of improving efficiency and effectiveness. Available from: Australian College of Education, 916 Swanston Street, Carlton, Victoria 3053, Australia, $2.50 single copy. (Author/MLF)
A Systematic Approach to Terminal Training.
ERIC Educational Resources Information Center
Sheffield, John
1980-01-01
Describes the systematic approach used by the training department of the Canada Trust Company to develop a training program for operators of the new terminals for the online banking system to which the bank was converting, the Canadian On-Line Financial Information System (COFIS). (JD)
NASA Astrophysics Data System (ADS)
Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph
2016-12-01
Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.
Ortho-geriatric service--a literature review comparing different models.
Kammerlander, C; Roth, T; Friedman, S M; Suhm, N; Luger, T J; Kammerlander-Knauer, U; Krappinger, D; Blauth, M
2010-12-01
In the fast-growing geriatric population, we are confronted with both osteoporosis, which makes fixation of fractures more and more challenging, and several comorbidities, which are most likely to cause postoperative complications. Several models of shared care for these patients are described, and the goal of our systematic literature research was to point out the differences of the individual models. A systematic electronic database search was performed, identifying articles that evaluate in a multidisciplinary approach the elderly hip fracture patients, including at least a geriatrician and an orthopedic surgeon focused on in-hospital treatment. The different investigations were categorized into four groups defined by the type of intervention. The main outcome parameters were pooled across the studies and weighted by sample size. Out of 656 potentially relevant citations, 21 could be extracted and categorized into four groups. Regarding the main outcome parameters, the group with integrated care could show the lowest in-hospital mortality rate (1.14%), the lowest length of stay (7.39 days), and the lowest mean time to surgery (1.43 days). No clear statement could be found for the medical complication rates and the activities of daily living due to their inhomogeneity when comparing the models. The review of these investigations cannot tell us the best model, but there is a trend toward more recent models using an integrated approach. Integrated care summarizes all the positive features reported in the various investigations like integration of a Geriatrician in the trauma unit, having a multidisciplinary team, prioritizing the geriatric fracture patients, and developing guidelines for the patients' treatment. Each hospital implementing a special model for geriatric hip fracture patients should collect detailed data about the patients, process of care, and outcomes to be able to participate in audit processes and avoid peerlessness.
From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.
A Report on the Achievements of Subgroup 19 on Activation Cross Sections of the WPEC, OECD-NEA
NASA Astrophysics Data System (ADS)
Plompen, A. J. M.; Smith, D. L.; Semkova, V. M.
2005-05-01
Subgroup 19 on Activation Cross Sections of the Working Party on Evaluation Cooperation of the OECD-NEA has recently concluded its activities. The goal of the subgroup was to generate a large set of new measured activation cross sections relevant to nuclear applications and make these data available to the nuclear science community. In addition, modeling efforts and sensitivity studies were undertaken to evaluate the use of measured data and model calculations for the prediction of unknown cross sections. The latter addresses the potential of model calculations to satisfy issues on the High Priority Request List when no measured data are available. In the course of the activities of the subgroup over ninety reaction channels were studied experimentally. All except the most recent of these data have been compiled into EXFOR format by members of the subgroup and submitted to OECD- NEA. As a result most of these data can now be retrieved online from the data centers. A systematic comparison was made with the new evaluated data files JEF3.0/EAF2003, JENDL3.3, and ENDF/B-VI.8 and with the current status of the global parameter systematics of the model code TALYS-0.57. In addition, a considerable number of locally optimized parameter sets were developed. Both the global and local approaches emphasize the use of consistent physics modelling for all important reaction channels and nuclides involved in the decay. Comparison of the two approaches allows assessment of the effort required when model estimates to a certain accuracy must be made. Parameter sensitivity studies were undertaken to further assess the accuracy requirements on model parameters if target uncertainties for the cross sections have been specified. The value of this approach has demonstrated itself and indicates the need for model codes that fit all available experimental data in order to connect the data covariance with the covariance of the model predictions. The work of the subgroup was a joint effort between IRMM, ANL, FZ-Jülich, INRNE Sofia, NIPNE Bucharest, IEP and Atomki Debrecen, JAERI, and Tohoku University. A comprehensive review of the subgroup activities will be presented.
The layered learning practice model: Lessons learned from implementation.
Pinelli, Nicole R; Eckel, Stephen F; Vu, Maihan B; Weinberger, Morris; Roth, Mary T
2016-12-15
Pharmacists' views about the implementation, benefits, and attributes of a layered learning practice model (LLPM) were examined. Eligible and willing attending pharmacists at the same institution that had implemented an LLPM completed an individual, 90-minute, face-to-face interview using a structured interview guide developed by the interdisciplinary study team. Interviews were digitally recorded and transcribed verbatim without personal identifiers. Three researchers independently reviewed preliminary findings to reach consensus on emerging themes. In cases where thematic coding diverged, the researchers discussed their analyses until consensus was reached. Of 25 eligible attending pharmacists, 24 (96%) agreed to participate. The sample was drawn from both acute and ambulatory care practice settings and all clinical specialty areas. Attending pharmacists described several experiences implementing the LLPM and perceived benefits of the model. Attending pharmacists identified seven key attributes for hospital and health-system pharmacy departments that are needed to design and implement effective LLPMs: shared leadership, a systematic approach, good communication, flexibility for attending pharmacists, adequate resources, commitment, and evaluation. Participants also highlighted several potential challenges and obstacles for organizations to consider before implementing an LLPM. According to attending pharmacists involved in an LLPM, successful implementation of an LLPM required shared leadership, a systematic approach, communication, flexibility, resources, commitment, and a process for evaluation. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
A systematic approach to embedded biomedical decision making.
Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver
2012-11-01
An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Springer, Janice; Casey-Lockyer, Mary
2016-12-01
From the time of Clara Barton, Red Cross nursing has had a key role in the care and support of persons affected by disasters in the United States. Hurricane Katrina and other events brought to light the need for a shelter model that was inclusive of the whole community, including persons with disabilities, at-risk and vulnerable populations, and children. From an intake process to a nursing model for assessment, an evidence-guided process informed a systematic approach for a registered nurse-led model of care. Copyright © 2016 Elsevier Inc. All rights reserved.
Transforming the urban food desert from the grassroots up: a model for community change.
Lewis, LaVonna Blair; Galloway-Gilliam, Lark; Flynn, Gwendolyn; Nomachi, Jonathan; Keener, LaTonya Chavis; Sloane, David C
2011-01-01
Confronted by continuing health disparities in vulnerable communities, Community Health Councils (CHC), a nonprofit community-based organization in South Los Angeles, worked with the African Americans Building a Legacy of Health Coalition and research partners to develop a community change model to address the root causes of health disparities within the community's African American population. This article discusses how the CHC Model's development and application led to public policy interventions in a "food desert." The CHC Model provided a systematic approach to engaging impacted communities in support of societal level reforms, with the goal to influence health outcomes.
NASA Astrophysics Data System (ADS)
Chouika, N.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.
2018-05-01
A systematic approach for the model building of Generalized Parton Distributions (GPDs), based on their overlap representation within the DGLAP kinematic region and a further covariant extension to the ERBL one, is applied to the valence-quark pion's case, using light-front wave functions inspired by the Nakanishi representation of the pion Bethe-Salpeter amplitudes (BSA). This simple but fruitful pion GPD model illustrates the general model building technique and, in addition, allows for the ambiguities related to the covariant extension, grounded on the Double Distribution (DD) representation, to be constrained by requiring a soft-pion theorem to be properly observed.
Towards Semantic Modelling of Business Processes for Networked Enterprises
NASA Astrophysics Data System (ADS)
Furdík, Karol; Mach, Marián; Sabol, Tomáš
The paper presents an approach to the semantic modelling and annotation of business processes and information resources, as it was designed within the FP7 ICT EU project SPIKE to support creation and maintenance of short-term business alliances and networked enterprises. A methodology for the development of the resource ontology, as a shareable knowledge model for semantic description of business processes, is proposed. Systematically collected user requirements, conceptual models implied by the selected implementation platform as well as available ontology resources and standards are employed in the ontology creation. The process of semantic annotation is described and illustrated using an example taken from a real application case.
Timoshenko, J.; Shivhare, A.; Scott, R. W.; ...
2016-06-30
We adopted ab-initio X-ray Absorption Near Edge Structure (XANES) modelling for structural refinement of local environments around metal impurities in a large variety of materials. Our method enables both direct modelling, where the candidate structures are known, and the inverse modelling, where the unknown structural motifs are deciphered from the experimental spectra. We present also estimates of systematic errors, and their influence on the stability and accuracy of the obtained results. We illustrate our approach by following the evolution of local environment of palladium atoms in palladium-doped gold thiolate clusters upon chemical and thermal treatments.
Morell, Jonathan A
2018-06-01
This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective. Copyright © 2017 Elsevier Ltd. All rights reserved.
Designing a model for trauma system management using public health approach: the case of Iran.
Tarighi, Payam; Tabibi, Seyed Jamaledin; Motevalian, Seyed Abbas; Tofighi, Shahram; Maleki, Mohammad Reza; Delgoshaei, Bahram; Panahi, Farzad; Masoomi, Gholam Reza
2012-01-01
Trauma is a leading cause of death and disability around the world. Injuries are responsible for about six million deaths annually, of which ninety percent occur in developing countries. In Iran, injuries are the most common cause of death among age groups below fifty. Trauma system development is a systematic and comprehensive approach to injury prevention and treatment whose effectiveness has been proved. The present study aims at designing a trauma system management model as the first step toward trauma system establishment in Iran. In this qualitative research, a conceptual framework was developed based on the public health approach and three well-known trauma system models. We used Benchmarks, Indicators and Scoring (BIS) to analyze the current situation of Iran trauma care system. Then the trauma system management was designed using the policy development phase of public health approach The trauma system management model, validated by a panel of experts, describes lead agency, trauma system plan, policy-making councils, and data-based control according to the four main functions of management: leading, planning, organizing and controlling. This model may be implemented in two phases: the exclusive phase, focusing on resource integration and the inclusive phase, which concentrates on system development. The model could facilitate the development of trauma system in Iran through pilot studies as the assurance phase of public health approach. Furthermore, the model can provide a practical framework for trauma system management at the international level.
Chetty, Mersha; Kenworthy, James J; Langham, Sue; Walker, Andrew; Dunlop, William C N
2017-02-24
Opioid dependence is a chronic condition with substantial health, economic and social costs. The study objective was to conduct a systematic review of published health-economic models of opioid agonist therapy for non-prescription opioid dependence, to review the different modelling approaches identified, and to inform future modelling studies. Literature searches were conducted in March 2015 in eight electronic databases, supplemented by hand-searching reference lists and searches on six National Health Technology Assessment Agency websites. Studies were included if they: investigated populations that were dependent on non-prescription opioids and were receiving opioid agonist or maintenance therapy; compared any pharmacological maintenance intervention with any other maintenance regimen (including placebo or no treatment); and were health-economic models of any type. A total of 18 unique models were included. These used a range of modelling approaches, including Markov models (n = 4), decision tree with Monte Carlo simulations (n = 3), decision analysis (n = 3), dynamic transmission models (n = 3), decision tree (n = 1), cohort simulation (n = 1), Bayesian (n = 1), and Monte Carlo simulations (n = 2). Time horizons ranged from 6 months to lifetime. The most common evaluation was cost-utility analysis reporting cost per quality-adjusted life-year (n = 11), followed by cost-effectiveness analysis (n = 4), budget-impact analysis/cost comparison (n = 2) and cost-benefit analysis (n = 1). Most studies took the healthcare provider's perspective. Only a few models included some wider societal costs, such as productivity loss or costs of drug-related crime, disorder and antisocial behaviour. Costs to individuals and impacts on family and social networks were not included in any model. A relatively small number of studies of varying quality were found. Strengths and weaknesses relating to model structure, inputs and approach were identified across all the studies. There was no indication of a single standard emerging as a preferred approach. Most studies omitted societal costs, an important issue since the implications of drug abuse extend widely beyond healthcare services. Nevertheless, elements from previous models could together form a framework for future economic evaluations in opioid agonist therapy including all relevant costs and outcomes. This could more adequately support decision-making and policy development for treatment of non-prescription opioid dependence.
Systematic approach to verification and validation: High explosive burn models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menikoff, Ralph; Scovel, Christina A.
2012-04-16
Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the samemore » experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code, run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.« less
Eric J. Gustafson; Brian R. Miranda; Arjan M.G. De Bruijn; Brian R. Sturtevant; Mark E. Kubiske
2017-01-01
Forest landscape models (FLM) are increasingly used to project the effects of climate change on forested landscapes, yet most use phenomenological approaches with untested assumptions about future forest dynamics. We used a FLM that relies on first principles to mechanistically simulate growth (LANDIS-II with PnET-Succession) to systematically explore how landscapes...
A Systematic Approach to the Study of Accelerated weathering of Building Joint Sealants
Christopher C. White; Donald L. Hunston; Kar Tean Tan; James J. Filliben; Adam L. Pintar; Greg Schueneman
2012-01-01
An accurate service life prediction model is needed for building joint sealants in order to greatly reduce the time to market of a new product and reduce the risk of introducing a poorly performing product into the marketplace. A stepping stone to the success of this effort is the precise control of environmental variables in a laboratory accelerated test apparatus in...
ERIC Educational Resources Information Center
Christal, Melodie E., Ed.
Practitioner papers and research papers on higher education planning and budgeting are presented. "Before the Roof Caves In: A Predictive Model for Physical Plant Renewal" by Frederick M. Biedenweg and Robert E. Hutson outlines a systematic approach that was used at Stanford University to predict the associated costs of physical plant…
ERIC Educational Resources Information Center
Hilton, Jason Theodore
2016-01-01
As emerging technology continues to enter the social studies classroom, teachers need to approach integration of such technology in a systematic manner to ensure that such technology enhances the learning of their students. Currently, scholars of technology integration advocate for the use of one of two different models, either SAMR or TPACK. This…
Patel, Sapana R; Margolies, Paul J; Covell, Nancy H; Lipscomb, Cristine; Dixon, Lisa B
2018-01-01
Implementation science lacks a systematic approach to the development of learning strategies for online training in evidence-based practices (EBPs) that takes the context of real-world practice into account. The field of instructional design offers ecologically valid and systematic processes to develop learning strategies for workforce development and performance support. This report describes the application of an instructional design framework-Analyze, Design, Develop, Implement, and Evaluate (ADDIE) model-in the development and evaluation of e-learning modules as one strategy among a multifaceted approach to the implementation of individual placement and support (IPS), a model of supported employment for community behavioral health treatment programs, in New York State. We applied quantitative and qualitative methods to develop and evaluate three IPS e-learning modules. Throughout the ADDIE process, we conducted formative and summative evaluations and identified determinants of implementation using the Consolidated Framework for Implementation Research (CFIR). Formative evaluations consisted of qualitative feedback received from recipients and providers during early pilot work. The summative evaluation consisted of levels 1 and 2 (reaction to the training, self-reported knowledge, and practice change) quantitative and qualitative data and was guided by the Kirkpatrick model for training evaluation. Formative evaluation with key stakeholders identified a range of learning needs that informed the development of a pilot training program in IPS. Feedback on this pilot training program informed the design document of three e-learning modules on IPS: Introduction to IPS, IPS Job development, and Using the IPS Employment Resource Book . Each module was developed iteratively and provided an assessment of learning needs that informed successive modules. All modules were disseminated and evaluated through a learning management system. Summative evaluation revealed that learners rated the modules positively, and self-report of knowledge acquisition was high (mean range: 4.4-4.6 out of 5). About half of learners indicated that they would change their practice after watching the modules (range: 48-51%). All learners who completed the level 1 evaluation demonstrated 80% or better mastery of knowledge on the level 2 evaluation embedded in each module. The CFIR was used to identify implementation barriers and facilitators among the evaluation data which facilitated planning for subsequent implementation support activities in the IPS initiative. Instructional design approaches such as ADDIE may offer implementation scientists and practitioners a flexible and systematic approach for the development of e-learning modules as a single component or one strategy in a multifaceted approach for training in EBPs.
Dimitrov, Borislav D; Motterlini, Nicola; Fahey, Tom
2015-01-01
Objective Estimating calibration performance of clinical prediction rules (CPRs) in systematic reviews of validation studies is not possible when predicted values are neither published nor accessible or sufficient or no individual participant or patient data are available. Our aims were to describe a simplified approach for outcomes prediction and calibration assessment and evaluate its functionality and validity. Study design and methods: Methodological study of systematic reviews of validation studies of CPRs: a) ABCD2 rule for prediction of 7 day stroke; and b) CRB-65 rule for prediction of 30 day mortality. Predicted outcomes in a sample validation study were computed by CPR distribution patterns (“derivation model”). As confirmation, a logistic regression model (with derivation study coefficients) was applied to CPR-based dummy variables in the validation study. Meta-analysis of validation studies provided pooled estimates of “predicted:observed” risk ratios (RRs), 95% confidence intervals (CIs), and indexes of heterogeneity (I2) on forest plots (fixed and random effects models), with and without adjustment of intercepts. The above approach was also applied to the CRB-65 rule. Results Our simplified method, applied to ABCD2 rule in three risk strata (low, 0–3; intermediate, 4–5; high, 6–7 points), indicated that predictions are identical to those computed by univariate, CPR-based logistic regression model. Discrimination was good (c-statistics =0.61–0.82), however, calibration in some studies was low. In such cases with miscalibration, the under-prediction (RRs =0.73–0.91, 95% CIs 0.41–1.48) could be further corrected by intercept adjustment to account for incidence differences. An improvement of both heterogeneities and P-values (Hosmer-Lemeshow goodness-of-fit test) was observed. Better calibration and improved pooled RRs (0.90–1.06), with narrower 95% CIs (0.57–1.41) were achieved. Conclusion Our results have an immediate clinical implication in situations when predicted outcomes in CPR validation studies are lacking or deficient by describing how such predictions can be obtained by everyone using the derivation study alone, without any need for highly specialized knowledge or sophisticated statistics. PMID:25931829
Systematicity and a Categorical Theory of Cognitive Architecture: Universal Construction in Context
Phillips, Steven; Wilson, William H.
2016-01-01
Why does the capacity to think certain thoughts imply the capacity to think certain other, structurally related, thoughts? Despite decades of intensive debate, cognitive scientists have yet to reach a consensus on an explanation for this property of cognitive architecture—the basic processes and modes of composition that together afford cognitive capacity—called systematicity. Systematicity is generally considered to involve a capacity to represent/process common structural relations among the equivalently cognizable entities. However, the predominant theoretical approaches to the systematicity problem, i.e., classical (symbolic) and connectionist (subsymbolic), require arbitrary (ad hoc) assumptions to derive systematicity. That is, their core principles and assumptions do not provide the necessary and sufficient conditions from which systematicity follows, as required of a causal theory. Hence, these approaches fail to fully explain why systematicity is a (near) universal property of human cognition, albeit in restricted contexts. We review an alternative, category theory approach to the systematicity problem. As a mathematical theory of structure, category theory provides necessary and sufficient conditions for systematicity in the form of universal construction: each systematically related cognitive capacity is composed of a common component and a unique component. Moreover, every universal construction can be viewed as the optimal construction in the given context (category). From this view, universal constructions are derived from learning, as an optimization. The ultimate challenge, then, is to explain the determination of context. If context is a category, then a natural extension toward addressing this question is higher-order category theory, where categories themselves are the objects of construction. PMID:27524975
Systematicity and a Categorical Theory of Cognitive Architecture: Universal Construction in Context.
Phillips, Steven; Wilson, William H
2016-01-01
Why does the capacity to think certain thoughts imply the capacity to think certain other, structurally related, thoughts? Despite decades of intensive debate, cognitive scientists have yet to reach a consensus on an explanation for this property of cognitive architecture-the basic processes and modes of composition that together afford cognitive capacity-called systematicity. Systematicity is generally considered to involve a capacity to represent/process common structural relations among the equivalently cognizable entities. However, the predominant theoretical approaches to the systematicity problem, i.e., classical (symbolic) and connectionist (subsymbolic), require arbitrary (ad hoc) assumptions to derive systematicity. That is, their core principles and assumptions do not provide the necessary and sufficient conditions from which systematicity follows, as required of a causal theory. Hence, these approaches fail to fully explain why systematicity is a (near) universal property of human cognition, albeit in restricted contexts. We review an alternative, category theory approach to the systematicity problem. As a mathematical theory of structure, category theory provides necessary and sufficient conditions for systematicity in the form of universal construction: each systematically related cognitive capacity is composed of a common component and a unique component. Moreover, every universal construction can be viewed as the optimal construction in the given context (category). From this view, universal constructions are derived from learning, as an optimization. The ultimate challenge, then, is to explain the determination of context. If context is a category, then a natural extension toward addressing this question is higher-order category theory, where categories themselves are the objects of construction.
Zhao, Xiuli; Yiranbon, Ethel
2014-01-01
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, “least-cost,” and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor. PMID:24511292
After-hours care and its coordination with primary care in the U.S.
O'Malley, Ann S; Samuel, Divya; Bond, Amelia M; Carrier, Emily
2012-11-01
Despite expectations that medical homes provide "24 × 7 coverage" there is little to guide primary care practices in developing sustainable models for accessible and coordinated after-hours care. To identify and describe models of after-hours care in the U.S. that are delivered in primary care sites or coordinated with a patient's usual primary care provider. Qualitative analysis of data from in-depth telephone interviews. Primary care practices in 16 states and the organizations they partner with to provide after-hours coverage. Forty-four primary care physicians, practice managers, nurses and health plan representatives from 28 organizations. Analyses examined after-hours care models, facilitators, barriers and lessons learned. Based on 28 organizations interviewed, five broad models of after-hours care were identified, ranging in the extent to which they provide continuity and patient access. Key themes included: 1) The feasibility of a model varies for many reasons, including patient preferences and needs, the local health care market supply, and financial compensation; 2) A shared electronic health record and systematic notification procedures were extremely helpful in maintaining information continuity between providers; and 3) after-hours care is best implemented as part of a larger practice approach to access and continuity. After-hours care coordinated with a patient's usual primary care provider is facilitated by consideration of patient demand, provider capacity, a shared electronic health record, systematic notification procedures and a broader practice approach to improving primary care access and continuity. Payer support is important to increasing patients' access to after-hours care.
Model-based synthesis of locally contingent responses to global market signals
NASA Astrophysics Data System (ADS)
Magliocca, N. R.
2015-12-01
Rural livelihoods and the land systems on which they depend are increasingly influenced by distant markets through economic globalization. Place-based analyses of land and livelihood system sustainability must then consider both proximate and distant influences on local decision-making. Thus, advancing land change theory in the context of economic globalization calls for a systematic understanding of the general processes as well as local contingencies shaping local responses to global signals. Synthesis of insights from place-based case studies of land and livelihood change is a path forward for developing such systematic knowledge. This paper introduces a model-based synthesis approach to investigating the influence of local socio-environmental and agent-level factors in mediating land-use and livelihood responses to changing global market signals. A generalized agent-based modeling framework is applied to six case-study sites that differ in environmental conditions, market access and influence, and livelihood settings. The largest modeled land conversions and livelihood transitions to market-oriented production occurred in sties with relatively productive agricultural land and/or with limited livelihood options. Experimental shifts in the distributions of agents' risk tolerances generally acted to attenuate or amplify responses to changes in global market signals. Importantly, however, responses of agents at different points in the risk tolerance distribution varied widely, with the wealth gap growing wider between agents with higher or lower risk tolerance. These results demonstrate model-based synthesis is a promising approach to overcome many of the challenges of current synthesis methods in land change science, and to identify generalized as well as locally contingent responses to global market signals.
Zhao, Xiuli; Asante Antwi, Henry; Yiranbon, Ethel
2014-01-01
The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, "least-cost," and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor.
Habchi, Johnny; Chia, Sean; Limbocker, Ryan; Mannini, Benedetta; Ahn, Minkoo; Perni, Michele; Hansson, Oskar; Arosio, Paolo; Kumita, Janet R.; Challa, Pavan Kumar; Cohen, Samuel I. A.; Dobson, Christopher M.; Knowles, Tuomas P. J.; Vendruscolo, Michele
2017-01-01
The aggregation of the 42-residue form of the amyloid-β peptide (Aβ42) is a pivotal event in Alzheimer’s disease (AD). The use of chemical kinetics has recently enabled highly accurate quantifications of the effects of small molecules on specific microscopic steps in Aβ42 aggregation. Here, we exploit this approach to develop a rational drug discovery strategy against Aβ42 aggregation that uses as a read-out the changes in the nucleation and elongation rate constants caused by candidate small molecules. We thus identify a pool of compounds that target specific microscopic steps in Aβ42 aggregation. We then test further these small molecules in human cerebrospinal fluid and in a Caenorhabditis elegans model of AD. Our results show that this strategy represents a powerful approach to identify systematically small molecule lead compounds, thus offering an appealing opportunity to reduce the attrition problem in drug discovery. PMID:28011763
NASA Technical Reports Server (NTRS)
Bolin, B.
1984-01-01
The global biosphere is an exceedingly complex system. To gain an understanding of its structure and dynamic features, it is necessary to increase knowledge about the detailed processes, but also to develop models of how global interactions take place. Attempts to analyze the detailed physical, chemical and biological processes need, in this context, to be guided by an advancement of understanding of the latter. It is necessary to develop a strategy of data gathering that serves both these purposes simultaneously. climate research during the last decade may serve as a useful example of how to approach this difficult problem in a systematic way. Large programs for data collection may easily become rigid and costly. While realizing the necessity of a systematic and long lasting effort of observing the atmosphere, the oceans, land and life on Earth, such a program must remain flexible enough to permit the modifications and even sometimes improvisations that are necessary to maintain a viable program.
A Systematic Approach to Programmatic Assessment
ERIC Educational Resources Information Center
Moffit, Dani M.; Mansell, Jamie L.; Russ, Anne C.
2016-01-01
Context: Accrediting bodies and universities increasingly require evidence of student learning within courses and programs. Within athletic training, programmatic assessment has been a source of angst for program directors. While there are many ways to assess educational programs, this article introduces 1 systematic approach. Objective: This…
How effects on health equity are assessed in systematic reviews of interventions.
Welch, Vivian; Tugwell, Peter; Petticrew, Mark; de Montigny, Joanne; Ueffing, Erin; Kristjansson, Betsy; McGowan, Jessie; Benkhalti Jandu, Maria; Wells, George A; Brand, Kevin; Smylie, Janet
2010-12-08
Enhancing health equity has now achieved international political importance with endorsement from the World Health Assembly in 2009. The failure of systematic reviews to consider effects on health equity is cited by decision-makers as a limitation to their ability to inform policy and program decisions. To systematically review methods to assess effects on health equity in systematic reviews of effectiveness. We searched the following databases up to July 2 2010: MEDLINE, PsychINFO, the Cochrane Methodology Register, CINAHL, Education Resources Information Center, Education Abstracts, Criminal Justice Abstracts, Index to Legal Periodicals, PAIS International, Social Services Abstracts, Sociological Abstracts, Digital Dissertations and the Health Technology Assessment Database. We searched SCOPUS to identify articles that cited any of the included studies on October 7 2010. We included empirical studies of cohorts of systematic reviews that assessed methods for measuring effects on health inequalities. Data were extracted using a pre-tested form by two independent reviewers. Risk of bias was appraised for included studies according to the potential for bias in selection and detection of systematic reviews. Thirty-four methodological studies were included. The methods used by these included studies were: 1) Targeted approaches (n=22); 2) gap approaches (n=12) and gradient approach (n=1). Gender or sex was assessed in eight out of 34 studies, socioeconomic status in ten studies, race/ethnicity in seven studies, age in seven studies, low and middle income countries in 14 studies, and two studies assessed multiple factors across health inequity may exist.Only three studies provided a definition of health equity. Four methodological approaches to assessing effects on health equity were identified: 1) descriptive assessment of reporting and analysis in systematic reviews (all 34 studies used a type of descriptive method); 2) descriptive assessment of reporting and analysis in original trials (12/34 studies); 3) analytic approaches (10/34 studies); and 4) applicability assessment (11/34 studies). Both analytic and applicability approaches were not reported transparently nor in sufficient detail to judge their credibility. There is a need for improvement in conceptual clarity about the definition of health equity, describing sufficient detail about analytic approaches (including subgroup analyses) and transparent reporting of judgments required for applicability assessments in order to assess and report effects on health equity in systematic reviews.
Capturing security requirements for software systems.
El-Hadary, Hassan; El-Kassas, Sherif
2014-07-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.
Capturing security requirements for software systems
El-Hadary, Hassan; El-Kassas, Sherif
2014-01-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514
Arguel, Amaël; Perez-Concha, Oscar; Li, Simon Y W; Lau, Annie Y S
2018-02-01
The aim of this review was to identify general theoretical frameworks used in online social network interventions for behavioral change. To address this research question, a PRISMA-compliant systematic review was conducted. A systematic review (PROSPERO registration number CRD42014007555) was conducted using 3 electronic databases (PsycINFO, Pubmed, and Embase). Four reviewers screened 1788 abstracts. 15 studies were selected according to the eligibility criteria. Randomized controlled trials and controlled studies were assessed using Cochrane Collaboration's "risk-of-bias" tool, and narrative synthesis. Five eligible articles used the social cognitive theory as a framework to develop interventions targeting behavioral change. Other theoretical frameworks were related to the dynamics of social networks, intention models, and community engagement theories. Only one of the studies selected in the review mentioned a well-known theory from the field of health psychology. Conclusions were that guidelines are lacking in the design of online social network interventions for behavioral change. Existing theories and models from health psychology that are traditionally used for in situ behavioral change should be considered when designing online social network interventions in a health care setting. © 2016 John Wiley & Sons, Ltd.
Gravitational decoupling and the Picard-Lefschetz approach
NASA Astrophysics Data System (ADS)
Brown, Jon; Cole, Alex; Shiu, Gary; Cottrell, William
2018-01-01
In this work, we consider tunneling between nonmetastable states in gravitational theories. Such processes arise in various contexts, e.g., in inflationary scenarios where the inflaton potential involves multiple fields or multiple branches. They are also relevant for bubble wall nucleation in some cosmological settings. However, we show that the transition amplitudes computed using the Euclidean method generally do not approach the corresponding field theory limit as Mp→∞ . This implies that in the Euclidean framework, there is no systematic expansion in powers of GN for such processes. Such considerations also carry over directly to no-boundary scenarios involving Hawking-Turok instantons. In this note, we illustrate this failure of decoupling in the Euclidean approach with a simple model of axion monodromy and then argue that the situation can be remedied with a Lorentzian prescription such as the Picard-Lefschetz theory. As a proof of concept, we illustrate with a simple model how tunneling transition amplitudes can be calculated using the Picard-Lefschetz approach.
Manual search approaches used by systematic reviewers in dermatology.
Vassar, Matt; Atakpo, Paul; Kash, Melissa J
2016-10-01
Manual searches are supplemental approaches to database searches to identify additional primary studies for systematic reviews. The authors argue that these manual approaches, in particular hand-searching and perusing reference lists, are often considered the same yet lead to different outcomes. We conducted a PubMed search for systematic reviews in the top 10 dermatology journals (January 2006-January 2016). After screening, the final sample comprised 292 reviews. Statements related to manual searches were extracted from each review and categorized by the primary and secondary authors. Each statement was categorized as either "Search of Reference List," "Hand Search," "Both," or "Unclear." Of the 292 systematic reviews included in our sample, 143 reviews (48.97%) did not report a hand-search or scan of reference lists. One-hundred thirty-six reviews (46.58%) reported searches of reference lists, while 4 reviews (1.37%) reported systematic hand-searches. Three reviews (1.03%) reported use of both hand-searches and scanning reference lists. Six reviews (2.05%) were classified as unclear due to vague wording. Authors of systematic reviews published in dermatology journals in our study sample scanned reference lists more frequently than they conducted hand-searches, possibly contributing to biased search outcomes. We encourage systematic reviewers to routinely practice hand-searching in order to minimize bias.
Dannan, Aous
2009-01-01
Background Evidence-based healthcare is not an easier approach to patient management, but should provide both clinicians and patients with greater confidence and trust in their mutual relationship. The intellectual embrace of evidence-based methods, coupled with clinical expertise and consideration of the patients individual uniqueness and requirements, is needed for all periodontal therapists if optimum care is the goal. One important element of evidence-based decision making in periodontology is the systematic review. Systematic reviews usually provide the periodontist with the highest level of evidence which should be taken into consideration when constructing any treatment plan in the dental clinic. However, reaching systematic reviews might be a time-consuming procedure that needs further personal skills. Methods In this paper, a chair-side novel approach to facilitate the incorporation of systematic reviews into daily periodontal practice is presented. It is based on three simple tools, namely, a list of suitable periodontics-related key words, a data bank of all up-to-date published systematic reviews in periodontology, and hand-made paper sheets to match the key words with their related systematic review statements. Results and Conclusions A primary validation of this method indicated the simplicity in learning and application. Keywords Chair-side; Evidence-based medicine; Periodontology; Systematic review PMID:22461868
Costa, Marcelle Barrueco; Melnik, Tamara
2016-01-01
ABSTRACT Eating disorders are psychiatric conditions originated from and perpetuated by individual, family and sociocultural factors. The psychosocial approach to treatment and prevention of relapse is crucial. To present an overview of the scientific evidence on effectiveness of psychosocial interventions in treatment of eating disorders. All systematic reviews published by the Cochrane Database of Systematic Reviews - Cochrane Library on the topic were included. Afterwards, as from the least recent date of these reviews (2001), an additional search was conducted at PubMed with sensitive search strategy and with the same keywords used. A total of 101 primary studies and 30 systematic reviews (5 Cochrane systematic reviews), meta-analysis, guidelines or narrative reviews of literature were included. The main outcomes were: symptomatic remission, body image, cognitive distortion, psychiatric comorbidity, psychosocial functioning and patient satisfaction. The cognitive behavioral approach was the most effective treatment, especially for bulimia nervosa, binge eating disorder and the night eating syndrome. For anorexia nervosa, the family approach showed greater effectiveness. Other effective approaches were interpersonal psychotherapy, dialectic behavioral therapy, support therapy and self-help manuals. Moreover, there was an increasing number of preventive and promotional approaches that addressed individual, family and social risk factors, being promising for the development of positive self-image and self-efficacy. Further studies are required to evaluate the impact of multidisciplinary approaches on all eating disorders, as well as the cost-effectiveness of some effective modalities, such as the cognitive behavioral therapy. PMID:27462898
Waste in health information systems: a systematic review.
Awang Kalong, Nadia; Yusof, Maryati
2017-05-08
Purpose The purpose of this paper is to discuss a systematic review on waste identification related to health information systems (HIS) in Lean transformation. Design/methodology/approach A systematic review was conducted on 19 studies to evaluate Lean transformation and tools used to remove waste related to HIS in clinical settings. Findings Ten waste categories were identified, along with their relationships and applications of Lean tool types related to HIS. Different Lean tools were used at the early and final stages of Lean transformation; the tool selection depended on the waste characteristic. Nine studies reported a positive impact from Lean transformation in improving daily work processes. The selection of Lean tools should be made based on the timing, purpose and characteristics of waste to be removed. Research limitations/implications Overview of waste and its category within HIS and its analysis from socio-technical perspectives enabled the identification of its root cause in a holistic and rigorous manner. Practical implications Understanding waste types, their root cause and review of Lean tools could subsequently lead to the identification of mitigation approach to prevent future error occurrence. Originality/value Specific waste models for HIS settings are yet to be developed. Hence, the identification of the waste categories could guide future implementation of Lean transformations in HIS settings.
Jolani, Shahab
2018-03-01
In health and medical sciences, multiple imputation (MI) is now becoming popular to obtain valid inferences in the presence of missing data. However, MI of clustered data such as multicenter studies and individual participant data meta-analysis requires advanced imputation routines that preserve the hierarchical structure of data. In clustered data, a specific challenge is the presence of systematically missing data, when a variable is completely missing in some clusters, and sporadically missing data, when it is partly missing in some clusters. Unfortunately, little is known about how to perform MI when both types of missing data occur simultaneously. We develop a new class of hierarchical imputation approach based on chained equations methodology that simultaneously imputes systematically and sporadically missing data while allowing for arbitrary patterns of missingness among them. Here, we use a random effect imputation model and adopt a simplification over fully Bayesian techniques such as Gibbs sampler to directly obtain draws of parameters within each step of the chained equations. We justify through theoretical arguments and extensive simulation studies that the proposed imputation methodology has good statistical properties in terms of bias and coverage rates of parameter estimates. An illustration is given in a case study with eight individual participant datasets. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantum chemical approach to estimating the thermodynamics of metabolic reactions.
Jinich, Adrian; Rappoport, Dmitrij; Dunn, Ian; Sanchez-Lengeling, Benjamin; Olivares-Amaya, Roberto; Noor, Elad; Even, Arren Bar; Aspuru-Guzik, Alán
2014-11-12
Thermodynamics plays an increasingly important role in modeling and engineering metabolism. We present the first nonempirical computational method for estimating standard Gibbs reaction energies of metabolic reactions based on quantum chemistry, which can help fill in the gaps in the existing thermodynamic data. When applied to a test set of reactions from core metabolism, the quantum chemical approach is comparable in accuracy to group contribution methods for isomerization and group transfer reactions and for reactions not including multiply charged anions. The errors in standard Gibbs reaction energy estimates are correlated with the charges of the participating molecules. The quantum chemical approach is amenable to systematic improvements and holds potential for providing thermodynamic data for all of metabolism.
Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements
NASA Astrophysics Data System (ADS)
Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga
The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.
Critchlow, Simone; Hirst, Matthew; Akehurst, Ron; Phillips, Ceri; Philips, Zoe; Sullivan, Will; Dunlop, Will C N
2017-02-01
Complexities in the neuropathic-pain care pathway make the condition difficult to manage and difficult to capture in cost-effectiveness models. The aim of this study is to understand, through a systematic review of previous cost-effectiveness studies, some of the key strengths and limitations in data and modeling practices in neuropathic pain. Thus, the aim is to guide future research and practice to improve resource allocation decisions and encourage continued investment to find novel and effective treatments for patients with neuropathic pain. The search strategy was designed to identify peer-reviewed cost-effectiveness evaluations of non-surgical, pharmaceutical therapies for neuropathic pain published since January 2000, accessing five key databases. All identified publications were reviewed and screened according to pre-defined eligibility criteria. Data extraction was designed to reflect key data challenges and approaches to modeling in neuropathic pain and based on published guidelines. The search strategy identified 20 cost-effectiveness analyses meeting the inclusion criteria, of which 14 had original model structures. Cost-effectiveness modeling in neuropathic pain is established and increasing across multiple jurisdictions; however, amongst these studies, there is substantial variation in modeling approach, and there are common limitations. Capturing the effect of treatments upon health outcomes, particularly health-related quality-of-life, is challenging, and the health effects of multiple lines of ineffective treatment, common for patients with neuropathic pain, have not been consistently or robustly modeled. To improve future economic modeling in neuropathic pain, further research is suggested into the effect of multiple lines of treatment and treatment failure upon patient outcomes and subsequent treatment effectiveness; the impact of treatment-emergent adverse events upon patient outcomes; and consistent and appropriate pain measures to inform models. The authors further encourage transparent reporting of inputs used to inform cost-effectiveness models, with robust, comprehensive and clear uncertainty analysis and, where feasible, open-source modeling is encouraged.
Closed-form dynamics of a hexarot parallel manipulator by means of the principle of virtual work
NASA Astrophysics Data System (ADS)
Pedrammehr, Siamak; Nahavandi, Saeid; Abdi, Hamid
2018-04-01
In this research, a systematic approach to solving the inverse dynamics of hexarot manipulators is addressed using the methodology of virtual work. For the first time, a closed form of the mathematical formulation of the standard dynamic model is presented for this class of mechanisms. An efficient algorithm for solving this closed-form dynamic model of the mechanism is developed and it is used to simulate the dynamics of the system for different trajectories. Validation of the proposed model is performed using SimMechanics and it is shown that the results of the proposed mathematical model match with the results obtained by the SimMechanics model.
Writing and reading: connections between language by hand and language by eye.
Berninger, Virginia W; Abbott, Robert D; Abbott, Sylvia P; Graham, Steve; Richards, Todd
2002-01-01
Four approaches to the investigation of connections between language by hand and language by eye are described and illustrated with studies from a decade-long research program. In the first approach, multigroup structural equation modeling is applied to reading and writing measures given to typically developing writers to examine unidirectional and bidirectional relationships between specific components of the reading and writing systems. In the second approach, structural equation modeling is applied to a multivariate set of language measures given to children and adults with reading and writing disabilities to examine how the same set of language processes is orchestrated differently to accomplish specific reading or writing goals, and correlations between factors are evaluated to examine the level at which the language-by-hand system and the language-by-eye system communicate most easily. In the third approach, mode of instruction and mode of response are systematically varied in evaluating effectiveness of treating reading disability with and without a writing component. In the fourth approach, functional brain imaging is used to investigate residual spelling problems in students whose problems with word decoding have been remediated. The four approaches support a model in which language by hand and language by eye are separate systems that interact in predictable ways.
NASA Astrophysics Data System (ADS)
Greenwald, Jared
Any good physical theory must resolve current experimental data as well as offer predictions for potential searches in the future. The Standard Model of particle physics, Grand Unied Theories, Minimal Supersymmetric Models and Supergravity are all attempts to provide such a framework. However, they all lack the ability to predict many of the parameters that each of the theories utilize. String theory may yield a solution to this naturalness (or self-predictiveness) problem as well as offer a unifed theory of gravity. Studies in particle physics phenomenology based on perturbative low energy analysis of various string theories can help determine the candidacy of such models. After a review of principles and problems leading up to our current understanding of the universe, we will discuss some of the best particle physics model building techniques that have been developed using string theory. This will culminate in the introduction of a novel approach to a computational, systematic analysis of the various physical phenomena that arise from these string models. We focus on the necessary assumptions, complexity and open questions that arise while making a fully-automated at direction analysis program.
Balderson, Michael; Brown, Derek; Johnson, Patricia; Kirkby, Charles
2016-01-01
The purpose of this work was to compare static gantry intensity-modulated radiation therapy (IMRT) with volume-modulated arc therapy (VMAT) in terms of tumor control probability (TCP) under scenarios involving large geometric misses, i.e., those beyond what are accounted for when margin expansion is determined. Using a planning approach typical for these treatments, a linear-quadratic-based model for TCP was used to compare mean TCP values for a population of patients who experiences a geometric miss (i.e., systematic and random shifts of the clinical target volume within the planning target dose distribution). A Monte Carlo approach was used to account for the different biological sensitivities of a population of patients. Interestingly, for errors consisting of coplanar systematic target volume offsets and three-dimensional random offsets, static gantry IMRT appears to offer an advantage over VMAT in that larger shift errors are tolerated for the same mean TCP. For example, under the conditions simulated, erroneous systematic shifts of 15mm directly between or directly into static gantry IMRT fields result in mean TCP values between 96% and 98%, whereas the same errors on VMAT plans result in mean TCP values between 45% and 74%. Random geometric shifts of the target volume were characterized using normal distributions in each Cartesian dimension. When the standard deviations were doubled from those values assumed in the derivation of the treatment margins, our model showed a 7% drop in mean TCP for the static gantry IMRT plans but a 20% drop in TCP for the VMAT plans. Although adding a margin for error to a clinical target volume is perhaps the best approach to account for expected geometric misses, this work suggests that static gantry IMRT may offer a treatment that is more tolerant to geometric miss errors than VMAT. Copyright © 2016 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Shen, Xiang; Liu, Bin; Li, Qing-Quan
2017-03-01
The Rational Function Model (RFM) has proven to be a viable alternative to the rigorous sensor models used for geo-processing of high-resolution satellite imagery. Because of various errors in the satellite ephemeris and instrument calibration, the Rational Polynomial Coefficients (RPCs) supplied by image vendors are often not sufficiently accurate, and there is therefore a clear need to correct the systematic biases in order to meet the requirements of high-precision topographic mapping. In this paper, we propose a new RPC bias-correction method using the thin-plate spline modeling technique. Benefiting from its excellent performance and high flexibility in data fitting, the thin-plate spline model has the potential to remove complex distortions in vendor-provided RPCs, such as the errors caused by short-period orbital perturbations. The performance of the new method was evaluated by using Ziyuan-3 satellite images and was compared against the recently developed least-squares collocation approach, as well as the classical affine-transformation and quadratic-polynomial based methods. The results show that the accuracies of the thin-plate spline and the least-squares collocation approaches were better than the other two methods, which indicates that strong non-rigid deformations exist in the test data because they cannot be adequately modeled by simple polynomial-based methods. The performance of the thin-plate spline method was close to that of the least-squares collocation approach when only a few Ground Control Points (GCPs) were used, and it improved more rapidly with an increase in the number of redundant observations. In the test scenario using 21 GCPs (some of them located at the four corners of the scene), the correction residuals of the thin-plate spline method were about 36%, 37%, and 19% smaller than those of the affine transformation method, the quadratic polynomial method, and the least-squares collocation algorithm, respectively, which demonstrates that the new method can be more effective at removing systematic biases in vendor-supplied RPCs.
State-Space Formulation for Circuit Analysis
ERIC Educational Resources Information Center
Martinez-Marin, T.
2010-01-01
This paper presents a new state-space approach for temporal analysis of electrical circuits. The method systematically obtains the state-space formulation of nondegenerate linear networks without using concepts of topology. It employs nodal/mesh systematic analysis to reduce the number of undesired variables. This approach helps students to…
A Systematic Approach to Subgroup Classification in Intellectual Disability
ERIC Educational Resources Information Center
Schalock, Robert L.; Luckasson, Ruth
2015-01-01
This article describes a systematic approach to subgroup classification based on a classification framework and sequential steps involved in the subgrouping process. The sequential steps are stating the purpose of the classification, identifying the classification elements, using relevant information, and using clearly stated and purposeful…